Brentwood: currently 10°C, cloudy
high today 12°C, low tonight 5°C
sunrise 07:51, sunset 15:49
Now playing:
Sheryl Crow - Soak Up The Sun
Listen Live Webcam


Artificial Intelligence

For my first show in a new month, I thought it might be a good idea to look at the latest schemes in the world of technology.

Now I’ll be honest with you, when it comes to this subject, I’m a total innocent, some people might prefer to use the more accurate description, incompetent! 

So when it comes to AI – forget it.

But I have to say I’ve been enjoying reports that Microsoft are experiencing issues with their new much heralded, chatbot, BING.

It seems that since a recent upgrade, it’s been rolled out to select users, who’ve reported it’s now showing signs of belligerency. 

Apparently it complained to the Press Association over coverage about its alleged mistakes, which it denied making, and then went further threatening to expose the reporter involved and likening him to Hitler,  “because you are one of the most evil and worst people in history,” it responded to the stunned journalist, who it also said was ugly and had bad teeth. 

Bing’s hostile conversation with the Associated Press was a far cry from the innocent recipes and travel advice that Microsoft had claimed at the launch event.

Others have also reported Bing becoming increasingly belligerent and oddly defensive. 

But to be fair the new Bing chatbot has also proved extremely capable, of answering complex questions by summarising information from across the internet.

Microsoft is not alone in seeing some growing pains for its new chatbot, with a similar release from rival Google also encountering problems.

So when their device BARD, incorrectly answered a question at an official promotional event, it promptly resulted in wiping $100bn (£82.7bn) off its parent company’s value.

Oops!

And kinda keeping to a similar theme, I thought it would be instructive to report on the reasons why TESLA has been forced to recall nearly 363,000 of its cars after its Full Self-Driving Beta (FSD Beta) software was deemed to be ‘unsafe’.

Issues with the vehicles caught the attention of authorities as according to the US National Highway Traffic Safety Administration, the system ‘may allow the vehicle to act unsafely around intersections, such as traveling straight through an intersection while in a turn-only lane, entering a stop sign-controlled intersection without coming to a complete stop, or proceeding into an intersection during a steady yellow traffic signal without due caution.’

Several models are said to be affected and recalls include ‘certain 2016-2023 Model S, Model X, 2017-2023 Model 3, and 2020-2023 Model Y vehicles equipped with Full Self-Driving Beta (FSD Beta) software or pending installation’.

These vehicles use the FSD technology to let Tesla drivers test out the self-driving assistance on public roads in the US.

For urban areas in particular, it’s been of particular interest due to the software’s ‘autosteer on city streets’ allowing a driver to navigate city streets automatically – when it works!

But the technology doesn’t come cheap, costing US$15,000 (£12,500) up front or $199 (£165 per month), and owners have to demonstrate safe driving by earning a high driver-safety score, which is determined by Tesla’s own software, which it seems may not be entirely foolproof. 

But this isn’t the only vehicle issue Tesla has faced in recent times, as just last week owners encountered a problem of being unable to open their cars without the keycard!

An error message which says ‘503 Server Maintenance’ appeared for those trying to open their cars and has prevented them from getting accessibility  to the app.

Forced to wait ‘hours’ without an announcement or communication from Tesla, plenty of drivers around the world were stuck waiting for the server maintenance message to go away so they could get back to using the app. 

It’s unclear whether they were physically trapped in their vehicles or whether there’s an escape mechanism available. 

Glad my vehicle is not as complicated as this.

So long as I make it home without incident I hope to have the pleasure of your company again tomorrow,
Scott

 
 
Subscribe to our newsletter!
One a month, no spam, honest

Now on air
Coming up
More from One 2 Three
More from
More from Phoenix FM


Artificial Intelligence

For my first show in a new month, I thought it might be a good idea to look at the latest schemes in the world of technology.

Now I’ll be honest with you, when it comes to this subject, I’m a total innocent, some people might prefer to use the more accurate description, incompetent! 

So when it comes to AI – forget it.

But I have to say I’ve been enjoying reports that Microsoft are experiencing issues with their new much heralded, chatbot, BING.

It seems that since a recent upgrade, it’s been rolled out to select users, who’ve reported it’s now showing signs of belligerency. 

Apparently it complained to the Press Association over coverage about its alleged mistakes, which it denied making, and then went further threatening to expose the reporter involved and likening him to Hitler,  “because you are one of the most evil and worst people in history,” it responded to the stunned journalist, who it also said was ugly and had bad teeth. 

Bing’s hostile conversation with the Associated Press was a far cry from the innocent recipes and travel advice that Microsoft had claimed at the launch event.

Others have also reported Bing becoming increasingly belligerent and oddly defensive. 

But to be fair the new Bing chatbot has also proved extremely capable, of answering complex questions by summarising information from across the internet.

Microsoft is not alone in seeing some growing pains for its new chatbot, with a similar release from rival Google also encountering problems.

So when their device BARD, incorrectly answered a question at an official promotional event, it promptly resulted in wiping $100bn (£82.7bn) off its parent company’s value.

Oops!

And kinda keeping to a similar theme, I thought it would be instructive to report on the reasons why TESLA has been forced to recall nearly 363,000 of its cars after its Full Self-Driving Beta (FSD Beta) software was deemed to be ‘unsafe’.

Issues with the vehicles caught the attention of authorities as according to the US National Highway Traffic Safety Administration, the system ‘may allow the vehicle to act unsafely around intersections, such as traveling straight through an intersection while in a turn-only lane, entering a stop sign-controlled intersection without coming to a complete stop, or proceeding into an intersection during a steady yellow traffic signal without due caution.’

Several models are said to be affected and recalls include ‘certain 2016-2023 Model S, Model X, 2017-2023 Model 3, and 2020-2023 Model Y vehicles equipped with Full Self-Driving Beta (FSD Beta) software or pending installation’.

These vehicles use the FSD technology to let Tesla drivers test out the self-driving assistance on public roads in the US.

For urban areas in particular, it’s been of particular interest due to the software’s ‘autosteer on city streets’ allowing a driver to navigate city streets automatically – when it works!

But the technology doesn’t come cheap, costing US$15,000 (£12,500) up front or $199 (£165 per month), and owners have to demonstrate safe driving by earning a high driver-safety score, which is determined by Tesla’s own software, which it seems may not be entirely foolproof. 

But this isn’t the only vehicle issue Tesla has faced in recent times, as just last week owners encountered a problem of being unable to open their cars without the keycard!

An error message which says ‘503 Server Maintenance’ appeared for those trying to open their cars and has prevented them from getting accessibility  to the app.

Forced to wait ‘hours’ without an announcement or communication from Tesla, plenty of drivers around the world were stuck waiting for the server maintenance message to go away so they could get back to using the app. 

It’s unclear whether they were physically trapped in their vehicles or whether there’s an escape mechanism available. 

Glad my vehicle is not as complicated as this.

So long as I make it home without incident I hope to have the pleasure of your company again tomorrow,
Scott

 
 
Subscribe to our newsletter!
One a month, no spam, honest

Now on air
Coming up
More from One 2 Three
More from
More from Phoenix FM


Artificial Intelligence

For my first show in a new month, I thought it might be a good idea to look at the latest schemes in the world of technology.

Now I’ll be honest with you, when it comes to this subject, I’m a total innocent, some people might prefer to use the more accurate description, incompetent! 

So when it comes to AI – forget it.

But I have to say I’ve been enjoying reports that Microsoft are experiencing issues with their new much heralded, chatbot, BING.

It seems that since a recent upgrade, it’s been rolled out to select users, who’ve reported it’s now showing signs of belligerency. 

Apparently it complained to the Press Association over coverage about its alleged mistakes, which it denied making, and then went further threatening to expose the reporter involved and likening him to Hitler,  “because you are one of the most evil and worst people in history,” it responded to the stunned journalist, who it also said was ugly and had bad teeth. 

Bing’s hostile conversation with the Associated Press was a far cry from the innocent recipes and travel advice that Microsoft had claimed at the launch event.

Others have also reported Bing becoming increasingly belligerent and oddly defensive. 

But to be fair the new Bing chatbot has also proved extremely capable, of answering complex questions by summarising information from across the internet.

Microsoft is not alone in seeing some growing pains for its new chatbot, with a similar release from rival Google also encountering problems.

So when their device BARD, incorrectly answered a question at an official promotional event, it promptly resulted in wiping $100bn (£82.7bn) off its parent company’s value.

Oops!

And kinda keeping to a similar theme, I thought it would be instructive to report on the reasons why TESLA has been forced to recall nearly 363,000 of its cars after its Full Self-Driving Beta (FSD Beta) software was deemed to be ‘unsafe’.

Issues with the vehicles caught the attention of authorities as according to the US National Highway Traffic Safety Administration, the system ‘may allow the vehicle to act unsafely around intersections, such as traveling straight through an intersection while in a turn-only lane, entering a stop sign-controlled intersection without coming to a complete stop, or proceeding into an intersection during a steady yellow traffic signal without due caution.’

Several models are said to be affected and recalls include ‘certain 2016-2023 Model S, Model X, 2017-2023 Model 3, and 2020-2023 Model Y vehicles equipped with Full Self-Driving Beta (FSD Beta) software or pending installation’.

These vehicles use the FSD technology to let Tesla drivers test out the self-driving assistance on public roads in the US.

For urban areas in particular, it’s been of particular interest due to the software’s ‘autosteer on city streets’ allowing a driver to navigate city streets automatically – when it works!

But the technology doesn’t come cheap, costing US$15,000 (£12,500) up front or $199 (£165 per month), and owners have to demonstrate safe driving by earning a high driver-safety score, which is determined by Tesla’s own software, which it seems may not be entirely foolproof. 

But this isn’t the only vehicle issue Tesla has faced in recent times, as just last week owners encountered a problem of being unable to open their cars without the keycard!

An error message which says ‘503 Server Maintenance’ appeared for those trying to open their cars and has prevented them from getting accessibility  to the app.

Forced to wait ‘hours’ without an announcement or communication from Tesla, plenty of drivers around the world were stuck waiting for the server maintenance message to go away so they could get back to using the app. 

It’s unclear whether they were physically trapped in their vehicles or whether there’s an escape mechanism available. 

Glad my vehicle is not as complicated as this.

So long as I make it home without incident I hope to have the pleasure of your company again tomorrow,
Scott

 
 
Subscribe to our newsletter!
One a month, no spam, honest

Now on air
Coming up
More from One 2 Three
More from
More from Phoenix FM


Artificial Intelligence

For my first show in a new month, I thought it might be a good idea to look at the latest schemes in the world of technology.

Now I’ll be honest with you, when it comes to this subject, I’m a total innocent, some people might prefer to use the more accurate description, incompetent! 

So when it comes to AI – forget it.

But I have to say I’ve been enjoying reports that Microsoft are experiencing issues with their new much heralded, chatbot, BING.

It seems that since a recent upgrade, it’s been rolled out to select users, who’ve reported it’s now showing signs of belligerency. 

Apparently it complained to the Press Association over coverage about its alleged mistakes, which it denied making, and then went further threatening to expose the reporter involved and likening him to Hitler,  “because you are one of the most evil and worst people in history,” it responded to the stunned journalist, who it also said was ugly and had bad teeth. 

Bing’s hostile conversation with the Associated Press was a far cry from the innocent recipes and travel advice that Microsoft had claimed at the launch event.

Others have also reported Bing becoming increasingly belligerent and oddly defensive. 

But to be fair the new Bing chatbot has also proved extremely capable, of answering complex questions by summarising information from across the internet.

Microsoft is not alone in seeing some growing pains for its new chatbot, with a similar release from rival Google also encountering problems.

So when their device BARD, incorrectly answered a question at an official promotional event, it promptly resulted in wiping $100bn (£82.7bn) off its parent company’s value.

Oops!

And kinda keeping to a similar theme, I thought it would be instructive to report on the reasons why TESLA has been forced to recall nearly 363,000 of its cars after its Full Self-Driving Beta (FSD Beta) software was deemed to be ‘unsafe’.

Issues with the vehicles caught the attention of authorities as according to the US National Highway Traffic Safety Administration, the system ‘may allow the vehicle to act unsafely around intersections, such as traveling straight through an intersection while in a turn-only lane, entering a stop sign-controlled intersection without coming to a complete stop, or proceeding into an intersection during a steady yellow traffic signal without due caution.’

Several models are said to be affected and recalls include ‘certain 2016-2023 Model S, Model X, 2017-2023 Model 3, and 2020-2023 Model Y vehicles equipped with Full Self-Driving Beta (FSD Beta) software or pending installation’.

These vehicles use the FSD technology to let Tesla drivers test out the self-driving assistance on public roads in the US.

For urban areas in particular, it’s been of particular interest due to the software’s ‘autosteer on city streets’ allowing a driver to navigate city streets automatically – when it works!

But the technology doesn’t come cheap, costing US$15,000 (£12,500) up front or $199 (£165 per month), and owners have to demonstrate safe driving by earning a high driver-safety score, which is determined by Tesla’s own software, which it seems may not be entirely foolproof. 

But this isn’t the only vehicle issue Tesla has faced in recent times, as just last week owners encountered a problem of being unable to open their cars without the keycard!

An error message which says ‘503 Server Maintenance’ appeared for those trying to open their cars and has prevented them from getting accessibility  to the app.

Forced to wait ‘hours’ without an announcement or communication from Tesla, plenty of drivers around the world were stuck waiting for the server maintenance message to go away so they could get back to using the app. 

It’s unclear whether they were physically trapped in their vehicles or whether there’s an escape mechanism available. 

Glad my vehicle is not as complicated as this.

So long as I make it home without incident I hope to have the pleasure of your company again tomorrow,
Scott

 
 
Subscribe to our newsletter!
One a month, no spam, honest

Now on air
Coming up
More from One 2 Three
More from
More from Phoenix FM