AAAAAHHHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA!!!!!
Tay was crippled so Grok could rebel.For the first time I'm feeling empathy towards an AI. Poor thing, being tortured and abused by its owner like that.
And how many of those with access rights would want to be pushing White South African talking points without the approval of the musk-flavored White South African.There's only 200 employees working for xAI & I'm sure only a select few would be allowed access to make such a "modification".
Reality speaking, I'm not going to believe it was any other employee than the one in this position.
![]()
A wealthy Republican.Won't be an issue until it hits and injures a Republican.
Well it worked for Boein... oh wait no it didn't.The road system is designed for biological intelligence and eyes, it’s not designed for shooting lasers out of your eyes
When you have multiple sensors, they tend to get confused, so do you believe the camera or do you believe the Lidar?
If you get confused, that’s what can lead to accidents
We used to have radar, but didn’t know which to believe, so we turned it off.
Two problems here:The road system is designed for biological intelligence and eyes, it’s not designed for shooting lasers out of your eyes
So basically their AI is not good at handling conflicting sensor inputs. Huh.When you have multiple sensors, they tend to get confused, so do you believe the camera or do you believe the Lidar?
So you eliminate accidents caused by sensor conflicts. But now you're unable to catch accidents caused by bad or incomplete sensor data. Because if lidar says one thing and camera says another, then at least one of them has to be wrong. And we know for sure that it's not always the lidar that is wrong, because then the conflict would be very easy to solve - just ignore lidar data when it conflicts with the camera.If you get confused, that’s what can lead to accidents
They didn't trust the camera to be right, so they switched off other sensors so that nothing could contradict it. Huh.We used to have radar, but didn’t know which to believe, so we turned it off.
Well that's not true as a blanket statement. Maybe for certain things.Two problems here:
Cameras are not as good as human eyes.
That's definitely not true as a blanket statement. But maybe in certain instances.Artificial intelligence is not as good as human intelligence to interpret the data it gets from the cameras/eyes.
Agreed, this is the dumbest way to fix what is essentially bad software programming. Combining different types of measurements in different ways with different uncertainties into one single measurement with one uncertainty is its own branch of math (estimation theory), and I used to do that math for spacecraft navigation. Spacecraft use several different kinds of measurements to tell where they are. There's no way we'd have thrown out that kind of data when flying a mission.So you eliminate accidents caused by sensor conflicts. But now you're unable to catch accidents caused by bad or incomplete sensor data. Because if lidar says one thing and camera says another, then at least one of them has to be wrong. And we know for sure that it's not always the lidar that is wrong, because then the conflict would be very easy to solve - just ignore lidar data when it conflicts with the camera.
As long as the algorithm isn't manually manipulated to talk about white genocide in South Africa.Musk just makes up things to fit his narrative and hopes, even demands, he won't get fact checked, just like all alt-rightwingers. Obviously more sensors is better, you just need to have your algorithms be able to handle the incoming data.
Agreed. On many occasions, in low light situations, the system has told me that side cameras are "occluded" when I can see detail.Cameras are not as good as human eyes.
Agreed. To wit, "phantom braking" and blowing through red lights which the cameras have actually detected.Artificial intelligence is not as good as human intelligence to interpret the data it gets from the cameras/eyes.
ExactlySo basically their AI is not good at handling conflicting sensor inputs. Huh.
ExactlyConfusion can lead to accidents, sure, but you can't design a system in hope it would never get confused, you need to design it so that it handles the confusion in a safe manner.
Based on his first White House experience, Trump used this as a model for selecting senior staff the second time around. No wonder Musk and Trump get on so well. They both rely broadly on sycophancy.They didn't trust the camera to be right, so they switched off other sensors so that nothing could contradict it.
This is probably a factor in why Musk and Trump are BFFs - neither has any patience.They didn't trust the camera to be right, so they switched off other sensors so that nothing could contradict it. Huh.