Tesla Master Plan: Part Deux

  • Thread starter CodeRedR51
  • 1,521 comments
  • 117,439 views
Yeah, adding a hashtag to your profile is such a terrible thing.
If it's meant as an attempt at market manipulation, then it is. When your actions are as thoroughly scrutinized as Elon Musk's are, even trivial things can turn into big deals and certainly he knows that.

But given the general volatility of the cryptocurrency market in general and the fact that Elon is no stranger to hyping up crypto (he was also plugging Dogecoin recently as well, of all things), it's not exactly concrete evidence that intentional wrongdoing was going on either.
 
that Elon is no stranger to hyping up crypto
Until December he certainly was; his only comments about Bitcoin were that he'd never bought or sold any and had no particular interest in them, but held 0.25BTC that a friend gave him once.

Now, after Tesla has converted stock into Bitcoin (incidentally, that also means that if you buy a Tesla with Bitcoin, you're technically buying Tesla stock with Tesla stock), he's all over it.

If it's meant as an attempt at market manipulation, then it is. When your actions are as thoroughly scrutinized as Elon Musk's are, even trivial things can turn into big deals and certainly he knows that.
Nah, it's just crazy disruptive Elon disrupting things from the fuddy-duddy norm (like when he uses medical-grade IT in automotive settings, rather than automotive-grade; disruption!). So what if he hypes up stock he holds between buying the stock and revealing he's bought the stock? Disruption!


Love this Twitter thread:
 
Highly doubt it. But then again if he sneezes weird the market freaks out. People just love to scrutinize his every move like it effects their personal lives in some way.
Well what do you expect? He's the richest guy on Earth, with a massive fanbase that he can very easily influence. Of course he's scrutinized. Of course the market freaks out when he starts doing weird ****.

And it's not even the first time he tries to manipulate the market, come on now. He's not some innocent kid who's trying to have fun online. He's very damn aware of what he's doing.
 
Last edited by a moderator:
Well what do you expect? He's the richest guy on Earth, with a massive fanbase that he can very easily influence. Of course he's scrutinized. Of course the market freaks out when he starts doing weird ****.

And it's not even the first time he tries to manipulate the market, come on now. He's not some innocent kid who's trying to have fun online. He's very damn aware of what he's doing.
My last sentence still stands.
 
It is quite clearly market manipulation. I've no idea if that's against the rules or not when it comes to crypto. Since it's the currency favoured by criminals and wrong-doers I'm guessing there aren't many rules?
 
I'm not defending Elon Musk in how he tweets and how he can move markets. With these recent tweets though, I don't think the SEC will go after him because I don't see this as fundamentally being much different than what Jim Cramer and the rest of the CNBC talking heads do to move markets on a daily basis. Not saying it's right or wrong necessarily, just I don't see anything coming out of this.
 
With these recent tweets though, I don't think the SEC will go after him because I don't see this as fundamentally being much different than what Jim Cramer and the rest of the CNBC talking heads do to move markets on a daily basis

I don't think the SEC regulate crypto, so they can't really do anything anyway.
 
I'm not defending Elon Musk in how he tweets and how he can move markets. With these recent tweets though, I don't think the SEC will go after him because I don't see this as fundamentally being much different than what Jim Cramer and the rest of the CNBC talking heads do to move markets on a daily basis. Not saying it's right or wrong necessarily, just I don't see anything coming out of this.
Looks like he could be according to this article?
Elon Musk is once again under the watchful eye of the SEC after his company Tesla invested $1.5 billion in Bitcoin, according to Doug Davidson, a former Branch Chief of the Securities and Exchange Commission’s Divison of Enforcement. After Musk’s tweets regarding the cryptocurrency resulting in the rise in value, the timings of these tweets could be brought into question.
https://www.teslareporter.com/3065/...zg4I6UFrzBDCgkQkhh-Cj1SPljD7lWL5BRjLQ5jC8aIwI
 




I still think having LIDAR is incredibly important and not using it seems like a Not A Good Idea. Although, I am taking computer vision at my university right now and I think we might talk about it.
 




I still think having LIDAR is incredibly important and not using it seems like a Not A Good Idea. Although, I am taking computer vision at my university right now and I think we might talk about it.

I also notice that this Tesla ran into another Tesla. Why weren't the two cars talking to each other and sharing data? All semi- or fully-autonomous cars should have some sort of data sharing transponder system.
 
Probably a pre-production test car so I wouldn't get your hopes up just yet.
If it's a pre-production car, would they not want to be testing the new wheel before reaching production? Also, the wheel in the picture is the exact same as the facelifted wheel seen on their website so I think there's a high chance that the regular wheel will at least be an option

message-editor%2F1611813964897-teslasteeringwheelsubhead.jpg




I also notice that this Tesla ran into another Tesla. Why weren't the two cars talking to each other and sharing data? All semi- or fully-autonomous cars should have some sort of data sharing transponder system.
I'm unaware of any brand doing any data sharing yet. I think it would definitely help with accidents like that one, but it's still probably a long ways away, especially when the amount of "autonomous" cars on the road is miniscule


Anyways, Matt Farah recently ranted about driver aid tech, and I whole heartedly agree with his argument





(language warning for the last part of the rant)

 
Anyways, Matt Farah recently ranted about driver aid tech, and I whole heartedly agree with his argument





(language warning for the last part of the rant)



Or maybe not everyone is a driving enthusiast and there's a huge market for self-driving cars? Just a thought! He's trying really hard to make this complicated, it's not.
 
Last edited:
I think it would definitely help with accidents like that one, but it's still probably a long ways away, especially when the amount of "autonomous" cars on the road is miniscule
On that whole that's true, but its plain to see that Teslas are so common in some parts of the country that they can crash into each other. They should definitely be data sharing. If they were, they might have been able to eliminate the errors that caused this accident, just like GPS satellites do with each other. Since we're all talking about disaster preparedness lately, how about we talk about how Tesla could've prevented this crash through technology which is sort of their thing.
 
Last edited:
Or maybe not everyone is a driving enthusiast and there's a huge market for self-driving cars? Just a thought! He's trying really hard to make this complicated, it's not.
Of course not everyone is a driving enthusiast, but not being a driving enthusiast is unrelated to properly training people to operate a 4,000 pound piece of metal. With proper training, better infrastructure, and regulations, there would be less accidents right now. This would also not be as difficult as having big tech companies create complicated algorithms to try to cover every single possible edge case. A long time ago, I remember hearing a lot of journalists saying every day drivers in Germany have a much different perspective on driving than Americans. They are more attentative and have more training and more regulations (they banned the use of Tesla's screen while driving because it's dangerous! https://electrek.co/2020/08/04/tesla-wiper-controls-ruled-illegal-germany-crashed/). They don't have autonomous cars yet, but the results speak for themselves:

160706-crashdeaths-editorial.png


Of course, you could say that's because Germans drive less than Americans, but they don't drive 40% less (https://www.odyssee-mure.eu/publica...ctor/transport/distance-travelled-by-car.html, 8.7k miles vs 13k miles).

Additionally, not everyone can afford an expensive new autonomous car just so that they are safer on the roads. More training and better infrastructure will benefit everyone equally. Also, current autonomous car technology is not at the level where they can save lives. They may be even more harmful as current technology are just driver aids. They are not autonomous. They still require the driver's attention. A quick search and you'll find many drivers sleeping or being unattentative while using Tesla's AutoPilot. There have been lethal accidents because of this. If they want to save lives, maybe beta testing on untrained drivers is a bad and unsafe idea?

Of course the technology isn't there yet, but in the future this could be helpful, but if you want to save lives, why not work on things that will work now without spending years of research and development?
 
Last edited:
Of course not everyone is a driving enthusiast, but not being a driving enthusiast is unrelated to properly training people to operate a 4,000 pound piece of metal. With proper training, better infrastructure, and regulations, there would be less accidents right now. This would also not be as difficult as having big tech companies create complicated algorithms to try to cover every single possible edge case. A long time ago, I remember hearing a lot of journalists saying every day drivers in Germany have a much different perspective on driving than Americans. They are more attentative and have more training and more regulations (they banned the use of Tesla's screen while driving because it's dangerous! https://electrek.co/2020/08/04/tesla-wiper-controls-ruled-illegal-germany-crashed/). They don't have autonomous cars yet, but the results speak for themselves:

160706-crashdeaths-editorial.png


Of course, you could say that's because Germans drive less than Americans, but they don't drive 40% less (https://www.odyssee-mure.eu/publica...ctor/transport/distance-travelled-by-car.html, 8.7k miles vs 13k miles).

Additionally, not everyone can afford an expensive new autonomous car just so that they are safer on the roads. More training and better infrastructure will benefit everyone equally. Also, current autonomous car technology is not at the level where they can save lives. They may be even more harmful as current technology are just driver aids. They are not autonomous. They still require the driver's attention. A quick search and you'll find many drivers sleeping or being unattentative while using Tesla's AutoPilot. There have been lethal accidents because of this.

Of course the technology isn't there yet, but in the future this could be helpful, but if you want to save lives, why not work on things that will work now without spending years of research and development?

You're way over-thinking this.

Lots of people don't want to drive in lots of situations. Therefore, there is a big market for self-driving tech.

That's it!
 
You're way over-thinking this.

Lots of people don't want to drive in lots of situations. Therefore, there is a big market for self-driving tech.

That's it!
I don't think that's the argument he was trying to make or what I am trying to say.

Sure, I agree, people have a desire for self-driving technology, but brands are marketing these technologies as a solution for safety rather than avoiding an inconvenience
 
I don't think that's the argument he was trying to make or what I am trying to say.

Oh I know. You were both trying to suggest that self-driving tech is created to solve the problem of car crashes. It isn't.

Sure, I agree, people have a desire for self-driving technology, but brands are marketing these technologies as a solution for safety rather than avoiding an inconvenience

...so you don't like the marketing? Your argument here is not that self-driving is not safer, but that by marketing a safe technology as being safer, they are suggesting that this is the only reason for the technology, or a main reason for the technology, which you say is wrong. But the assumption is wrong. They're marketing (AFAIK correctly) that self driving is safer, and then not taking that next step to suggest that the whole point of developing self driving technology is to address the problem of safety. But even if they did, your beef would be with that particular statement, not the tech.

So we're back where I was to begin with. There is big demand because people don't want to drive in lots of situations. And the tech at least has the potential to make people safer. So we're good right?

Edit:

For a complete analysis here, this is a strawman:

1) Pretend that the whole point of self-driving is something it isn't
2) Attack that made up position
3) Come to the desired conclusion that self-driving is a waste of time and money.
 
Last edited:
I have an entirely different way to reach the conclusion that self-driving is a waste of time and money -

It will never reach Level 5 status with typical American highway infrastructure
and Levels 3 & 4 will never be truly, safely achievable because of the inherently compromised nature of the squishy humans in the drivers seat. Maybe, maybe level 4, but again its hard for me to see how it could work with our road infrastructure
That leaves level 2.5 parading as "Full Self Driving" (and always 2 years away from Level 5) for the foreseeable future.

I do think that Level 4 and 5 will be possible, perhaps even routine, for low-speed (35-40mph max) urban vehicles like buses and taxis.
 
It will never reach Level 5 status with typical American highway infrastructure
I definitely agree with this. All roads, highway and surface streets, have nowhere near the standardization levels needed for full autonomy. Especially the surface streets would have to be completely redesigned to fit any sort of standard model. Right now the only thing one road really has in common with another is the color of the lines.
 
It will never reach Level 5 status with typical American highway infrastructure

Doesn't have to be a huge deal. If the car doesn't know what to do, it can safely pull off and wait for a human to take over. I'd say "never" is a bit too far too, maybe not "soon". Though... honestly... computers are moving fast.
 
Doesn't have to be a huge deal. If the car doesn't know what to do, it can safely pull off and wait for a human to take over. I'd say "never" is a bit too far too, maybe not "soon". Though... honestly... computers are moving fast.

What you are describing isn't truly level 5 though, which by its own definition requires the car to be able to drive in any condition that a human could drive in - if its waiting for a human to take over (thus implying the conditions are beyond its capability/understanding, but not beyond a human's), it isn't level 5. That's level 4. I still hedge on level 4 because I think that's possible, but only with a operator/driver that is being paid to be attentive. It's basically the subway operator model.

Level 3 is a non-starter in my opinion - I'd guess basic consumers could/would be lulled into thinking a Level 3 system is infallible and very likely go to sleep or in the very least not be prepared to take over - even if they are intending to be alert/aware. That's best case. I think a reasonable or even typical case would be people fully ignoring the "be alert" and crawling into the backseat for a proper nap.

1 & 2: Great and consumer ready
3: Fundamentally, intrinsically unsafe - should never happen even if it is theoretically quite possible.
4: Aspirational, but probably best suited for commercial applications. Will need additional infrastructure.
5: Almost assuredly geo-constrained and low speed, purely commercial/transit use. Will need additional infrastructure.
 
What you are describing isn't truly level 5 though, which by its own definition requires the car to be able to drive in any condition that a human could drive in - if its waiting for a human to take over (thus implying the conditions are beyond its capability/understanding, but not beyond a human's), it isn't level 5. That's level 4. I still hedge on level 4 because I think that's possible, but only with a operator/driver that is being paid to be attentive. It's basically the subway operator model.

It's not a big deal. A system which can drive on 98% of what is out there and just pulls over and waits patiently for a user to bypass whatever the problem is would still be extremely useful. It doesn't have to be someone who is paid to be attentive. It could potentially be, such as a remote operator, or someone who travels to the particular site to guide vehicles through. But it could also be a passenger who is requested to the front to help out.


Level 3 is a non-starter in my opinion - I'd guess basic consumers could/would be lulled into thinking a Level 3 system is infallible and very likely go to sleep or in the very least not be prepared to take over - even if they are intending to be alert/aware. That's best case. I think a reasonable or even typical case would be people fully ignoring the "be alert" and crawling into the backseat for a proper nap.

I don't know whether the "caravan" concept from Tesla is level 2, 3, or level 4. The caravan concept is that there is a lead truck, with a driver, probably using something like level 2 autonomy. And then behind that there are multiple follow vehicles that are driverless, but which follow the piloted vehicle in front of them very closely (potentially extremely closely) and potentially share telemetry. I don't know where exactly that fits. Maybe it's level 3.

I agree that asking someone to stay alert while not requiring them to stay alert 99% of the time is a recipe for failure. Humans are not good at that.
 
Next significant release will be in April. Going with pure vision — not even using radar. This is the way to real-world AI.
I don't understand why they don't use all the different types of sensors so they can all cross-reference each other. Fully autonomous driving will require redundancy to be effective, just like aviation requires. I don't feel comfortable operating automated systems that aren't redundant. In semi-autonomous systems, I am the redundancy which is fine, but in the future they're planning cars without controls and that's pushing it in my opinion. It's a great idea if done properly but I feel like all these various detection systems need to be onboard and cross-referencing each other.
 
Last edited:
Back