Tesla reportedly asked highway safety officials to redact information about whether driver-assistance software was in use during crashes::Elon Musk’s Tesla has faced investigations into Autopilot, including an ongoing NHTSA probe of more than 800,000 Teslas after several crashes.
I don’t see the problem for Tesla. Regardless of whether the autopilot is active or not, the driver is responsible.
The driver is fully responsible, but Tesla are also making the big bucks with misleading marketing of how good their driving assistance is. So it’s more profitable to keep people unaware of its actual capabilites.
Legally yes, but in my opinion when you market your car as self-driving you do share a certain level of responsibility if it self-drives into an accident.
Until they actually aren’t:
https://m.youtube.com/watch?v=6Kf3I_OyDlI&pp=ygUYdGVzbGEgc2VsZiBkcml2aW5nIGZhaWwg
Here is an alternative Piped link(s): https://piped.video/watch?v=6Kf3I_OyDlI&
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source, check me out at GitHub.