Home
News
Forums
New posts
Registered members
Media
New media
New comments
Search media
Latest activity
Guides
Store
Log in
Register
What's new
Search
Search
Search titles only
By:
Home
Forums
Automotive
Auto News
Bad First Day: Navya Self-Driving Shuttle Ends Involved in Accident With a Semi
Log in
Register
Install the app
Install
Welcome to GTPlanet!
Create Your FREE Account
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
[QUOTE="Eunos_Cosmo, post: 12054836, member: 137826"] I agree. I find these "ethics" questions to be kind of a false binary. It's not even an ethical question in my opinion, it's a calibration question. The 'question' is not between 'careen into the concrete pole' or 'murder innocent pedestrians' but more about how to not do either of those things...I have difficulty imagining a real scenario when there isn't a third option with less dismemberment, and a computer with gigs of ram should be able to find that third option rather more quickly than our lizard brains. I imagine in most cases the car will just be instructed to brake 100%, as swerving is seldom advisable in any case. If that's not enough, then it's likely it was unavoidable. That all being said, I think the real problem (possibly intractable) is making autonomous cars work in situations they were not designed to encounter. Do autonomous cars have any sort of intelligence? Or do they just follow pre-programmed commands based on sensor input? I see this as the biggest obstacle for autonomous cars going faster than 20mph. [/QUOTE]
Insert quotes…
Verification
Post reply
Home
Forums
Automotive
Auto News
Bad First Day: Navya Self-Driving Shuttle Ends Involved in Accident With a Semi