Another autonomous car worry

Online now: Google [Bot], Majestic-12 [Bot]
Post Reply
It's been about a year since the posting in this thread specifically about autonomous cars, so I thought I would create a new one.

Video shows autonomous car being fooled by projected street signs (YouTube video at link). The demo shows a drone projecting (in an evening's dark, to be sure) a 60 kph speed limit sign onto a building within a 30 kph zone. While the driver kept full control over the car, another computer reading the car's autopilot showed that the latter had read the sign and had reset its maximum speed to 60 kph. An additional part of the problem is that the drone could flash the sign for as short as 0.1 seconds, long enough for a car's computer to detect and "correctly" interpret it but short enough that a driver may not have noticed it unless he happened to be staring close to the site of the projection.

While Cory Doctorow, the author of the linked BoingBoing.net article, pointed out some other possible dangerous scenarios AND the autonomous car maker's explanation that just a single sign wouldn't be likely to override other systems within their semi-autonomous driving system, but how about this as a science-fictiony scenario:

You are basically riding along with your Tesla Model X's autopilot more-or-less in full control when your car begins to shift to the slow lane of unfamiliar freeway then takes an off-ramp onto a road. Knowing that the autopilot will head to the nearest powering station when the car's batteries are getting low, you ignore this change in direction until you look up and find yourself traveling on a small winding road in a heavily wooded region. Eventually your car turns into a large field where there are ten other Teslas and a half-dozen men carrying large rifles. One of them orders you out of your car while another strips you of any valuables, then tells you to join the other Tesla drivers and passengers that you spot tied-up in one corner of the field.

LONG afterwards you find out that what the robbers had done was flash a series of signs warning of "DETOUR AHEAD" along with other appropriate sign instructions directing any autonomous car system to that remote field.
dv
User avatar
Stupid worry.

1) Your GPS knows what the speed limit is without reading it. It would be one thing to flash an orange speed limit sign at a SDC to get them to slow down, but any speed limit higher than the known value for the road you're one should be ignored or require user approval.

2) Making the SDC as "slow" as a human (signs need to be visible for X seconds, need to be visible more than once, etc.) would be a pretty easy way around this "hack."

Inputs should be double-checked and validated. Think about it - when you're driving through a small town you know damn well is a speed trap, and you know you're in a 25, then you see a sign for 55, do you immediately accelerate to 60 or do you kinda poke along tentatively somewhere between 25 and 55 until you see a second sign to confirm? (Traffic permitting.) Shouldn't SDCs be equally suspicious?

3) Route changes and construction zones, etc., should require human intervention anyway, possibly disabling self-driving altogether. Otherwise you have to intentionally create a decision tree where the car thinks driving 55mph on the shoulder in the wrong direction on an interstate is a good idea. That's called "tempting fate."
You are simply ignoring many other possibilities. While the group doing this experiment chose the most obvious case of a dangerously high speed limit within a district which was not designed to handle that, but what about some of the other examples such as projecting a "SCHOOL ZONE AHEAD / SLOW TO 25 MPH" onto a superhighway, or like my example above (expanding on Doctorow's fragment) of someone projecting detour signs leading cars to unsafe places (although I think Doctorow was thinking more along the lines of a road filled with unsafe conditions as opposed to my idea of hijackers misdirecting selected traffic).
ukimalefu Rebel? resistance? why not both?
User avatar
Quote:
they were able to defeat a Renault Captur’s “Level 0” autopilot (Level 0 systems advise human drivers but do not directly operate cars)


https://arstechnica.com/cars/2019/06/sp ... eet-signs/

-

Recently I saw news of many people getting lost in some dessert because the GPS told them to take a shortcut, turns out the maps in the devices were outdated and didn't show the road was not in use anymore. Also, people are stupid. Human error.

-

Technology may not be perfect, but people are definitely not perfect.

Still waiting for the movie about the killer self driving car.
obvs precoupado
Send private message
I think it's interesting to think about hypothetical things like this as major stumbling blocks, but not think about the fact that almost everyone drives around looking at our phones and taking our own focus off of the road as a more major stumbling block.
C. Ives Lacks Critical stick fiddling Thinking
User avatar
ukimalefu posted:


Recently I saw news of many people getting lost in some dessert because the GPS told them to take a shortcut, turns out the maps in the devices were outdated and didn't show the road was not in use anymore. Also, people are stupid. Human error.

Image
C. Ives posted:
ukimalefu posted:


Recently I saw news of many people getting lost in some dessert because the GPS told them to take a shortcut, turns out the maps in the devices were outdated and didn't show the road was not in use anymore. Also, people are stupid. Human error.

Image

I don't know the context but that cheesecake looks delicious.
Edit: oh yeah typo. Still looks good, but can be improved on with a topper of chocolate and caramel drizzle and crushed peanuts. Maybe a bit of sea salt too.

Last edited by Betonhaus on Sun Jul 07, 2019 5:53 pm.

dv
User avatar
DEyncourt posted:
You are simply ignoring many other possibilities. While the group doing this experiment chose the most obvious case of a dangerously high speed limit within a district which was not designed to handle that, but what about some of the other examples such as projecting a "SCHOOL ZONE AHEAD / SLOW TO 25 MPH" onto a superhighway, or like my example above (expanding on Doctorow's fragment) of someone projecting detour signs leading cars to unsafe places (although I think Doctorow was thinking more along the lines of a road filled with unsafe conditions as opposed to my idea of hijackers misdirecting selected traffic).


If you think I ignored that, you didn't read or understand my post.
dv posted:
DEyncourt posted:
You are simply ignoring many other possibilities. While the group doing this experiment chose the most obvious case of a dangerously high speed limit within a district which was not designed to handle that, but what about some of the other examples such as projecting a "SCHOOL ZONE AHEAD / SLOW TO 25 MPH" onto a superhighway, or like my example above (expanding on Doctorow's fragment) of someone projecting detour signs leading cars to unsafe places (although I think Doctorow was thinking more along the lines of a road filled with unsafe conditions as opposed to my idea of hijackers misdirecting selected traffic).


If you think I ignored that, you didn't read or understand my post.

You understand that an automated driving system (ADS) would HAVE to respond relatively quickly to a LOT of signs such as "SCHOOL ZONE AHEAD". Sure, an ADS could be programmed to ignore THAT particular sign while on a superhighway, but how about "ROAD CLOSURE AHEAD / FOLLOW DETOUR / ==>"? How do you program around THAT kind of unexpected situation?

I agree that any ADS should fully rely upon the driver's feedback, but we already know how reliable that can be.
dv
User avatar
DEyncourt posted:
You understand that an automated driving system (ADS) would HAVE to respond relatively quickly to a LOT of signs such as "SCHOOL ZONE AHEAD".


A useful system wouldn't need to respond faster than a human would, and should certainly be expected to double or triple-check inputs enough to realize a sign disappeared.
dv posted:
DEyncourt posted:
You understand that an automated driving system (ADS) would HAVE to respond relatively quickly to a LOT of signs such as "SCHOOL ZONE AHEAD".


A useful system wouldn't need to respond faster than a human would, and should certainly be expected to double or triple-check inputs enough to realize a sign disappeared.

Again, good suggestions, but how about when traveling on a superhighway a projected sign appears on an overpass JUST as your car is passing by--your car cannot backtrack to check for a longer persistence. Sure, the ADS could wait until getting confirmations further along that road but I'm certain that (especially evil-intending) hackers would take such into consideration.
DukeofNuke FREE RADICAL
User avatar
ukimalefu posted:


Still waiting for the movie about the killer self driving car.

Ahem ...

Image

https://www.imdb.com/title/tt0085333/
Or (indeed demonstrating literal autonomous driving):

Image

Not that I EVER watched that TV show (but I was too old at least in temperment when it was broadcast).
obvs precoupado
Send private message
It was so funny to see Mr. Feeny be the voice of a car.
obvs posted:
It was so funny to see Mr. Feeny be the voice of a car.

Sorry, but your "obviously" current cultural reference did this to me: Image

I have no idea who Mr. Feeny is/was. I am guessing that you must be referring to KITT and not Christine (which I believe lacked any human voice but I have only seen brief fragments of ANY version of that movie while channel-surfing), but I lack the motivation to even check IMDB.com's entry for "Knight Rider" to find that actor's name to see where and when he played Mr. Feeny to understand why that might be funny.
Yeah, yeah Image Get off my lawn!
obvs precoupado
Send private message
:lol:

That cultural reference is 26 years old.
Pariah Know Your Enemy
User avatar
The solution seems obvious: Since all SDCs will be networked road information will be crowd sourced automatically and updated to the maps servers at nano-second speed. Redundant sign sightings will be feed into an algorithm and spurious, one off contradicting readings will be discarded.
There will be master cars that can be used to map new construction zones and detours and it will be a legal requirement to register the intent of creating a traffic diversion. These registration will be automatically entered into the system along with the already digitized construction plans.

Or something like that :shrug:
Pariah posted:
The solution seems obvious:


Yeah. Trains, buses, and taxis. Done.
obvs posted:
:lol:

That cultural reference is 26 years old.

Yeah, but that means I was about 34 when Mr. Feeny happened, so perhaps not age-appropriate for me?
DEyncourt posted:
obvs posted:
:lol:

That cultural reference is 26 years old.

Yeah, but that means I was about 34 when Mr. Feeny happened, so perhaps not age-appropriate for me?

It was prime time TV...
macaddict4life posted:
DEyncourt posted:
obvs posted:
:lol:

That cultural reference is 26 years old.

Yeah, but that means I was about 34 when Mr. Feeny happened, so perhaps not age-appropriate for me?

It was prime time TV...

Doesn't mean that I watched such.
And YOU Image Get off my lawn!
ukimalefu Rebel? resistance? why not both?
User avatar
DukeofNuke posted:
ukimalefu posted:


Still waiting for the movie about the killer self driving car.

Ahem ...

Image

https://www.imdb.com/title/tt0085333/


I know about that. Maybe a remake with a Tesla.
Subsequent topic  /  Preceding topic
Post Reply

Another autonomous car worry