Uber self-driving fatality raises questions about autonomous vehicles

Print Friendly, PDF & Email

Uber puts self-driving trials on hold after a pedestrian was hit and killed by one of its cars in autonomous mode.

share
Print Friendly, PDF & Email

An Uber self-driving car has hit and killed a pedestrian in the US, marking the first known death of a pedestrian struck by a car in autonomous mode on a public road.

The accident, which happened in Phoenix in Sunday night (local time), is still in the process of being investigated by Arizona Police.

So far reports suggest the 49 year-old pedestrian was not using a designated crossing when she was struck by the Uber controlled Volvo SUV – which had a “human back-up operator” behind the wheel at the time, but was operating in autonomous mode, and going about 60km/h.

Some early reports quote police as saying that Uber is likely not at fault in the accident. (Other reports suggest the Uber was speeding slightly at the time, and showed no signs of braking before hitting the pedestrian.)

Uber, meanwhile, has suspended all public road testing trials of its self-driving cars, and speculation has begun over the impact the incident will have on the self-driving industry, as well as the broader AI tech sector.

As Bloomberg reports, the accident comes at a critical time for the AV industry, with big names like Alphabet (Google), General Motors, and Tesla tipping in billions of dollars, alongside Uber, to develop the technology.

And as MIT Review notes, the accident also comes amid what looked like rapid progress on the technology – and a push to loosen legal restrictions. Waymo, a subsidiary of Alphabet spun out of Google, was moving to take safety drivers out of its AVs and broached plans to launch a driverless taxi service in Phoenix later this year, MIT said.

But all this hinges on the companies’ holding the confidence of both consumers, and the regulators in cities like Phoenix, who have opened their roads to public testing of the technology.

As Inc.com has put it, this was the incident all of the above companies were dreading: “when a self-piloting car, with a human behind the wheel or not, plowed into someone because there was some combination of conditions that programmers didn’t anticipate.”

Certainly, it raises all sorts of questions, including the most stinging to the industry (and the subject of a Fox News online poll today): should the autonomous vehicles program continue after the deadly crash? At the very least, there are calls to slow it down.

Bryan Reimer, a research scientist at MIT who studies automated driving, says the accident offers “clear proof” the technology is not ready for commercial roll-out.

“Until we understand the testing and deployment of these systems further, we need to take our time and work through the evolution of the technology,” he said here.

On the other hand, many from the pro-autonomous camp have been quick to point out the fact that pedestrians are hit and killed by human-operated cars on an almost hourly basis in some US states.

But the fact remains that one of the key public sales points of this type of autonomous vehicle technology is safety: that computers and AI systems can drive better and react faster than any person.

As Bloomberg reporter Eric Newcomer put it, the repercussions will come down to a “deeper question of do we expect self driving cars to operate more effectively than human drivers?”

 

The Arizona accident is also drawing comparisons with the Tesla Model S collision with a truck in Florida in May 2016, which claimed the life of the car’s owner – who was behind the wheel at the time, but with the car set to Autopilot.

In that case, the fallout of the fatality was largely contained to Tesla, says Bloomberg, and didn’t seem to hold the industry back.

But experts are warning that the Uber crash is a whole different ball-game.

“A lot of us were surprised that the Tesla fatality did not have greater consequences. The Uber fatality could turn out to be the thing that makes the general public more skeptical,” said Bryant Walker Smith, a professor at the University of South Carolina’s School of Law who studies driverless car regulations.

“In Tesla’s Florida crash, the car was purchased by and used by the victim. In the Arizona crash, the vehicle was a test vehicle under the control in every sense by Uber, and the victim was an ordinary person,” Smith told Bloomberg.

“People are going to be aware of this tragedy and this death, even if they are unaware of the hundreds of other people who died in motor crashes today,” he said.

Print Friendly, PDF & Email


30 Comments
  1. Joe 2 years ago

    People die on the roads every day and no one besides the victim’s immediate family and friends gives it a second thought. An autonomous vehicle gets involved in a road death and it is like ‘end of days’ with full scale public blowback.

    • Stan Hlegeris 2 years ago

      Exactly. 100 additional Americans, more or less, many of them pedestrians, were killed on the same day by meat-bag driven vehicles. The same again today, and tomorrow. This is news only because it suits the interests of those who profit from delaying progress.

      • Wallace 2 years ago

        Your overall argument is good but in this case there’s a problem which must be corrected. This car should have, at the minimum, been hard braking when it struck the person.

        There’s no reason a well developed self-driving system should not detect something as large as a human moving into its direction of travel. There may be instances where some person or animal darts out from between parked vehicles or from behind vegetation and enters the vehicle’s path too fast and close to physically stop the vehicle. But they should be detected and brakes applied.

        Of course we need to withhold judgement util we have more facts. A Tesla on autopilot was reported to hit another car a week or so ago. After the facts were gathered it turned out that the Tesla in question was not outfitted with autopilot capabilities. Jump the gun reporting.

  2. MaxG 2 years ago

    Typical flat-earther response… we already know that self-driving cars are involved in less accidents than humans… I rather see humans removed from the driver seat in order to stop the carnage.

    • Matthew Tucker 2 years ago

      statistics please. The stats i have say the opposite

      • Andy Bowe 2 years ago

        Matthew when Tesla upgraded their autopilot software over 12 months ago the research resulted that a Tesla with that running (not even in control) reduced the accident rate by a factor of 5. So yes MaxG is correct with available info. PS insurance companies are recognising it also and offering discounts. Please note if you ask others for stats claiming yours are different perhaps providing yours first and then asking for counter is a better approach?

      • MaxG 2 years ago

        Sure you do… 🙂 no problem…
        Think climate change: there is research pro and con; it depends which is perceived as weighing in heavier 🙂

        The minute I see something ‘attacked”, I always ask if the same rigour has been applied to the claimed ‘good’ aspect.

        I have done professional analysis of problems for decades and have to admit that any statistics without stating the the baseline assumptions and constraints is useless (hence, can be skewed in any direction desirable).

        Some sources:
        A pretty good one:
        http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0184952

        Fair enough:
        https://www.axios.com/humans-cause-most-self-driving-car-accidents-1513304490-02cdaf3d-551f-46e6-ad98-637e6ef2c0b9.html

        The typical on the fence (and you need to do this and that, an some psychology:
        https://www.scientificamerican.com/article/are-autonomous-cars-really-safer-than-human-drivers/

        • Wallace 2 years ago

          Bull. The data confirming climate change and the driving forces is overwhelming.

  3. Tom 2 years ago

    Let’s see kilometers driven per pedestrian killed and compare whether human drivers are better.
    Fifty bucks says they are much worse.

    • MaxG 2 years ago

      I bet a hundred. 🙂
      We do not need cars to kill a person, take a gun and visit a school — as they do in the US.

      • Joe 2 years ago

        ……30,000 people a year are shot to death in The Land of the Free, Home of the Brave.

    • Andy Saunders 2 years ago

      I’m sure you’re right, but unfortunately that’s mostly irrelevant.

      See the airline industry, where accident rates like those experienced on the roads would cause immediate, huge outrage. So they are forced to invest to reduce risk to minuscule probabilities. So it will be with autonomous vehicles.

    • Matthew Tucker 2 years ago

      There is one death on the road per 160 million KM driven. The total number of KM driven by fully autonomous vehicles is around 560,000. So, at the moment the ratio is much, much worse for autonomous vehicles

      • Tom 2 years ago

        Interesting. Links to the stats?
        We need some more km to make it statistically significant.

  4. MaxG 2 years ago

    I drive a car with — let’s say a list of features, that provide — some autonomous functionality. It does stop-and-go traffic by itself, and has a LIDAR-based adaptive cruise and collision prevention systems. Based on this experience, I have seen or more so imagined situations, which neither a computer not I could have avoided (had they occurred).
    Think for a moment: a car at 60km/h drives 16 meters per second. Something coming suddenly in its way within this distance, the car will not be able to stop purely on physics alone; not taking driver or system reaction into account.

    • Rod 2 years ago

      I guess we will never know the full circumstances but if I was driving late at night (or early morning) and noticed a stationery pedestrian or cyclist on the median or footpath, I would instinctively slow and cover my brakes. I would be very surprised if the programming in AVs covers this scenario.
      There is too much money tied up in this to stop it but every death will be scrutinised much more than in a non AV incident.
      I can see a day when pedestrians and cyclists, due to their low footprint, will be required to carry a transponder.
      Maybe an implanted chip. I’ll send the suggestion to Peter Dutton and Broader Farce.

      • MaxG 2 years ago

        If you would notice it; dusk; greenery, obstructions, etc.
        Like you said, we never know, and the media has nothing better to do than crank every nuisance.

      • Wallace 2 years ago

        ” I would be very surprised if the programming in AVs covers this scenario.”

        The programming should cover this. The car should calculate distance to ‘target’ and speed. And closely observe for any movement in the direction of the vehicle’s path.

        The superiority of self-driving (once mature) will be that the vehicle will be able to very closely monitor the potential problem while simultaneously monitoring the rest of the environment for problems. A human driver might carefully watch the identified problem and not see something happening on the other side of the street.

        • Rod 2 years ago

          Yes, it should but does it now?
          If it is dark and late at night, is the logic there to say “drunks about”? if it is raining can the software adjust multiple parameters like traveling below the speed limit? In this incident the car was actually 3 mph above the speed limit at night!
          Can it give a reasonable estimation if a stationery object might move into its path unexpectedly?
          Is it capable of scanning far ahead to predict a problem like a good experienced driver can? Almost a sixth sense.
          As a cyclist I long for the day we take idiots out of cars and most of the cars disappear due to car sharing and AV hailing.
          But until the safety superiority is proven I remain sceptical and concerned.

          • Wallace 2 years ago

            There are several companies developing self-driving systems and we don’t know how advance any of them are.

            I want to know more about this incident.

          • Rod 2 years ago

            Vulnerable road users are undoubtedly harder to “see”.
            There was a story about a year ago about a Tesla using the earlier software killing a cyclist in Ireland. It won’t take many similar incidents before the population demand public roads cease being used as testing tracks.

          • Wallace 2 years ago

            Based on what I see in the video she should have never been struck. We have technology to see into the dark – radar, lidar, ultrasound and IR cameras.

            I’m leaning toward Volvo being at fault but will wait to see whatever other facts might be released.

          • Rod 2 years ago

            Agreed, she was moving across several lanes rather than off the curb or from behind an object. Hard for a human to see at night but the various sensors should have.

          • Wallace 2 years ago

            Here’s an image taken in the dark with an infrared scouting camera.

            https://uploads.disquscdn.com/images/ae39bf77f12db13054674e28cf50fb83af6d6e8eac270ebfaa73a8f50bb2cf0a.jpg

            You can turn a cheap digital camera into an infrared camera by eliminating the IR filter in front of the sensor. I can’t see why a self-driving vehicle could not have seen into the dark and detected an object moving toward the car’s path.

  5. palmz 2 years ago

    Not bagging out AV’s but I am surprised that it was doing 38Mph in a 35 zone.

    Hopefully this accident can be used to improve the programming of these cars to help prevent similar indents. (as stated it did have a human backup driver so we cannot say a human would have done better)

  6. MaxG 2 years ago

    On another note: referring to road accidents; in an accident in Germany, there will be no 100% right or wrong or at fault or not; based on the premise: the minute you participate in traffic you accept the inherent risk of an accident. E.g. in the extreme case of an oncoming vehicle — if you do nothing to avoid the collision you are partially at fault.
    In the context of self-driving cars and this stated accident. Stepping onto the road without regard to anything certainly does not constitute innocence (as in ‘not at fault’) on behalf of the victim. — Please, think about this for a moment…

    • Wallace 2 years ago

      Correct, the person who was killed bears some portion of the blame. Perhaps a large percentage if she quickly darted in front of the vehicle.

      But if the car did not slow and apply brakes before the pedestrian was stuck then the self-driving system is not ready for use.

      We don’t yet know enough about the incident.

  7. Ian 2 years ago

    I want a self driving car, stupid woman, she could jeopardise all that. In the future, if I send my car to fetch me a beer, I don’t really care if it knocks down 10 people ,just don’t get in its way. People die on the roads every day … an autonomous vehicle gets involved… and it’s like ‘end of days’ – Seriously? People’s comments send shudders down my spine.

Comments are closed.

Get up to 3 quotes from pre-vetted solar (and battery) installers.