Print Friendly, PDF & Email

A few weeks ago, when Tesla Motors founder Elon Musk was talking up the proposed merger of his electric vehicle company with the Solar City franchise run by his cousin, he boldly predicted that Tesla would be the world’s first trillion dollar company.


“I think as a combined automotive and power storage and power generation company; I think the potential is there for Tesla to be a $US1 trillion company, market cap company,” Musk said.

“So, if we play a major role in the transition of the world to a new form of energy generation and storage and transport, that’s what kind of happens.”

Musk’s vision is clear – a future energy system that is a mixture of solar power, stationary storage and electric cars that will power the planet, both for its electricity and much of its transport needs.

“Solar power, stationery storage, electric cars; this is Earth’s solution. And we’re going to try to make that happen as fast as possible and the fundamental good of SolarCity and Tesla will be measured by the degree at which we accelerate that transition so trying to make it happen as fast as possible.

“Everywhere it’s going to be solar battery. There will be some wind, and for some geothermal, hydro and there will be some long tail before the final coal plant finally stops operating and the final natural gas plant stops operating.”

Those comments should not be surprising, given the strength of Musk’s vision, the success of Tesla to date, and his SpaceX venture. And his own high personal stakes in the Tesla and Solar City franchises.

The question that has emerged since then is to what extend that vision is damaged by news of the first death from a driver using the Tesla “auto-pilot” system.

The accident, which occurred more than a month before the unveiling for the Solar City merger, and only came to light after safety authorities advised Tesla they would be conducting investigations into the autopilot technology.

It has sparked a huge polemic about whether this signals the end of the self-drive dream, or is an inevitable, if unfortunate and tragic accident that could make the technology even safer in the long term.

joshua brown

What we know about the accident is that 40-year-old Joshua Brown, a Tesla enthusiast, was driving along a highway in Florida when a big “tractor trailer” pulled out in front of it. There seems to be no doubt that the heavy vehicle was at fault, and the truck had no protection such as side, under-rail guards. This link provides a useful “time-line”.

What is not understood is why the sensors on the auto-pilot system failed to detect the tractor-trailer, and kept the Model S at the same speed and direction before it ploughed into the underside of the heavy vehicle, sheering off the roof and killing the driver immediately.

Tesla has been criticised for not releasing details earlier, and for not putting out a warning on the operation of its auto-pilot, although it does insist that drivers should still be alert and have hold of the steering wheel. (There are reports that the dead driver in Florida was watching a video on his iPad at the time of the crash).

Others say an accident at some time was inevitable, and more such fatalities may be unavoidable in the future. Indeed, there has been a huge debate about how auto-pilot and self-driving technologies will make decisions when faced with the inevitable choice between two objects, when one, but not both are unavoidable. What decisions should a software program be allowed to make. Who does it decide to kill?

Will it halt the rise of Tesla and self-drive vehicles? Not likely. As we have reported, Apple is investing billions in the technology, and so are Google, and other leading auto-makers.

As Morgan Stanley has predicted, the question in the future may not be so much a choice between self drive and auto drive, but whether humans are even allowed to drive if autonomous vehicles prove to be much safer over time.

As one pundit noted, it is possible to write a software program to avoid a repeat of the Florida crash; but it’s almost impossible to teach humans not to repeat the same mistakes that lead to other fatalities.

The auto-industry didn’t stop because of fatalities, although they surely sparked controversy at the time. This wikipedia list itemises some of the first and significant road deaths as first the steam car, and then the internal combustion engine, started using the roads.

It didn’t stop the automobile industry, and still doesn’t, even with more than a million deaths a year across the world. The promoters of autonomous vehicles claim that death toll will be all but eliminated with software at the controls, rather than humans.

For the moment, the investors appeared to have sided with Tesla. After being sold off slightly after the initial reports, the stock has since rebounded, nearly regaining its losses since the proposed merger with Solar City was unveiled.

At $US30 billion, that’s a fair premium for a company yet to turn a profit, and is based on an optimistic outlook for the future. It’s still got a long way to go to $US1 trillion.

Published by Giles Parkinson

Giles Parkinson is a journalist of 30 years experience, a former Business Editor and Deputy Editor of the Financial Review, a columnist for The Bulletin magazine and The Australian, and the former editor of Climate Spectator.

27 replies on “Will autopilot death derail Elon Musk’s trillion dollar Tesla dream?”

  1. First of all the Driving Assist system is NOT Auto Pilot it is merely a system which will keep a vehicle inside its white lines. While it may have some ability to detect an intrusion into the lane it does not have laser scanning vertical and horizontal nor does it have communication with other vehicles using an identification tag on any vehicles.
    So to somehow think this is some kind of total guidance system is deluded.
    Until a system of GPS identifying of all vehicles is used to enable all systems inboard to be aware of other vehicles the basic message as is given to the drivers has to be observed.

    The Auto Assist system has to be monitored at all times {{{ you must keep vigilant at all times while using this system }}}

    Frankly every vehicle should be built with emitting identification systems in place this would permit a safe guidance system to be put in place and do away with the need for traffic enforcement police as the job can be done automatically.

    Think about freeing up 90% of the police force to actually go do the job of being your local friend when you have a problem, not spending most of their time wasted on traffic duty.

  2. If this sort of incident with new automotive technologies had hindered their progress we would be knee deep in horse shit instead of bull shit.

  3. As autopilot by default is turned off with a Tesla and drivers are advised to keep their hands on the wheel and be paying attention, it’s unfortunate that the system is known as autopilot. It’s really “driver assist” which I think is the official name. But driver assist is problematic: when drivers find it hard to pay attention even without driver assist, it must be so hard to pay careful attention when your car drives itself perfectly competently most of the time. Watching a video is pushing it though.

    Tesla’s autopilot partner Mobileye’s comments and Tesla’s response

    Comments on Lidar (google’s system) compared to Mobileye and V2V (vehicle-to-vehicle communication)
    People say that autopilot is so far away but I’d imagine V2V will be a powerful inclusion in the mix. Additionally, autopilot doesn’t have to be perfect – it only has to beat the crash record of human-error-laden driving – can’t be too hard, surely.

    1. “it only has to beat the crash record of human-error-laden driving”

      Yes and no. From a macro perspective that’s absolutely correct however from an individual perspective – would you be happy getting into a machine that randomly kills 1200 people annually (Australia). Fault is implied when human drivers crash, even if there isn’t any.

      1. Humans are the only ones who drive cars, where else does the blame lie?

        Animal strikes and sudden health issues such as stroke or heart attack while driving are the primary ones I can think of. Mechanical failure is another. These would have to be very low percentages though.

        1. Oh I just meant that a driver might be killed through no fault of their own, due to the complete fault of a second driver. However people don’t really consider (but do ultimately accept) the random chance of another driver nodding off at the wheel and crossing lanes. Difference is we all accept humans as imperfect where as a faulty auto-pilot has the potential to be perfect. The levels of expectation are different.

      2. No I wouldn’t. OK, significantly beat human-error-laden driving which probably still is not so difficult. Whatever, I wouldn’t expect autopilot to be 100% perfect and I would expect a certain number of fatalities in the development process – I wouldn’t think that everything could be worked out in modelling and testing.

  4. Tesla cars have no lidar. Google driverless cars have an ugly spinning lidar on the roof – which would have seen the truck.

    We should also ban windscreens from being painted on the back of trucks – I as a human I find it to be unacceptable.

  5. I don’t use my self drive capability much, except for demos. -People are impressed.
    But I really enjoy the exhilaration of driving the car myself.
    What the car does have, however, is smart cruise control that has an early warning anti-collision function that will automatically brake to avoid a stack.
    This also works off cruise. I’ve been alerted by my car several times when the vehicle in front slowed suddenly. So, on the whole, it’s much safer than most cars anyway, for lots of reasons.
    Generally best not to be watching Harry Potter…

  6. It would be most illogically conservative if it held back Tesla much at all. It would be an example of the frequently seen flawed thinking that results in newer technologies being held to account to a vastly different scales than existing technologies – just because we’re used to it, doesn’t mean it should get a free pass.

    Andrew Bolt stated the Conservative viewpoint on The Weekly, and I’m paraphrasing here “Why do we have to change things so quickly when we’ve gotten to where we are now with what we have available now”. You can’t argue with the second half of that statement, but it’s the “why” which is illogical in almost every way. It makes the almightily grand assumption that we’re at some kind of peak of humanity and change is invariably for the worse. Maybe they should have said that prior to the industrial revolution?

    How many royal commissions should we have into human-driven automobiles each year?

    1. Let us look at what you posted the quote from Andrew Bolt.
      Andrew Bolt stated the Conservative viewpoint on The Weekly, and I’m paraphrasing here “Why do we have to change things so quickly when we’ve gotten to where we are now with what we have available now”.

      This is exactly what is wrong with people who do not know that the present system is so dismally dysfunctional.

      I would caution everyone to not look at the said person for any kind of guidance as to a sensible future as his words do not give guidance and are looking backward to the 1950’s era.

      Apologies to you DevMac

      1. So True ,

        The modern world , it’s as good as it get’s , but at the same time it’s a disaster

  7. Humans may recognize dangerous situations that auto systems aren’t yet able to recognize. (Think a burning petrol tanker a km down the road.) People may also recognize that there is a sensor or system failure. Safety will be maximized by having a driver and auto system that are each able override each other under some circumstances. The system will be better still if the car is able to check that the driver is awake and competent.

  8. When the first planes had autopilot I presume there was a time period during which pilots hovered over the controls. Surely we’re currently in the same situation with cars – even though the Tesla doesn’t have ‘autopilot’.

  9. Two different technologies, one car. The Tesla is primarily an Electric Vehicle using battery – stored energy. This part is the pointy end of the stick that will be required to poke holes in oil’s stranglehold. The Harry Potter hands free thingy is just a tragic gimmick !

  10. I would like to know what sensors failed? I would assume there would be radar, infra red and laser and sonic sensors that all should have detected a large object blocking the road.

    1. I do not think it has radar or infra red or laser or sonic because it is a guidance system or driver assist system not an auto pilot system.
      To have an Auto Pilot system every vehicle has to have emitting and receiving systems as well as every possible type of scanning device possible all those you mention to be in place and not disabled.

      1. I find that astounding, my car has front rear parking sensors that beep like crazy when I am getting too close to an object. Just about all fishermen in a boat have a sonar fish finder and all motion sensor lights have infrared sensors. So the tech is there and its cheap.

  11. The headline is a bit sensationalist. The issues i can see are technical, behavioural, and moral.

    Technically an auto pilot system is becoming more and more feasible. But whilst we are a lot more advanced than a few years ago, there is still a long way to go.

    Behaviourally, drivers still have responsibility, as do aircraft pilots, but unlike pilots there is much less training, and more scope for humans to place too much reliance on these systems. There was a story making the rounds a few years ago (possibly urban myth) of a Japanese driver who died in a car crash and was found in the toilet of his campervan. The suspicion was that he had cruise control on and thought it would be ok to relieve himself.

    Morally is perhaps the most interesting connundrum. Would be ok for a car to divert onto a side walk and risk killing one child rather than ploughing into a peleton of cyclists who have crashed and sprawled across the road, and risk killing a lot more?

    1. The Japanese tourist myth likely stems from a lawsuit in the USA (where else?) regarding a motor home and a driver that got out of the seat to make a coffee.

      Autopilot on an aircraft doesn’t stop you flying into a mountain. This was demonstrated in 1979, when an Air New Zealand antarctic sightseeing flight flew into Mt Erebus after the flight plan was corrected, removing a previous, beneficial, data entry error, placing the flight plan into Erebus.

  12. Have we forgotten that Tesla is an electric car company and the ‘Autopilot’ in question, an accessory?

  13. There is one overriding rule: nothing is perfect!
    … and nothing ever will.
    and then there is the human preference to “blame somebody else” rather than taking responsibility.
    Go Tesla, for all reasons pointed out, e.g. driver-“assist”, watching a movie while driving, etc.

  14. I think self-driving is not going anywhere, but it doesn’t matter. Electric cars are displacing gasoline cars and Tesla’s the market leader.

  15. Actually, all the evidence is pointing to Joshua Brown being at fault. He appears to have been (a) speeding massively (a witness has claimed he passed her while she was driving at 85 mph) and (b) not paying attention (since he made no attempt whatsoever to hit the brakes).

  16. Engineers endeavour to do their best. All new technologies have a learning curve. Unfotunately lives are lost in the process, for example aeroplanes. The essance of life is difficult o comprehend. As an engineer I respect this element of human nature but do not understand it. Scares the S$%&# out of me.

Comments are closed.

Get up to 3 quotes from pre-vetted solar (and battery) installers.