Self Driving Car
In June 2015, Google founder Sergey Brin confirmed that there had been 12 collisions as of that date, eight of which involved being rear-ended at a stop sign or traffic light, two in which the vehicle was side-swiped by another driver, one in which another driver rolled through a stop sign, and one where a Google employee was controlling the car manually. In July 2015, three Google employees suffered minor injuries when the self-driving car they were riding in was rear-ended by a car whose driver failed to brake at a traffic light. This was the first time that a self-driving car collision resulted in injuries. On 14 February 2016 a Google self-driving car attempted to avoid sandbags blocking its path. During the maneuver it struck a bus. Google addressed the crash, saying “In this case, we clearly bear some responsibility, because if our car hadn’t moved there wouldn’t have been a collision.” Google characterized the crash as a misunderstanding and a learning experience.
Self Driving Car
In August 2012, Google announced that their self-driving car had completed over 300,000 autonomous-driving miles (500,000 km) accident-free, typically having about a dozen cars on the road at any given time, and were starting to test them with single drivers instead of in pairs. In late-May 2014, Google revealed a new prototype of its driverless car, which had no steering wheel, gas pedal, or brake pedal, and was fully autonomous. As of March 2016, Google had test-driven their fleet of driverless cars in autonomous mode a total of 1,500,000 mi (2,400,000 km). In December 2016, Alphabet (Google’s parent company) announced that the self-driving car technology would be spun-off to a new company called Waymo.
Self Driving Car
In 2015 a questionnaire survey by Delft University of Technology explored the opinion of 5,000 people from 109 countries on automated driving. Results showed that respondents, on average, found manual driving the most enjoyable mode of driving. 22% of the respondents did not want to spend any money for a fully automated driving system. Respondents were found to be most concerned about software hacking/misuse, and were also concerned about legal issues and safety. Finally, respondents from more developed countries (in terms of lower accident statistics, higher education, and higher income) were less comfortable with their vehicle transmitting data.
Self Driving Car
An autonomous car (also known as a driverless car, auto, self-driving car, robotic car) is a vehicle that is capable of sensing its environment and navigating without human input. Many such vehicles are being developed, but as of May 2017 automated cars permitted on public roads are not yet fully autonomous. They all require a human driver at the wheel who is ready at a moment’s notice to take control of the vehicle.
Self Driving Car
In mid‑October 2015 Tesla Motors rolled out version 7 of their software in the U.S. that included Tesla Autopilot capability. On 9 January 2016, Tesla rolled out version 7.1 as an over-the-air update, adding a new “summon” feature that allows cars to self-park at parking locations without the driver in the car. Tesla’s autonomous driving features are ahead of others in the industry, and can be classified as somewhere between level 2 and level 3 under the U.S. Department of Transportation’s National Highway Traffic Safety Administration (NHTSA) five levels of vehicle automation. At this level the car can act autonomously but requires the full attention of the driver, who must be prepared to take control at a moment’s notice. Autopilot should be used only on limited-access highways, and sometimes it will fail to detect lane markings and disengage itself. In urban driving the system will not read traffic signals or obey stop signs. The system also does not detect pedestrians or cyclists.
Self Driving Car
Other disruptive effects will come from the use of autonomous vehicles to carry goods. Self-driving vans have the potential to make home deliveries significantly cheaper, transforming retail commerce and possibly rendering hypermarkets and supermarkets redundant. As of right now the U.S. Government defines automation into six levels, starting at level zero which means the human driver does everything and ending with level five, the automated system performs all the driving tasks. Also under the current law, manufacturers bear all the responsibility to self-certify vehicles for use on public roads. This means that currently as long as the vehicle is compliant within the regulatory framework, there are no specific federal legal barriers to a highly automated vehicle being offered for sale. Iyad Rahwan, an associate professor in the MIT Media lab said, “Most people want to live in a world where cars will minimize casualties, but everyone wants their own car to protect them at all costs.” Furthermore, industry standards and best practice are still needed in systems before they can be considered reasonably safe under real-world conditions.
California’s DMV had long frustrated the self-driving car industry, which felt state regulators were holding back innovation that could improve public safety. The DMV previously missed a deadline for autonomous vehicle rules. And when it released rules in December 2015, it excluded fully self-driving vehicles, citing safety concerns.
In a 2014 US telephone survey by Insurance.com, over three-quarters of licensed drivers said they would at least consider buying a self-driving car, rising to 86% if car insurance were cheaper. 31.7% said they would not continue to drive once an autonomous car was available instead.
According to Tesla, starting 19 October 2016, all Tesla cars are built with hardware to allow full self-driving capability at the highest safety level (SAE Level 5). The hardware includes eight surround cameras and twelve ultrasonic sensors, in addition to the forward-facing radar with enhanced processing capabilities. The system will operate in “shadow mode” (processing without taking action) and send data back to Tesla to improve its abilities until the software is ready for deployment via over-the-air upgrades. After the required testing, Tesla hopes to enable full self-driving by the end of 2017 under certain conditions.
The first fatal accident involving a vehicle being driven by itself took place in Williston, Florida on 7 May 2016 while a Tesla Model S electric car was engaged in Autopilot mode. The occupant was killed in a crash with an 18-wheel tractor-trailer. On 28 June 2016 the National Highway Traffic Safety Administration (NHTSA) opened a formal investigation into the accident working with the Florida Highway Patrol. According to the NHTSA, preliminary reports indicate the crash occurred when the tractor-trailer made a left turn in front of the Tesla at an intersection on a non-controlled access highway, and the car failed to apply the brakes. The car continued to travel after passing under the truck’s trailer. The NHTSA’s preliminary evaluation was opened to examine the design and performance of any automated driving systems in use at the time of the crash, which involved a population of an estimated 25,000 Model S cars. On 8 July 2016, the NHTSA requested Tesla Motors provide the agency detailed information about the design, operation and testing of its Autopilot technology. The agency also requested details of all design changes and updates to Autopilot since its introduction, and Tesla’s planned updates schedule for the next four months.
There are different opinions on who should be held liable in case of a crash, in particular with people being hurt. Many experts see the car manufacturers themselves responsible for those crashes that occur due to a technical malfunction or misconstruction. Besides the fact that the car manufacturer would be the source of the problem in a situation where a car crashes due to a technical issue, there is another important reason why car manufacturers could be held responsible: it would encourage them to innovate and heavily invest into fixing those issues, not only due to protection of the brand image, but also due to financial and criminal consequences. However, there are also voices that argue those using or owning the vehicle should be held responsible since they lastly know the risk that involves using such a vehicle. Experts suggest introducing a tax or insurances that would protect owners and users of autonomous vehicles of claims made by victims of an accident. Other possible parties that can be held responsible in case of a technical failure include software engineers that programmed the code for the autonomous operation of the vehicles, and suppliers of components of the AV.
In June 2011, the Nevada Legislature passed a law to authorize the use of autonomous cars. Nevada thus became the first jurisdiction in the world where autonomous vehicles might be legally operated on public roads. According to the law, the Nevada Department of Motor Vehicles (NDMV) is responsible for setting safety and performance standards and the agency is responsible for designating areas where autonomous cars may be tested. This legislation was supported by Google in an effort to legally conduct further testing of its Google driverless car. The Nevada law defines an autonomous vehicle to be “a motor vehicle that uses artificial intelligence, sensors and global positioning system coordinates to drive itself without the active intervention of a human operator.” The law also acknowledges that the operator will not need to pay attention while the car is operating itself. Google had further lobbied for an exemption from a ban on distracted driving to permit occupants to send text messages while sitting behind the wheel, but this did not become law. Furthermore, Nevada’s regulations require a person behind the wheel and one in the passenger’s seat during tests.