By Vaheh Manoukian
Whether it’s Tesla, Google, or Uber, technology companies are racing to advance the technology and policies to promote the autonomous vehicle (“AV”) movement. Waymo, a project originated by Google, is leading the way in this $6 billion industry. Waymo’s current mission includes safely improving and scaling AVs through robust testing. Its Arizona–based pilot program uses no human operators in its Chrysler Pacifica minivans. Recently Elon Musk, CEO of Tesla motors, promises his customers a fully autonomous Tesla vehicle by the end of 2020. According to Musk, the future Tesla vehicle will be so capable that the rider can take a nap as the vehicle drives itself to the destination.
Consumer interest in autonomous vehicles is increasing around the globe. In 2016, a poll in the U.S. by Massachusetts Institute of Technology found that only 19% of respondents between 35 and 44 and 15.4% of respondents age 75 or older were willing to use autonomous vehicles. Thus, despite potential advantages, customer concerns still loom over the technology, as users are unsure if they are ready to ride in or share the road with driverless vehicles. While AV vehicles should be safer in theory, traffic does not exist in a vacuum. Human interaction such as eye contact, honking, and hand gestures are often vital for the flow of traffic (and are often required by law). Additionally, moves that are against the law may be necessary from time to time to avoid dangerous situations. AV’s may never be able to flow in traffic with human drivers because humans are often unpredictable in their route and driving standards.
For the purposes of regulating current vehicle technology, California Vehicle Code Section 38750 defines an AV is “any vehicle equipped with autonomous technology that has been integrated into that vehicle.” Section 38750 currently does not allow an AV to be driven on public roads without human assistance. According to Section 38750, autonomous vehicles can be tested on public roads if they are operated by a licensed driver who is “designated by the manufacturer of the autonomous technology,” and “[is] seated in the driver’s seat . . . capable of taking over immediate manual control of the autonomous vehicle in the event of an autonomous technology failure or other emergency.”
Lawmakers are hesitant to allow AV testing on public roads because drivers already face several challenges on the road and car accidents are one of the leading causes of death in America. The biggest safety advantage to an autonomous vehicle is that a robot is not human; it is programmed to obey all the rules of the road, it will not speed, and it cannot be distracted by a text message. Some argue that AV technology will actually promote carelessness in traffic and introduce new problems for cities that introduce them. However, hypothetically at least, AVs can detect what humans cannot, especially at night or in low-light conditions. 
How Future Laws and Regulations for Driverless Technology Can Ensure Public Safety
One motivation for developing AVs is to reduce greenhouse gas emissions and encourage efficient land use. California’s Senate is currently reviewing SB 59, a bill focused on establishing a policy for AVs that promotes both. The purpose of this bill is to “maximize ride-sharing and shared-use vehicles as an alternative to personal car ownership by encouraging pooling, prioritizing pooled vehicles’ mobility, and providing for shared-vehicles passenger safety and comfort.”
The National Highway Traffic Safety Administration released guidelines for self-driving vehicles in September 2018 in its Vision for Safety 2.0 guidance. These guidelines not only provide a template for legislatures and state highway officials to follow, but also clarify that carmakers and startups with AV technology do not need to wait for federal legislation to test or deploy their systems.
One area of inquiry is the anticipated impact of AVs on legal liability and insurance policies. Will insurance cover AV accidents? Will the operator, the owner, or the manufacturer be held liable? Liability rules applying to AVs will need to define roles, determine fault, and fix compensation for harm, as current law does for non-automated vehicles. Automobile accident liability cases are most often decided on theories of negligence or strict liability, which include no-fault statutes in some states.
Safety Obstacles Facing AVs
Public safety is one of the biggest issues that is prohibiting AVs from being released onto our roads. In 2018, an autonomous Uber vehicle with a safety driver struck and killed a pedestrian in Arizona. Uber quickly suspended all testing of its autonomous fleet while it investigated the causes of the crash. A few days later, the driver of a Tesla in autonomous mode died when the vehicle crashed into a highway median in Mountain View, California. Tesla did not suspend the feature in its vehicles while the company and the National Highway Traffic Safety Administration (NHTSA) investigated the causes of that crash.
Since proponents highlight the safety improvements of driverless cars, these fatalities will invite stricter scrutiny of the claims of the technology. After Uber’s fatal crash, Toyota built a new facility to test its vehicles’ responses to “edge cases”—extreme situations too dangerous to test on public streets.
AVs will have more systems and processors than non-autonomous vehicles currently on the road. The increasingly sophisticated technology will make AVs more vulnerable to hackers. Although it may seem like an issue for the future, car-hacking is already happening. Fiat Chrysler recalled 1.4 million Jeep Cherokees after it found the cars were vulnerable to hacking. Security experts were able to wirelessly control functions like acceleration and braking, creating an incredibly dangerous situation. Until these problems are solved, fully autonomous cars may pose a dangerous risk to other road users. At the moment, driverless cars are only truly safe when tested and operated around other driverless cars in a controlled environment.
We can be quite confident that AVs will be common on city streets sometime in the future. Questions revolving this technology are grounded in the safety and security of passengers and pedestrians. Lawmakers are currently hesitant to promote testing AV’s on public road because of the dangers involved in the technology. Thus, lawmakers and the auto industry must first come together in order to progress towards a safe and efficient platform for developing AV technology. Once AVs are sufficiently developed and ready to be integrated into traffic, traffic laws and safety standards must adjust to consider the concerns of pedestrians and other drivers on the road. The benefits to AVs are undeniable: they will be better for the environment, they will decrease traffic, and they will be safer than the cars on the roads today so creating an effective regulatory scheme is essential.
 Waymo’s “Early Rider Program” in Metro Phoenix Allows People to Use Self-driving Cars to Go Places Every Day in Exchange for Feedback on the AVs, WAYMO, https://waymo.com/apply/ (last visited Mar. 31, 2019).
 Andrew J. Hawkins, Waymo’s Fully Driverless Minivans Are Already Putting People to Sleep, THE VERGE (Mar. 14, 2018), https://www.theverge.com/2018/3/13/17114194/waymo-driverless-minivan-arizona-early-rider-video.
 Aarian Marshall, Elon Musk Promises a Really Truly Self-Driving Tesla in 2020, WIRED (Feb. 19, 2019), https://www.wired.com/story/elon-musk-tesla-full-self-driving-2019-2020-promise/.
 See id.
 John Laporte, Autonomous Vehicle Technology, STATISTA, https://www.statista.com/topics/3573/autonomous-vehicle-technology/ (last visited Mar. 31, 2019).
 This broad definition encompasses any vehicle with one or more collision avoidance systems, such as those using electronic blind spot assistance, adaptive cruise control, or lane departure warning.
 Cal. Vehicle Code § 38750.
 Alissa Walker, Are Self-Driving Cars Safe for Our Cities?, CURBED (Sept. 21, 2016), https://www.curbed.com/2016/9/21/12991696/driverless-cars-safety-pros-cons.
 See id.
 Senate Bill No. 59, (2019) (citing to a proposal to amend Cal. Gov’t Code, tit. 7, § 65589.5).
 See id.
 Jack Karsten et al., The State of Self-Driving Car Laws Across the U.S., BROOKINGS (May 1, 2018), https://www.brookings.edu/blog/techtank/2018/05/01/the-state-of-self-driving-car-laws-across-the-u-s/.
 After a US federal investigation, it is thought that the car did not stop because the system put in place to carry out emergency stops in dangerous situations was disabled. Meriame Berboucha, Uber Self-Driving Car Crash: What Really Happened, FORBES (May 28, 2018), https://www.forbes.com/sites/meriameberboucha/2018/05/28/uber-self-driving-car-crash-what-really-happened/#47643a04dc41.
 See id.
 Tesla reported that the driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. Jason Green, Tesla: Autopilot Was on During Deadly Mountain View Crash, MERCURY NEWS (Mar. 30, 2018), https://www.mercurynews.com/2018/03/30/tesla-autopilot-was-on-during-deadly-mountain-view-crash/.
 See id.
 Andrew J. Hawkins, Toyota Will Test Self-Driving Car “Edge Cases” At New Proving Ground in Michigan, THE VERGE (May 3, 2018), https://www.theverge.com/2018/5/3/17314778/toyota-self-driving-car-test-proving-ground-michigan.
 Paul Lazarra, Fiat Chrysler Recalls 1.4m Jeeps Affected by Hack Attack, ALPHR (2015), https://www.alphr.com/cars/1001240/fiat-chrysler-recalls-14m-jeeps-affected-by-hack-attack (last visited Mar 31, 2019).