In 392 incidents recorded by the National Highway Traffic Safety Administration from July 1st last year to May 15th, six people lost their lives and five were seriously injured. Teslas, which operated on autopilot, the most ambitious Full Self Driving mode or any of their related features was in 273 accidents. The revelations are part of a federal effort to determine the safety of advanced driving systems as they become more common. In addition to the futuristic charm of self-driving cars, dozens of carmakers have released automated parts in recent years, including features that allow you to get your hands off the steering wheel under certain conditions and help you park in parallel. In Wednesday’s announcement, NHTSA revealed that Honda vehicles were involved in 90 incidents and Subarus in 10. Ford Motor, General Motors, BMW, Volkswagen, Toyota, Hyundai and Porsche reported five or fewer . “These technologies are very promising to improve safety, but we need to understand how these vehicles perform in real life,” said Steven Cliff, the service manager. “This will help our researchers to quickly identify potential trends in defects that occur.” Speaking to reporters before Wednesday’s release, Dr. Cliff also warned against drawing conclusions from the data collected so far, noting that it does not take into account factors such as the number of cars from each manufacturer in circulation and equipped with these types of technologies. “The data may raise more questions than they answer,” he said. Some 830,000 Tesla cars in the United States are equipped with autopilot or other company driver assistance technologies – explaining why Tesla vehicles accounted for nearly 70 percent of all reported accidents. Ford, GM, BMW and others have similar advanced systems that allow hands-free driving under certain conditions on highways, but far fewer of these models have been sold. These companies, however, have sold millions of cars in the last two decades that are equipped with individual driver assistance system components. Accessories include so-called lane keeping, which helps drivers stay in their lane, and adaptive cruise control, which maintains the car’s speed and automatically brakes when forward traffic slows down. Dr. Cliff said NHTSA will continue to collect bug data for such features and technologies, noting that the service will use them as a guide to setting rules or requirements for how to design and use them. The data was collected on the basis of an NHTSA order issued a year ago, which required automakers to report accidents involving cars equipped with advanced driver assistance systems, also known as ADAS or Level-2 Automated Driving Systems. The order was partly triggered by accidents and fatalities over the past six years involving Teslas autopilot operation. Last week, NHTSA expanded its investigation into whether Autopilot has technological or design flaws that pose a security risk. The agency investigated 35 accidents that occurred while Autopilot was activated, including nine that have resulted in the deaths of 14 people since 2014. It had also launched a preliminary investigation into 16 incidents in which Teslas crashed into emergency vehicles under autopilot control. they had stopped and their lights were blinking. According to the order issued last year, NHTSA also collected data on accidents or incidents involving fully automated vehicles that are still mostly under development but are being tested on public roads. Manufacturers of these vehicles include GM, Ford and other traditional automakers, as well as technology companies such as Waymo, which is owned by Google’s parent company. These types of vehicles were involved in 130 incidents, NHTSA found. One resulted in serious injuries, 15 in minor or moderate injuries and 108 did not result in injuries. Many of the accidents involving automated vehicles have resulted in wing curves or bumper taps, as they operate mainly at low speeds and in city driving. Waymo, which has a driverless taxi fleet in Arizona, was involved in 62 incidents. GM Cruise Division, which has just started offering driverless taxi rides in San Francisco, was involved in 23. A minor accident involving an automated test vehicle built by Pony.ai, a start-up, resulted in revoking three of the company’s vehicle correction software tests. The NHTSA mandate was an unusually bold step for the regulator, which has come under fire in recent years for not being more dynamic with the automotive industry. “The agency is gathering information to determine if, in the field, these systems pose an unwarranted safety risk,” said J. Christian Gerdes, a professor of engineering and director of the Stanford University Automotive Research Center.

Issues with the Tesla autopilot system

Card 1 of 5 Claims for safer driving. Tesla cars can use computers to handle certain aspects of driving, such as changing lanes. However, there are concerns that this driver assistance system, called Autopilot, is not secure. Here is a closer look at the subject. Driver assistance and accidents. A 2019 crash that killed a college student highlights how gaps in autopilot and driver distractions can have tragic consequences. In another accident, a Tesla hit a truck, killing a 15-year-old boy from California. His family sued the company, claiming the Autopilot system was partly responsible. Safe shortcuts? Former Tesla executives say the carmaker may have undermined safety by designing the Autopilot driver assistance system to fit the vision of Elon Musk, its CEO. Mr Musk is said to have insisted that the system rely solely on cameras to monitor the car’s environment, instead of using additional detection devices. Other companies’ self-driving vehicle systems typically follow this approach. Information gap. Lack of reliable data also hinders system security ratings. Reports published by Tesla every three months suggest that accidents are less common with autopilot than without, but the data may be misleading and do not explain the fact that Autopilot is mainly used for highway driving, which is generally twice as common. safe from driving in the city. Street’s. An advanced driver assistance system can steer, brake and accelerate vehicles on its own, although drivers must remain vigilant and ready to take control of the vehicle at all times. Safety experts are concerned because these systems allow drivers to give up active control of the car and could make them think that their cars are being driven by them. When technology is malfunctioning or unable to handle a particular situation, drivers may be unprepared to take control quickly. The NHTSA mandate required companies to provide accident data when using advanced driver assistance systems and automated technologies within 30 seconds of impact. Although this data provides a broader picture of the behavior of these systems than ever before, it is still difficult to determine whether they reduce conflicts or otherwise improve security. The agency has not collected data that would allow researchers to easily determine if using these systems is safer than turning them off in the same situations. “The question: What is the baseline with which we compare this data?” said Dr. Gerdes, a Stanford professor who from 2016 to 2017 was the first head of innovation for the Department of Transportation, of which NHTSA is a part. But some experts say comparing these systems to human driving should not be the goal. “When a Boeing 737 falls from the sky, we do not ask: ‘Does it fall from the sky more or less than other planes?’ said Bryant Walker Smith, an associate professor at the University of South Carolina and law and engineering schools specializing in emerging transportation technologies. “Accidents on our roads are equivalent to many plane crashes every week,” he added. “Comparison is not necessarily what we want. “If there are accidents that these driving systems contribute to – accidents that would not otherwise have happened – this is a potentially solvable problem that we need to be aware of.”