By the end of this week, potentially thousands Tesla owners test the latest version of the automaker’s “fully autonomous driving” beta software., Version 10.0.1, on public roads, as a regulatory agency Federal authorities investigate system safety After some notable crashes.
new Study from Massachusetts Institute of Technology Confidence in the idea that the FSD system, despite its name, may not actually be so safe, even though it is actually an advanced driver assistance system (ADAS) rather than an autonomous system. Give Researchers studying gaze data from the autopilot’s departure epoch, initiated by 290 people, have found that the use of a partially automated driving system can lead to driver neglect.
“Visual behavior patterns change back and forth [Autopilot] Withdrawal, “the study states. “Before being released, drivers were less likely to look at the road and focused on areas unrelated to driving compared to after shifting to manual driving. Off-road gaze before being released from manual driving. The high proportion of was not compensated for by the longer line of sight ahead. “
Tesla CEO Elon Musk says that not everyone who pays for FSD software will have access to a beta that promises more automated driving capabilities. First, Tesla uses telemetry data to capture personal driving metrics over a seven-day period to ensure that drivers are paying close attention. This data can also be used to implement a new safety assessment page that tracks the owner’s vehicle linked to insurance.
MIT’s research provides evidence that drivers may not be using Tesla’s Autopilot (AP) as recommended. The AP includes safety features such as traffic-conscious cruise control and auto-steering, which reduces driver’s attention and often releases the steering wheel. Researchers have discovered that this type of behavior may be the result of a misunderstanding of what AP functions can do and what their limitations are. Drivers with automated tasks can naturally become bored after trying to maintain their visual and physical attention.
The report, entitled “Models of Naturalistic Gaze Behavior on Tesla Autopilot Withdrawal,” tracks owners of Tesla Models S and X for over a year across the Boston region. The vehicle was equipped with a CAN bus, GPS, and a real-time intelligent driving environment recording data acquisition system 1 that continuously collects data from three 720p camcorders. These sensors provide information such as vehicle kinematics, driver interaction with the vehicle controller, mileage, position and driver posture, face, and scenery in front of the vehicle. MIT has collected approximately 500,000 miles worth of data.
The point of this research is not to be ashamed of Tesla, but to propose a driver attention management system that can provide real-time feedback to the driver and adapt automation functions according to the driver’s attention level. Currently, the autopilot uses a hands-on-wheel sensing system to monitor driver involvement, but not through eye or head tracking to monitor driver attention.
The researchers behind this study said, “Based on naturalistic data, we understand the characteristics of driver attention changes under automation and ensure that drivers remain fully engaged in their driving tasks. We have developed a seemingly behavioral model that can support the development of solutions. ”This not only helps driver monitoring systems deal with“ atypical ”eyes, but also provides the safety benefits of automation for driver behavior. It can also be used as a benchmark for research.
Companies such as Seeing Machines and SmartEye are working with automakers such as General Motors, Mercedes-Benz and Ford to not only deploy camera-based driver assistance systems in ADAS-powered vehicles, but also. Drunk driving or driving disability.. Technology exists. The question is whether Tesla will use it.
MIT study finds Tesla drivers become inattentive when Autopilot is activated – TechCrunch Source link MIT study finds Tesla drivers become inattentive when Autopilot is activated – TechCrunch