vowed that his so-called Department of Government Efficiency, or DOGE, would operate with “maximum transparency.” DOGE’s website serves as a testament to this commitment, as the Tesla and SpaceX CEO, now a White House adviser, has frequently highlighted. There, the organization lists cut grants and budgets, offering an ongoing account of its activities.
However, in recent weeks, The New York Times has reported that DOGE has not only included significant errors on its website—such as claiming to save $8 billion while the canceled contract was actually for $8 million, of which $2.5 million had already been disbursed—but has also worked to obscure these inaccuracies post hoc, removing identifying information about DOGE’s reductions from the site, and later from its coding, making it harder for the public to verify and track.
For road safety researchers who have observed Musk over the years, this pattern seems familiar. DOGE “issued some figures that didn’t add up, then altered them,” claims Noah Goodall, an independent transportation researcher. “That feels very much like Tesla. It gives the impression they’re not truly interested in accuracy.”
For nearly ten years, Goodall and others have been monitoring Tesla’s public announcements regarding its Autopilot and Full Self-Driving features, which are advanced driver-assistance systems intended to reduce driving stress and improve safety. Over time, researchers assert, Tesla has released safety data lacking necessary context; promoted figures that are unverifiable by outside experts; highlighted favorable safety metrics that later proved misleading; and even retroactively altered previously released safety statistics. The inconsistencies in the data have led Tesla Full Self-Driving enthusiasts to begin crowdsourcing their own performance metrics.
Rather than transparent data releases, “what we are presented with are vague snippets that, when scrutinized in context, appear quite dubious,” claims Bryant Walker Smith, a law professor and engineer at the University of South Carolina, who studies autonomous vehicles.
Government-Facilitated Missteps
Tesla’s initial and most notable number mix-up occurred in 2018, when it published its first Autopilot safety statistics following the first documented fatality of a driver utilizing Autopilot. Almost immediately, researchers pointed out that while the data seemed to indicate drivers using Autopilot were significantly less likely to crash compared to other American drivers, the figures were missing crucial context.
At that time, Autopilot combined adaptive cruise control, maintaining a preset distance from the car ahead, and steering assistance, which kept the vehicle centered within lane markings. However, the comparison did not account for vehicle type (luxury cars, the only models made by Tesla at the time, are generally less prone to accidents), the demographics of the drivers (Tesla owners were typically wealthier and older, and thus less accident-prone), or the types of roads where Teslas were being driven (Autopilot was designed for divided highways, while crashes are more common on rural roads and particularly on connectors and local roads).
The confusion did not end there. Following the fatal Autopilot accident, Tesla did share some safety statistics with the National Highway Traffic Safety Administration, the country’s road safety authority. Utilizing those figures, the NHTSA released a report claiming that Autopilot resulted in a 40 percent decrease in crashes. Tesla promoted this favorable statistic, even referencing it in 2018 after another individual perished while using Autopilot.