February 29, 2016
On Friday, the Office of the Inspector General threw another report on the stack of official criticisms of the way the National Highway Traffic Safety Administration’s Office of Defects Investigation operates.
As its title suggests, Additional Efforts are Needed to Ensure NHTSA’s Full Implementation of OIG’s 2011 Recommendations was a look at the progress the agency didn’t make in the last five years on 10 recommendations to implement process improvements that would track consumer complaints, thoroughly document Defect Assessment Panels decisions on which risks to investigate, achieve its timeliness goals for completing investigations, create a systematic process for determining when to involve a third-party or Vehicle Research and Test Center for assistance, train its staff, and keep identifying information out of public files, among others.
Five years later, the OIG found that NHTSA had satisfactorily completed three of the action items: it conducted a workforce assessment, it boosted its communication and coordination with foreign countries on safety defects and it reviewed and tracked consumer complaints associated with specific investigations. But it also noted that the agency was lagging in some of most important process improvements. Sure, NHTSA created a bunch of systems to address these deficiencies, but it did little to ensure that those systems were used with consistency:
Although NHTSA took actions to address all 10 of our 2011 recommendations, our review determined that ODI lacks sufficient quality control mechanisms to ensure compliance with the new policies and procedures, and lacks an adequate training program to ensure that its staff have the skills and expertise to investigate vehicle safety defects. Earlier this year, NHTSA stated that it will “aggressively implement” the 17 recommendations from our June 2015 report. The results of this review of NHTSA’s implementation of OIG’s 2011 recommendations can provide lessons learned as NHTSA makes important decisions regarding future process improvements.
In the aftermath of the first big safety crisis of the modern era – Firestone Tire tread separation failures the caused the tippy Ford Explorer to rollover – the General Accounting Office, in response to Congressional inquiries, and mostly the DOT Office of the Inspector General, began to pump out reports. But, with the successive waves of high-profile safety problems –Toyota unintended acceleration; the General Motors ignition switch failures and exploding Takata airbag inflators – the pace has accelerated.
There have been seven censorious assessments since 2002 – with critiques on NHTSA’s data collection and analysis, its recall management practices, its lack of investigative and decision-making processes, and its enforcement and transparency. (We have been covering these reports, as well as documenting these problems:
Inspector Agrees with SRS: NHTSA Ain’t Right
Elective Warning Reports: When Manufacturers Don’t Report Claims
Elective Warning Reports Redux
How NHTSA and NASA Gamed the Toyota Data
What NHTSA Doesn’t Want You to Know about Auto Safety
The most recent take-down, Inadequate Data and Analysis Undermine NHTSA’s Efforts to Identify and Investigate Vehicle Safety Concerns, was released by the OIG a mere seven months ago. This report rapped the agency for ODI’s lack of process, for prioritizing probes by chances of recall success rather than threat to safety; lack of transparency; failure to audit manufacturer’s EWR reports; and the lack of enforcement.
To determine NHTSA’ s progress on the ten action items it cited in 2011, OIG investigators pored through agency records, looking for evidence that agency staff was documenting its reasons for not meeting its deadlines to complete investigations, or its decisions to move forward with defect investigations, and found pretty spotty performance.
For example, NHTSA had agreed to start putting its defect screening meeting minutes and other pre-investigation information, such as data from insurance companies, in its Advanced Retrieval Tire, Equipment, Motor Vehicle Information System (ARTEMIS), the system originally implemented in July 2004 to analyze and identify trends in the early warning reporting data. But the OIG found that NHTSA managers hadn’t developed any processes to ensure that staff was actually putting the stuff in. Out of a sample of 42 issue evaluations opened in 2013, 42 percent had no documentation of any pre-investigative work.
The documentation for failing to meet a timeliness goal was worse: more than 70 percent of delayed investigations the OIG reviewed did not include justifications for why ODI’s goals for timely completion of investigations were not met.
Under policies ODI established in 2013, managers developed a checklist to ensure that all evidence associated with an investigation “such as consumer complaints and information exchanged with manufacturers was documented.” The OIG reviewed documentation for 36 preliminary evaluations and six engineering analyses opened between March 2013 and December 2013 and found that ODI used the checklist for 4 preliminary evaluations and zero engineering analyses. (This may explain why FOIAs to the agency regarding its investigatory activities often turn up next to nothing.) The OIG concluded: “As a result, ODI may not be capturing all evidence associated with an investigation, potentially hampering its ability to assess or support the adequacy of its investigations.”
Less egregious but still inconsistent was how well NHTSA’s contractor redacted files for public consumption (the OIG found nine out of 62 investigation documents were not fully redacted, containing birth dates, driver’s license numbers and e-mail addresses.) and filing meeting minute notes of defect screening meetings (out of 21 panel meetings held in 2013 and 2014, 17 percent were not appropriately documented).
One of the most troubling observations was the lack of training for ODI staff. Back in 2011, NHTSA argued that a formal training program wasn’t necessary, but agreed to offer basic training in automotive technology, ODI policies and processes, computer skills for data analysis, and ARTEMIS. But, when ODI investigators paid a visit, they found this:
During our audit, ODI’s pre-investigative staff told us that they received little or no training in their areas of concentration, some of which can be quite complex. For example, ODI staff charged with interpreting statistical test results for early warning reporting data told us they have no training or background in statistics.
Since data are at the root of all NHTSA activities, allowing the people to make the first cut with no training and no understanding of statistics seems counterproductive, to say the least. (Although given NHTSA’s many numerically dubious claims, The Safety Record cannot say that it is surprised.)
In the last year, we have seen a lot of positive changes at NHTSA, and we know that jack-hammering a better agency out of decades of calcified practice will take time. But, if this 5-year progress report is any indication, it’s going to be a long time before we see the OIG’s 17 recommendations from June come to fruition.