Highway agency takes a hit over safety report on electronic billboards
Posted with permission from Fairwarning.org
Why did the billboard cross the road?
It sounds like the opening line of a corny joke, but it’s actually a question raised by a baffling glitch in a Federal Highway Administration study on the safety of electronic billboards. Billboards that seem magically to have moved from one side of the highway to the other are part of a detailed critique by a former FHWA researcher, who says the federal report is so badly flawed that no one should rely on its conclusions.
The $859,000 FHWA study had been eagerly anticipated by local agencies across the country, including some that held up permit decisions on electronic billboards to await federal guidance. They hoped the report would shed light on whether the visually stunning digital signs, which change messages every few seconds, might pose a threat to traffic safety.
The FHWA study, performed for the agency by the consulting firm Leidos, used sophisticated eye-tracking equipment to time drivers’ glances at billboards and other visual features along assigned routes in Reading, Pa., and Richmond, Va. Finally released years behind schedule, the study found that the digital signs did not prompt drivers to look away from the road long enough to increase the risk of crashes. The billboard industry, which has aggressively pushed to install more of the lucrative displays, trumpeted the results as confirmation of their safety.
But the lengthy critique issued last month by Jerry Wachtel (pictured at right)–who worked for the highway administration in the 1970s and ’80s and served as an adviser in a preliminary phase of the study—says the research is so riddled with errors and contradictions that it should be disregarded (A summary of his critique can be found on the website of the Eno Center for Transportation think-tank). His report lists experts from the U.S. and five other countries who reviewed his conclusions, including one who wrote: “It is highly disappointing, even irresponsible, that a study anticipated for so long on such an important question has been so poorly executed.”
For their part, officials of the FHWA and Leidos, which formerly was known as SAIC, have refused to answer questions about the federal study. “FHWA has no additional comment beyond those made in the report itself,” spokesman Doug Hecox said in an email.
Wachtel, who heads The Veridian Group, a consulting firm, says he wasn’t hired to dissect the report, but decided to do it on his own. He said he was concerned that local officials lacking the technical background to analyze the FHWA study would simply read the conclusions “and begin to promulgate regulations based on this faulty data.”
“The breadth and depth of the mistakes and errors were so substantial,” he said, “that it was either very, very poor science, or there was something about it that they were trying to hide.”
Seizing on safety issue
While billboard foes oppose digital signs mainly on aesthetic grounds, they’ve also seized on the safety issue, and are touting Wachtel’s critique. “We know that the issue of digital billboards and traffic safety is far from settled,” said Mary Tracy, president of the anti-billboard group Scenic America, in a prepared statement. “Any public agency considering allowing the bright, blinking signs on their roadsides should take this critique into account first.’’ The Outdoor Advertising Assn. of America, the billboard industry trade group, did not respond to requests for comment.
It’s just the latest, but most prominent, black eye for the federal research, which was announced with fanfare in 2007.
As reported by FairWarning, records obtained under the Freedom of Information Act show that a lengthy hold-up resulted when peer reviewers shredded a draft of the federal study in spring 2011. They said the eye-glance times recorded for the test drivers were far too brief to be credible, suggesting serious problems with the equipment or mistakes in analyzing the data.
“The reported glances to billboards here are on the order of 10-times shorter than values reported elsewhere,” one reviewer wrote. Said another: “The data reported as average glance durations are not plausible.’’ The reviewers “have serious concerns” about glance data that “greatly undermine their confidence in the report,” former FHWA official Christopher Monk told the lead author William A. Perez in a May, 2011 email released in response to FairWarning’s Freedom of Information Act request.
“Suffice to say that if we cannot adequately address these concerns either through counterargument or through re-analyzing, I doubt OST [Office of the Secretary of Transportation] will let it go out and it will be perceived (correctly) as a failure on our part.”
By then, agency officials were being peppered with inquiries about the status of the overdue study. Records show that they repeatedly answered, euphemistically, that it was under review. “Have no idea when we can change that message (do you?) but we will plan to continue to sound like a broken record,” wrote one official in an email to another. “Wish we could end this.”
Changes not explained
The study was finally released more than two years later, on Dec. 30, 2013. It featured major adjustments to the eye-glance data, without explaining how they were recalculated. It said the longest recorded glance at an electronic billboard was 1.34 seconds—less than the two seconds that some authorities say raises crash risks.
According to the study, “The results did not provide evidence” that electronic billboards, “as deployed and tested in the two selected cities, were associated with unacceptably long glances away from the road.”
The Outdoor Advertising Assn. of America quickly embraced the finding. “Studies have long shown that digital billboards do not cause distracted driving behavior,’’ said its president and CEO, Nancy Fletcher, “and this new study comes to the same conclusion.’’
Much of Wachtel’s critique is steeped in bone-dry technical argot, though some puzzling details and factual discrepancies are apparent to an ordinary reader.
It notes, for example, that the federal study did not measure glances for the entire time drivers were approaching billboards. And while the draft report had listed a combined 20 electronic and 10 standard billboards on the driving courses of the two cities, the final report included only eight signs of each type. There was no explanation for discarding the rest.
The sizes of billboards and distance of setbacks from the road were also changed from one version to the other, the critique said. It added: “Perhaps the greatest concern for a reader attempting to understand the findings of this study is that, between the draft and final reports, some target billboards appear to have crossed from one side of the road to the other.”
Previous coverage on FairWarning:
FairWarning (www.fairwarning.org) is a nonprofit, online news organization focused on issues of safety, health and government and business accountability.