Graphs test official reportsText analysis picks apart complex
chains of reasoning in inquiry dossiers.
15 August 2003
JOHN
WHITFIELD
|
Visualizing arguments can
help build consensus. |
©
Corbis | | |
A new analytical technique tests the conclusions and
distils the arguments of complex documents. Designed to
scour accident-inquiry reports, it could probe other
long, controversial accounts, such as the UK
government's Iraq dossier, or the US government's report
on the terrorist attacks on 11 September 2001.
"This is a way to check that things aren't being
oversimplified or hidden," says its developer, Chris
Johnson, an accident analyst at the University of
Glasgow, UK. The approach reveals whether or not a
report's content support its conclusions.
The analysis produces diagrams of conclusions,
analysis and evidence. If, for example, a report blames
an accident on human error, a computer search of the
whole report pulls out references to this. A reader then
follows the trail of argument leading to each
conclusion, and pinpoints the evidence for or against
it.
In this way, Johnson has recently found
inconsistencies in the 80-page document discussing the
breakdown of London's computerized ambulance dispatch
system. It blamed technical failure brought on by lack
of testing. But Johnson's diagrams pick up references to
the system's extensive testing throughout the
report.
The technique has a subjective element, Johnson
admits. "If two different people analyse the same report
using the same technique, their diagrams will differ.
But by visualizing the arguments, you can help build a
consensus - it's a tool for encouraging agreement."
"It's a very helpful technique," says Peter Ladkin of
Bielefeld University, Germany, who also develops
computer programs for analysing reports. "Before this,
people had to keep all the hypotheses and
counterhypotheses in their heads, and most of us aren't
good at doing that."
Accident reports sorely need such review, he says.
"In about 50% of cases the official reports give
misleading indications of what the causes were."
"Major accidents tend to be done very well," says
Graham Braithwaite, director of the Safety and Accident
Investigation Centre at Cranfield University, UK. "But
more companies are looking at day-to-day accidents, and
that's where the weaknesses come in. Any tools and
guidance they can get is a good thing."
Representing reasoning
The UK government is currently defending the accuracy
of the dossier it produced on Iraq's weapons of mass
destruction last year. Following the death of government
weapons adviser David Kelly, who discussed the report
with journalists, a prominent judge, Lord Hutton, is
investigating the case.
Good methods of representing
reasoning are crucial to understanding any major
issue |
Peter Ladkin Bielefeld
University | | |
Document-analysis techniques could usefully be
applied to the Iraq case, says Ladkin: "Good methods of
representing reasoning are absolutely crucial to
understanding any major issue."
"It might be the sort of thing that should be brought
to Lord Hutton's attention," says computer scientist
John McDermid of York University, UK, who studies safety
and security systems. But one would need all of the
drafts of the report to follow the chain of argument, he
says, not just the final version.
The technique can be applied to other types of data.
Last week, Johnson presented a comparison of the media
coverage of the 2000 Paris Concorde crash and the 2002
official inquiry to the International System Safety
Conference in Ottawa, Canada1.
He thinks that journalists converged so quickly on the
conclusion that the inquiry reached two years later that
someone with inside knowledge of the investigation must
have been briefing them. |