Low pay, long hours, meager prospects for advancement - these are the
reasons academic lab workers often wonder how much they're valued.
Well, now it's possible to put a dollar value on the life of a lab
technician - $31,000. That's the amount UCLA was fined after the
death of a 23-year old technician in an
academic lab accident last December.
Just to call it a "fatal accident", though, does not
capture the reality of what happened. The technician, Sheri Sangji,
was handling a syringe of a highly dangerous and spontaneously
combusting chemical. She spilled it on her sweater, which burst into
flames covering nearly half her body with burns. After what must have
been 18 agonizing days in the hospital, she died of her injuries.
Naturally this incident has sparked a lot of chatter about safety in
academic laboratories. In his statement, Patrick Harran, the
professor in charge of the lab where the accident occurred, mentioned
something about establishing a "culture of safety". Presumably, this
means a culture that is proactive about minimizing hazards, foreseeing
risks, and taking safety issues seriously. Most scientists will agree
to these things in principle, but do we in fact have such a culture in
academic science? Please. A culture of lip-service to safety, maybe,
but when is the last time you have seen anyone in academic science
(other than Sangji) suffer any kind of real consequence as a result of
a safety issue? Let's take, for an example, myself: as a graduate
student, I forgot about a warm water bath in which I was heating a
plastic tray. The water evaporated and the plastic melted, filling
the building with fumes. And what was the consequence to me? A few
people made jokes about it in the hallways for the next week. Another
time, I forgot about a bunsen burner I had left ignited on a lab bench
that was cluttered with paper. Again, no consequence beyond a verbal
reprimand.
My advisors have not exactly been safety zealots, either.
Over the years I've seen them get
irritated when people spend time cleaning and organizing, dodge
questions about hazards, and complain about the expense of buying
latex gloves. In general I haven't found that people in academic
science take safety very seriously. My "X-ray safety training"
consisted largely of a guy telling me not to stick my finger in the
beam. More than once, I've noticed how scientists think safety
precautions are funny, i.e. "hey, look at Mr. safety dork in the lab
goggles!" There is pressure to cut corners, a distaste for formal
procedures, and a tendency towards workaholism; if anything, these
things make for a culture of un-safety.
So should I be worried? Could I be the next victim? Yes, I could,
but no, I am not worried. Despite the lack of a culture of safety,
and even with occasional accidents like Sangji's, the truth is that
academic science remains one of the safer occupations.
From a quick glance at the
CDC database
,
I get the impression that accidents-
though they do happen in science- are
much more common in blue-collar professions.
The reason we have a low fatality rate in academic science has nothing
to do with a "culture of safety"- it is because what we do is just not
that inherently dangerous. We generally don't work with heavy
machinery or climb in high places. We sometimes work with dangerous
chemicals, but in limited amounts (Sangji was killed by a spill of two
ounces of t-butyl lithium; imagine an industrial operation involving
two tonnes of the stuff.)
Saying that academic science is low-risk, though, is different from
saying that it is no-risk. The risk will never go all the way to
zero, but could we push it any closer? If currently our
accidental-death rate is one every five years, could we push it to,
say, one death every twenty years? I think we could- in fact I think
it would be relatively straightforward. It's done routinely in other
industries that are more inherently dangerous (think of air travel,
for example). We could put better procedures in place, and make sure
they are followed (Sangji's death would almost certainly have been
prevented by wearing a lab coat). We could institute both positive
and negative incentives, e.g. rewards for pro-active minimization of
risks, and real consequences for negligence. So why don't we do more
of these things in academic science? The reason, I think, is simple-
we don't want to. The cost is too high. Safety, on the whole, is a
trade-off, and despite our penchant for win-win solutions, making our
labs substantially safer would involve substantial inconveniences and
would mean less productivity in terms of gathering data and writing
grants. I think we've collectively made the calculation that the
trade-offs that could have prevented an accident like Sangji's are not
worth the cost.
So instead of a real culture of safety, we have something similar to
what computer security expert
Bruce Schneier
calls 'security theatre'-
practices that give the appearance of going to great lengths to
mitigate risks, but that in reality do little. One tell-tale sign
that a proclaimed security measure is really security theatre is when
it involves little cost to the person putting it in place. For
example, in the years after 9/11 the Department of Homeland security
put up signs everywhere promoting vigilance against terrorism ("if you
see something, say something"). This certainly gave the appearance
that the government had made counter-terrorism a high priority, but
did anybody ever really demonstrate that these signs (which
must have been a minor expense, considering the size of the federal
budget) actually made anybody safer? How about the "safety
division" at your University? Its existence gives the impression
that your
University cares about safety, without costing much (perhaps five
salaries in an institution that employs thousands.) But does the safety
division really make you safer? As with the terrorism signs, it's
very hard to know, because the risk is small to begin with. Such
situations encourage security theatre, because it's difficult to prove
or disprove the effectiveness of any approach. If a bad thing
happens, say, only once every five years at worst,
it's difficult to prove that you've made it even more rare.
When security theater fails (as it always does), and an accident
happens, and we're not willing to admit that we took a calculated
risk, then we need somebody to blame. The most telling part of
Harran's statement is where he points out Sangji's role in causing the
accident: "I underestimated her abilities in handling such materials",
he says. But this is the paradox of safety: individual accidents are
always the result of individual actions, but overall accident rates
are the result of environments and policies. This point was made
humbly and elegantly in the autobiography of General Omar Bradley,
a commander in World War II.
If you looked at any one casualty, he said, there was always some way it
might have been prevented- if the soldier had been wearing his helmet,
if he had been more alert, etc.
But on a large
scale these things always averaged out,
and though individual casualties could always be blamed on individual
soldiers, overall casualty rates for whole armies never could;
these were solely the result of the decisions of the highest-level commanders.
I think scientists have a
hard time understanding this, because of their tendency to see things
in terms of a Darwinian struggle for survival of the fittest. For
despite their well-known tendency to vote Democratic, scientists have
at least one thing in common with the right-wing crowd: a reluctance
to acknowledge systemic causes for individual problems. Newt Gingrich
looks at a woman receiving welfare to support her children, and
concludes that she must be lazy. Never mind that her city hasn't
nearly enough jobs to meet the demand for employment. A professor
looks at a postdoc who can't find a tenure-track job, and concludes
that it must be because he's not trying hard enough or is not smart
enough. Never mind that an overwhelming majority of postdocs today
will not find tenure-track jobs. So is it a surprise that a professor
looks at a technician who died in a fatal lab accident, and concludes
that she must have been careless?
I'm afraid the accident at UCLA will bring on little more than a surge
in security theatre, but let me suggest something else. Maybe it's
unrealistic to think that we'll ever have a real culture of safety in
science, but at least we could spread the blame and consequences
around a bit more fairly. The accident, after all, was not entirely
Sangji's fault. Professor Harran states that she did this kind of
experiment all the time, without incident. Perhaps he hopes this will
exculpate him, but rather, it demonstrates that a fatal accident could
have happened in his lab at any time. So it seems unfair that Sangji
should bear the blame and consequences alone. After all, at the time
of this writing, Sangji remains dead, while Harran's career seems to
have continued with little more than a hiccup- a recent article about
the accident said that Harran was "at a conference and unavailable for
comment". I'm guessing it wasn't a conference on lab safety. I
think it would be appropriate for Harran to suffer a serious career
setback as a result of this accident. That may sound harsh, and it
is. I don't think Harran sounds like such a bad guy. His
shortcomings in safety precautions don't sound any worse than those of
the average scientist. But the same could be said of the technician
who died in his lab. Without a true culture of safety, every so
often, awful things are going to happen to people who don't deserve
it. Just ask Sherri Sangji.