Saturday, June 6, 2009

A Culture of Safety (?)

Low pay, long hours, meager prospects for advancement - these are the reasons academic lab workers often wonder how much they're valued. Well, now it's possible to put a dollar value on the life of a lab technician - $31,000. That's the amount UCLA was fined after the death of a 23-year old technician in an academic lab accident last December. Just to call it a "fatal accident", though, does not capture the reality of what happened. The technician, Sheri Sangji, was handling a syringe of a highly dangerous and spontaneously combusting chemical. She spilled it on her sweater, which burst into flames covering nearly half her body with burns. After what must have been 18 agonizing days in the hospital, she died of her injuries.

Naturally this incident has sparked a lot of chatter about safety in academic laboratories. In his statement, Patrick Harran, the professor in charge of the lab where the accident occurred, mentioned something about establishing a "culture of safety". Presumably, this means a culture that is proactive about minimizing hazards, foreseeing risks, and taking safety issues seriously. Most scientists will agree to these things in principle, but do we in fact have such a culture in academic science? Please. A culture of lip-service to safety, maybe, but when is the last time you have seen anyone in academic science (other than Sangji) suffer any kind of real consequence as a result of a safety issue? Let's take, for an example, myself: as a graduate student, I forgot about a warm water bath in which I was heating a plastic tray. The water evaporated and the plastic melted, filling the building with fumes. And what was the consequence to me? A few people made jokes about it in the hallways for the next week. Another time, I forgot about a bunsen burner I had left ignited on a lab bench that was cluttered with paper. Again, no consequence beyond a verbal reprimand.

My advisors have not exactly been safety zealots, either. Over the years I've seen them get irritated when people spend time cleaning and organizing, dodge questions about hazards, and complain about the expense of buying latex gloves. In general I haven't found that people in academic science take safety very seriously. My "X-ray safety training" consisted largely of a guy telling me not to stick my finger in the beam. More than once, I've noticed how scientists think safety precautions are funny, i.e. "hey, look at Mr. safety dork in the lab goggles!" There is pressure to cut corners, a distaste for formal procedures, and a tendency towards workaholism; if anything, these things make for a culture of un-safety.

So should I be worried? Could I be the next victim? Yes, I could, but no, I am not worried. Despite the lack of a culture of safety, and even with occasional accidents like Sangji's, the truth is that academic science remains one of the safer occupations. From a quick glance at the CDC database , I get the impression that accidents- though they do happen in science- are much more common in blue-collar professions. The reason we have a low fatality rate in academic science has nothing to do with a "culture of safety"- it is because what we do is just not that inherently dangerous. We generally don't work with heavy machinery or climb in high places. We sometimes work with dangerous chemicals, but in limited amounts (Sangji was killed by a spill of two ounces of t-butyl lithium; imagine an industrial operation involving two tonnes of the stuff.)

Saying that academic science is low-risk, though, is different from saying that it is no-risk. The risk will never go all the way to zero, but could we push it any closer? If currently our accidental-death rate is one every five years, could we push it to, say, one death every twenty years? I think we could- in fact I think it would be relatively straightforward. It's done routinely in other industries that are more inherently dangerous (think of air travel, for example). We could put better procedures in place, and make sure they are followed (Sangji's death would almost certainly have been prevented by wearing a lab coat). We could institute both positive and negative incentives, e.g. rewards for pro-active minimization of risks, and real consequences for negligence. So why don't we do more of these things in academic science? The reason, I think, is simple- we don't want to. The cost is too high. Safety, on the whole, is a trade-off, and despite our penchant for win-win solutions, making our labs substantially safer would involve substantial inconveniences and would mean less productivity in terms of gathering data and writing grants. I think we've collectively made the calculation that the trade-offs that could have prevented an accident like Sangji's are not worth the cost.

So instead of a real culture of safety, we have something similar to what computer security expert Bruce Schneier calls 'security theatre'- practices that give the appearance of going to great lengths to mitigate risks, but that in reality do little. One tell-tale sign that a proclaimed security measure is really security theatre is when it involves little cost to the person putting it in place. For example, in the years after 9/11 the Department of Homeland security put up signs everywhere promoting vigilance against terrorism ("if you see something, say something"). This certainly gave the appearance that the government had made counter-terrorism a high priority, but did anybody ever really demonstrate that these signs (which must have been a minor expense, considering the size of the federal budget) actually made anybody safer? How about the "safety division" at your University? Its existence gives the impression that your University cares about safety, without costing much (perhaps five salaries in an institution that employs thousands.) But does the safety division really make you safer? As with the terrorism signs, it's very hard to know, because the risk is small to begin with. Such situations encourage security theatre, because it's difficult to prove or disprove the effectiveness of any approach. If a bad thing happens, say, only once every five years at worst, it's difficult to prove that you've made it even more rare.

When security theater fails (as it always does), and an accident happens, and we're not willing to admit that we took a calculated risk, then we need somebody to blame. The most telling part of Harran's statement is where he points out Sangji's role in causing the accident: "I underestimated her abilities in handling such materials", he says. But this is the paradox of safety: individual accidents are always the result of individual actions, but overall accident rates are the result of environments and policies. This point was made humbly and elegantly in the autobiography of General Omar Bradley, a commander in World War II. If you looked at any one casualty, he said, there was always some way it might have been prevented- if the soldier had been wearing his helmet, if he had been more alert, etc. But on a large scale these things always averaged out, and though individual casualties could always be blamed on individual soldiers, overall casualty rates for whole armies never could; these were solely the result of the decisions of the highest-level commanders. I think scientists have a hard time understanding this, because of their tendency to see things in terms of a Darwinian struggle for survival of the fittest. For despite their well-known tendency to vote Democratic, scientists have at least one thing in common with the right-wing crowd: a reluctance to acknowledge systemic causes for individual problems. Newt Gingrich looks at a woman receiving welfare to support her children, and concludes that she must be lazy. Never mind that her city hasn't nearly enough jobs to meet the demand for employment. A professor looks at a postdoc who can't find a tenure-track job, and concludes that it must be because he's not trying hard enough or is not smart enough. Never mind that an overwhelming majority of postdocs today will not find tenure-track jobs. So is it a surprise that a professor looks at a technician who died in a fatal lab accident, and concludes that she must have been careless?

I'm afraid the accident at UCLA will bring on little more than a surge in security theatre, but let me suggest something else. Maybe it's unrealistic to think that we'll ever have a real culture of safety in science, but at least we could spread the blame and consequences around a bit more fairly. The accident, after all, was not entirely Sangji's fault. Professor Harran states that she did this kind of experiment all the time, without incident. Perhaps he hopes this will exculpate him, but rather, it demonstrates that a fatal accident could have happened in his lab at any time. So it seems unfair that Sangji should bear the blame and consequences alone. After all, at the time of this writing, Sangji remains dead, while Harran's career seems to have continued with little more than a hiccup- a recent article about the accident said that Harran was "at a conference and unavailable for comment". I'm guessing it wasn't a conference on lab safety. I think it would be appropriate for Harran to suffer a serious career setback as a result of this accident. That may sound harsh, and it is. I don't think Harran sounds like such a bad guy. His shortcomings in safety precautions don't sound any worse than those of the average scientist. But the same could be said of the technician who died in his lab. Without a true culture of safety, every so often, awful things are going to happen to people who don't deserve it. Just ask Sherri Sangji.

Saturday, March 7, 2009

Over the Hill

In 1905, his Annus mirabilis, Albert Einstein published five papers, each of which laid foundations for the next century of physics. One of the papers described special relativity. He was twenty-six years old at the time.

Rosalind Franklin was an X-ray crystallographer who did the experiments that allowed Watson and Crick to publish their famous structure of DNA in 1953. She did the experiments in her late 20's.

Évariste Galois originator of many important modern mathematical ideas such as group theory, died at age 20.

In the Manhattan project- the federal government project that moved from concept to working atomic bomb in about the time it now takes to get a PhD- the average age of a participating scientist was 29 years old.

I see a pattern here and a lesson for all the thirty-something postdocs out there: if you haven't had your blockbuster Nobel-prize-worthy idea by now, chances are you never will. I'm not saying you can't continue doing wonderful, creative things - things that younger people couldn't dream of doing - into your 40's, your 50's, or your 80's. I am saying that making a great fundamental stride in science probably isn't one of those things.

Maybe I'm wrong, but if I'm right, then our academic system does not reflect this reality. These days, scientists (at least in my field) are still considered "in training" well into their 30's, when it may very well be that natural peak performance time, at least as far as scientific imagination is concerned, is in one's 20's, and that the appropriate time for training is in grade school, not grad school. It would be as if, to join the Olympics, you had to go through training until you were 35, waiting in line behind all the athletes coming before you. The athletes seem to have learned how to adapt to this reality. They recognize that there's a peak period in life for the kind of performance they're looking for, and they have designed the career around the reality. There's a place for older athletes too, just not at center court any more; instead, they coach, they commentate, they go on TV to sell cars and razor blades. Can't scientists do the same? Can't we recognize that in some things, age does matter? Let's have a place for our middle-aged and older scientists that recognizes their increased maturity and wisdom. But is it realistic to have them submitting application after application to funding agencies, saying that they're on the cusp of a revolutionary new idea?

Thursday, March 5, 2009

Hard-core

Recently a strange individual began hanging around my lab, using the equipment and reagents, making demands on my time. I found out that he is unemployed and trying to figure out what to do, working for my advisor for free while deciding whether to go to graduate school. He is now commuting half an hour each way to spend long hours gathering data for someone else's grant applications in exchange for no money and no progress toward any kind of degree or certification. My advisor is as intimidating and as demanding toward him as toward anybody else. His motivation, I presume, is the thought that someone will surely notice his intelligence and hard work and it will pay off somehow.

Now that is some extra-strength Kool-Aid .

Does this happen in any other industry?

Wednesday, March 4, 2009

Laziness vs. Leadership

The lab I work in, like many others, has been living in fear of bankruptcy-- the last few grant proposals have been rejected, and what money we have is stretched tight. I often get the feeling that my advisor thinks this is the fault of the postdocs and grad students- that if we would just work a little harder for a little less, our funding problems would be solved.

University research labs are not alone in blaming their predicament on the greed and laziness of their lowest-paid workers- the leadership of the American auto industry is doing the same thing when they complain about the high price of unionized American auto workers. I suggest that those leaders also look at their own decisions over the past few decades as an explanation for their current crisis: for example, the decision to develop large, fuel-guzzling, once-profitable trucks and SUV's while neglecting fuel-efficient compact cars and hybrids. The lowest paid workers were in no position to challenge that lapse in judgement, and no amount of hard work or sacrifice on their part could have made up for it.

So, professors, do you really think the problem is that your American postdocs are too greedy and lazy? Perhaps the financial strain science labs are experiencing is a reflection on scientific leadership, not just on the people who work in those labs. After all, many professors seem eager to make it clear that they are in charge-- my advisor certainly has no qualms telling postdocs how to spend their time. That's fine, but if you're the decider, then next time one of your grant proposals is rejected, you need to reflect on the quality of your decisions. They're probably more important than whether or not I come in to the lab on weekends.

Friday, February 27, 2009

A rare glimpse

I have seen the enemy, and he is us.

I went to lunch with my advisor and a visiting speaker. I was mainly a spectator for the unusually candid conversation between them. A great deal of the conversation consisted of complaining.

Among the things they complained about were that they:

  • Are not paid enough or fairly
  • Have too many unofficial responsibilities
  • Too often take the burden of assisting incompetent people while getting little in return
  • Experience abusive treatment backed by threats against career advancement
  • Want more recognition of their contributions to their departments
  • Sometimes suffer consequences from "shaft clauses" buried in the fine print of their employment contracts

In other words... they are indistinguishable from grad students and postdocs.

Thursday, February 26, 2009

I declare backruptcy!

In an episode of the American television series "The Office", the character Michael Scott becomes anxious about his massive credit card debt, prompting him to declare dramatically while standing amidst his colleagues, "I...declare...BANKRUPTCY!" A sympathetic colleague later approaches him privately to say, "You know, Michael, you can't just say the word 'bankruptcy' and expect something to happen. That doesn't mean anything."

Similarly, postdocs and grad students cannot expect anything to come of simply declaring, "I quit science!" (perhaps to a friend or spouse, or to an advisor, or on an anonymous blog). If you're really serious about finding a different way to make a living, you need to start taking concrete steps in that direction. I recommend taking the following steps, in this order:

  • Stop drinking the kool-aid.
  • Don't skip this step, otherwise you might as well not bother with the rest. If you're still hanging on to the idea that because of all your hard work, the world owes you a job as a professor... or if at the deepest levels you still think you will turn things around and become a famous scientist, making all the personal sacrifice worthwhile... you will continue to sabotage or ignore avenues other than academic ones. You will amplify the problems of non-academic jobs, seeing even minor problems as evidence that you couldn't possibly succeed outside of academia, while you continue to cope with massive problems within academia. Make sure you're really prepared to leave the cult.
  • Devote a specific time to job-searching.
  • No career-change strategy will work unless you do it. After years in the academic pipeline, it can be difficult to get started, and the temptation to procrastinate can be strong. To overcome this, budget a finite amount of time to your non-academic career search, and stick to the plan. Do this even if it comes at the expense of time at the lab bench or working on that manuscript you need to finish. Do it even if you're afraid your advisor will be upset. You probably work hard enough on your science: it's perfectly reasonable to set aside a couple of hours to develop your career. Find a time and place where you can concentrate, and then glue yourself to the chair. Push aside all distractions. If you haven't been on the job market recently (or perhaps ever), you will probably be at a loss for what to do, at first. Just start with a blank page that says, "My Career Plan" at the top, and try to think of what to do next. If the page stays blank for your first session, don't worry. The important thing at this stage is to avoid getting distracted and discouraged. If you stick with it, the ideas will start to come, I promise.
  • Polish up your resume/CV.
  • Incredibly, some grad students (and even postdocs), despite being highly skilled professionals, do not keep an up-to-date resume. Some have the misconception that a resume isn't important or that you don't need one until you have a specific job application in mind. In contrast, people in the private sector seem to understand that the resume needs to be up-to-date at all times, ready at a moment's notice when an opportunity presents itself. The nice thing is, once you get your resume in good shape, it will be relatively easy to keep it up-to-date. Polishing your resume right now is a modest investment of time and effort that will pay off nicely in the long run. You can find plenty of free advice online about how to write a resume, and there are sources of paid advice as well.
  • Do a thorough self-evaluation.
  • Take stock of your skills, abilities, and accomplishments. Most scientists-in-training tend to sell themselves short, probably because completing a dissertation means setting a very lofty long-term goal, such as, "prove the existence of a new sub-Higgs Boson particle." We tend to tie our sense of self-worth to our degree of success in meeting that one lofty, esoteric goal, all the while ignoring our more earthly accomplishments. In the course of searching for that frustrating Higgs particle, most scientists develop professional skills such as writing, teaching, and data organization. Though academic departments might not be too impressed by your wizardry with Microsoft Excel, someone in the private sector just might be. As you recall your skills and accomplishments, add them to your resume. Don't worry about adding too much: the more you have, the more easily you can selectively tailor your resume for each position you apply for.
  • Come up with an organization system.
  • When you have an idea, or an opportunity comes your way, you need a ready-made system for keeping track of it so that you can follow through. It doesn't matter so much what your system is, just that it works for you. On my computer I have a folder for every job opening or interesting company that I hear about, containing a tailored resume, cover letter, and summary of any correspondence. I have another folder for each professionally-interesting person I come in contact with, containing notes on whatever we discussed. If you maintain such a system for keeping track of your progress, your senses will be more keen to pick up opportunities that present themselves.
  • Prepare yourself to deal with frustration and failure.
  • Perhaps the hardest part of searching for a job is this: you are going to be turned down, more than once. Many academics have a long track record of passing every test that comes their way. It's very difficult, then, not to take it as personal failure when a job application is turned down. Early in my process, I sent a resume and cover letter to a small company I liked. I had already started fantasizing about what it would be like to work there. I got no reply. It was hard not to get discouraged. As it turns out, it probably didn't have much to do with my qualifications: a few months later, that company got into financial difficulty and announced layoffs. The job market is a market, not a test. The purpose of contacting a potential employer is to exchange information: to see if there can be a trade (your money for my time) that is worthwhile for both parties. The sooner we figure out that no deal is possible at this time, the sooner I can move on to the next opportunity. One way to get in to this mindset is to make rejection your goal. Set out this week or this month with the goal of applying for three jobs, and being turned down for all of them. No despair or complaining is allowed until you have those 3 (or 5 or 10) rejections. This frees you from worrying so much about the outcome (getting a new job) and allows you instead to focus on the process: how did you find this opportunity? Using the same method, could you find another? How did the application process go? Could you improve it? Finding a new job will be a long-term process. In fact, it will continue for the rest of your career. In asking for a job, as in asking for other things in life, it pays to develop the capability to handle "no" as an answer.

Saturday, January 31, 2009

Newspeak

We scientists are using some odd language to describe our careers these days: we are undergrads, then grad students, then postdocs, then perhaps PI's. We're sponsored by the NSF and the NIH.

Why can't we speak in English? Does it really save so much time to leave four letters out of the phrase "graduate school"? Perhaps it has more to do with a distaste for being reminded that we're still in school, despite having already graduated.

Neologisms can change the way we think. Take, for example, the term "NIH". Most people outside of science don't know what the NIH is, but told that it stands for "National Institutes of Health", could probably infer that it is a part of the federal government concerned with medical matters. Those of us who are intertwined with the NIH know that this description is far too general. For sure, the NIH supports health scientists, but the term "science" has a much broader meaning than the kinds of activities the NIH actually supports. For example, imagine a scientist who proposes an exciting yet unlikely hypothesis (let's say a 1 in 10 chance of being right). To test the hypothesis, the scientist designs a set of experiments, which will be pursued to completion no matter what the "preliminary" results. The results of the experiments will be published, whether they confirm or deny the hypothesis. Is this "science"? I think yes, but anyone seriously hoping to get NIH funding would never submit a proposal like that. To get NIH funding, you submit preliminary results in support of a hypothesis you know is probably correct. This may be a good way to do science, at least some of the time-- it's not that the approaches promoted by the NIH are bad, it's just that for many scientists NIH is so dominant that it's easy to forget that there are other approaches. "NIH" is not just an institution but a whole culture of science, indeed a way of life for some. The acronym therefore has a meaning far more precise and far less flexible than can be conveyed by the mere English phrase "National Institutes of Health". Perhaps by concentrating on our "NIH RO1", rather than on our "National Institutes of Health Research Proposal", we can avoid the unpleasant thought that we lack the freedom to define "research", "proposal", and "science" as broadly as the English language would allow.

That's why when someone talks about their "pubs", I question whether they are really just trying to save the time it takes to say "-lications". At least we're not calling ourselves "FirstAuth" yet.

That neologisms should be a red flag was pointed out, of course, half a century ago:

Even in the early decades of the twentieth century, telescoped words and phrases had been one of the characteristic features of political language; and it had been noticed that the tendency to use abbreviations of this kind was most marked in totalitarian countries and totalitarian organizations. Examples were such words as Nazi, Gestapo, Comintern, Inprecorr, Agitprop. In the beginning the practice had been adopted as it were instinctively, but in Newspeak it was used with a conscious purpose. It was perceived that in thus abbreviating a name one narrowed and subtly altered its meaning, by cutting out most of the associations that would otherwise cling to it. The words Communist International, for instance, call up a composite picture of universal human brotherhood, red flags, barricades, Karl Marx, and the Paris Commune. The word Comintern, on the other hand, suggests merely a tightly-knit organization and a well-defined body of doctrine. It refers to something almost as easily recognized, and as limited in purpose, as a chair or a table. Comintern is a word that can be uttered almost without taking thought, whereas Communist International is a phrase over which one is obliged to linger at least momentarily. In the same way, the associations called up by a word like Minitrue are fewer and more controllable than those called up by Ministry of Truth. This accounted not only for the habit of abbreviating whenever possible, but also for the almost exaggerated care that was taken to make every word easily pronounceable.

In Newspeak, euphony outweighed every consideration other than exactitude of meaning. Regularity of grammar was always sacrificed to it when it seemed necessary. And rightly so, since what was required, above all for political purposes, was short clipped words of unmistakable meaning which could be uttered rapidly and which roused the minimum of echoes in the speaker's mind... The use of them encouraged a gabbling style of speech, at once staccato and monotonous. And this was exactly what was aimed at. The intention was to make speech, and especially speech on any subject not ideologically neutral, as nearly as possible independent of consciousness. For the purposes of everyday life it was no doubt necessary, or sometimes necessary, to reflect before speaking, but a Party member called upon to make a political or ethical judgement should be able to spray forth the correct opinions as automatically as a machine gun spraying forth bullets. His training fitted him to do this, the language gave him an almost foolproof instrument, and the texture of the words, with their harsh sound and a certain wilful ugliness which was in accord with the spirit of Ingsoc, assisted the process still further. So did the fact of having very few words to choose from.

-- George Orwell, "1984"