Vials of bioterror bacteria have gone missing. Lab mice infected with deadly viruses have escaped, and wild rodents have been found making nests with research waste. Cattle infected in a university’s vaccine experiments were repeatedly sent to slaughter and their meat sold for human consumption. Gear meant to protect lab workers from lethal viruses such as Ebola and bird flu has failed, repeatedly.
A USA TODAY Network investigation reveals that hundreds of lab mistakes, safety violations and near-miss incidents have occurred in biological laboratories coast to coast in recent years, putting scientists, their colleagues and sometimes even the public at risk.
Oversight of biological research labs is fragmented, often secretive and largely self-policing, the investigation found. And even when research facilities commit the most egregious safety or security breaches — as more than 100 labs have — federal regulators keep their names secret.
Of particular concern are mishaps occurring at institutions working with the world’s most dangerous pathogens in biosafety level 3 and 4 labs — the two highest levels of containment that have proliferated since the 9/11 terror attacks in 2001. Yet there is no publicly available list of these labs, and the scope of their research and safety records are largely unknown to most state health departments charged with responding to disease outbreaks. Even the federal government doesn’t know where they all are, the Government Accountability Office has warned for years.
A team of reporters who work for the USA TODAY Network of Gannett newspapers and TV stations identified more than 200 of these high-containment lab facilities in all 50 states and the District of Columbia operated by government agencies, universities and private companies. They’re scattered across the country from the heart of New York City to a valley in Montana; from an area near Seattle’s Space Needle to just a few blocks from Kansas City’s Country Club Plaza restaurant and shopping district.
High-profile lab accidents last year with anthrax, Ebola and bird flu at the Centers for Disease Control and Prevention and the discovery of forgotten vials of deadly smallpox virus at the National Institutes of Health raised widespread concerns about lab safety and security nationwide and whether current oversight is adequate to protect workers and the public. Wednesday the Department of Defense disclosed one of its labs in Utah mistakenly sent samples of live anthrax — instead of killed specimens – to labs across the USA plus a military base in South Korea where 22 people are now being treated with antibiotics because of their potential exposure to the bioterror pathogen. As many as 18 labs in nine states received the samples, the CDC said Thursday.
“What the CDC incidents showed us … is that the very best labs are not perfectly safe,” says Marc Lipsitch, a Harvard University professor of epidemiology. “If it can happen there, it certainly can happen anywhere.”
Some people find little reassurance that nobody was sickened in the CDC accidents or in the historically low numbers of serious infections among lab workers generally, or that infections spreading into communities surrounding labs have been rarer still.
“Many of us think that’s really a matter of good fortune,” said Beth Willis, who chairs a citizen lab advisory panel in Frederick, Md., home to one of the nation’s largest high-containment research campuses at the Army’s Fort Detrick.
The country’s best labs have robust safety programs, said Kenneth Berns, co-chair of a panel of outside lab safety advisers currently examining biosafety at CDC and other federal labs. Yet the systemic safety problems identified at the CDC’s prestigious labs have raised questions about what’s happening elsewhere. “It’s a matter of some concern,” said Berns, a distinguished professor emeritus of molecular genetics and microbiology at the University of Florida.
“You’re talking about something that has the ability to take off, and we could not be confident of being able to contain it,” he said.
Relman said that not enough is known about the state of safety at labs that perform infectious disease research but emphasized that the kinds of labs drawing concern are the same ones the public needs to discover important new treatments and vaccines. “We have to find some happy blend of minimized risk and enhanced benefit,” he said.
At the high-containment labs identified by USA TODAY, experiments are underway involving drug-resistant tuberculosis, exotic strains of flu, the SARS and MERS viruses, plague, anthrax, botulism, ricin and the Ebola and Marburg hemorrhagic fever viruses, according to interviews and more than 20,000 pages of internal lab safety records and incident reports obtained from labs across the country.
Studies are also being done on a wide range of bioterrorism pathogens that are less known to the public, such as the agents that cause exotic diseases like tularemia, Q fever and melioidosis. Still others are focused on pathogens that pose serious economic risks to agriculture, such as foot-and-mouth disease, brucellosis and “mad cow” disease.
At a few labs, experiments have been done with strains of flu and other viruses purposely made to be more dangerous in studies that seek to understand how they might mutate naturally. White House science advisers called for a temporary halt of that kind of “gain of function” research last fall while expert scientific panels spend the next year studying its risks and benefits.
The research at BSL-3 and BSL-4 labs — which use special equipment, negative air pressure and numerous safety and security procedures — seeks to better understand how organisms cause disease and ways to protect against them. It’s the kind of work that the public doesn’t give much thought to until people with Ebola arrive on planes in the United States from an outbreak in Africa, or the current avian flu outbreak forces farmers to kill millions of chickens raising the specter of higher egg prices.
It’s impossible to obtain a full accounting of lab accidents or lab-acquired infections because there is no universal, mandatory requirement for reporting them and no system to analyze trends to assess emerging biosafety risks and disseminate lessons learned on a regular basis.
The Federal Select Agent Program, which inspects and regulates the subset of research labs that experiment with about four dozen types of pathogens deemed to pose bioterror threats, requires labs to report potential exposure or release incidents, as well as thefts or losses of specimens.
From 2006 through 2013, labs notified federal regulators of about 1,500 incidents with select agent pathogens and, in more than 800 cases, workers received medical treatment or evaluation, limited public data in program annual reports show. Fifteen people contracted laboratory-acquired infections and there were three unintended infections of animals, according to the reports, which do not identify labs and mostly provide aggregated counts of incidents by type. Reported incidents involve events ranging from spills to failures of personal protective equipment or mechanical systems to needle sticks and animal bites.
The program, jointly run by the Centers for Disease Control and Prevention and the U.S. Department of Agriculture, refuses to release copies of detailed incident reports, citing a 2002 bioterrorism law.
Incident records the USA TODAY Network obtained directly from individual labs provide a window on the kinds of mistakes that happen. An animal caretaker in Georgia was potentially exposed to a bird flu virus that kills 60% of the people it infects when a defective respirator hose supplying purified air detached from its coupling in September. A researcher in Wisconsin was quarantined for seven days in 2013 after a needle stick with a version of the same H5N1 influenza virus. A lab worker in Colorado failed to ensure specimens of the deadly bacterium Burkholderia pseudomallei had been killed before shipping them in May 2014 to a co-worker in a lower-level lab who handled them without critical protective gear. None of the workers was infected.
The public and the lab community tend to learn only about the rare instances of serious or fatal lab infections, which sometimes are published as case reports in scientific journals or make national news.
In 2009, Malcolm Casadaban, a University of Chicago scientist with an underlying medical condition, died from an infection with a weakened strain of plague bacteria. In 2012, 25-year-old researcher Richard Din died after being infected during vaccine research involving Neisseria meningitides bacteria at a lab inside San Francisco’s VA medical center. Both of their deaths involved research in biosafety level 2 labs, where pathogens are considered to be less dangerous than those worked with in high-containment labs.
Din, who became a researcher to cure diseases like the cancer that killed his mother, developed a fever and started feeling dizzy while out to dinner with friends. He had no idea how serious his symptoms were, his friends and family told USA TODAY. By morning, Din was covered in a splotchy rash and could barely talk, recalled Lawrence Tsai, who raced to Din’s apartment to help.
Tsai carried his friend down two flights of stairs and drove him to the hospital. “His body was very hard, very straight,” Tsai said. “Only his eyes were open. He could not say anything.”
A few hours later, Din was dead. And Tsai said he and his friends were told they, too, were at risk and needed to take antibiotics because of their close contact with him. The bacteria that killed Din can spread from person to person by direct contact with respiratory secretions. About two dozen emergency room workers also were treated with antibiotics as a precaution, according to a presentation about the case at a scientific conference. Nobody else was sickened.
Federal workplace safety investigators, who investigated because the case involved a death, said Din died because the VA failed to adequately supervise and protect workers in the research lab. Among the “serious” issues they cited: Din and other workers in the lab were manipulating specimens of the dangerous bacteria out on tabletops — not inside protective biosafety cabinets that would have reduced potential exposures to droplets or splashes. The lab also failed to train workers about warning signs of infection, violation records show.
Although lab-created outbreaks that spread to people or animals in the surrounding community are rare, they have happened.
“That’s what you would worry about,” said Gigi Kwik Gronvall, of the UPMC Center for Health Security, an independent think tank that studies biosecurity and epidemics. “But even then the consequences up to now have been limited to the very close contacts of the person who was infected.”
A small, deadly outbreak of severe acute respiratory syndrome in China in 2004 was traced to lab workers at the National Institute of Virology in Beijing. In 2007, an outbreak of foot and mouth disease among cattle in England that required herds to be slaughtered was blamed on leaking drainage pipes at a nearby research complex.
In Louisiana, tests are underway to make sure a deadly bioterror bacterium hasn’t colonized the soil and water around the Tulane National Primate Research Center near New Orleans. Late last year, the bacteria got out of one of the center’s BSL-3 labs, likely hitching a ride on workers’ clothing, sickening two monkeys that lived in outdoor cages and later infecting others. Tulane will spend the next five years testing its outdoor monkey colony as well as wildlife and feral cats around the 500-acre facility to ensure the bacteria haven’t contaminated the environment. The CDC and Tulane say they think the bacteria spread only inside the center’s buildings, and so far tests outdoors have not detected the bacterium, Burkholderia pseudomallei, which can cause severe and difficult-to-treat illness in people and animals infected by coming into contact with contaminated soil or water.
On a global scale, a lab accident is considered by many scientists to be the likely explanation for how an H1N1 flu strain re-emerged in 1977 that was so genetically similar to one that had disappeared before 1957 it looked as if it had been “preserved” over the decades. The re-emergence “was probably an accidental release from a laboratory source,” according to a 2009 article in the New England Journal of Medicine.
However, most pathogens studied in labs, unlike the flu, don’t spread easily from person to person. Often, to become infected a person needs to have direct contact with a pathogen, which is why lab workers are most at risk, experts said. For example, people can become infected with anthrax by inhaling the bacterium’s spores, but once sickened they are not contagious, according to the CDC.
“I don’t think the public needs to be too concerned,” said Marian Downing, president of the American Biological Safety Association. “There are multiple levels of checks and balances in place.”
Beyond accidental lab-associated outbreaks, federal auditors consider the deliberate theft and misuse of a deadly pathogen to be one of the most significant risks of biolab research. That’s what the FBI says happened in the 2001 anthrax letter attacks that killed five and sickened 17. Bruce Ivins, a biologist and anthrax researcher at the U.S. Army Medical Research Institute of Infectious Diseases (USAMRIID) at Fort Detrick, Md., was the perpetrator, the FBI concluded.
The GAO, the investigative arm of Congress, has issued repeated warnings since 2007 that the proliferation of BSL-3 and BSL-4 laboratories has increased the aggregate risk of accidental or intentional releases of viruses, bacteria or toxins.
No single agency tracks the overall number or location of these labs, the GAO has said. Little is known about high-containment labs working with dangerous pathogens such as tuberculosis, the MERS virus and others that aren’t on the select agent list and tracked by the Federal Select Agent Program.
National standards for constructing and operating these kinds of labs are lacking, which means labs vary by local building requirements. While voluntary guidance exists for safe lab design and operations, the GAO has found it is not universally followed.
The documents obtained by USA TODAY show power failures at BSL-3 labs at Texas A&M University repeatedly resulted in the labs losing their negative air pressure during 2013, a key safety feature that is among several used to keep pathogens contained inside the lab. The CDC’s labs in Atlanta also have had airflow problems over the years, the newspaper previously reported.
“The public is concerned about these laboratories because exposing workers and the public to dangerous pathogens, whether deliberate or accidental, can have disastrous consequences,” the GAO’s Nancy Kingsbury told Congress at a hearing on the CDC lab incidents last summer.
Lab regulators at the Federal Select Agent Program — whose departments often fund the research they oversee — would not grant interviews despite repeated requests since last year. The program oversees about 262 organizations that operate BSL-3 and eight organizations that operate BSL-4 labs.
The two federal agencies that jointly run the program — the CDC and USDA — operate their own labs, which have been involved in recent high-profile incidents.
“We believe the current system of inspecting/overseeing laboratories is adequate, but we are always open to continued improvements,” the CDC said in an emailed statement. USDA officials also declined to be interviewed.
Lab safety officials at the National Institutes of Health, a major research funding agency that operates its own labs and helps set national biosafety guidelines, also declined interview requests.
“There is no ‘zero-risk’ proposition in the conduct of research,” the agency said in a statement. “NIH works extremely hard to minimize all research-related risks.”
More than 100 labs experimenting with potential bioterror agents have been cited by regulators at the CDC and USDA for serious safety and security failings since 2003, USA TODAY has learned.
Yet so much of select agent oversight is cloaked in secrecy, making it difficult to assess regulators’ effectiveness in ensuring safety. In several instances, troubled labs and even federal regulators appeared to misrepresent the significance of the government’s enforcement efforts.
Since 2003, the CDC has referred 79 labs for potential enforcement actions by the U.S. Department of Health and Human Services’ Office of Inspector General. It has levied fines against 19 of them totaling more than $2.4 million, the CDC said in response to questions.
Some are repeat offenders. Five labs have had “multiple referrals” for enforcement actions, the CDC said. Two labs have been kicked out of the program, and five labs have been suspended from doing any select agent research, the agency said.
Which labs repeatedly failed to address safety problems? The CDC won’t name names — not even for the two labs kicked out of the select agent program. The CDC and its regulatory partners at the USDA say the 2002 bioterrorism law requires keeping this information secret.
Yet earlier this year, the CDC publicly announced its suspension of the Tulane National Primate Research Center — after the center’s accidental release of a bioterror bacterium became publicly known and was the subject of news reports. The CDC said it balances the public’s right to transparency with the risk posed by information being made available to those who might use it to threaten public health or security.
Currently seven labs are under the extra scrutiny of a federal select agent lab performance improvement program, the CDC said. The program is offered as a voluntary alternative to suspension or other regulatory action, the agency said, for labs with a “repeated failure to correct past observation, biosafety and security concerns” or failures to comply with extra security requirements for work with “Tier 1” select agents. Tier 1 agents are those deemed to pose the greatest risk of deliberate misuse with the most significant potential for mass casualties or devastating economic effects.
While under scrutiny of the program, an individual researcher or project must halt the research that has been found in violation, but other select agent research at the institution generally is allowed to continue, the CDC said.
Thirty-three labs have been put on performance improvement programs since 2008, CDC said. Their names are secret too.
Dozens more labs have faced regulatory actions from the USDA, which takes the lead overseeing select agent labs primarily working with animal or agricultural pathogens. The USDA says it has conducted 48 investigations that have resulted in $116,750 in fines.
The USDA said all of its enforcement records about these fines are required to be kept secret because of the 2002 bioterrorism law. The USDA did release a spreadsheet it says documents its actions, but the agency redacted almost all the information on it: lab names, violation types and even dates. Only a few references to warning letters and fines were spared the agency’s black marker.
The Federal Select Agent Program says no law or regulation bars the labs themselves from discussing their select agent research. And universities and other research institutions routinely publish their research on select agent pathogens in scientific journals.
Registered labs just aren’t supposed to share details of specific security measures, such as locations of keys and codes, that would give access to pathogens. The CDC and USDA said there is nothing that prohibits labs from releasing information or answering questions about any regulatory problems they’ve had. Yet few were willing to readilydiscuss violations or failed inspections.
Labs at the University of Hawaii-Manoa are among those in the federal performance improvement program, at least as of January, records obtained by USA TODAY show. Although the secrecy provisions of the 2002 bioterrorism law apply only to certain federal agencies, officials at the state-run university cited that law among its reasons for denying requests for records about safety violations and the performance improvement program.
The university inadvertently confirmed that its Honolulu labs had been put in the performance improvement program in records it filed in January with Hawaii’s Office of Information Practices, which is deciding USA TODAY’s public records appeal. The university wrote that being put on a PIP is something it is “proud” of.
“We do not believe entering into the program is an embarrassment, we think it should be showcased, but that would be improper because as participants in the Federal Select Agent Program, we are obligated to keep this information private,” the university wrote to the appeals agency, adding that it “has been an exemplary participant in the Federal Select Agent Program.”
University of Hawaii officials declined to be interviewed.
Last year, two labs agreed to pay fines handed down by the HHS Office of Inspector General for select agent violations, records show.
A lab that federal officials would describe only as an “Arizona research university” agreed in 2014 to pay a $165,000 fine for failing to keep accurate inventory records for select agents and not having biosafety procedures adequate for the risks associated with the pathogens they worked with. The lab, the USA TODAY Network’s reporting found, was Northern Arizona University in Flagstaff. Lab director Paul Keim said the issues date back to 2010 when the university had difficulty keeping up with changing federal regulations. Since then the university’s labs have passed several inspections, he said.
An unnamed Florida laboratory agreed to pay $50,000 to resolve violations that included failing to ensure accurate inventories of select agents and failing to notify the CDC and appropriate law enforcement agencies after discovering a missing select agent.
The inspector general’s office, citing regulations stemming from the 2002 bioterrorism law, redacted the names of these labs, as well as all other labs receiving fines, in documents it provided to USA TODAY under the Freedom of Information Act. Other labs that have been fined over the years for select agent violations are located in Alabama, California, Missouri, South Dakota, Texas, Virginia and Wisconsin, records show.
As a way of providing some oversight, Congress requires a report each year on the number of thefts, losses and releases of bioterror pathogens at labs regulated by the Federal Select Agent Program.
Yet regulators provide scant details of their activities and the problems identified at labs. Usually just three pages long plus a cover page, the reports contain only aggregated counts of lab incidents by type, plus vague information on a few serious incidents.
The select agent program told Congress it had “imposed a $425,000 civil money penalty” on an unnamed lab where a serious biosafety lapse in 2008 had resulted in a cow in a nearby disease-free herd becoming infected with Brucella bacteria, which cause brucellosis.
Brucellosis is a contagious and economically significant agricultural disease — which causes cattle and other livestock to abort their fetuses, produce less milk, suffer weight loss, infertility and lameness. It has been the subject of eradication efforts for decades.
The $425,000 fine would have been one of the largest in the overall select agent program’s history — if it had actually been imposed.
But it wasn’t imposed, USA TODAY’s investigation found, and the USDA never corrected the record with Congress.
USA TODAY was able to identify the Brucella research program at Louisiana State University’s AgCenter in Baton Rouge as the likely recipient of the $425,000 fine by examining USDA animal health reports that tallied what states reported brucellosis cases in 2008. Louisiana, which had a case that year, had been declared brucellosis-free in 2000.
LSU officials spent months denying USA TODAY access to its records about the incident, citing among other things select agent regulations unrelated to the requested information. In statements and interviews, LSU downplayed its violations and provided information that was later contradicted by federal records.
“The incident was not found to be caused by a violation of federal regulations; no fines were imposed upon LSU, and the regulatory agencies had uncertainty as to whether the strain of bacteria in the affected cow was the same strain that was being used in the LSU research,” LSU officials said in a November 2014 email to USA TODAY.
Yet, in December 2014, when USA TODAY received copies of the incident investigation reports from the USDA and Louisiana’s state agriculture department, the documents showed no uncertainty.
USDA records show that investigators documented serious violations. In levying the $425,000 fine, regulators cited LSU for failing to have adequate biosafety measures, resulting in the release of the bacteria that caused the cow’s infection. The USDA also cited LSU for violating regulations by sending Brucella-infected cattle that had been part of select agent vaccine experiments to an unregistered slaughter facility where their meat was sold for human consumption.
LSU’s Phil Elzer, who at the time ran the Brucella studies and now is a university administrator, said in an interview the practice of sending research cattle to slaughter was declared in the lab’s operating procedures that were reviewed and signed off on at each inspection by Federal Select Agent Program regulators. “To all of a sudden say we were doing it wrong was very surprising,” Elzer said. LSU appealed, and the USDA eventually dropped the fine, he said.
In January 2010, records show, the USDA sent a letter to LSU saying the case was being closed but reiterating the issues with the infected cow and the use of the unauthorized slaughter plant.
USDA officials acknowledge that they never imposed the $425,000 fine and made a mistake touting it in their report to Congress.
“It should have stated that we were proposing a fine, instead of stating we issued a fine,” said Freeda Isaac, USDA’s director of Agriculture Select Agent Services, in an emailed statement. Isaac added that the USDA suspended a portion of LSU’s select agent registration because of the Brucella incident and “that portion of the registration is still suspended,” Isaac said last fall.
For those labs not in the select agent program — and even those that are — self-policing is the front line of biosafety. Biosafety committees at research institutions, often staffed by scientists’ colleagues, assess the risks of proposed research and grant or deny approval for studies. Labs also have other safety staff who may do internal inspections and lab audits, plus additional committees overseeing the use of animals in research.
Yet some researchers appear ignorant of their institutions’ biosafety rules. Others brazenly ignore repeated requests by biosafety staff to stop experiments and address issues.
Documents obtained by the USA TODAY Network include at least 50 incidents since 2012 in which researchers were conducting experiments with genetically manipulated organisms without proper approval from internal safety committees. In some cases, records show researchers flaunting their institutional rules.
• At the University of Tennessee Health Science Center in, biosafety staff concluded in a 2013 report that the root causes of a researcher failing to get her experiments approved included “general indifference of the investigator to institutional rules governing the need for biosafety compliance” as well as a “lack of oversight of research activities.” The scientist, the investigation revealed, knowingly launched unapproved experiments — exposing mice to a genetically manipulated strain of Burkholderia thailandensis — in a quest to get a vaccine study manuscript published that reviewers said needed additional data. The research was halted after veterinarians found several cages containing dead and dying mice, yet none of the cages was labeled with the infectious agent and they were in an area not approved for experiments with a BSL-2 pathogen. The incident was “an extremely unusual event,” said Sheila Champlin, an assistant vice chancellor at the center, noting corrective actions were taken before the scientist was allowed to resume research.
• At the University of Iowa, a biosafety officer in February 2014 discovered that a scientist had been conducting experiments with a genetically manipulated strain of the MERS virus since September 2013 without biosafety committee approval. The biosafety officer ordered the investigator to stop all experiments, and the scientist was put on probation and received increased safety monitoring. The work was being done in a BSL-3 lab at the time it was discovered, but started in a BSL-2 lab, the safety officer’s investigation found. The university concluded that the scientist did not “effectively communicate” to his staff the importance of getting safety committee approval before starting the experiments with the virus, which can cause a deadly, contagious respiratory disease in people.
• At the University of California-Irvine, a researcher ignored repeated notices from biosafety staff during 2012 and 2013 that a research project’s approval had expired, that it needed further revisions and that all work must cease — yet the scientist continued the experiments with a lentivirus, anyway, in the BSL-2 lab. As a result of the incident, the university now sends researchers four notices starting 90 days before approvals expire, said James Hicks, the university’s associate vice chancellor of research. As the deadline nears, Hicks is copied on the notices so he can intervene if necessary. “We take a very strong view and a very correct view of the importance of following the regulations and the guidelines,” he said in an interview.
• At the University of Nebraska, a biosafety officer in 2013 found that a researcher had continued growing plants as part of an experiment using a transgenic tobacco rattle virus vector — despite being told repeatedly over two months that additional approval was needed from the biosafety committee before research could begin. As a result of the incident, the university said it revised its biosafety guidelines to describe consequences of unapproved research and sent a letter to faculty. “This was an isolated instance that was fully and successfully resolved,” the university said.
• At the University of Hawaii-Manoa, biosafety staff discovered a scientist was doing a type of cancer research in 2012 despite being denied biosafety committee approval and being repeatedly told not to do the experiments. Separately, at a March 2013 biosafety committee meeting at the university, members discussed the need for penalties when researchers fail to comply with biosafety rules, stating “there must be some consequence and corrective action other than an email” to the scientist, the minutes say.
Labs that receive funding from the National Institutes of Health and some other federal agencies are required to report incidents to the NIH involving certain types of genetically engineered organisms and recombinant DNA technology. From 2010 through 2014, the NIH received 644 reports of lab incidents during this kind of research.
Most of the reports the NIH receives are for what it says are non-serious incidents, such as small spills, splashes, cuts and equipment failures. Failure to obtain required biosafety committee approvals to do this type of research are among the more common types of non-compliance.
Although it is not a regulatory agency, the NIH said in a statement that agency staff have made site visits to 100 institutions in recent years in an effort to help improve biosafety committee resources and adherence to the NIH Guidelines for operating their labs.
“Most instances of non-compliance result from a lack of full understanding of the requirements of the NIH Guidelines, rather than willful disregard, and our emphasis has been on corrective actions through education, which institutions seem uniformly responsive to,” the NIH said.
In September 2014, the NIH contacted the University of Louisville after a whistle-blower alleged the university had knowingly failed to report lab incidents as required, according to records obtained under the federal Freedom of Information Act. In response, the university told the NIH that it discovered three incidents that were not reported to the NIH but should have been, the records show.
The records indicate that University of Louisville biosafety officials were aware of some of the unreported incidents as much as six months before the NIH opened its inquiry. William Pierce Jr., the university’s executive vice president for research and innovation, in a statement to USA TODAY, said “there was apparent confusion regarding the authority and responsibility for reporting violations to the NIH.” Pierce said the university has hired an outside firm to oversee its biosafety committee and created training courses for scientists. “We feel confident the current system is working,” he said.
The NIH closed its inquiry after the university answered the agency’s questions, filed reports on the previously unreported incidents and agreed to take actions to ensure better reporting in the future.
“In investigating the incident, we did not find any evidence of willful non-compliance,” the NIH said in response to USA TODAY’s questions.
For some residents living near labs, the lack of transparency is frustrating — and worrisome. It’s not enough to tell the public the labs have robust safety procedures. “What people are really interested in is how well it’s working,” said Beth Willis, the citizen lab safety representative near Fort Detrick. “The more people in the community feel that there’s secrecy, the more they’re distrustful, whether their distrust is warranted or not.”