Whenever President Donald Trump is questioned about why the United States has nearly three times more coronavirus cases than the entire European Union, or why hundreds of Americans are still dying every day, he whips out one standard comment. We find so many cases, he contends, because we test so many people. The remark typifies Trump’s deep distrust of data: his wariness of what it will reveal, and his eagerness to distort it. In April, when he refused to allow coronavirus-stricken passengers off the Grand Princess cruise liner and onto American soil for medical treatment, he explained: “I like the numbers where they are. I don’t need to have the numbers double because of one ship.” Unable—or unwilling—to fix the problem, Trump’s instinct is to fix the numbers instead.
The administration has failed on so many different fronts in its handling of the coronavirus, creating the overall impression of sheer mayhem. But there is a common thread that runs through these government malfunctions. Precise, transparent data is crucial in the fight against a pandemic—yet through a combination of ineptness and active manipulation, the government has depleted and corrupted the key statistics that public health officials rely on to protect us.
In mid-July, just when the U.S. was breaking and rebreaking its own records for daily counts of new coronavirus cases, the Centers for Disease Control and Prevention found itself abruptly relieved of its customary duty of collating national numbers on COVID-19 patients. Instead, the Department of Health and Human Services instructed hospitals to funnel their information to the government via TeleTracking, a small Tennessee firm started by a real estate entrepreneur who has frequently donated to the Republican Party. For a while, past data disappeared from the CDC’s website entirely, and although it reappeared after an outcry, it was never updated thereafter. The TeleTracking system was riddled with errors, and the newest statistics sometimes appeared after delays. This has severely limited the ability of public health officials to determine where new clusters of COVID-19 are blooming, to notice demographic patterns in the spread of the disease, or to allocate ICU beds to those who need them most.
To make matters more confusing still, Jared Kushner moved to start a separate coronavirus surveillance system run out of the White House and built by health technology giants—burdening already-overwhelmed officials and health care experts with a needless stream of queries. Kushner’s assessments often contradicted those of agencies working on the ground. When Andrew Cuomo, New York’s governor, asked for 30,000 ventilators, Kushner claimed the state didn’t need them: “I’m doing my own projections, and I’ve gotten a lot smarter about this.”
One of the administration’s most consequential failures was that it didn’t establish uniform reporting standards for states and counties. All the numbers from local agencies were just tipped into a mass of detail that was “inconsistent, incomplete, and inaccessible,” according to a report published by the American Public Health Association, the Johns Hopkins Center, and Resolve to Save Lives, a nonprofit led by a former CDC director. But Trump also urged authorities to slow down coronavirus testing. “Instead of 25 million tests, let’s say we did 10 million tests,” he told CBN News. “We’d look like we were doing much better because we’d have far fewer cases. You understand that.”
While Anthony Fauci, the government’s leading expert on COVID-19, was undergoing surgery and conveniently anaesthetized in late August, the CDC changed its guidelines to stop recommending testing for asymptomatic people, even those who had been in contact with carriers of the virus. Two federal health officials told The New York Times that the instruction came from higher-ups at the White House, even though experts think the U.S. needed more testing at that point in the pandemic, not less. The situation became so overtly politicized that Dr. Rick Bright, the former director of the Biomedical Advanced Research and Development Authority, came up with a plan for a national testing infrastructure, only to be sidelined by higher-ups. He resigned from his position in the National Institutes of Health on October 6. “He can no longer countenance working for an administration that puts politics over science to the great detriment of the American people,” Bright’s attorneys said in a statement.
Even details about Trump’s own bout of COVID-19—whether the president had been on oxygen, or a clear timeline of how long he had been infected—were suppressed or spun, by the admission of his own doctor. In real time, as the coronavirus blazes through the country, Americans are witnessing the chaos and dangers that ensue when the integrity of data is leached away.
The frightening thing is that Trump’s war on data isn’t limited to the pandemic. It has been waged throughout the federal government, warping policy and enfeebling institutions from the inside. Over nearly four years, his administration has defunded, buried and constrained dozens of federal research and data collection projects across multiple agencies and spheres of policy: environment, agriculture, labor, health, immigration, energy, the census. “It scares me,” said Katherine Smith Evans, a former administrator of the Economic Research Service, an agency under the U.S. Department of Agriculture. “There are enough chances to make bad policy without lacking the data to make good policy.” We are witnessing a widespread act of erasure.
The impulse to ascribe this to a Republican devotion to small government is a mistaken one. “I don’t see an all-hands-on-board effort to get rid of everything,” Katherine Wallman, who was the chief statistician of the U.S. from 1992 to 2017, told me. “What I do see is that they’re taking on the inconvenient data. Or trying to get data that could help a particular point.” The ERS, which Evans ran until 2011, is a prime case. Sonny Perdue, the secretary of agriculture, complained last year that the agency’s research—which, among many other things, tells America how crop prices are moving, what school lunches ought to contain and who needs food stamps—was “based on political science rather than strong science.” The ERS was finding repeatedly that trade deals benefit U.S. farmers and that federal spending on food stamps had dropped steadily since 2013, flatly contradicting the administration’s claims on both counts.
In June 2019, Perdue told the ERS that its offices would be relocating from Washington, D.C., to Kansas City, Missouri. It was a tactic of brute force, executed in the knowledge that many employees wouldn’t move their lives halfway across the country. Budgets were scheduled to be cut in 2020, in any case. By October 2019, two-thirds of ERS positions were vacant. “I think it’s a decent hypothesis,” Evans said, “that some research results were uncomfortable or inconvenient, and that may have led to a desire to see the agency cut.”
The meticulous assembly of numbers is one of the government’s most overlooked functions, but it’s also one of the most vital. Federal statistics inform the administration about what problems have arisen, who is in distress, and where resources need to go. Citizens aggregate themselves in public data—forcing the state to heed them when individually they might be muted or ignored, and holding officials accountable if their needs aren’t met. By gutting the collection of federal statistics, the Trump administration is burning away the government’s capacity to regulate. By attacking numeracy, it is attacking democracy.
A third of the way into the 20th century, the U.S. went through a revolution of statistics. Until then, the methods of quantifying a country had changed slowly and incrementally. In 1921, President Warren Harding found that he had no unemployment numbers, so he called a conference to canvass opinions on how many working-age Americans didn’t have jobs. Then the figure was put to a vote. The most popular guess—between 3.5 million and 5 million—was published in the conference’s report. That was how hard up for data the government was.
Having accurate statistics for unemployment could have significantly eased the pain of the Great Depression. In 1930, President Herbert Hoover’s administration relied on a federal agency that counted new hires but not layoffs. It should have turned instead to the Bureau of Labor Statistics, which was summing up lost jobs and thus capturing the actual, terrifying scale of the economic collapse. The same year, when the government tried to distort the numbers from a census of unemployment, the economist in charge resigned in protest. Lacking the kind of data that revealed the severity of the Depression, the state came up with pitifully inadequate solutions. In 1930, Hoover’s government sponsored two bills, totaling just an extra $100 million in public works spending. Then Congress adjourned over the summer without enacting any additional laws, “in the face of desperate need,” as the American Federationist reported. Throughout 1930, the first full year of the Depression, the administration soothed itself with dodgy data. By 1931, 8 million Americans were out of work, up from 1.5 million in 1929, and the economy had become a calamity.
The revolution began in 1933, after Franklin D. Roosevelt took office. New statistical agencies were opened. Their work was better funded, and more sophisticated methods took root. Roosevelt’s government sampled urban housing, workers on welfare, and consumer purchasing; a health study surveyed 700,000 families in 83 cities and 23 rural counties; the cost of living index was revised and revised again; when Social Security was introduced, its data was stored on punch cards for easy analysis. The New Deal’s pump was primed by statistics.
Among those who work with public data, the canonical tale—the example that statisticians, economists, civil servants and academics recalled to me most often to illustrate how government data produces policy that indisputably helps hundreds of millions of people—is that of lead in blood. For nearly half a century now, white semitrailers with “National Health and Nutrition Examination Survey” stenciled on their sides have been roaming through the U.S. to find out how healthy Americans are. A quartet of semis pulls into a county, and their trailers are connected to form a large mobile clinic. Over the next few weeks, clinicians ask about the diet, sleep and medication of participants who volunteer. They run blood panels, check teeth and perform ultrasounds. They administer questionnaire after questionnaire; the one on hearing ability alone has 41 questions. The surveys now occur on a two-year cycle; the 2015-16 edition examined nearly 10,000 people. Not many broader surveys are this deep; not many deeper surveys are this broad. The data collected by NHANES is one of the federal government’s richest resources in shaping health policy.
In 1976, for the first time, NHANES began to test for levels of lead. After blood was drawn from a vein, it was frozen, packed into dry ice, and shipped to a lab in Atlanta. The results stunned officials. The blood of Americans held far more lead than anyone had expected. “We knew there was lead in paint, and that children were ingesting paint,” said Charles Rothwell, the director of the National Center for Health Statistics, which conducts NHANES. “But there were fairly high levels even in areas that probably weren’t impacted by paint.” The reason, of course, was leaded gasoline, which had been known for decades to be toxic. But both leaded paint and leaded gas had been kept on the market by bullying lobbies and pliant governments. Until the 1970s, no federal funds had been devoted to the study of lead levels at all. Industries were producing their own data, arguing that a base measure of lead in the blood was normal, and that atmospheric lead didn’t poison people. NHANES proved them wrong. In adults, lead damages the kidneys, causes problems during pregnancies, harms the nerves and triggers anaemia. In children, lead markedly stunts the brain, resulting in slowed growth, learning and speech defects, and impeded intelligence. No amount of lead in a child’s blood, we now know, can be considered safe.
A ban on lead-based paint went into effect in 1978, soon after a phased reduction of leaded gas. As NHANES continued its work, it was able to chart how lead concentrations in its blood samples were dropping rapidly. These statistics first helped defeat a petition from the Lead Industries Association to relax the law and then, in 1985, to persuade the Environmental Protection Agency to demand that gas companies cut their lead content by 90 percent. The potency of good data, Rothwell said, “put appropriate regulations in place, and kept them in place.”
The spine of federal data has always been the decennial census, the latest edition of which is being conducted this year. The kind of cross-section the census provides to officials at every level is impossible to beat, said Joe Salvo, the director of the population division in New York City’s Department of City Planning: “We may complain about the census, its warts and so on. But when we walk through a neighborhood, we see the data come alive.” Every city department uses census statistics as a baseline, and builds more layers of information atop it. The health department, for example, tracks asthma cases to see if they’re higher in some pockets of the city—but it must first know how many people live in these pockets.
Salvo told me a story of how census data helps New York react to emergencies. In 2012, as Hurricane Sandy was preparing to make landfall, the city’s health department realized that not every emergency shelter could be equipped with a generator. “Health came to us, wanting to know the neighborhoods with significant numbers of vulnerable people, aged 75 or older,” Salvo said. The census told him that 108,000 people in that demographic lived in areas liable to be inundated. The shelters near their homes received generators first. “The city also has an evacuation fleet of 50 buses,” he said. “Where should we put them? If we had to guess, can you imagine what a mess that would be?” Laying census data over sea level data furnished the answer: Brighton Beach, southern Brooklyn, northern Queens, parts of Staten Island. “Turns out, you can really do well with 50 buses if you have the right data,” Salvo said.
The 2020 census will be Salvo’s fourth. Every census experiences some kind of friction. New York argued that the 2010 census undercounted the residents of Queens and Brooklyn; the Supreme Court had to weigh in on part of the methodology of the 2000 census; the 1990 census missed 8 million people and double-counted 4 million others. But none of that came close to the anxiety triggered by the Trump administration’s proposal to add a citizenship question to the census, Salvo said. “There’s been a level of fear and apprehension that has gone beyond immigrant communities,” he said, even after the government withdrew its proposal in July 2019. “We have so many people in mixed-status households, where one person is a citizen, one is a legal resident, one or two are perhaps undocumented.” People were afraid that their answers to the census’ questions would be summoned later and used against them.
And when people are afraid, they skip out on the census altogether. In a Pew survey last year, 26 percent of Black adults, 21 percent of Hispanics, and a third of young Americans indicated a reluctance to respond. Salvo couldn’t imagine a worse situation. “It all ultimately comes down to whether you’re in the data. If you don’t answer the census, it’s like you don’t exist,” he said. “If you see your compatriots aren’t answering, it’s like your community doesn’t exist. The ramifications are serious—for dollars budgeted for the city, but also for our ability to get to you help.” Even the threat of asking about citizenship, he said, chilled many people’s intent to participate. Under Trump, the census has already shriveled in other ways. Compared to 2010, the Census Bureau halved the number of local offices it operates around the country, and it hired fewer staff to follow up with people who fail to participate at first. Ultimately, the Supreme Court allowed the Trump administration to end the census early, and it is scheduled to review shortly Trump’s proposal to exclude undocumented immigrants when determining how congressional seats are apportioned. But a resistance to being included in the census is a deeper, more existential quandary. Most Americans, Rothwell told me, don’t think of themselves as being in a position to do something for their government. “I keep wanting to say to them, ‘The information you give us about yourselves in these surveys—this is how it’s being used, these are the benefits that come out of it. This is how you’re helping your neighbors, and your country.’”
The erosion of data across the federal government is particularly insidious because it’s relatively invisible to the public at large. Often, the only people who know the value of these sets of numbers are those who work with them daily. The life-and-death implications of data can be highly technical and hard to convey. But looking at the kinds of data being erased, a clear narrative of political intent emerges.
In 2017, Immigration and Customs Enforcement stopped publishing routine data about its enforcement raids, and it no longer updates the list of deaths occurring in its custody. Health and Human Services has wiped information on how Obamacare impacts public health, and has started to leave out questions about LGBTQ people in surveys that assess the needs of elderly and disabled Americans. The Justice Department has not released any numbers for deaths in correctional institutions since 2016. The first federally funded study on sex trafficking in Native American communities has been shut down. Reports on arms sales to other countries are vanishing. Every instance speaks to a desire to evade accountability, to a narrow ideological impulse, or to an appeasement of commercial interests—and sometimes all three at once.
The damage has perhaps been most prolific on environmental matters. The health risks of a variety of industrial pollutants—coal dust from mountaintop mining, formaldehyde, chemicals called PFAs—were being investigated until the government shut down the studies. Funding for clean energy research was withheld. The Obama administration had ordered the tracking of methane emissions by the oil and gas industry; the Trump administration reversed that order. Companies are taking their cue from the government, said Christopher Sellers, an environmental historian at Stony Brook University. Sellers sits on the coordinating committee of the Environmental Data and Governance Initiative, a network of academics and activists who started to archive tranches of environmental data in publicly accessible servers as soon as Trump took office. So far, EDGI has salted away more than 200 terabytes of data, but Sellers has noticed that the government’s attitude has emboldened companies to stint on fresh statistics as well. “I’ve been looking at other greenhouse gas reporting by oil and gas facilities, and the number of facilities reporting these gas emissions is trending downwards.” The void left by all this data doesn’t just make it difficult to frame policy to fight pollution and climate change; it cements the argument that there’s nothing to fight at all.
What will happen when public facts and figures disappear? Sellers pointed me to the EPA’s decision to pull the funding of a long-term study on a chemical called chlorpyrifos. The study is part of the work of 13 research centers—all defunded now—that examine how adverse health in adults might result from childhood exposure to chemicals. For two decades, the Columbia Center for Children’s Environmental Health has tracked hundreds of children in New York, and its studies have shown that chlorpyrifos, a chemical found in pesticides and patented by Dow, can distort brain development in the womb. Children go into seizures, or lapse into lethargy or coma, but their symptoms are particularly difficult to diagnose. Chlorpyrifos is only permitted for use now on agricultural land, and the EPA nearly prohibited even that in 2015. Now the chemical has a new life in the market, and a deficit of data will hinder anyone who wishes to press for a ban. The cognitive skills of children living near farmland and being exposed to chlorpyrifos may dip and wane, as an older EPA-funded paper has already suggested. But by abandoning its support of the Columbia study, the government has made it harder to protect such children. “What’s happening with chlorpyrifos,” Sellers said, “is the exact antithesis to what happened with lead and gasoline.”
Every set of public data is valuable twice over—by virtue of the information it holds, and by virtue of being accessible to anyone. Even if the Trump administration is rubbing out the latter, it cannot eliminate the former. Companies are avid consumers of public data; Starbucks crunches government demographics to work out where to locate stores, Deloitte analyzes business trends using the Bureau of Labor Statistics’ inflation index, and crop insurance firms look to ERS research to price premiums. In the absence of these sources, the gathering of such information will merely move into the private sector, to be sold to companies as statistical intelligence. You might still call this public data, but only in the sense that it is data about the public; otherwise, the import of this data will be closely held, the benefits accruing only to those with the money to buy them. This isn’t speculation; it has already happened. In his book “The Fifth Risk,” Michael Lewis describes how the National Weather Service missed spotting a tornado’s path through Moore, Oklahoma, in the spring of 2015, and thus never issued an alert. But AccuWeather, a private company that relies on government data to refine its own forecasting, knew that Moore was in danger. “The big point is that AccuWeather never broadcast its tornado warning,” Lewis writes. “The only people who received it were the people who had paid for it—and God help those who hadn’t.”
The privatization of government data will shape the data itself: what gets counted, how it’s steered, what it conveys. “If we privatize the collection of greenhouse gas numbers, what’s the industry that’s most interested in taking that on?” Sellers said. “Oil and gas companies.” The data becomes vulnerable to manipulation—and that kills the basis of public debate.
In the midst of a fractious election season, confidence in data is vital. Across the political divide, we already find it impossible to agree on what the news tells us and which experts to believe. If data is hollowed out as well, each side will bring its own numbers to every issue, cooked to its own convenience. Consensus on policy is hard enough to reach; without reliable data, we lose consensus on what we should even begin to discuss.
But the consequences of unreliable information resound far beyond the election. The quality of data is hard to separate from the quality of governance. The state’s machinery works only if the data it is using to make its decisions is sound and fair. After all, a nation is an act of invention—an abstract, uncanny idea made real every day by a million concrete things that citizens decide they want for themselves. Food that is edible. Streets that are safe to walk. Air that is clean. Workplaces that treat people well. It is in the measures of these qualities—how edible? how safe? how clean? how well?—that a nation shapes itself. Four more years of data decay will fatally weaken the government and its capacity to help its people. The act of invention falters. The lead stays in the gasoline.