Heading toward a national computer health system?

Guest post by Joseph November

The debate over health care has shut down the federal government, but one thing both parties seem to agree upon is that investing in information technology offers a way to better, cheaper medicine for Americans. In a 2009 op-ed piece titled “How to Take American Health Care From Worst to First,” Republican Newt Gingrich and Democrat John Kerry teamed up with baseball manager Billy Beane to call for a “a data-driven information revolution” for “our overpriced, underperforming health care system.” The model for this revolution was provided by Beane, who was celebrated in the 2011 movie Moneyball for taking an information-intense and heavily statistical approach to selecting players called sabermetrics, which enabled his Oakland A’s to regularly beat teams with much higher payrolls. Pointing to Beane’s success in the ballpark, the politicians jointly called for a numerical, evidence-based approach to be taken to improving medicine in America, declaring that “a health care system that is driven by robust comparative clinical evidence will save lives and money.”

Bipartisan support of using computers in medical research and care goes all the way back to the late 1950s, and such support has already left an indelible mark on medicine. In Biomedical Computing: Digitizing Life in the United States, which recently won the Computer History Museum Prize, the major book prize in the field of history of computing, I explain what happened when computers and biomedicine first encountered each other. As my research shows, the U.S. Senate’s International Health Study of 1959 (led by Sen. Hubert Humphrey [D-MN] and backed by Sen. Alexander Wiley [R-WI], then the most senior Republican in the Sentate) was vocal in calling on the government to channel funds to projects that attempted to incorporate computers into biology and medicine.

In 1960, two recipients of Capitol Hill support—dentist-turned-computer expert Robert Ledley and radiologist-turned-computer expert Lee Lusted—argued that the United States should build a “computer health system,” a nationwide network of computers that would keep track of each American’s medical data. Noting that 20% of the American public changed address each year, and that many more went on extended trips, Ledley and Lusted hoped their network would allow care providers to access patients’ complete medical record no matter where they went in the nation. Looking at the big picture, Ledley and Lusted wrote: “The great significance of such a health network cannot be overestimated, both as an aid to increasing individual good health and longevity (leading to per capita productivity corresponding to a greater GNP), and as a vast new source of medical information concerning mankind.”

They also noted that “much research and planning will be necessary before this health computer network can be a reality.” In 1960, even ARPANET, the Internet’s predecessor, was years away. But Ledley and Lusted also sensed that physicians’ and patients’ resistance to computerizing medical data would have to diminish in the way that public opposition to nationwide sharing of financial data had lessened between the 1930s and 1950s. They joked: “Someone recently observed that we would indeed be living in J.K. Galbraith’s ‘Affluent Society’ if we had a complete central charge service—and this before a central health records service.”

Presently, many of the technical obstacles of building a national “computer health system” have been overcome. Meanwhile, politicians in both parties seem enamored of big data. What remains an open question, though, is whether the public is willing to provide that data.

novemberJoseph November is an associate professor of history at the University of South Carolina. At this past weekend’s meeting of the Society for the History of Technology, his widely praised Biomedical Computing won the Computer History Museum Prize offered by SIGCIS, the Special Interest Group on Computers, Information, and Society.