Edward Archer writes for the Martin Center about an “intellectual and moral decline” in academic research.

For most of the past century, the United States was the pre-eminent nation in science and technology. The evidence for that is beyond dispute: Since 1901, American researchers have won more Nobel prizes in medicine, chemistry, and physics than any other nation. Given our history of discovery, innovation, and success, it is not surprising that across the political landscape Americans consider the funding of scientific research to be both a source of pride and a worthy investment.

Nevertheless, in his 1961 farewell address, President Dwight D. Eisenhower warned that the pursuit of government grants would have a corrupting influence on the scientific community. He feared that while American universities were “historically the fountainhead of free ideas and scientific discovery,” the pursuit of taxpayer monies would become “a substitute for intellectual curiosity” and lead to “domination of the nation’s scholars by Federal employment…and the power of money.”

Eisenhower’s fears were well-founded and prescient.

My experiences at four research universities and as a National Institutes of Health (NIH) research fellow taught me that the relentless pursuit of taxpayer funding has eliminated curiosity, basic competence, and scientific integrity in many fields.

Yet, more importantly, training in “science” is now tantamount to grant-writing and learning how to obtain funding. Organized skepticism, critical thinking, and methodologic rigor, if present at all, are afterthoughts. Thus, our nation’s institutions no longer perform their role as Eisenhower’s fountainhead of free ideas and discovery. Instead, American universities often produce corrupt, incompetent, or scientifically meaningless research that endangers the public, confounds public policy, and diminishes our nation’s preparedness to meet future challenges.