Modern Science Didn’t Appear Until the 17th Century. What Took So Long?

The Scientific Revolution of the 17th century yielded the figure of the modern scientist, single-mindedly dedicated to collecting empirical evidence and testing hypotheses against it. Strevens, who studied mathematics and computer science before turning to philosophy, says that transforming ordinary thinking humans into modern scientists entails “a morally and intellectually violent process.” So much scientific research takes place under conditions of “intellectual confinement” — painstaking, often tedious work that requires attention to minute details, accounting for fractions of an inch and slivers of a degree. Strevens gives the example of a biologist couple who spent every summer since 1973 on the Galápagos, measuring finches; it took them four decades before they had enough data to conclude that they had observed a new species of finch.

Credit…Jessica Herman

This kind of obsessiveness has made modern science enormously productive, but Strevens says there is something fundamentally irrational and even “inhuman” about it. He points out that focusing so narrowly, for so long, on tedious work that may not come to anything is inherently unappealing for most people. Rich and learned cultures across the world pursued all kinds of erudition and scholarly traditions, but didn’t develop this “knowledge machine” until relatively recently, Strevens says, for precisely that reason. The same goes for brilliant, intellectually curious individuals like Aristotle, who generated his own theory about physics but never proposed anything like the scientific method.

According to “The Knowledge Machine,” it took a cataclysm to disrupt the longstanding way of looking at the world in terms of an integrated whole. The Thirty Years’ War in Europe — which started over religion and ended, after killing millions, with a system of nation-states — made compartmentalization look good. Religious identity would be private; political identity would be public. Not that this partition was complete in the 17th century, but Strevens says it opened up the previously unfathomable possibility of sequestering science. The timing also happened to coincide with the life of Isaac Newton, who became known for his groundbreaking work in mathematics and physics. Even though Newton was an ardent alchemist with a side interest in biblical prophecy, he supported his scientific findings with empirical inquiry; he was, Strevens argues, “a natural intellectual compartmentalizer” who arrived at a fortuitous time.

So modern science began, accruing its enormous power through what Strevens calls “the iron rule of explanation,” requiring scientists to settle arguments by empirical testing, imposing on them a common language “regardless of their intellectual predilections, cultural biases or narrow ambitions.” Individual scientists can believe whatever they want to believe, and their individual modes of reasoning can be creative and even wild, but in order to communicate with one another, in scientific journals, they have to abide by this rule. The motto of England’s Royal Society, founded in 1660, is “Nullius in verba”: “Take nobody’s word for it.”

Strevens’s book contains a number of surprises, including an elegant section on quantum mechanics that coolly demonstrates why it’s such an effective theory, deployed in computer chips and medical imaging, even if physicists who have made ample use of it (like Feynman) have said that nobody, themselves included, truly understands it. Strevens also has some pretty uncharitable things to say about the majority of working scientists, painting them as mostly uncreative drones, purged of all nonscientific curiosity by a “program of moralizing and miseducation.” The great scientists were exceptions because they escaped the “deadening effects” of this inculcation; the rest are just “the standard product of this system”: “an empiricist all the way down.”

Source Article