Although modern science has become synonymous with experimentation, many historical records reflect a different narrative. Prior to the 1600s, the dominant epistemology relied on ordinary observation and common sense. Aristotelians accepted ancient authority without proof, observed nature in its pure state, and inferred general axioms from specific observations through deductive reasoning. However, as science strayed further from everyday experience, this shift created a rift between knowledge and the process of knowledge creation, stripping meaning and context from the scientific classroom. The following historical examples illustrate such rifts and their scientific consequences:
Copernicus
The Ptolemaic system was superseded by the Copernican model as the dominant planetary system in the mid-1600s. Contrary to Ptolemy, who suggested that the Earth is the centre of the universe, Copernicus’s heliocentric model postulated that the Sun is the centre of the universe. Copernicus suggested that the Earth exhibits three distinct motions: the annual revolution around the Sun, the daily rotation on its axis, and the wobble of the Earth’s axis.
Although Copernicus was not the first to suggest that the Earth moves, he was the first to construct a mathematical model describing the motion of the Earth. However, a major hindrance to the initial acceptance of the Copernican system was its inconsistency with common sense: our everyday experience tells us that the Earth does not move. Accordingly, it was difficult for people to adopt unquestioning faith in scientific theory, which was founded upon complex, unfamiliar principles, over their own observations.
Galileo
Galileo Galilei, a vehement supporter of the Copernican system, first attempted to buttress Copernicus’s ideas by publishing a series of thought experiments concerning the relativity of motion. In Dialogue Concerning the Two Chief World Systems (1632), Galileo famously argued that if a ship was sailing smoothly at constant velocity, an observer below deck would not be able to tell whether the ship was in motion. Similarly, humans on Earth could not feel the uniform motion of the Earth.
Optical sciences played a key role in the widespread adoption of the Copernican framework, as well as the reconciliation of science as fundamentally incongruent with ordinary experience. Galileo’s refracting telescope allowed him to make a series of revolutionary observations invisible to the naked eye—including the moons of Jupiter—which only made sense within the context of the Copernican system. His discoveries ultimately led to a schism between the scientific community and the Catholic church and began a movement in science surrounding the idea that the senses could be deceptive.
Hooke
An interesting consequence of the reliance on ordinary experience was the belief that organisms on the threshold of human perception were fundamentally simple. Human sensations were the only true point of view—the reason that vision didn’t reveal the intricate details of small organisms is because they didn’t have a complex structure, not because of the inherent limitations of the human eye. As a consequence, small things were considered largely insignificant.
This idea was turned on its head by Robert Hooke. During the Great Plague of London in the 1660s, Hooke used a compound microscope to view and recreate detailed images of various organisms at the threshold of human perception, most famously the gray drone fly and the rat flea. Hooke’s diagrams were sensational—he depicted the eye of the drone fly, composed of 1900 segments, and the armored body of the rat flea in unprecedented detail. Not only did it become clear that small organisms were in fact extremely complex, but also that they could be more sophisticated than humans. For example, unlike a human, the gray drone fly has no visual vanishing points; these organisms see the world in a completely unique way.
Robert Hooke’s discoveries challenged the anthropocentric conception that human sensation was the one “true” viewpoint; much as the Earth was not at the center of the planetary system, humans were no longer at the center of the universe.
Bacon
The discovery that our senses can be misleading ultimately led to a reformation of the scientific method and its capacity to produce knowledge. Francis Bacon, the Attorney General and Lord Chancellor of England in the early 1600s, is credited as the father of modern scientific inquiry. In Novum Organum (1620), Bacon posited that the capacity of science to discover fundamental truths about nature and advance society was stymied so long as it continued to be conducted by the Aristotelian method.
Bacon invalidated knowledge accrued through observation with the naked eye, along with the assumption that science was consistent with common sense. He believed instruments could be used to rectify limitations of human senses, thereby proposing the consolidation of engineering and philosophy. Bacon affirmed that knowledge should revert to a blank slate; only rigorous experimentation could generate well-founded data useful to devise conclusions through inductive reasoning.
“The senses judge only of the experiment, the experiment judges of the thing.”
The “Baconian method,” as it came to be known, was widely adopted by fellow intellectuals, leading to the creation of the Royal Society, the development of the modern scientific method, and ultimately, the scientific laboratory as we know it.
This brings us to the modern university science class. Students are expected to memorize information and apply equations. Participation and experiential learning is limited, and students are responsible to seek out competitive (and frequently unpaid) research opportunities on their own. Moreover, undergraduate students in research labs are often assigned menial tasks rather than experiments or other independent projects, often due to a lack of technical skills. Mandatory laboratory courses offer little opportunity for independent exploration or mistakes, and many upper-year science students forgo laboratory courses altogether because they are time-intensive.
Isolating knowledge from the process of knowledge creation strips all meaning from science, for who truly has a passion for memorization? Much like the initial resistance to the Copernican system, students have a much more difficult time uncritically accepting facts as true than they do understanding their origin and evolution.
The product of a Bachelor of Science program is a student who has retained a limited collection of disjointed facts. Compared to other disciplines, STEM students are overwhelmingly encouraged to take as many specialized and esoteric courses as they can manage, rather than using some of their credits towards becoming well-rounded students. Employment within the biomedical sector ultimately requires the application of knowledge, but students are largely taught retention.
Universities disproportionately value the skill of memorization over qualities such as curiosity, persistence, collaboration, and problem solving. However, a high grade in a memorization-based science course is not an accurate predictor of a good scientist. Accordingly, many senior science undergraduates feel a lack of direction, not knowing whether to pursue graduate studies, and unaware of opportunities within the scientific sector other than professional school.
With this in mind, it’s important for science students to take action and enrich their own scientific education, keeping in mind that science originated from the natural human tendency for curiosity and exploration. Seek out mentorship opportunities and different research and shadowing experiences. Take laboratory courses and courses outside of STEM. Most importantly, critically analyze the information you are being taught, and make connections between what you are learning in the classroom and what you experience outside of it.