Oh, science, how low hast thou fallen! Not really. Actually, we’re living in a golden age of science that hopefully will only keep progressing indefinitely into the future. But that doesn’t mean that, in order to get to such heights, science hasn’t made plenty mistakes of mistakes in the past—or in the present. In fact, the self-correcting, self-regulating scientific method relies on making mistakes, not because it is fundamentally a bad method, but because we, as humans, are fundamentally flawed beings. We learn by trial and error, and the truth always hides deep beneath a host of naive assumptions and biased opinions which always threaten to corrupt facts and evidence.
Science makes mistakes, there’s no doubt about it. If it claimed to possess perfect knowledge of the world, it would be no better than religion or other dogmatic doctrines. Unlike religion, science doesn’t deal in absolutes, but in probabilities—which is how we conduct our everyday knowledge anyway.
However, that doesn’t mean we shouldn’t heed scientists when it comes to issues like evolution, the effectiveness of vaccines, climate change, or, indeed, that the Earth is spheroid-shaped. That’s because those are well-proven facts that present the best possible explanations for all the variables at hand, as no alternative theory can reliably explain the phenomena in question. If science doesn’t have the answer, you can bet religion or alternative medicine doesn’t either.
So, yeah, science (or what used to pass for the equivalent of science at some time or another) makes mistakes. Some past theories have been highly dubious, if not outright preposterous. The important thing is to acknowledge that and to learn from it—that’s the whole point, after all. With that in mind, here are 10 times where science got it all horribly wrong.
On a similar note, read this: Does Space Really Smell Like Fried Steak? Astronauts Debunk The Myths
“Take this miraculous substance, the philosopher’s stone, and turn these ordinary metals into gold!”—call this, the alchemist’s rally.
Long ago, people already suspected we could manipulate the elements in order to get crazy results. But they had no idea about how it all worked. Alchemy is the protoscientific precursor of chemistry, and one of the earliest attempts of humanity to explain the very nature of the world around us. It originated in Hellenistic Egypt around the 4th century BCE, went through many changes and iterations throughout millennia, and finally was replaced with the advent of the so-called first chemical revolution of the 19th century.
Given this enormous time-frame, it’s hard not to include alchemical dogma among the beliefs of many famous scientists, including Isaac Newton, for example.
Newton is remembered for his work on physics, but his primary concern was actually the quest for the legendary philosopher’s stone, a properly alchemical task. Alchemists like Newton believed there was a substance (the philosopher’s stone) that was capable of turning base metals (mercury, for example) into gold. This very substance was supposed to be useful to achieve immortality.
As crazy as alchemy may sound, its studies paved the way towards the proper science of chemistry shortly after Newton’s time.
“Put that fire out! How? By sucking its phlogiston away, of course!”—probably some 18th century chemist.
Back in the early 1700s, a particular account of how processes like burning and rusting work became very popular in the scientific community of the time. It was called the phlogiston theory and it maintained there are substances that contain a theoretical element, phlogiston, that gets released when that substance burns. Any substance, like water, which is not “phlogisticated” can therefore not burn at all. Thus, dephlogistication is, according to this theory, the process by which a particular thing releases its stored phlogiston, which is then absorbed by the air around it. This explained why ashes didn’t burn (since all the phlogiston had left them) while wood did.
The rejection of the theory of phlogiston and the advent of the oxygen theory of combustion, together with the law of conservation of mass, marks the high point of what is known as the first chemical revolution, launching humanity into the contemporary era of scientific thought.
“All living things have an energy field, a vital force, that gives them the very life they live, like the breath of God,” the disgraced vitalist said.
What phlogiston is to the field of chemistry, vitalism is to biology. Up to the 18th and 19th centuries, biologists believed that living organisms where fundamentally different from non-living objects because the former, but not the latter, contain a non-physical “force,” element, or principle which explains their life. This principle, often called “life spark,” “energy,” or élan vital, is sometimes also equated with the soul.
Vitalists mainly wanted to argue that the phenomenon of life could not be explained by mechanic or purely material principles. Other scientists have since proven vitalists wrong, to the point that no contemporary biologist would want to be labeled a vitalist.
This could also interest you, by the way: Is Artificial Intelligence Remotely Close To Human Consciousness?
“See this flea right here? It just suddenly appeared; came out of nowhere!”—the law of spontaneous generation.
Many scientists once thought complex organisms could be naturally formed without descent from similar organisms. Not only did they maintain complex life could arise from nonliving matter, but that this process was completely ordinary and a commonplace occurrence in several kinds of organisms, like fleas and maggots. The first coherent formulation of this theory came from Aristotle and was taken as a fact for over two millennia, even by the scientists of the 1700s. It wasn’t until Louis Pasteur came along in the 19th century that he definitely disproved this ubiquitous false belief. No biologist actually believes in spontaneous generation nowadays, fortunately.
The Luminiferous Aether
“Waves cannot travel in empty space. Light is a wave that travels through space. Therefore, space cannot be empty: it must be filled with the aether!”—sounds reasonable, right?
Here’s another mysterious substance: the aether, the theoretical medium through which light supposedly propagates. When the understanding that light is an electromagnetic wave was first developed in the late 1800s, scientists of the time proposed the luminiferous (or “light-bearing”) aether to explain how light could travel through seemingly empty space. The aether was actually a reasonable hypothesis for the time, when you consider scientists back then also believed no wave could propagate without a medium. Take sound waves, for example, which just cannot travel in the vacuum. So, this luminiferous substance was just that necessary medium. Makes sense, doesn’t it?
Problems for the theory came when more and more studies into the nature of light demanded increasingly contradictory physical properties for the aether. By the end of the 19th century, a famous experiment (the Michelson–Morley experiment) effectively disproved the existence of this strange substance.
“This bump here on your skull shows you’re actually a melancholic serial killer”—your typical phrenologist.
Today considered nothing but a pseudo-science, phrenology arose in the 19th century among the scientific community as the study of human personality and human nature by the measurement of the skull. Phrenologists believed that the shape of our heads was indicative of our individual temperaments, so by making some classifications of shapes and measurements, they were supposed to be able to tell whether someone had criminal tendencies, high intelligence, or aptitude for a given job.
There were at least two problems with this study: it was biased and based on unreliable experiments, and it was used to justify racist and sexist beliefs. Fortunately, it was completely disproven in the early 20th century and no respectable scientist today would actually buy into it.
Irreversible Cell Differentiation
“Once a skin cell, always a skin cell”—a random biologist, until very recently.
When biologists talk about cellular differentiation, they mean the process by which certain cells change from one cell type to another, usually a more specialized one. This is the process that takes place when stem cells that could potentially become any other kind of cell turn into, say, blood cells specifically.
Until recently, it was assumed that once a stem cell differentiates into a blood cell, its fate is sealed. Scientists thought this process was simply irreversible. Enter cloning technology, which proved that the nucleus of a fully differentiated cell could be turned back into an embryonic cell after all. The 1990s cloning of Dolly the Sheep, for example, effectively showed this fact.
“You can’t teach old dogs new tricks”—the pessimistic neuroscientist.
How long did scientists believe nerve cells could neither regenerate nor be created in a born organism! We used to think that, especially within the brain, nerve cell damage was permanent and irreversible, and that no new neurons could be born in the brains of old organisms. This was all challenged and disproven recently, when studies showed that though neurogenesis (the birthing process of nerve cells) and neuroregeneration (the healing process of nerve cells) are most active during the development of the embryo, both continue throughout adult life to different degrees.
Homosexuality as an Illness
“Gay people are sick!”—woefully ignorant person.
Historically, there’s been an incredible amount of prejudice against gay people, to the point that active persecutions have been common all around the world. One of the most unfortunate scientific missteps was when the American Psychiatric Association included homosexuality in its list of mental disorders in 1952, or when the World Health Organization (WHO) classified homosexuality as a mental illness in 1977, caving to this unjustified social bias.
Fortunately, after decades of studies that have consistently proven that being homosexual represents no inherent obstacle to a healthy lifestyle, and that, on the contrary, same-sex relationships can actually have positive effects on many people, no respectable scientific body today considers homosexuality a sickness. WHO changed its stance by 1990, and even the Chinese Society of Psychiatry removed homosexuality from its catalog of mental disorders in 2001.
As you can see, evolution, anthropogenic climate change, or vaccines are not on this list. That’s because those “theories” have been as confirmed as any theory could possibly be. Remember that, in science, a theory is not an unproven guess, but just a set of theorems, confirmed or otherwise, which can be counted as facts under any other standard. So there is such a thing as a thoroughly confirmed scientific theory, like the theory of evolution.
At any rate, it’s good to see science has made such an extraordinary progress over the past few centuries—to go from believing in an aether to our current understanding of the universe is nothing short of exceptional. Grave mistakes are becoming less and less common. False hypotheses, though still ubiquitous, have very little chance of becoming widespread beliefs among the scientific community nowadays. But mistakes, even little ones, will always be present, and any scientific theory can be dethroned if sufficient evidence is presented against it. The fact that science accepts it can be mistaken means it hasn’t gone the way of dogmatisms and social ideologies. And that’s just about the single best thing about science.
Your voice matters:
Are you a science buff? Do you an idea for an article? Click on this link for a chance to share your thoughts with the rest of the world.
Take a look at this other article:
The 19 Most Staggering Photos Of Our Universe That’ll Transport You To Other Worlds