10 Scientific Ideas That Scientists Wish You Would Stop Misusing
![]()
Many ideas have left the world of science and made their way into everyday language — and unfortunately, they are almost always used incorrectly. We asked a group of scientists to tell us which scientific terms they believe are the most widely misunderstood. Here are ten of them.
1. Proof
Physicist Sean Carroll says:
I would say that "proof" is the most widely misunderstood concept in all of science. It has a technical definition (a logical demonstration that certain conclusions follow from certain assumptions) that is strongly at odds with how it is used in casual conversation, which is closer to simply "strong evidence for something." There is a mismatch between how scientists talk and what people hear because scientists tend to have the stronger definition in mind. And by that definition, science never proves anything! So when we are asked "What is your proof that we evolved from other species?" or "Can you really prove that climate change is caused by human activity?" we tend to hem and haw rather than simply saying "Of course we can." The fact that science never really proves anything, but simply creates more and more reliable and comprehensive theories of the world that nevertheless are always subject to update and improvement, is one of the key aspects of why science is so successful.
2. Theory
Astrophysicist Dave Goldberg has a theory about the word theory:
Members of the general public (along with people with an ideological axe to grind) hear the word "theory" and equate it with "idea" or "supposition." We know better. Scientific theories are entire systems of testable ideas which are potentially refutable either by the evidence at hand or an experiment that somebody could perform. The best theories (in which I include special relativity, quantum mechanics, and evolution) have withstood a hundred years or more of challenges, either from people who want to prove themselves smarter than Einstein, or from people who don't like metaphysical challenges to their world view. Finally, theories are malleable, but not infinitely so. Theories can be found to be incomplete or wrong in some particular detail without the entire edifice being torn down. Evolution has, itself, adapted a lot over the years, but not so much that it wouldn't still be recognize it. The problem with the phrase "just a theory," is that it implies a real scientific theory is a small thing, and it isn't.
3. Quantum Uncertainty and Quantum Weirdness
Goldberg adds that there's another idea that has been misinterpreted even more perniciously than "theory." It's when people appropriate concepts from physics for new agey or spiritual purposes:
This misconception is an exploitation of quantum mechanics by a certain breed spiritualists and self-helpers, and epitomized by the abomination, [the movie] What the Bleep Do We Know? Quantum mechanics, famously, has measurement at its core. An observer measuring position or momentum or energy causes the "wavefunction to collapse," non-deterministically. (Indeed, I did one of my first columns on "How smart do you need to collapse a wavefunction?") But just because the universe isn't deterministic doesn't mean that you are the one controlling it. It is remarkable (and frankly, alarming) the degree to which quantum uncertainty and quantum weirdness get inextricably bound up in certain circles with the idea of a soul, or humans controlling the universe, or some other pseudoscience. In the end, we are made of quantum particles (protons, neutrons, electrons) and are part of the quantum universe. That is cool, of course, but only in the sense that all of physics is cool.
4. Learned vs. Innate
Evolutionary biologist Marlene Zuk says:
One of my favorite [misuses] is the idea of behavior being "learned vs. innate" or any of the other nature-nurture versions of this. The first question I often get when I talk about a behavior is whether it's "genetic" or not, which is a misunderstanding because ALL traits, all the time, are the result of input from the genes and input from the environment. Only a difference between traits, and not the trait itself, can be genetic or learned — like if you have identical twins reared in different environments and they do something different (like speak different languages), then that difference is learned. But speaking French or Italian or whatever isn't totally learned in and of itself, because obviously one has to have a certain genetic background to be able to speak at all.
5. Natural
Synthetic biologist Terry Johnson is really, really tired of people misunderstanding what this word means:
"Natural" is a word that has been used in so many contexts with so many different meanings that it's become almost impossible to parse. Its most basic usage, to distinguish phenomena that exist only because of humankind from phenomena that don't, presumes that humans are somehow separate from nature, and our works are un- or non-natural when compared to, say, beavers or honeybees.
When speaking of food, "natural" is even slipperier. It has different meanings in different countries, and in the US, the FDA has given up on a meaningful definition of natural food (largely in favor of "organic", another nebulous term). In Canada, I could market corn as "natural" if I avoid adding or subtracting various things before selling it, but the corn itself is the result of thousands of years of selection by humans, from a plant that wouldn't exist without human intervention.
6. Gene
Johnson has an even bigger concern about how the word gene gets used, however:
It took 25 scientists two contentious days to come up with: "a locatable region of genomic sequence, corresponding to a unit of inheritance, which is associated with regulatory regions, transcribed regions and/or other functional sequence regions." Meaning that a gene is a discrete bit of DNA that we can point to and say, "that makes something, or regulates the making of something". The definition has a lot of wiggle room by design; it wasn't long ago that we thought that most of our DNA didn't do anything at all. We called it "junk DNA", but we're discovering that much of that junk has purposes that weren't immediately obvious.
Typically "gene" is misused most when followed by "for". There's two problems with this. We all have genes for hemoglobin, but we don't all have sickle cell anemia. Different people have different versions of the hemoglobin gene, called alleles. There are hemoglobin alleles which are associated with sickle cell diseases, and others that aren't. So, a gene refers to a family of alleles, and only a few members of that family, if any, are associated with diseases or disorders. The gene isn't bad - trust me, you won't live long without hemoglobin - though the particular version of hemoglobin that you have could be problematic.
I worry most about the popularization of the idea that when a genetic variation is correlated with something, it is the "gene for" that something. The language suggests that "this gene causes heart disease", when the reality is usually, "people that have this allele seem to have a slightly higher incidence of heart disease, but we don't know why, and maybe there are compensating advantages to this allele that we didn't notice because we weren't looking for them".




like it's thanksgiving dinner 
