In my last article, I reviewed some of the top arguments used by sophists and midwits.
I saved the last argument for a separate article, because it’s the one that drives me the craziest. This argument says that people shouldn’t do their own research, at least when it comes to “complex topics” like science.
We now regularly come across media articles telling us not to do our own research:
Here’s Brian Stelter on CNN, discouraging us from doing our own research:
How 'do your own research' hurts America's Covid response
Here’s another lovely example from Forbes:
You Must Not ‘Do Your Own Research’ When It Comes To Science
The author of the Forbes article argues that at least when it comes to issues like vaccinations, climate change, and SARS-CoV-2, it can be “dangerous” to do one’s own research. He says:
The reason is simple: most of us, even those of us who are scientists ourselves, lack the relevant scientific expertise needed to adequately evaluate that research on our own.
In our own fields, we are aware of the full suite of data, of how those puzzle pieces fit together, and what the frontiers of our knowledge is. When laypersons espouse opinions on those matters, it’s immediately clear to us where the gaps in their understanding are and where they’ve misled themselves in their reasoning.
It’s true that being aware of the “full suite of data,” and “what the frontiers of our knowledge is,” can be useful in evaluating scientific literature. But does that mean people shouldn’t do their own research?
Let’s ask this in a different way.
Who should be allowed to do their own research?
At what point should someone be allowed to do their own research, according to the author of the Forbes article?
I’m guessing that everyone would agree that people with the “proper credentials” should be allowed to do their own research, at least within their own fields. So at what point did it become “OK” for these people to do that? Was it as soon as they started attending a formal academic program? If so, what level of program? A master’s program? But undergraduates often engage in research and sometimes even co-author papers. So does that mean it’s ok for an undergraduate to do research, but most other adults of the general public shouldn’t do it because they’re too stupid?
But an undergraduate can get guidance from a professor, you might say. They are, after all, in a formal academic program. In other words, they’re in a kind of a “controlled” environment; one that can steer the young impressionable student away from The Bad Rabbit Holes, and steer them towards “the good ones.”
First of all, isn’t this just a road to solidification of dogma? Unless the purpose of schooling is to make lots of little people who are good at regurgitating information?
But more importantly, is it really the case that the main problem with the general public doing their own research is that they don’t have access to this kind of “guidance”?
I’m pretty sure that most of the people who are “outside” mainstream views, have access to mainstream views. That’s because the mainstream views are ubiquitous within the media. When it comes to the COVID-19 vaccines, I’m pretty sure the “vaccine skeptics” know that they’re going against the mainstream view. After all, the U.S. government paid news media $1 billion for propaganda to promote the COVID-19 vaccines.
So what are they missing? According to the Forbes writer:
If you “do your own research,” you can no doubt find innumerable websites, social media accounts, and even a handful of medical professionals who are sharing opinions that confirm whatever your preconceived notions about COVID-19 are.
However, do not fool yourself: you are not doing research. You are seeking information to confirm your own biases and discredit any contrary opinions.
He claims that when people are doing their own research they’re just “seeking to confirm” their own biases. Sure, probably that’s the case sometimes, but surely the author must know that “professional” scientists or researchers are also susceptible to confirming their own biases.
So does he think that the general public is more susceptible to this than “professional” researchers?
I’m not sure that this has even been studied. It just sounds like a faith-based statement.
I will say that of the people I’ve met who became skeptical about the COVID-19 vaccines, a lot of them started out neutral and open-minded about them, and only later grew skeptical after seeing the propaganda, inconsistencies, and lies coming from the media and health officials. And in some cases, people who got vaccinated later changed their minds about them and regretted getting the vaccine; I fall into this camp. These people obviously did not start out biased against the vaccines.
That phrase “even a handful of medical professionals,” is also interesting. When it comes to the COVID-19 vaccines, we have more than just a “handful” that disagree from the mainstream view. I suppose the Forbes writer just thinks that these people, which include well-credentialed doctors or scientists like Robert Malone, Peter McCullough, or Luc Montagnier, are just the “wrong people” to listen to. According to mainstream sources, we should be listening to the likes of Anthony Fauci, Eric Topol, or Peter Hotez instead.
That’s just a matter of disagreement, isn’t it.
It’s not that things can’t go wrong when people do research outside their fields. It’s true that most people don’t know how to critically evaluate scientific papers.
But this is all a matter of degree, and unfortunately, a formal academic program is no guarantee of quality control. I don’t know what percentage of doctors and PhDs are horrible at evaluating scientific papers, but I’ve come across too many who just take the abstract of a paper at face value and call it a day. I’ve met people who dropped out of high school or college who were more insightful readers of scientific literature than many MDs.
As for statistics, I can’t count how many times I’ve heard researchers or doctors regurgitating statistical myths. Well, no one’s telling these researchers to stop doing research because they don’t know basic statistics.
Degree not required
Remember learning about Benjamin Franklin? He made key discoveries in the field of electricity, invented the lightning rod and bifocal glasses, and even made contributions in oceanography. Apparently his formal education ended when he was 10. Most of his learning was self-motivated.
You may have also heard of Thomas Edison, inventor of the light bulb and phonograph, among other trifling things. Apparently he attended school for only a few months. Most of his learning was through reading on his own.
Then there were the Wright brothers. Apparently it’s somewhat contentious who was the very first to fly an airplane, but I think we can agree that they were, at the least, “accomplished” in the field of aviation. Anyway, both the Wright brothers dropped out of high school.
Is it that school didn’t matter as much back then, but does now? Sure, some disciplines have gotten more specialized, or more technical. On the other hand, people have never had more access to information, whether in the form of books or articles or online courses or videos or podcasts.
It’s not that formal academic programs don’t provide value. One of the most important things it can provide is access to people that you can get great advice from. But even here, you don’t need to be part of an academic program to get a good tribe of mentors.
Incentives and motivation matter
I’d be willing to bet that people who engage in doing their own research are often more motivated to learn than a good chunk of those enrolled in academic programs. I’d be willing to bet that often these people are learning more; after all, they’re doing the research because they want to. Often it’s about something that affects them personally, like their own illnesses. I can’t think of anyone more motivated to find out the truth about an illness than someone who suffers from it, or has a loved one who does.
Occasionally these people figure something out about their illnesses that their doctors missed. This isn’t surprising to me at all, because no one is more motivated to solve their illnesses than they are. Here’s just one example of a couple that did their own research on their chronic illnesses. They eventually found something that helped after Western mainstream medicine had failed them for years.
Is it really good for society to tell these people not to do their own research just because they don’t have the proper degrees?
Think about which subjects you’ve actually retained information in. How much information do you remember from your formal education, where you were spoon-fed information? Now think about something you were really interested in, where it was largely self-initiated and learned “open source.” Maybe it was how to fix your car, or build a deck, or build your business, or treat your autoimmune disorder. Do you have more mastery in the subject matters you learned through school, or the stuff you sought out on your own?
Should we really deny the opportunity for people to learn science by doing their own research? If you were a teacher and a student really wanted to do their own research on something, wouldn’t you be thrilled? Isn’t that what every teacher wants? And yet our media discourages this for some reason.
Meanwhile, there isn’t enough discussion of the bad incentives in academic research; the kind of incentives that hinder the ability to get to the truth. For more on that, watch this starting at around 27:30:
It’s not that expertise doesn’t matter
None of this is to say that expertise doesn’t matter. Of course it does. And there are certain types of expertise that someone who isn’t part of a larger institution probably won’t be able to gain. For one thing, they won’t have access to laboratories or equipment that might be needed to do certain kind of experiments.
But there are different types of expertise. There are theorists, and those who synthesize information, and those who have interdisciplinary knowledge. We need these people as much as the experimentalists or technicians, aka the experts on the limits of detection of the machinery or methods used to gather data.
By the way, if you have good theoretical knowledge, but don’t have a deep knowledge of the machinery you’re using, you’d be in the company of people like Nobel Laureate and physicist Carlo Rubbia, who erroneously thought he’d made a significant discovery in particle physics because he didn’t listen to the technicians who were telling him that their particle detectors were fooling them (hear this account starting at around 25:15).
Mistakes will be made
Whether it’s world renowned Nobel laureates, or people doing their own research, mistakes will be made. There’s no way to avoid it entirely. The answer is not to restrict people from doing their own research, or to censor or restrict dialogue. The answer is more dialogue, and more openness.
There is no perfect solution; it’ll always be a tradeoff. If we have a society where people can do their own research, including “outside their lane,” sure, probably more mistakes will be made. But one of the best ways to learn is through making mistakes. That seems preferable to gate-keeping research to a select few.
We should also foster a culture that makes it easy to “save face” if someone makes a mistake, so that they can quickly course correct.
What I learned from experiencing imposter syndrome during grad school
When I first started my PhD, although I had spent a lot of time building up “theoretical” knowledge of biology, I had close to zero laboratory or hands-on experience. I’d had some laboratory exposure as a component of some of my undergraduate biology courses but had retained very little from it because I found lab work boring at the time. Maybe this was because I was learning lab techniques for the sake of learning techniques, instead of learning them because I needed them to answer a burning question I had.
Anyway, I didn’t know if my level of lab experience was “typical” for a new PhD student or if perhaps I was “behind” when it came to lab skills. In short, I had a sort of imposter syndrome. I remember feeling sheepish asking questions about “basic” lab techniques because I was worried I would say something that would reveal that I was missing vital pieces of information that I “should” know.
I now realize that I had implicitly believed that there was a body of “Things You Should Know” that existed, a so-called canon of knowledge, like what you’d find in a textbook that’s considered “foundational” in the field.
After a few months of fumbling around in a highly unstructured environment in which I had the freedom to take my research in almost any direction, even if it went outside my advisor’s areas of expertise (my PhD advisor was very “hands off”), I realized that there wasn’t really a canon.
I realized it was ok to not know about something, even if it seemed really “basic,” because what was “basic” depended on which sub-field you were in. I discovered that there was this field called “systems biology” which involved computational modeling of biological systems, and that the modelers knew very little about “wet lab” stuff, aka, the stuff molecular biologists do in the lab, like PCR, quantifying proteins, cloning. On the other hand most of the molecular biologists were iffy on things like ecology or field work, and most knew nothing about the software underpinning a lot of the analyses they did. The professors who had been around for the longest, aka the tenured ones with long publication histories, tended to not know about the latest technologies. And the technicians knew the most about the quirks of the equipment we were using. But all of these people added value, and appeared on papers together.
I suspect that part of the reason there wasn’t really a “canon” or “script” for young researchers to follow, was that I was in a field that was sort of young, or at the very least “less developed” than, say, neurosciences or cancer or plant sciences. I was doing research in unicellular algae called diatoms, and by “less developed” I mean that there aren’t as many people in this field. It’s not very institutionalized. There are few textbooks on the subject. For my research, I not only drew from studies done in algae, but also other unicellular organisms, plants, animals, and fungi, and even ended up taking inspiration from cancer research.
I realize now that this was also due to the style of my PhD advisor. Over the years he had jumped from subject to subject, which ranged from biology to biochemistry to biogeochemistry to biological oceanography to geology to materials sciences and even astrobiology. In fact, he always told his PhD students that if they were planning on continuing on in academia, that they should work on something completely different from what they’d worked on during their PhD. He thought this would help them grow. Although he and I didn’t agree on everything, on this we agreed.
I’m sure there are some programs that are much more “formalized” than what I had experienced. I imagine most medical programs are this way. There’s a “core” set of classes that everyone takes, and a handful of textbooks that are considered foundational that everyone knows, and a “canon” of knowledge that “everyone should know,” and standardized tests that people have to pass.
This might have its benefits. But I wonder: in those more institutionalized fields, how much of the stuff in the textbooks, the so-called “canon” of knowledge, will turn out to be wrong? Especially in light of the fact that we’ve seen cases where the “consensus” was wrong, as mentioned in my last article?
And who chooses what gets to be in the “canon,” or what’s considered “basic” knowledge in the field? Does this have anything to do with why so many medical doctors don’t know much about nutrition? If someone else had developed the medical “canon” could it have been another way?
What’s an expert?
When someone says “don’t do your own research, at least when it comes to science,” I think they’re implying that there is this canon of research that the “experts” know, and unless you know this canon, you shouldn’t have a say.
Again, it’s not that expertise or education doesn’t matter. If you’ve barely spent any time learning biology, you’ll have a hard time understanding biology papers. But who’s an expert?
We should instead be asking who could contribute to our understanding.
Take this analysis of a clinical trial here, which incidentally appears in a Substack called Do Your Own Research. The author isn’t a biologist or some specialist on clinical trials in a formal sense, but his background, which was in computer science, came in handy when finding flaws with this trial (see the section “An Algorithm’s Tale”).
But also, if someone has pored through this trial and picked it apart and did research on it and trials similar to it, who’s to say he’s not an expert in it, of sorts?
In fact, I’m going to guess that the way in which he conducted his analysis and did research was broadly similar to how many PhDs are conducted; by staring at the data and doing lots of literature searches and poring over papers and asking people for help when you feel something’s gone over your head. The point is that there’s more than one way to become an expert in something, and “going to school for it,” is just one way. Another way, is to dive right in and do the research, without the requisite “formal classes.”
In the end, the credentials or formal education don’t matter as much as whether you’ve gotten closer to the truth.
Disciplines should welcome criticism from the outside
We should be wary of “gate-keeping.”
Take a look at this article about “epistemic trespassing.” From the abstract:
Epistemic trespassers judge matters outside their field of expertise. Trespassing is ubiquitous in this age of interdisciplinary research and recognizing this will require us to be more intellectually modest.
I do agree that it’s good to be “intellectually modest.” And that includes being open to what people outside of your field have to say.
What’s hilarious is that one of the first examples that this article lists as someone who “epistemically trespassed,” was Linus Pauling:
Linus Pauling, the brilliant chemist… won two Nobel Prizes… Later, Pauling asserted that mega-doses of vitamin C could effectively treat diseases such as cancer and cure ailments like the common cold. Pauling was roundly dismissed as a crackpot by the medical establishment after researchers ran studies and concluded that high-dose vitamin C therapies did not have the touted health effects. Pauling accused the establishment of fraud and careless science. This trespasser did not want to be moved aside by the real experts.
So I guess Pauling was “roundly dismissed” at the time, but it looks like his ideas on vitamin C and cancer are being considered seriously again; take a look at some examples of papers on it here or here, or listen to this podcast at around 12:45. And as for vitamin C and the common cold, I haven’t looked into it enough to tell you that Pauling’s been “vindicated,” but it was never a “crackpot” theory because there’s evidence behind it, even if it’s noisy.
So maybe Pauling wasn’t a crackpot. Maybe, instead, he’s what you’d call a polymath, which elitist academics sometimes aren’t able to recognize and commonly mistake for crackpots.
Contributions can even come from “naive” people.
In this interview with evolutionary biologists Heather Heying and Bret Weinstein, at around 25:50, Heying describes how their undergraduate advisor had once given them a piece of advice about what kinds of jobs they might want if they planned to stay in academia:
Do not accept a job in which you are not exposed to undergraduates because teaching undergraduates means exposing yourself and the thinking that you are presenting, to naive minds who will throw curve balls at you, and some of those curve balls are going to be nuisances and maybe they’ll waste your time, but some of them are likely to reveal to you the frailty in your own thinking or in the thinking of the field, and that is the way that progress is made.
So who we call “peers” is up for discussion.
Finally, I love this video on how disciplines can end up disconnected from reality by not getting feedback or criticism from outside:
When the criticism that’s allowed, is not criticism that is across disciplines- it’s only those people who are professional philosophers or only those who are professional physicists or professional economists that are allowed to comment on the ideas of the professionals, or the theologians who only accept criticism from other theologians that already agree with the basics, that already agree with the rules of the field- that’s where you get this dogmatism. That’s where you get this parade of delusion and nonsense, where the emperor is wearing no clothes…
Don’t give respect to people who have the credential… It doesn’t matter how long the aikido practitioner has been practicing, it doesn’t matter how many trophies he has, how many accolades he has, and it doesn’t matter how many awards the economist has won… Neither of those things matter because they are not representative of that person having a sound connection to reality.
Mastering a discipline where you only exist in that discipline does not mean in any way that you understand the basics of what you’re talking about or that your ideas are grounded in truth and reality.
What makes a good researcher?
Although formal education can obviously be helpful to research, as mentioned above, it’s not what matters most.
I think one of the most important features in a good researcher is having initiative. This includes initiative to engage in research in the first place, but also taking initiative to reach out to people to be mentors or to get advice. People who are not afraid to approach other people to ask questions, will probably get more answers.
The second is intellectual humility. I’m going to guess that people who have a high degree of intellectual humility have gotten to know a complex subject deeply enough to know that the state of the field is much less settled and straightforward than what the textbooks say. That has taught them to be careful in reaching conclusions, whether in their field, or a new field that they’ve just gotten into.
It’s the opposite of the person who’s quick to say “It’s settled science.”
Unfortunately, it’s possible for someone to get a PhD in something without having thought about their own field deeply at all.
In the case of the COVID vaccines, my opinion is that the people who were questioning the safety of the vaccines, the so-called “skeptics,” generally had more humility than the ones who poo-pooed the safety concerns.
The skeptics were the ones who tended to acknowledge that biology was more complicated than the simplistic theory behind the mRNA vaccines- that perhaps using a single antigen (spike protein) could lead to unintended consequences like escape variants, or that immunity was not the same thing as antibody levels, or that it was bad that we didn’t have longterm safety data, or that it was bad that we didn’t know how much spike protein was produced, or that the lipid nanoparticles could lead to unintended consequences, or that perhaps substituting pseudouridine for uridine in the “mRNA” could have unintended consequences, like the “mRNA” lasting a lot longer in the body than “normal” mRNA.
Who were the more sophisticated thinkers in this case? The ones who brushed these possibilities aside? Or the ones who thought that there was more to the story; the ones doing their own research?
Hopefully, time will tell.
In the meantime, may your research be fruitful, and multiply.
I'm a computer programmer, "don't do your own research" makes zero sense in my field. By the time you're done with formal training, there are entirely different ways of doing things. You *have to* constantly be updating your skills on your own if you want to be at the bleeding edge of the industry. Moreover, it's completely possible for someone who's dropped out of High School to learn and be better at these concepts than someone with a PhD from a top school. Your worth comes from how well you accomplish the tasks in front of you, and education is merely a tool to help.
This reminds me of an article I read during college about the AIDS epidemic. I cited it as an avenue into explaining what's going on today, since at that time there was so much scientific uncertainty that doctors were really trying a lot of different things, much to the behest of the afflicted community. Eventually, many of those in the gay and black community started "doing their own research", and examining a lot of the scientific evidence alongside many medical doctors, and it got to the point that many of those in these communities actively participated in these medical meetings and engaging in the science themselves.
Of all things, I think COVID made the public generally aware of their naiveite when it comes to science and medicine. I think it led to a huge dichotomy where there now became those who wanted to dig a little deeper and those who just wanted to follow the approved narrative.
Unlike other times, the free dissemination of information through open-access journals meant that more people were allowed entry into the world of science, and many gatekeepers did not like that because controlling the route of information meant controlling the narrative. Allowing these people in meant that they could now question studies and criticize things that didn't seem right. I think that's why there was such a hard clampdown on fact-checking.
At the same time, I do believe there are people who are still learning about the "idea" of science more than engaging in scientific rigor, but even so I think even wanting to engage is a step in the right direction.
I do find it interesting that media pundits are those trying to tell us not to do our own research since we aren't experts. I have to wonder how much scientific research they have done themselves to disseminate COVID information to the public.