This week’s BEACON Researchers at Work blog post is by University of Texas graduate student Amir Shahmoradi.
Summary: In a world in which science and technological breakthroughs dominate all aspects of almost every individual human life, scientists and researchers are under an ever increasing pressure to cross and expand the borders of human knowledge. As new discoveries require higher levels of precision and reproducibility, excess workload and hyper-competitive work environments have made researchers more prone to human cognitive biases. A solution to this emerging problem is to introduce all graduate students in STEM fields with the limitations of human mind and scientific instruments and their potential role in false positive discoveries and misconduct of scientific research. I suggest that a full-semester course that covers relevant topics including those mandated by NSF as Responsible Conduct of Research should be developed and tailored for each individual STEM field of research and be offered as an integral core course of every graduate program across the world.
Growing up in a traditional and highly religious society, I was drawn from an early age to the romantic mystique of ancient religious and philosophical writings. I joined study sessions and participated in lively discussions with religious scholars. But living in an academic household, I gradually developed a sense of scientific skepticism that led me to question the basic tenets of this knowledge. By contrast, science and mathematics seemed so captivating to me as a teenager for a very simple reason: Science is based on observation, evidence, and mathematics. It is universal, independent of people, society, religion and ideologies.
My passion for science, in particular Astronomy, Physics and Biology kept growing, until I stumbled on a post dubbed “The Same Color Illusion” in Astronomy Picture of the Day (APOD), which profoundly changed the way I view and perceive the world around me ever since. This APOD post showcased a simple example of human cognitive bias and how it can affect our perception of similar and different colors, with a simple clear message: “What human senses perceive of the world, does not necessarily reflect the reality.”
The psychological literature is full of studies that demonstrate how human’s limited senses can result in cognitive flaws and biases in our understanding of the universe. In fact, psychologists have pinpointed many types of biases that affect not only the way we see but how we think about and react to the world around us. Confirmation bias, for example, is the tendency to notice, accept, and remember data that confirms what we already believe, and to ignore, forget, or explain away data that is contradictory to our beliefs. To make things worse, add the (unknown) limitations of instruments by which human probes the universe. The combined effects of human and instrument biases can result in erroneous conclusions and predictions.
Fortunately, many of such biases are now well understood by scientists, in particular, by experimental physicists, biologists and observational astronomers. A worked-out example is the well-known Malmquist bias in observational astronomy. Nevertheless, as our circle of knowledge expands, so does the circumference of darkness surrounding it, bringing new types of instrumental and human cognitive biases with it, that might affect human’s understanding of natural phenomena.
Today, we live in a world that relies heavily on science and technology. As a result, the number of scientists has also grown exponentially rapidly over the past century. With limited funds and resources now available to the community of scientists, the competition and work stress has also increased steadily among researchers.
In such a hyper-competitive atmosphere, scientists are more prone to perception and cognitive biases due to excess workload and stress. There already exist websites, such as Retraction Watch, that regularly report new examples of wrong scientific papers, and papers that contain fake or irreproducible results, forgeries and plagiarism.
The two major funding resources of science in the United States, the National Science Foundation and the National Institute of Health have already stepped in to mitigate the increasing trend that is seen in irreproducibility of scientific discoveries and retractions of scientific articles, before scientists lose the public’s trust in their work. Examples of actions taken include new rules for validating scientific discoveries and mandatory Responsible Conduct of Research (RCR) for all students and postdocs supported by NIH and NSF funds.
Personally, I cannot believe that any scientist in the world would intentionally want to fake results or commit plagiarism or be involved in any other unethical action. Over the past decade, I have witnessed how human cognitive biases can affect the minds and scientific results of numerous scientists. I have seen scientists who insist on the accuracy of their wrong discoveries, and in many cases, I have become convinced that there is no personal intention involved in their stance. I have been very fortunate to work on some specific research projects that opened my mind to many of the limitations that we humans and our scientific instruments face in probing and understanding the universe.
I personally believe the RCR trainings mandated by NIH and NSF can become even more efficient, if they were instead offered as a mandatory comprehensive full-semester course, for all graduate students in all STEM fields, a course that would also cover the myriad of human cognitive biases and instrumental limitations that would meddle with reasoning of every scientist and their understanding of natural phenomena. Regardless of where these students end up, whether academia or industry, whether they are funded by NSF/NIH or not, every student in science programs must learn about the limitations of human mind and its potential adverse effects in scientific reasoning and discoveries.
For more information about Amir’s work, you can contact him at a dot shahmoradi at gmail dot com.