The Minutiae of Scientific Integrity

Scientific integrity might seem obvious and uncomplicated at the surface. You should not commit fraud, plagiarize, or mishandle those involved in your research, be it animals, human beings, or ancient clay tablets.1 Right? But when I was an aspiring young researcher myself, eagerly awaiting whether that new-fangled light bulb-fad would go anywhere, I found myself rather intrigued by some of the aspects of this necessary building block of any academic endeavor. During a course on the subject, I was struck by the heartbreaking and curious stories that were told, as well as by the fascinating questions that could still be asked and would probably not soon get an answer that might satisfy any and all moral intuitions. And today we are going to explore some of these stories and discuss a number of those questions!

To be crystal clear – unlike much academic writing, which is often forced to be obtuse – this is not a piece about how one can maintain scientific integrity as a professional or a lay person.2 I might write such a blog one day, when I have earned enough of your trust that you dutifully read everything I type. But until then, I confine myself to the more outlandish, the entertaining, and – sometimes – even the educational. And with thorough integrity, naturally!

This blog is also available in Dutch.

The Basics of Scientific Integrity

During the aforementioned course, I was referred to roughly two kinds of resources on the topic – in addition to lectures and discussions with other students that combined both. In the first place, there were the documents that were drafted by the organizations that have grown around the modern scientific endeavor. Like the various academies of science in the United States of America, the Royal Dutch Academy of Sciences (Koninklijke Nederlandse Academie van Wetenschappen), and the German Science Community (Deutsche Forschungsgemeinschaft). These documents mostly laid out the tenets of conscientious  and reliable research in a systematic manner, peppered with some interesting case studies and examples.3 And also a few of the books and essays by individual researchers on the topic were written in this vein.4 In the second place, there were those books and essays that interrogated all the ways in which the history of and the culture adjacent to the sciences – including some of the ideas that were or are still being held in high regard by those who are lucky enough to practice this craft – used to actively hamper scientific integrity and might still do so today.5

The basic tenets of scientific integrity that can be gleaned from these resources, are often formulated as lists. (And who does not like a good listicle?) Such lists might refer to best (and worst!) practices, useful attitudes that could explicitly be cultivated, and pertinent values one may consider. Common recommendations include principles like independence, responsibility, and transparency, as well as research designs which center honest communication and proper reporting.6 And as a general framework, such tenets are important, essential even. Because science as well as scholarship are based on trust. I have to trust, to a certain extent, the sources I cite. Because I cannot re-excavate Ugarit on my own, I cannot operate the technological marvels that made the scorched scrolls from Herculaneum legible, and I cannot re-read all the literary works that fueled the insights of the environmental humanities. Notwithstanding their importance, it should also be said that none of the tenets laid out in such documents – neither the practices, nor the attitudes or the values – will probably be able to prevent all breaches of scientific integrity. There shall still be identifiable instances wherein the work of researchers is employed for questionable ends without their intent or wherein people who work in academia willingly ignore the very basics of the relevant ethics. And we may in addition still encounter conundrums surrounding scientific integrity which do not have a clear cut answer or at least a resolution that might fit a list – and they probably never will until, all academics are replaced by an A.I. which subsequently deems flawless research impossible and shuts the whole thing down.7

Best Intentions, Worst Consequences

The choices of those occupied by science and scholarship, even if they were made with the best intentions, can have an adverse impact on society.9 Or the weird fiction stories in which scholars tinker with forgotten machinery, or rummage through old books and papers, and inadvertently release eldritch monstrosities unto the world.10 And while I myself have never perturbed the wrath of an unremembered supernatural being by reading the wrong dusty tome at a new moon, there are examples where scientists and scholars created something, or discovered hitherto lost information, that was not only used but also abused by others.

An illustrative case is the experience of Arthur W. Galston.11 Galston was a graduate student who wrote a thesis on a synthetic chemical that could assist plants to grow better at lower temperatures than they normally required. But in higher concentrations it proved to be toxic, causing the plants to lose their leaves instead. Galston was one of those lucky fellows, whose graduate work was taken seriously by others in his field. Sadly those others were military researchers from the United States of America, who used his work to develop the notorious compound Agent Orange. From 1962 onwards, 50.000 tons of this compound were used in the Vietnam War to open up dense jungles, but with devastating outcomes for the health of the people living there. Aghast, Galston went on to convince the scientific community and eventually the relevant politicians of the terrible consequences of this abuse of his research. And subsequently the use of the compound was phased out five years before the war ended.

Galston would later write about his insight that researchers should keep being involved with their research until the very end, as even not working on a project could mean that your efforts are employed to inflict harm.12 And the same can – to a different degree – be said for those who chose scholarship over science, I reckon. The spectacular finds from Mesopotamia, for example, have earned this area and part of world history monikers like ‘the cradle of civilization’ and adjacent high-minded designations. But such a privileging of this small part of the story of humankind – which also happened to ancient Greece, the Roman Empire, and the European Middle Ages, for instance – can cause the marginalization of other historical phases and places.13 It is also not entirely accurate. A lot had to happen during prehistoric times and further afield, to make sure that the most famous polities of Mesopotamia could come to the fore – as such, their existence and achievements constitute just one of many, equally important events.14 And this is only part of the baggage from the early study and popularization of the polities of the ancient Near East.15 For these and similar reasons, it is perhaps imperative that those working in the relevant fields keep communicating about the context, nuances, and relative importance of their discoveries. Even if we ourselves practice integrity, we may have to ensure that the interpretations of our research follow suit.

When Integrity is Optional

Though sometimes it is the people working in science or engaging in scholarship themselves who willingly throw integrity to the proverbial wind. Most of us may readily recall some of the more famous instances of such perversions of the trust held in science and scholarship by their colleagues and the public at large. In the Netherlands, there was a highly respected social psychologist, one Diederik Stapel, who “had been inventing research data and fabricating his own experiment results for years” before a group of PhD-students found out.16 Some decades previous, across the Atlantic, we find one William Summerlin, who drew on mice with a felt-tipped pen in order to create the impression of skin transplants.17 In both these cases, the missteps did not only involve the perpetrator, their research, and the general confidence in science and scholarship, but also the fate of PhD-students and animals. Especially when such third parties are jeopardized by the lack of integrity of humans with power over them, things get ugly.

Because such famous cases, where remedies and rectifications would eventually be implemented, should not make us forget that sometimes willful breaches of integrity might go undetected.18 And again, it will often be the less powerful parties that are harmed. The most blatant case that almost escaped notice – as it was only uncovered through a series of mind-bending coincidences – is that of Vijay Soman.19 In 1978, a junior researcher named Helena Wachslicht-Rodbard submitted a paper on anorexia nervosa in women to the New England Journal of Medicine. The journal, as is the proper procedure, send the paper out to anonymous reviewers. One of these, Philip Felig at Yale University pushed this chore onto one of his protĂ©gĂ©s, the aforementioned Vijay Soman. At the time Soman was working on a similar subject as Wachslicht-Rodbard. And so he copied her paper in order to pass it off as his own work and recommended Felig to reject the original. Soman’s plagiarized paper – which also carried the name of his boss as an author – was submitted to the American Journal of Medicine, which naturally sent it out for review. One of these reviewers was one Jesse Roth, who pushed the chore onto his protĂ©gĂ©. The name of that protĂ©gĂ©? Helena Wachslicht-Rodbard!

One can imagine the shock and surprise felt by Wachslicht-Rodbard.20 But this discovery of Soman’s fraud, which was only possible because the spurious paper happened to be send for review to Jesse Roth who decided to delegate this task to the person that was plagiarized, also shows the other kinds of harm that breaches of scientific integrity can inflict. Wachslicht-Rodbard had to fight an uphill battle to get any recourse, which took two years and she would eventually leave academia entirely.21 Scientific integrity, we can therefore state, is never a mere hypothetical subject that solely concerns abstract interests, like the trust in science or scholarship. Because the fates of humans and non-human animals are involved – including those who stand to suffer from the faulty science or scholarship produced. As such, choices regarding scientific integrity might seem easy – common sense even. But those choices are not always viewed as obvious by everybody, both by researchers as well as people outside of academia, when you get down to the persons that figured in real-life examples. It should therefore not have come as a surprise to me that, after my research in this matter, some questions kept lingering in my mind.

Conclusion: Lingering Questions

Most of my remaining questions had to do with those aforementioned ideas that were or are held in high regard and could still hamper science and scholarship today. I confine myself here to problematic notions of productivity and talent. One of the guides for scientific integrity that I read, introduced the three loyalties of a scientist: to the field, to society, and to themselves. But with regard to the latter loyalty, it was mainly said that a lack of scientific integrity might jeopardize achievements such as procuring degrees and grants.22 While I would say that you fail yourself by not acting with integrity, specifically because you have not adequately acquired and disseminated knowledge, as well as potentially inflict harm and depriving society as a whole. In another book on scientific integrity, the matter of ‘overambition’ was brought up, which was characterized as a flaw of the very talented that lack a sense of patience.23 But the assumption underlying this potential cause of unwitting breaches of scientific integrity, is that talent – well – exists. This is a controversial hypothesis, however, that has since long been debated in ethics and philosophy.24 One might therefore wonder if a distinction between such a sense of overambition and ‘misplaced ambition’, which would refer to those who overreach relative to their capacities and consequently conduct shoddy research, is productive when discussing the causes of breaches of scientific integrity.25 One could, for example, first investigate if a young researcher is sufficiently embedded within an academic structure and has received proper supervision, before the matter of their aptitude for science is even brought into question. And these are not nitpicks, as such problematic notions of productivity and talent can themselves be part of a culture which inhibits the tenets of scientific integrity. If one has to prove one’s talent continuously and mostly through productivity, some cut corners – like not checking the citations you adapt from other works – may suddenly seem an uncomfortable necessity.26

Those were some of the minutiae of scientific integrity. There is, of course, a lot more to talk about. From the mistreatment of test subjects – and sometimes even their remains! – to the academic battles that had to be waged in the previous century against the utmost evil, the vile bigotry, and the sheer stupidity of the bogus idea of eugenics, which used to be considered – believe it or not – part of mainstream science.27 And such stories, just like those about Agent Orange and our not-so-merry band of fraudulent academics, are more than gruesome history! They constitute – to a certain extent – also cautions for those of us doing research today. Because, despite the existence of handy listicles and step-by-step guides, a lot can still go wrong with regard to scientific integrity, intentionally or inadvertently. And the same holds true for a blog that purports to disclose scholarship for the digital masses. But I will try my best to continue to be honest, responsible, and overly generous with footnotes!

Please enable JavaScript in your browser to complete this form.

References

  1. Kees Schuyt, Scientific Integrity: The Rules of Academic Research, translated by Kristen Gehrman (Leiden: Leiden University Press, 2019), p. 9-10.
  2. Helen Sword, Stylish Academic Writing (Cambridge: Harvard University Press, 2012), p. vii-viii.
  3. See for example: Committee on Science, Engineering, and Public Policy of the National Academy of Science, the National Academy of Engineering, and the Institute of Medicine of the National Academies, On Being a Scientist: A Guide to Responsible Conduct in Research (Third Edition) (Washington: The National Academies Press, 2009); Committee for the Revision of the Netherlands Code of Conduct for Research Integrity, Netherlands Code of Conduct for Research Integrity (The Hague: Koninklijke Nederlandse Academie van Wetenschappen, 2018); Klaus-Michael Debatin et al, Guidelines for Safeguarding Good Research Practice Code of Conduct (Bonn: Deutsche Forschungsgemeinschaft, 2022).
  4. For a valuable example that mostly follows this model, see: Schuyt, Scientific Integrity.
  5. For perhaps the most comprehensive example, see: Jonathan Marks, Why I Am Not A Scientist: Anthropology and Modern Knowledge (Berkely: University of California, 2009).
  6. Committee for the Revision of the Netherlands Code of Conduct for Research Integrity, Netherlands Code of Conduct for Research Integrity, p. 13, 17-18.
  7. Irony aside, the generative large language models that most people currently reference when they talk about A.I. may have an impact on scientific integrity – good, bad, or both – see for example: Annette Flanagin et al, “Nonhuman ‘Authors’ and Implications for the Integrity of Scientific Publication and Medical Knowledge”, JAMA 2023, 329 (8), p. 637-639.
  8. A fact that narrative media seem all too aware of. Recall the stark warning in the 1993 movie Jurassic Park about the paths science could but perhaps shouldn’t take.8Vickie J. Williams, “The ‘Jurassic Park’ Problem: Dual-Use Research Of Concern, Privately Funded Research And Protecting Public Health”, Jurimetrics 2013, 53 (3), p. 361-362.
  9. William F. Touponce, Lord Dunsany, H.P. Lovecraft, and Ray Bradbury: Spectral Journeys (Lanham: Scarecrow Press, 2013), p. 79-81.
  10. Committee on Science, Engineering, and Public Policy of the National Academy of Science, the National Academy of Engineering, and the Institute of Medicine of the National Academies, On Being a Scientist, p. 49.
  11. Arthur W. Galston, “Science and Social Responsibility: A Case History”, Annals of the New York Academy of Science 1972, 196 (4), p. 223.
  12. Nontheless a prime opportunity for your regular reminder to think about the Roman Empire!
  13. Mario Liverani, The Ancient Near East: History, Society and Economy, Translated by Soraia Tabatabai (New York: Routledge/Taylor & Francis Group, 2014), p. 4-5.
  14. John Maier, “The Ancient Near East in Modern Thought”, in: Jack M. Sasson (ed.), Civilizations of the Ancient Near East (New York: Charles Scribner’s Sons, 1995), p. 107. For more specific examples, see some of the earlier chapters in: Agnùs Garcia-Ventura & Lorenzo Verderame (eds.), Perspectives on the History of Ancient Near Eastern Studies (University Park, PA: Pennsylvania State University Press, 2020). Such baggage, that still influences the views of people today, can also be seen in the study of ancient myths, see: Eric Csapo, Theories of Mythology (Malden: Blackwell Publishing, 2005).
  15. Schuyt, Scientific Integrity, p. 31.
  16. Marks, Why I Am Not a Scientist, p. 179-181; Schuyt, Scientific Integrity, p. 32.
  17. Marks, Why I Am Not a Scientist, p. 180.
  18. Schuyt, Scientific Integrity, p. 72.
  19. Ibidem.
  20. Marks, Why I Am Not A Scientist, p. 181.
  21. Committee on Science, Engineering, and Public Policy of the National Academy of Science, the National Academy of Engineering, and the Institute of Medicine of the National Academies, On Being a Scientist, p. 2.
  22. Schuyt, Scientific Integrity, p. 77.
  23. John Rawls, A Theory of Justice (Cambridge: Harvard University Press, 1971), p. 73-74; Norman Daniels, ‘Democratic Equality: Rawls Complex Egalitarianism’, in: Samuel R. Freeman (ed.), The Cambridge Companion to Rawls (Cambridge: Cambridge University Press, 2003), p. 245.
  24. Schuyt, Scientific Integrity, p. 77.
  25. lagiarism of secondary sources”, see: Brian Martin, “Plagiarism Struggles”, Plagiary: Cross-Disciplinary Studies in Plagiarism, Fabrication and Falsification 2008, 3 (1), p. 4. For an elaboration, see: Brian Martin, “Plagiarism and Responsibility”, Journal of Tertiary Educational Administration 1984, 6 (2), p. 183-190.
  26. Marks, Why I Am Not A Scientist, p. 64-70, 120-124, 135-136, 137-139, 198-227.