U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • PMC10129971

Logo of springeropen

How Do Scientists Perceive the Relationship Between Ethics and Science? A Pilot Study of Scientists’ Appeals to Values

Caleb l. linville.

1 Department of Philosophy, Kansas State University, 1116 Mid Campus Dr North 201 Dickens Hall Manhattan, 66506-0803 KS Manhattan, United States

Aidan C. Cairns

2 Physics Education Research, Department of Physics, Kansas State University, Manhattan, United States

Tyler Garcia

Bill bridges, jonathan herington.

3 Department of Philosophy, University of Rochester, Rochester, United States

James T. Laverty

Scott tanona, associated data.

De-identified, coded data is available upon request to the corresponding author.

Not applicable.

Efforts to promote responsible conduct of research (RCR) should take into consideration how scientists already conceptualize the relationship between ethics and science. In this study, we investigated how scientists relate ethics and science by analyzing the values expressed in interviews with fifteen science faculty members at a large midwestern university. We identified the values the scientists appealed to when discussing research ethics, how explicitly they related their values to ethics, and the relationships between the values they appealed to. We found that the scientists in our study appealed to epistemic and ethical values with about the same frequency, and much more often than any other type of value. We also found that they explicitly associated epistemic values with ethical values. Participants were more likely to describe epistemic and ethical values as supporting each other, rather than trading off with each other. This suggests that many scientists already have a sophisticated understanding of the relationship between ethics and science, which may be an important resource for RCR training interventions.

Introduction

One challenge to promoting ethical behavior in science is that scientists sometimes view ethics as being external to science (Hempel, 1965 ; Lacey, 1999 ; Betz, 2013 ). Douglas ( 2000 ) notes the influence of the long tradition of the “value-free ideal” that holds that value-laden decision-making about science is limited to the choice of projects and application of products in society. However, there is a growing set of analyses of science that suggest value questions cannot be ignored in science (Douglas, 2009 ; Brown, 2013 , Biddle, 2017 ). These analyses suggest that ethical science requires attention to the consequences of decision-making at a variety of stages within science, and that what it means for a research project to be “good science” is not merely a matter of epistemic norms. On these views, ethics is integral to science, since good scientific methodology requires attention to both epistemic and non-epistemic values. Research fraud, for example, not only undermines the epistemic goals of science, but wastes future researchers’ time and funding, imposes opportunity costs, and risks harm to society through actions that might be based on falsified research. Even an apparently non-problematic methodological decision such as a choice of statistical method can carry ethical consequences, for example by influencing the ratio of false positives to false negatives.

Given these arguments, we hypothesize that knowledge of how scientists view the relevance of ethics to their work, and especially to their epistemic goals as researchers, is critical to efforts to promote ethical behavior in science. For example, do they view ethics primarily as external standards separate from methodological criteria for achieving epistemic goals such as accuracy? Or do they rather think of ethics as a way of working towards epistemic goals? While there is a body of research that suggests research ethics training is largely ineffective (Antes et al., 2010 , Kalichman, 2014 , Mumford et al., 2015 ), there is little work that investigates how we might improve ethical behavior with knowlege of how scientists use and move between science, ethics, and epistemic values.

This paper investigates how scientists think about the role of ethical values in science, and how they relate ethical values to other values, both epistemic and non-epistemic, by identifying the values scientists invoke in answers to questions about ethics in their own work and about research ethics vignettes. 1 To that end, we investigate how scientists relate ethics and science in three ways: (1) we look into the types of values that scientists appeal to when faced with ethical questions; (2) we analyze how explicitly scientists relate epistemic values with ethical values; and (3) we explore the relationships that scientists express between different values.

Theoretical Framework and Background

Approach to research and learning.

Modern theories of learning suggest that people actively make their own knowledge, building from what they have previously learned (Steffe & Gale, 1995 ). From this perspective, we argue that teaching scientists to engage in research more responsibly will be more effective if they are built around scientists’ existing ideas and practices. Such an approach assumes that individuals’ conceptions are made up of small pieces of reasoning (referred to as “resources”). Resources are neither right nor wrong by themselves, but may be properly or improperly applied within a given context (Hammer et al., 2005 ). Interventions built around these ideas focus on identifying individuals’ productive reasoning and providing them opportunities to apply and refine the application of that reasoning. This lies in contrast to many existing training programs for RCR, which focus on delivering information about ethical standards. In this paper, we work to identify resources that scientists use when reasoning about ethical concerns: namely, the values they invoke and the relationships between those values.

Values in Science

The resources of interest for this study are the values of scientists that, together with their beliefs about the world and evidence they uncover, drive their decision-making. Hausman ( 2011 ) describes values in terms of preferences, wherein if a person prefers one thing to another, they value that thing. It follows from this that values are relational; things are being valued, and someone is doing the valuing. This further implies that there can be hierarchies of values, that some values might be foundational, and that a goal is a type of intentional value. 2 Moreover, values can have different relationships to one another. Values can conflict; for example if a person would prefer to realize two things, but it is only possible to realize one. Values can be supportive as well; for example, if the achievement of one value serves as an instrument for the pursuit of some other.

Scientific values are the aims scientists try to achieve qua scientist and the things they prioritize in practice. Science is not a monolith, and these vary with discipline, institution, and even individual, down to the motives, incentives, and goals that drive scientists’ decision making. These values may include epistemic goals, personal aims, or ethical principles. Scientists might value truth as a general goal of science, and thus value the uncovering of particular facts as a specific goal for their research. They might have particular career interests. They might be motivated to do their work to help society.

Prior Work on Scientists’ Beliefs About Values in Science

Which values may appropriately drive scientific inquiry, and how scientists should attend to non-epistemic values in the pursuit of science, has been a matter of debate. The so-called value-free ideal indicated that science itself should remain value-free, even if the reasons for doing that science may be value-driven. On this view, while scientific practice might be constrained by ethical rules and principles, and some choices of projects or applications might be value-laden, science itself pursues purely epistemic goals such as attaining knowledge, understanding, and truth. Responsible conduct of research might require behaviors such as good record-keeping for the purpose of furthering science, but responsible conduct here is an instrument for epistemic goals rather than for independent ethical reasons.

Douglas and others (e.g., Douglas, 2009 ; Brown, 2013 ; Biddle, 2017 ) suggest that this view is mistaken, that the value-laden implications of decisions are unavoidable, and that scientists should more actively consider non-epistemic values. For example, Douglas’s ( 2000 ) examination of dioxin studies on rats demonstrates how decisions throughout research can affect the balance of the “inductive risk” of wrongly accepting or rejecting a hypothesis, with correspondingly different potential effects on human health. Douglas argues that scientists’ decisions have unavoidable ethical dimensions and are value-laden whether scientists attend to those values or not. She further argues that the appropriate response to the recognition of the impact of those choices is to explicitly attend to those potential consequences in at least some stages of scientific decision-making. The suggestion that value-ladenness is unavoidable picks up on a range of literature suggesting the inevitable entanglement of science and values (e.g., Rudner 1953 ; Graham, 1979 ; Kuhn, 1996 ; Myrdal, 1970 ).

More recent values-in-science literature identifies different approaches to the involvement of non-epistemic values. While Douglas ( 2000 , 2009 ) suggests that the reality of inductive risk undermines the distinction between epistemic and non-epistemic values, she argues for only an indirect role of values, where the ramifications of wrongly accepting or rejecting a hypothesis should be considered in the choice of methods, but valuations of potential consequences should not be used directly to determine conclusions (e.g., data should not be rejected because they do not favor a hypothesis). Steel ( 2010 ) retains the distinction but argues that non-epistemic values should have a role in scientific inquiry because of inductive risk; he claims that social costs associated with acceptance or rejection of hypotheses should determine evidential thresholds. Other theorists hold that non-epistemic values should be restricted from science. For example, Lacey ( 1999 ) argues normatively that science should be value neutral, and that theories should be evaluated solely on epistemic merit, and Resnik and Elliot ( 2019 ) suggest that rejecting the value-free ideal risks undermining the integrity of scientific research.

While there has been considerable normative and philosophical work on values in science, empirical studies of what scientists think of the role of values in science are more limited. While there has been some discussion (O’Rourke & Crowley 2013 ; Robinson et al., 2019 ; Beebe and Delsén, 2020 ; Schindler, 2022 ) about the relative importance scientists place on different, mostly epistemic, values, few focus on both epistemic and non-epistemic factors. 3 However, there is evidence that some scientists recognize non-epistemic values as having a place in scientific inquiry. For example, Steel et al. ( 2018 ) found that the value-free ideal is not an unequivocally dominant viewpoint. They found a tendency for scientists in their survey to hold that science could be objective and guided by societal values simultaneously. They also found that participants that identified as female and participants in non-natural sciences were more likely to depart from the value-free ideal. We extend this research by identifying the types of values that scientists invoke specifically in the case of ethical questions. If scientists generally hold to the value-free ideal, then we should mostly see appeals to epistemic values. However, if they do not hold that view, then we should expect to see a mix of epistemic, ethical, and other values.

Some studies suggest that scientists view science as being constrained by certain non-epistemic values. Kempner et al. ( 2005 ) states that scientific inquiry might be restricted because certain questions can only be addressed using unethical means. Additionally, non-epistemic concerns, such as social pressure and criticism, might restrict science in concert with ethics. Wolpe ( 2006 ) claims that scientists might avoid thinking about ethics because they view ethics as arbitrary restrictions. We want to see if scientists articulate epistemic and ethical values as being in positive or negative relationships with each other. If scientists consistently describe ethical and epistemic values as trading off with each other, that could imply that scientists see ethics as a restraint on science. If they see them as supportive, we want to know how and in what ways: e.g., do they view research ethics primarily as something needed to help achieve epistemic goals as a community?; alternatively, does potential benefit to society motivate epistemic values of science?

Pennock and O’Rourke ( 2017 ) suggest a value approach to integrating ethics into science using the concept of scientific virtues, character traits that are conducive for achieving the goals of science. They claim that scientific virtues can be implemented through theory-centered, exemplar-centered, or concept-centered methods. We look at a broad range of values, including scientific and epistemic virtues together with explicitly ethical and social values, as well as how scientists relate them. A better understanding of how scientists relate values can help us identify appropriate methods for integrating ethics in science.

Research Questions

We identified three research questions.

  • What types of values do scientists appeal to when reasoning about ethics?
  • What types of values do scientists explicitly associate with “ethics”?
  • How do scientists relate ethical and epistemic values?

Definitions of Categories

We defined eight different types of value, utilizing a range of existing definitions in the literature. Following Steel ( 2010 ), we separate epistemic and non-epistemic values. For our purposes in this study, we construe the epistemic category broadly to include not only appeals to truth and other empirical concerns, but also to values like simplicity and explanatory power. Non-epistemic values include all other values that may influence scientific decision making.

We differentiate non-epistemic values into several categories (see Table  1 ). Some of the categories (Ethical, Communitarian, RCR/Legal, and Self-interest) are based on Rest et al.’s ( 2000 ) neo-Kohlbergian model of ethical reasoning, which differentiates between reasoning based upon personal interest, maintaining norms, and postconventional moral values. While we reject a strict ordering of values, we want to differentiate between broadly ethical values such as explicitly utilitarian or deontological values and other types of other-directed values such as reciprocal or rule-following reasoning. Following the general line that postconventional thinking is more properly identified as “ethical”, we differentiated communitarian values from ethical values in order to distinguish ethical motivation from the desire for social approval. We also included a category to capture cases where researchers valued maintaining institutional and legal standards, including “RCR” rules of conduct set by professional or granting agencies. The category of practical values was introduced after we noticed that interviewees would appeal to values that were related to furthering other goals, but were not clear about the nature of the further goals; all other categories had been defined prior to coding. Subcategories of values were developed in the first pass of categorizing, to allow more fine-grained analysis. A more detailed list of categories and their definitions can be found in the Appendix .

Definitions of Categories of Value Appeals

The seven categories of values and their definitions. Definitions of subcategories are in the Appendix .

Our data comes from interviews with scientists that took part in a year-long fellowship program oriented around improving RCR by using explicit discussions of the values of science. This fellowship occurred at a large midwestern university, and had fifteen participants and one facilitator. There were six participants from the discipline of biology, three from chemistry, three from physics, two from biochemistry, and one from geology. Three of them were female, and twelve were male. All participants held tenure-track positions and eleven of the participants already had tenure. Participants were recruited using a snowball methodology involving email, word of mouth, recommendation, and explicit invitations by the fellowship organizers. Recruitment intentionally tried to promote diversity in terms of gender, race, and academic status.

The fellowship consisted of meetings throughout the academic year that focused on goals and values of science, such as the relative priority of truth, predictive accuracy, and social benefit, or the consequences of choices regarding statistics. Participants were interviewed about ethics both before and after the fellowship. Interviewees were told that the interview was not a test to judge whether they were behaving ethically, but rather to learn how they reason with ethical and epistemic values. This study focuses on the pre-fellowship interviews.

Data Collection

Data were collected from fifteen interviews with science faculty focused on the relationship between ethics and science. The interviews took place before the fellowship sessions. The interviews were conducted in private by one of the authors, who is a male philosophy professor. Two of the participants had had prior professional interactions with the interviewer through an education training project; two had had previous personal interactions with the interviewer; and two had prior knowledge of the interviewer. The interviews took place either in the campus office of each participant or (in one case) in the interviewer’s office, and were video and audio recorded. Field notes were taken during the interviews, but not used in this study. All of the interviewees responded to the same interview questions, divided into two main sections: the first featured questions about their general experience with ethics in their own careers, and the second asked questions about fictional research ethics vignettes (see Table  2 ). We included each set of questions to see how scientists reason about ethics in both their direct experience and hypothetical dilemmas. Additionally, the vignettes were designed to elicit thinking about tradeoffs. Transcripts were not returned to participants for correction or revision after the interviews.

Summary of the Interview Questions

Responses to the first question were not analyzed in this study.

There were four questions about the interviewee’s general experience with ethics, and three fictional vignettes. Two of the vignettes were adapted from the Ethical Decision Making Measure (Mumford et al., 2006 ) and had subparts which focused on a different RCR topic. In total, there were six topics. The four questions and the topics in each of the three vignettes are listed in the tables below. In the portion of the interview focused on general experience, follow-up questions were asked based on the responses to the original questions. In the section with the vignettes, a fictional vignette would be presented and then the interviewee was asked what they would do in that situation.

Data Coding

All fifteen interviews were transcribed using an automated transcriber (otter.ai). The transcripts were updated for accuracy by the coders as needed during the coding process. Coding was performed by viewing the videos alongside the transcripts. We first coded all fifteen interviews based only on the general experience questions, and then coded the vignettes questions. The answers to one interview question were omitted, since it asked about the interviewee’s research in general. That question did not address our larger inquiry, since it was only about research, and not about ethical issues.

While viewing the interviews, coders looked for statements that implied the interviewee valued something or had a goal of achieving something. Since a value is something that guides or motivates actions, coders looked for statements where the interviewee stated that they did, would, tried to, wanted to do something, or simply desired or preferred something. We operationalized this process by fitting the value into the sentence “The interviewee has a goal of doing X,” or “The interviewee cares about X.” By operationalizing in this way, we ensured that we identified objects of valuing rather than general statements about the world, and avoided coding descriptive statements made by the interviewees. The search for value statements relied solely on the videos of the interviews, which would be watched twice. When we identified a value that fit into one of those sentences, we identified the surrounding quote in the transcript, fixed the transcribed quotation as necessary, and then copied that quotation into a spreadsheet to document it. Often, the quotations were parts of a longer monologue by the interviewee; when this was the case, the entire the entire discourse was not included in the spreadsheet. We included enough of what the interviewee said to give appropriate context for the value or goal being appealed to, and we ended the quote when either the interviewer started talking, or when the interviewee shifted to a different subject.

The specific values we identified from the quotes were documented alongside them in the spreadsheet, along with time stamps. The values were documented to be as close to the original wording of the interviewee as possible. We avoided using synonyms, and we also avoided paraphrasing, over-summarizing, or inferring implicit values. When the interviewee expressed a value not actually held by her or him (for example, one interviewee spoke about one of his students valuing precision), we did not include those values in the analysis because their status as motivation for the scientist was unclear. A quote could have multiple values. Examples of this coding can be found in Table  3 .

Examples of coding and categorization of quotes from the interview data

The specific part of the quote that the value is pulled from is in bold.

When an interviewee related one value to another, the two values would be documented separately, with a relationship indicating how the interviewee indicated they were connected. We used three different relationships: supporting, tradeoff, and prioritization. Supportive relationships were documented when the realization of one value was conducive to the realization of another value. Tradeoffs were documented when the interviewee indicated they held two values that could not be realized simultaneously. Prioritization relationships were documented in the case of a tradeoff where the interviewee indicated one value should be prioritized over the other. If the interviewee did not express a relationship between multiple values, we simply listed the values. Examples of relationship coding can be found in Table ​ Table4 4 .

Examples of Relationships

The specific part of the quote that the values and relationship are pulled from is in bold.

We also noted when the interviewee explicitly associated values with ethics. When an interviewee was directly responding to the question “What does ethics mean to you?” or when they explicitly stated something was an ethical issue, we noted it. We wanted to capture these views in order to better understand how faculty thought about ethics.

An example of a value that the interviewee explicitly called ethical is “I think what I mean by that is, so, ethics is so fundamentally it’s like, do no harm or you know, work towards a common good.” In this case, we can infer the values of avoiding harm and working towards the common good, which are both ethical values. Another interviewee responded to the question “What does ethics mean to you?” with “Not falsifying data. Making sure you’re reporting accurate findings. Making sure your work is rigorous and reproducible. That sort of thing.” This interviewee answered with epistemic values.

Data Analysis

After documenting the quotes, values, and times, we placed the values into the theoretical categories defined in the Methods Section. These categories are Epistemic, Ethical, Communitarian, Self-Interest, RCR/Legal, Practical, and Economic. Each category had several subcategories (for example, the category Ethical was divided into Rights, Fairness, Equality, Social Good, Virtue, and Care). All of the categories and their subcategories can be found in the Appendix . When placing a value into a category, we placed it into a subcategory first, which was tied to the larger category. To determine the category both the value itself and the quote that it was extracted from must be considered. We categorized the values based on how the interviewee explained the motivation for holding that value, which could be found in the quote from which the value was extracted.

For example, one interviewee said:

Oftentimes we say that science is really about trying to find the truth about how things work. And so, if you’re not doing things in the right way or in the proper way, you can definitely get the wrong results, or actually cause you to go down a path that’s not right and proper. If you’re simply about getting the results instead of actually getting the truth out of the results, then that can lead to unethical behavior.

From this quote, we pulled four values and two relationships. First, the interviewee implies that consequences can arise from not doing things in the right or proper way; we coded the value of “Doing things in the right way,” and categorized that as Ethical. Second, he says that doing things in that way can cause you to get the wrong results. We coded this as “Getting the right results,” which was categorized as Epistemic. Those two values are related to each other, since he implies that doing things in the right way supports getting the right results.

The interviewee also claims that it is a mistake to care more about simply getting results than about getting truth from the results. We coded the value of “Getting the truth out of the results,” and categorized that as Epistemic in the subcategory of Alethic. We also coded the value of “Simply getting the results,” which was categorized as Epistemic. These two values were connected by the relationship of “Getting the truth out of the results” being prioritized over “Simply getting the results.”

Another interviewee said:

Ethics, um, that’s a good question. I mean, as a working scientist, I think it has a lot to do with reproducibility of results. Like if we’ve said we’ve done this experiment, and we get this result, I should have some sort of confidence that we’ve described the experiment well enough, and that we sort of understand the sources of error and the like well enough, that if someone else sets out to reproduce our work they should be able to.

From this, we noted first that the interviewee is explicitly referring to ethics in this quote. He says that ethics has a lot to do with reproducibility, so we coded the value as “Reproducibility,” which was categorized as Epistemic in the subcategory of Methodological, with an explicit association with Ethics.

After categorizing the data, we ran an inter-rater reliability check for the categorization of the values. The checks on categories were only done on values that had been agreed on by all coders. Inter-rater reliability checks on categories were done with three coders. For these checks, one coder would send a series of quotes that he or she had analyzed. The two other coders would analyze the quotes and record their categorization of the interviewee’s values. The original coder would compare their work to the other coders’ work, and then all the coders would meet and discuss which categorizations to accept when there was a disagreement. The categories were predefined, and all coders used the same list. Inter-rater reliability checks were done on 45 values; the Fleiss’ Kappa value was 0.89, which signifies “almost perfect agreement” (Landis & Koch, 1977 ).

After categorizing the values in the spreadsheet, we calculated the frequency and relative percentage of each category, as well as the subcategories. We calculated these for the fifteen interviews. Additionally, we calculated the frequency of different relationships, as well as the two value categories that they tied together. We did these calculations twice; once with all of the values, and once with only the values with explicit association with ethics. Participants did not provide feedback on findings.

As seen in Fig.  1 , the two largest value categories are Epistemic and Ethical. All of the other categories accounted for markedly fewer value appeals than the Epistemic and Ethical categories. Overall, epistemic values were appealed to more often than ethical values; however, in the vignettes, slightly more ethical values are appealed to than epistemic values.

An external file that holds a picture, illustration, etc.
Object name is 11948_2023_429_Figa_HTML.jpg

Number of Appeals to Values by Category

Epistemic values appealed to includes such goals as reproducibility and confidence about data being reported, such as in this example:

“Until we know exactly what’s happening, we just call that the end of the line. So if we aren’t 100% certain on what we believe, the results, and the data we have is repeatable and conclusive, we won’t publish at all.”

Ethical values expressed often appealed to betterment of society or avoidance of harm, such as in this example:

The right thing means, not just for me to profit, but for humanity to be better off, and for sure, for humanity not to suffer from what I do.

The quantities of each category are shown in Table  5 .

Epistemic and Ethical are also the largest categories of values that interviewees explicitly associated with ethics (see Fig.  2 ). Notably, epistemic values accounted for the largest category when only considering these values. The interviewees explicitly denoted practical values and self-interested values as ethical zero times. These numbers are reflected in Table  6 .

An external file that holds a picture, illustration, etc.
Object name is 11948_2023_429_Figb_HTML.jpg

Number of Appeals to Values Explicitly Associated with Ethics by Category

An example of epistemic values explicitly associated with ethics is found in the following quotation:

It would be unethical to disseminate information that you’re not confident is reliable… I think it’s our responsibility to report data that we have seen consistently, and, not necessarily, this goes along with not falsifying, you know, maybe massaging data, to complete your story, the hypothesis that you love.

The values expressed by the interviewee are epistemic: reliability, repeatability, and truth of the data. However, the quote begins framing these as ethical concerns, indicating that the epistemic goals are of ethical value.

Scientists in the study expressed more supportive relationships between epistemic and ethical values than negative relationships (see Fig.  3 ): 10.1% of values were connected by tradeoffs, and 31.4% were connected by supportive relationships, for a total of 41.5% connected by some relationship. The most frequent supportive pair were epistemic values with other epistemic values. Ethical values were expressed as supporting epistemic values at a similar rate as ethical values supporting other ethical values. Fewer tradeoffs were expressed than supportive relationships; the interviewees mentioned fewer examples of ethical values trading off with epistemic values than those types of values supporting each other, and zero instances of ethical values trading off with other ethical values were mentioned. Counts and percentages of the relationships between value categories can be found in Table  7 .

An external file that holds a picture, illustration, etc.
Object name is 11948_2023_429_Figc_HTML.jpg

Relationship between Value Categories

Relationships Between Value Categories

‘Value-value relationship’ refers to possible combinations of value categories interacting with each other; for example, ‘Epistemic-Ethical’ refers to epistemic and ethical values interacting with each other.

Tradeoffs between epistemic and ethical values were expressed seven times. For example:

This is a truth-seeking exercise. And getting the truth out is more important than maintaining someone’s ego.

The interviewee here was stating that the epistemic value of truth-seeking should be prioritized over the ethical value of caring for others’ emotional state, suggesting that in some circumstances the two values might conflict.

Another example:

I think there’s all sorts of interactions about how you interact socially and professionally with the postdocs and grad students, you know how do you manage, you want to be pushing their research productivity, but you don’t want to be making them miserable either.

The interviewee here expresses an ethical value of avoiding harm to postdocs/graduate students, and an epistemic value of maximizing research productivity. In this example, the interviewee does not indicate that one value should always be prioritized over the other, but just indicates that they could come into conflict.

On other occasions, interviewees indicated that ethical values supported epistemic aims. For example, one interviewee said the following:

There is an assumption anytime you submit a grant that it will be reviewed without bias, and that system only works when there is voluntary reporting of bias. So it’s kind of a greater-good scenario, where if you want the granting system to work you have to report times when it would be advantageous for you to act.

Here the voluntary reporting of bias is an ethical, “greater-good” value that supports the epistemic value of having a bias-free grant reviewing system.

These results allow us to address our three research questions: what types of values do scientists appeal to when talking about ethics; which of those values do they explicitly associate with ethics; and how do they relate epistemic and ethical values?

The scientists in our study reasoned about ethical problems using both ethical and epistemic values. We observe that the appeals to epistemic values and appeals to ethical values occur at roughly the same frequency, and that these two categories account for substantially more appeals than any other category. These overall patterns appear in both the questions about researchers’ general experience with ethics and the questions about the fictional vignettes. These patterns suggest a model of approaching research ethics not merely in terms of applied ethical principles, but also in terms of epistemic issues. We also found that scientists in this study rarely invoked legal or regulatory ramifications when considering research ethics. Grouped together, appeals to legal values and RCR rules only accounted for 4.4% of all value appeals. Together, these results cast doubt on the view that scientists view ethics as being external to scientific practice.

These were the values appealed to in reasoning about ethical problems. As to which values were explicitly associated with ethics, we found that the scientists in our study explicitly associated epistemic and ethical values with ethics more frequently than other types of values. This suggests that epistemic values are not just employed in ethical reasoning, but that ethical and epistemic values might be conceptually linked, and that their association might be available as a resource for research ethics training.

With regards to the question of how scientists relate epistemic and ethical values, our data do not support the view that scientists view ethics as a restraint on science, as suggested by Kempner et al. and Wolpe. Our sample expressed more examples of ethical and epistemic values supporting each other than trading off with each other. However, our study did not indicate whether this supportive relationship generally was directional, with ethics viewed more as an instrument for producing epistemically sound results or epistemic soundness viewed more as an ethical instrument for good.

Our study also did not uncover much explicit recognition of value tradeoffs in general. Understanding where and how values trade-off one another (for example, between autonomy and common good in many ethical dilemmas) is a critical aspect of good value-laden reasoning, but this type of reasoning did not appear in our study as a clear available resource for use in research ethics training.

Implications

Although our study is only a preliminary investigation of scientist’s value-laden reasoning, if its results prove to be robust, they have a number of implications for efforts to promote research ethics. First, the relatively low rate of appeal to regulatory reasoning suggests that regulations and authority might not be a substantial part of researchers’ individual decision-making; if scientists were primarily motivated by impositions of authority, then, all things equal, they would be expected to refer to regulatory guidelines at a higher rate than other factors. If it is not, appeals to institutional ethics rules in education or practice will fail to engage with scientists’ common resources for ethical reasoning. Ethics should not be presented as a set of rules or guidelines external to science, since our evidence suggests that scientists do not perceive ethics as relating to scientific inquiry in that way. The more common association of ethical and epistemic values suggests an alternative resource for research ethics training.

Since research ethics promotion should be most effective when it allows scientists to employ resources they already have in their thinking about ethics and science, research ethics training should recognize both epistemic and ethical values. Failure to engage with epistemic values, for example by focusing only on general ethical principles or a set of standard RCR guidelines, misses the opportunity to elicit the epistemic dimension of ethical reasoning scientists already have. The association of ethical and epistemic values suggests a benefit to emphasizing the importance of good research practice for both epistemic and ethical reasons. Connecting ethical and epistemic reasoning can also help make explicit the value science has for society and the responsibilities of scientists have in virtue of potential for societal good or harm.

To make use of this resource, efforts to ensure responsible conduct of research can offer narratives of how research ethics relates to epistemic aims, such as emphasizing the epistemic value of good record-keeping, and the social value of finding the truth. For example, in a discussion of plagiarism, instead of focusing on its deceptiveness or unfairness to those that actually did the work, a program might invoke discussion about its effect on the ability of others to trace the progression of ideas, and how that might hinder other scientists. Such a discussion would invoke resources scientists already have, of thinking about research ethics in epistemic terms, and could make explicit the distinctive responsibilities of scientists. It also would allow entry into more complex, entangled value-laden discussions found in the values-in-science literature, such as how decisions about tradeoffs between false positives and false negatives might depend on consequences of each in specific contexts, rather than by choosing a standard level of significance.

These possibilities suggest a promising research avenue of exploring the effect of stimulating scientists’ reasoning about both epistemic and ethical values. More work is needed on how responsible conduct could be promoted by improving and building on recognition of value relationships.

Limitations

There are some limitations on what can be inferred from our data. This study was done on a small sample and selection bias was possible. There were more men than women in the sample, which is notable in particular given the evidence of a correlation between gender identity and commitment to the value-free ideal (Steel et al., 2018 ). The relative importance scientists place on certain values cannot be properly inferred from the counts of the value categories, since the frequency of the categories is dependent on the structure of the individual interview, and there is some interpretation required to assign categories to statements. Similarly, no study of this kind can identify what the interviewees actually care about; we can only partially infer how they construct their reasoning according to the values they invoke in response to the general questions and hypothetical vignettes. As with any interviews on culturally-loaded topics, there is a possibility of social-desirability bias; interviewees might give answers that they think others want to hear. To reduce this effect, it was made clear to interviewees that the interview data would be anonymized and that the interviews were not tests of their ethicality. We also note that the coding is not an exhaustive list of the values that scientists appeal to: different scientists may appeal to a further set of values. Further research at a larger scale is therefore needed to establish the robustness of these results.

After conducting interviews about ethics with fifteen science faculty members, we analyzed the interviews for expressions of values. The values were identified by the scientist’s expression of a goal or ideal that they would like to realize. Those values were categorized as ethical, epistemic, economic, communitarian, RCR/legal, self-interest, or practical, and the frequency of the categories was calculated. We found evidence that scientists think of ethical concerns in research in terms of a range of values, rather than only in terms of ethical values, with epistemic and ethical values as the two categories most appealed to. Additionally, we found that the scientists in our study explicitly associated both epistemic and ethical values with ethics. We found little evidence that suggests scientists think about ethical problems in research in terms of legal ramifications. We also found that scientists frequently think of epistemic values and ethical values in terms of supportive relationships. This suggests that research ethics programs should potentially focus on the justificatory role of both epistemic and ethical values in responsible conduct of research.

Appendix: Lists of Categories

Types of values.

  • Rights (autonomy, respect, rights): The appeal is based on treating people in a certain way because of some intrinsic feature.
  • Fairness/equality (fairness, equality, esp re distributions of goods): The appeal is based on treating each person in the same way.
  • Welfare/Social good (“good for people”, “improves lives”, “saves lives”): The appeal is based on doing something that will benefit society or a group of people, without any specific mention of the individuals that will be affected.
  • Virtue (e.g., honesty, generosity, trustworthiness, character): The appeal is based on fulfilling some character trait that the interviewee believes to be desirable.
  • Interpersonal care/pro-social (“need to listen to people in your lab”): The appeal is based on the interest of another person; the interviewee has a desire to maximize the welfare of a specific person.
  • Standard RCR (credit for authorship, falsification, plagiarism, conflicts of interest, etc.): A value is appealed to because of regulations and guidelines set forth by RCR training.
  • Legal/regulatory (“it’s important we follow IRB rules”, “we have to follow biosafety laws”): An appeal is based on specific threat of punishment or sanction by a governing entity.
  • Social approval: An appeal is based on getting approval from a given community.
  • Peer approval: An appeal is based on getting approval from one’s peers.
  • Alethic (e.g. “the truth,” “know): An appeal is based on pursuing or clarifying knowledge or truth about something.
  • Explanatory/Understanding (e.g., “explain reality”, “understand our world”): An appeal is based on understanding some process.
  • Methodological (e.g. control, falsifiability, testability, reproducibility): An appeal is based on some specific part of the scientific method.
  • Aesthetic (e.g., “elegant theory”, “beautiful result”): An appeal is based on having a theory or explanation of some scientific phenomenon that is pleasing in some way.
  • Predictive (e.g., predicts, forecasts,): An appeal is based on conducting science in order to make predictions about the future.
  • Empirical (data-driven, open data, sharing information): An appeal is based on either the dissemination of data, or the proper use of data.
  • Technological/Applications: The appeal is based on using science to have some technological application, but no explicit other use or societal application.
  • Neutrality/Objectivity.
  • Economic (i.e. “use resources wisely”, “grow the economy”): The appeal is based on using somewhat fixed resources.
  • Self-interest (e.g., “get tenure”): An appeal is based on some benefit to the interviewee.
  • Practical (clean labs, preparedness, … --things that are instrumental for science generally, but without specific other categories): An appeal is based on actions that are otherwise value-neutral, but must be done in order for science to be conducted. This category was added due to a number of expressed values that referenced what was needed to get things done, but without an explicit explanation of why those things needed to be done, or what their ultimate purpose would be. Because these values dealt with the need to get day-to-day tasks done, but did not reference broader goals (such as epistemic or ethical), we decided to place them in their own category.

This material is based upon work supported by the National Science Foundation under Grant No. 1835366.

Data Availability

Code availability, declarations.

This research was determined as exempt from human subject research review by the Institutional Review Board of Kansas State University (Proposal # 9356). Subjects provided informed consent to their participation and were able to withdraw from the study at any time.

1 We use “research ethics” to refer specifically to ethics in research contexts, where the context might not distinguish between ethics in research and ethics or ethical principles more broadly. Otherwise, we use “ethics” in a general sense, which the context might further specify (as in “ethics in science”).

2 Values in this sense are not restricted to, but do include, specific “values” such as Schwartz’s ( 2004 ) principles that are the primary influences on decision-making, or cultural “values” as described in Sanderson et al. ( 2018 ).

3 For the purposes of this study, we treat theoretical virtues such as simplicity and understanding as epistemic values, broadly construed, while recognizing that for many other purposes these should be differentiated from truth and empirical adequacy.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

  • Antes AL, Wang X, Mumford MD, Brown RP, Connelly S, Devenport LD. Evaluating the effects that existing instruction on responsible conduct of research has on ethical decision making. Academic Medicine. 2010; 85 (3):519–526. doi: 10.1097/ACM.0b013e3181cd1cc5. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Beebe JR, Dellsén F. Scientific realism in the wild: an empirical study of seven sciences and history and philosophy of science. Philosophy of Science. 2020; 87 (2):336–364. doi: 10.1086/707552. [ CrossRef ] [ Google Scholar ]
  • Betz G. In defence of the value free ideal. European Journal for Philosophy of Science. 2013; 3 (2):207–220. doi: 10.1007/s13194-012-0062-x. [ CrossRef ] [ Google Scholar ]
  • Biddle, J. B., & Kukla, R. (2017). The geography of epistemic risk. In K. C. Elliott & T. Richards (Eds.), Exploring inductive risk: Case studies of values in science (p. 0). Oxford University Press. 10.1093/acprof:oso/9780190467715.003.0011
  • Brown MJ. Values in science beyond underdetermination and inductive risk. Philosophy of Science. 2013; 80 (5):829–839. doi: 10.1086/673720. [ CrossRef ] [ Google Scholar ]
  • Douglas H. Inductive risk and values in science. Philosophy of Science. 2000; 67 (4):559–579. doi: 10.1086/392855. [ CrossRef ] [ Google Scholar ]
  • Douglas, H. (2009). Science, policy, and the value-free ideal . University of Pittsburgh Press.
  • Graham LR. The multiple connections between science and ethics. The Hastings Center Report. 1979; 9 (3):35–40. doi: 10.2307/3560796. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Hammer, D. , Elby, A, Scherr, R., & Redish, E. F. (2005). Resources, framing, and transfer. In J. P. Mestre (Ed.) Transfer of learning from a modern multidisciplinary perspective. (pp. 89-119). Information Age Publishing Inc.
  • Hausman, D. M. (2011). Preference, value, choice, and welfare . Cambridge University Press.
  • Hempel, C. G. (1965). Aspects of scientific explanation and other essays (1st edition). Free Press.
  • Kalichman M. Rescuing responsible conduct of research (RCR) education. Accountability in Research. 2014; 21 (1):68–83. doi: 10.1080/08989621.2013.822271. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Kempner J, Perlis CS, Merz JF. Forbidden knowledge. Science. 2005; 307 (5711):854. doi: 10.1126/science.1107576. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Kuhn, T. S. (1996). The structure of scientific revolutions (3rd edition). University of Chicago Press.
  • Lacey, H. (1999). Is science value free?: Values and scientific understanding (1st edition). Routledge.
  • Landis JR, Koch GG. The measurement of observer agreement for categorical data. Biometrics. 1977; 33 (1):159. doi: 10.2307/2529310. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Mumford MD, Devenport LD, Brown RP, Connelly S, Murphy ST, Hill JH, Antes AL. Validation of ethical decision making measures: evidence for a new set of measures. Ethics & Behavior. 2006; 16 (4):319–345. doi: 10.1207/s15327019eb1604_4. [ CrossRef ] [ Google Scholar ]
  • Mumford MD, Steele L, Watts LL. Evaluating ethics education programs: a multilevel approach. Ethics & Behavior. 2015; 25 (1):37–60. doi: 10.1080/10508422.2014.917417. [ CrossRef ] [ Google Scholar ]
  • Myrdal G. Objectivity in social research. London: Gerald Duckworth & Co; 1970. [ Google Scholar ]
  • O’Rourke M, Crowley SJ. Philosophical intervention and cross-disciplinary science: the story of the Toolbox Project. Synthese. 2013; 190 (11):1937–1954. doi: 10.1007/s11229-012-0175-y. [ CrossRef ] [ Google Scholar ]
  • Pennock RT, O’Rourke M. Developing a scientific virtue-based approach to science ethics training. Science and Engineering Ethics. 2017; 23 (1):243–262. doi: 10.1007/s11948-016-9757-2. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Resnik DB, Elliott KC. Value-entanglement and the integrity of scientific research. Studies in History and Philosophy of Science Part A. 2019; 75 :1–11. doi: 10.1016/j.shpsa.2018.12.011. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Rest, J. R., Narvaez, D., Thoma, S. J., & Bebeau, M. J. (2000). A neo-Kohlbergian approach to morality research. Journal of Moral Education, 29 (4), 381-395. https://doi.org/10.1080/713679390.
  • Robinson B, Gonnerman C, O’Rourke M. Experimental philosophy of science and philosophical differences across the sciences. Philosophy of Science. 2019; 86 (3):551–576. doi: 10.1086/703553. [ CrossRef ] [ Google Scholar ]
  • Rudner R. The scientist qua scientist makes value judgments. Philosophy of Science. 1953; 20 (1):1–6. doi: 10.1086/287231. [ CrossRef ] [ Google Scholar ]
  • Sanderson MR, Bergtold JS, Stamm H, Caldas JL, Ramsey MM, Aistrup J. Climate change beliefs in an agricultural context: what is the role of values held by farming and non-farming groups? Climatic Change. 2018; 150 (3):259–272. doi: 10.1007/s10584-018-2283-2. [ CrossRef ] [ Google Scholar ]
  • Schindler S. Theoretical virtues: do scientists think what philosophers think they ought to think? Philosophy of Science. 2022; 89 (3):542–564. doi: 10.1017/psa.2021.40. [ CrossRef ] [ Google Scholar ]
  • Schwartz SH, Boehnke K. Evaluating the structure of human values with confirmatory factor analysis. Journal of Research in Personality. 2004; 38 (3):230–255. doi: 10.1016/S0092-6566(03)00069-2. [ CrossRef ] [ Google Scholar ]
  • Steel D. Epistemic values and the argument from inductive risk. Philosophy of Science. 2010; 77 (1):14–34. doi: 10.1086/650206. [ CrossRef ] [ Google Scholar ]
  • Steel D, Gonnerman C, McCright AM, Bavli I. Gender and scientists’ views about the value-free ideal. Perspectives on Science. 2018; 26 (6):619–657. doi: 10.1162/posc_a_00292. [ CrossRef ] [ Google Scholar ]
  • Steffe LP, Gale J, editors. Constructivism in education. Hillsdale, N.J: Routledge; 1995. [ Google Scholar ]
  • Wolpe PR. Reasons scientists avoid thinking about ethics. Cell. 2006; 125 (6):1023–1025. doi: 10.1016/j.cell.2006.06.001. [ PubMed ] [ CrossRef ] [ Google Scholar ]

OEC logo

Site Search

  • How to Search
  • Advisory Group
  • Editorial Board
  • OEC Fellows
  • History and Funding
  • Using OEC Materials
  • Collections
  • Research Ethics Resources
  • Ethics Projects
  • Communities of Practice
  • Get Involved
  • Submit Content
  • Open Access Membership
  • Become a Partner

Chapter 1: How Ethics and Values Intersect in Science

Chapter 1 of Michael Pritchard and Theodore Goldfarb's instructor guide, " Ethics in the Science Classroom ."

The roles of ethics in science

A book devoted to advocating the infusion of ethics/values into the teaching of science rests on the assumption that ethics and values play a significant role in science and that ignoring this fact will diminish a student's comprehension of the true nature of the scientific enterprise. But this is not an assumption that is accepted and appreciated by most secondary school students, nor by all of their teachers. When asked about the connection between ethics and science, many science teachers will make reference to such issues as scientific fraud and plagiarism that have occasionally made dramatic headlines. They will generally view such behavior as the exception rather than the rule and profess a belief that science is for the most part an objective and value-free activity practiced by honest, moral individuals. Our point is not to deny that fraudulent behavior among scientists is unusual, but rather to emphasize the fact that science  is  the product of human activity, and as such it inevitably involves a wide variety of value-laden choices and judgements, many of which have ethical dimensions.

What is science? Professor John Ziman of the Imperial College of Science and Technology, London, one of the most influential writers on the practice of science, points out that definitions given by professional scientists, historians of science, philosophers of science, and representatives of other related disciplines tend to emphasize "different aspects of the subject, often with quite different policy implications." (8) Philosophers might emphasize the methodological aspects of science focusing on experimentation, observation and theorizing as elements of the means by which reliable information about the natural world is gleaned through the practice of science. Historians are prone to view science as the accumulation of knowledge, stressing its archival aspect as a significant historical process worthy of special study. Ziman concludes that: "...science is all these things and more. It is indeed the product of research; it does employ characteristic methods; it is an organized body of knowledge; it is a means of solving problems." (9)

The fact that the practice of science is a human social activity is a central theme of a booklet entitled "On Being a Scientist," initially published in 1989. This booklet was written by the Committee on the Conduct of Science under the auspices of the National Academy of Sciences as a description of the scientific enterprise for students who are about to begin to do scientific research. The reader is instructed that:

Scientists have a large body of knowledge that they can use in making decisions. Yet much of this knowledge is not the product of scientific investigation, but instead involves value-laden judgements, personal desires, and even a researcher's personality and style. (10)

Debunked is the notion of a rigid Baconian scientific method by which scientists derive truth about the universe by making observations with no preconceptions about what they may discover. Instead the authors claim that:

...research is as varied as the approaches of individual researchers. Some scientists postulate many hypotheses and systematically set about trying to weed out the weaker ones. Others describe their work as asking questions of nature: "What would happen if ...? Why is it that...?" Some researchers gather a great deal of data with only a vague idea about the problem they might be trying to solve. Others develop a specific hypothesis or conjecture that they then try to verify or refute with carefully structured observations. Rather than following a single scientific method, scientists use a body of methods particular to their work. (11)

The booklet includes several real-life stories that illustrate the fallibility of scientists, and the ways in which they can be influenced by personal or social values.

Mentioned as examples of the values that can distort science are attitudes regarding religion, race and gender. Assurance is given that science has social structures and mechanisms that tend to limit and correct the influences of such biases. The peer review process, the requirement that experiments be replicable and the openness of communication are claimed to serve this purpose. The booklet ends with a strong appeal for scientists to exercise social responsibility. A second edition of this booklet, revised by a joint committee of the National Academy of Sciences, the National Academy of Engineering and the Institute of Medicine, was published in 1995 and retains much of the discussion of the role of values in science.

The claim that the peer review process and openness of communication significantly reduce the influences of bias in science assumes a set of historic norms for the behavior of scientists that are less descriptive of scientific behavior today than when they were codified by the eminent sociologist R. K. Merton in 1942. Merton's norms, as expressed by Ziman (12) include the principles of  communalism  (that science is public knowledge available to all),  universalism  (there are no privileged sources of scientific knowledge), and  disinterestedness  (science is done for its own sake). In today's world, where the vast majority of scientific research is funded by corporate or other private interests which often place rigid restrictions on the publication of scientific results and the exchange of scientific information, and where academic scientists find themselves in a highly competitive environment, these norms can no longer be viewed as generally applicable to the practice of science.

The tendency of many scientists and teachers of science to portray science and scientists in an idealistic and unrealistic manner is often motivated by belief that this will result in a greater willingness on the part of students and the public to accept scientific, rational thought as a powerful tool for learning about, and understanding, the world and the universe. There is no evidence to support this view. On the contrary, when students are taught that scientists are mere mortals who are subject to the same social pressures and temptations, in their work as well as in their private lives, that influence all human endeavor, they are more likely to identify with scientists. The powerful methods that science offers for seeking knowledge about the universe then become personally accessible rather then a set of exotic tools available only to the members of an elite priesthood.

Recent surveys have shown that despite a renewed interest in mysticism, and growing concern about the contribution of technological development to environmental degradation, public regard for science and technology remains very high. This is particularly true in the United States and other industrialized nations, but also in the developing world. While a high regard for science is certainly a desirable public attitude, it can be associated with an uncritical acceptance of any conclusion or opinion that is presented in the name of science. This is contrary to the essence of the scientific approach to knowledge, which seeks to engender a critical/skeptical attitude and recognizes that all of the results of science are to be viewed as subject to further verification and revision.

By presenting science to students as the product of the work of fallible human agents, rather than as a body of unassailable factual knowledge about the universe, gleaned by means of value-free observation and deduction, we can teach students proper respect for science, while nurturing an appropriate attitude of skepticism. Bringing scientists down from a pedestal is necessary if students are to recognize their own humble efforts in school science laboratories as requiring the same honesty in the reporting of observations and treatment of data that they assume was employed in the deduction of the scientific knowledge contained in their textbooks.

Examples of ethics and values issues in science

In an essay entitled "The Ethical Dimensions of Scientific Research" (13) the widely published logician and philosopher of science Nicholas Rescher attacks the view that science is value free, and shows how ethical considerations enter into many aspects of the practice of scientific research. Rescher describes ethical problems and issues in science under several headings. We will use Rescher's headings, describing the major ethical issues that he discusses, and adding a few that he doesn't mention:

  • Choosing research goals.

Rescher states, "Perhaps the most basic and pervasive way in which ethical problems arise in connection with the prosecution of scientific research is in regard to the choice of research problems, the setting of research goals, and the allocation of resources (both human and material) to the prosecution of research efforts." (14) At the national level, he asks whether we are morally justified in committing such a large fraction of the federal research budget to space exploration at the expense of larger appropriations for the advancement of knowledge in medicine, agriculture and other fields of technology bearing directly on human welfare. Other major value- laden choices that he doesn't mention are the balance between the funding of military versus non-military research and between the funding of fossil fuel and nuclear energy investigations as opposed to those involving renewable energy sources. A recent issue that has divided the public, politicians and the scientific community is the extent to which "BIG SCIENCE" projects like the supercollider subatomic particle accelerator or the Human Genome Project should be funded as compared to funding a broader variety of more modest "small" science endeavors.

At the institutional level of the department, laboratory or research institute, Rescher mentions the issue of support for pure, or basic, versus applied, or practical, research. Today, with an increasing fraction of research being done by, or funded by, industry the constraints imposed by corporate interests on the choice of research projects, or on the direction of the research is becoming an increasingly significant ethical issue.

At the individual level Rescher cites difficult, and even painful, ethical decisions that often must be made. These include the choice between pure and, frequently more lucrative, applied research, and for those who choose applied science, such questions as whether to work on military projects. Recently the media have publicized the moral dilemma of whether former researchers for the tobacco industry should violate secrecy agreements by revealing that the industry knew more about the addictive nature of nicotine than was claimed in sworn testimony by company spokespeople.

  • Staffing of research activities.

Rescher includes under this heading the ethical concerns that arise when scientists become administrators of large sums of public money that are needed to fund most forms of contemporary scientific research. As he points out, the increasing administrative responsibilities imposed on scientists is an ethical issue, in and of itself, because it impairs a scientist's ability to devote his or her energies to the practice of science. In research at universities, the employment of graduate students to do research raises issues about whether the assigned research is the optimal work in terms of the education and the training of the student. An additional ethical concern related to staffing a research group is the fact that women and minority members have historically been under-represented in scientific research. Making good on commitments to equal opportunity is a serious moral obligation of the scientist as research administrator.

  • Research methods.

The ethical concerns related to the use of human subjects and animals in research are the focus of Rescher's remarks about issues related to the methods of research. We will discuss the topic of human subjects in some detail, both in the next chapter and in connection with the case study about the Tuskegee syphilis experiment in Chapter 4. The heightened public concern about animals as research subjects resulting from the animal rights movement is an issue familiar to most science teachers, particularly biology teachers. The deletion of experiments using animals in school science laboratories, due to moral objections by teachers, students, parents or the community, is becoming an increasingly common occurrence.

Other ethics and values issues related to research methods include such questions as whether a double-blind protocol is needed in cases where subjective interpretations of research data may influence experimental results. Additionally, there are issues related to the manipulation and presentation of data, many of which are discussed in connection with the Millikan case study in Chapter 4. The use of placebos in tests of the effectiveness of a new drug can raise ethical issues associated with the withholding of a potentially effective treatment of a serious illness.

  • Standards of proof and the dissemination of research findings

Rescher discusses the issue of the amount of evidence a scientist must accumulate before announcing his or her findings. As he states, "This problem of standards of proof is ethical, and not merely theoretical or methodological in nature, because it bridges the gap between scientific understanding and action, between thinking and doing..." Personal factors, such as the need to publish in order to advance his or her career goals may tempt a scientist to exaggerate the certainty of scientific results. The fact that positive results are often rewarded by increased funding from research sponsors increases this temptation.

In most cases, the science establishment scorns the scientist who chooses to announce his or her findings via public media before they have been published in a peer- reviewed journal. As discussed by Rescher, there is good reason to be concerned about premature publicity about findings that have not been accepted as valid by the scientific community. Well known researchers or research institutions can use the sensationalism, which is as much a characteristic of science reporting as other types of journalism, to influence public opinion and governmental funding agencies. The media emphasis on such values as the novel and the spectacular which, if translated into more funds for this type of study, can distort the development of science.

Other types of ethical conflict, not mentioned by Rescher, may result from publication standards. A scientist may be convinced that the results of a study are valid, and may have significant, perhaps even urgent, social value, although they do not quite meet the often rigid standards set by his or her peers. One such standard is the generally accepted requirement that in order to be considered valid, a result derived from statistical analysis of data must have less than a 5% chance of being a result of chance. Suppose a scientist analyzes some geological data that show that some natural disaster is likely to occur at the 93% rather than the 95% statistical confidence level. No possibility exists of doing further studies that might increase the certainty of the result. Peer reviewers at the relevant scientific journal reject the report because it fails the 95% test. The scientist must make the decision whether to accept this judgment or risk the opprobrium of colleagues and make the results known by seeking the help of news-hungry science journalists.

  • Control of scientific "misinformation".

Rescher affirms that scientists have a duty to control and suppress scientific misinformation. This obligation extends to preventing erroneous research findings from misleading their colleagues and, perhaps more urgently, to protect against the danger that false results may endanger the health or welfare of the public.

On the other hand, Rescher warns against misusing this need to censor misinformation in a way that stifles novelty and innovation. Too often in the history of science, scientists, particularly those who are young and not yet well-established, have found it very difficult to gain acceptance for revolutionary discoveries that do not fit within the prevailing disciplinary paradigm.

Rescher also raises the issue of science versus pseudo-science. Whereas the need to control misinformation would logically extend to pseudo-science, he points out that the distinction between what is accepted as science and what some members of the scientific community would label as pseudo-science is not always clear. As examples of contemporary problems in this area are the scientific standing of various forms of extra-sensory perception, herbal and other non-Western, "traditional" medicines, acupuncture and the recent controversy over the validity of "cold fusion." Rescher urges caution to those who would settle such disputes through censorship and suppression of views that they fear might damage the public image of science. He suggests, instead that scientists have faith that truth will "...win out in the market place of freely interchanged ideas..."

  • Allocation of credit for scientific research achievements.

For obvious reasons, scientists are no less interested than those in any other field of endeavor in receiving appropriate credit for their work. Rescher mentions the bitter disputes that have arisen over the years with regard to decisions about who should receive credit for a particular discovery or invention. The agreement by the international scientific community to give such credit to the scientist(s) whose report of the discovery is first submitted to an appropriate journal has provided a means for resolving most, but not all such disputes. The recent controversy over the discovery of the virus that causes AIDS demonstrates that this procedure is not infallible, at least in cases where it may be difficult to determine if research reports from different laboratories are describing the same phenomenon. Furthermore since different laboratories frequently make nearly simultaneous, independent discoveries of the same scientific result or phenomenon, the question arises as to the ethical justification for giving all of the credit to the one who just happens to be first to submit the results for publication.

As Rescher points out, the fact that since scientific work is usually a collaborative effort, either within a single research facility, or involving several laboratories, the issue of allocating credit can be very complicated. This has become an even more problematic issue since Rescher first wrote his essay in 1965. In some fields, like high energy nuclear physics, the list of authors can exceed ten, or even twenty. Cases where junior colleagues or graduate students believe that a senior researcher has usurped credit that they deserve are not uncommon. Even issues like the order of the names on a published research article -- should they be listed alphabetically, in decreasing order of the contribution made, or in order of seniority -- can result in controversy.

A current ethical issue related to credit, and to authorship of research reports, is the extent to which a scientist whose name appears as an author should be held responsible for all the data and results reported in a published paper. This issue emerged from cases where data in a paper have been challenged as being wrong and perhaps fraudulently represented. If the work is a collaborative effort, involving researchers from different scientific disciplines, is it reasonable to expect all of them to vouch for the entire content of the paper? If not, should each author's contribution be clearly stated in the paper, or in a footnote?

One source of disputes concerning credit for research ideas and ownership of intellectual property is the peer review process. The National Science Foundation reports that accusations that a peer reviewer appropriated an experimental or theoretical idea or result from a research proposal or paper he or she was sent to evaluate, is the largest category of scientific misconduct complaints that it receives. Of course, the number of such serious accusations is only a very small fraction of all the proposals and papers that are reviewed.

This completes our discussion of ethical issues related to the practice of science under the headings in Rescher's essay. It is by no means an exhaustive list of issues of the types he discussed. There are also other important categories of ethical concerns not mentioned by Rescher. For example, there are ethical concerns related to the relative importance of cooperation and competition in scientific research, and the related issue of the extent to which scientists are obliged to share their data. (This issue is discussed in chapter 4 in connection with the case study on the discovery of the structure of DNA.)

Rescher explicitly states that he chose to ignore ethical issues related to societal uses of science as opposed to those associated with the practice of science. He claims that issues related to the exploitation of science "are not ethical choices that confront the scientist himself." This very assertion has been a continuing subject of dispute, both within the scientific community itself, as well as among philosophers, historians and sociologists of science. Not only is the obvious point made that scientists are members of society, and are therefore confronted by questions related to the social uses of science, a more controversial ethical claim is made by those who take issue with Rescher's disclaimer. They assert that scientists, because of their special knowledge, and because of the support they demand from society, have a social obligation to concern themselves with the uses that society makes of science, and to help the lay public make informed choices about technological issues.

Independent of this question concerning the social responsibility of the scientist, we believe that the introduction of ethical issues in the secondary school science curriculum should definitely include those related to the social uses, as well as the "doing" of science. Most students will not become scientists, but all students will need to participate, as citizens, in making informed choices about the uses of science.

We will mention two major contemporary developments in which numerous ethics and values issues related to the uses of science arise. The first is the rapidly developing field of bioengineering, including the application of the powerful techniques associated with modern genetics research. The results of the massive international Human Genome Project will further expand the need to confront a long list of extremely controversial social uses of this work. With increasing frequency, front page headlines and prime time TV news stories draw public attention to these controversies. Should society condone, or even encourage the cloning of animals, and perhaps human beings? Should prospective parents be able to buy embryos, with specific genetic pedigrees, for implantation into the woman's uterus? Should an individual's genetic code be kept on file by the government, and if so, to whom should it be available?

A second contemporary development that poses numerous ethics and values choices related to applied science is the worldwide concern about the potential conflict between industrial development and the ecological health of the planet. The growing list of serious local, regional and global environmental problems, including the pollution of air, water and land, acid precipitation, soil erosion, stratospheric ozone depletion and global warming, has spawned an increased sense of urgency among the world's people and their political leaders about the present and future health of the earth's ecosystems. Decisions concerning what to do about these problems involve an evaluation of the scientific facts in the context of many other value-laden social and political factors. Should the developing nations of the world be denied the benefits of the technologies that have resulted in serious pollution problems as a result of their widespread use by the developed nations? Is it appropriate to base environmental decisions on cost-benefit analysis when this requires measuring such human values as life, health and beauty in economic terms? Should the use of a chemical be banned when it is estimated to cause one death in a million, ten thousand or one thousand exposed people? What roles should scientists, political leaders and informed citizens play in making environmental decisions?

Further discussion of ethics and values issues related to the "doing" and "using" of science will be found in connection with the examples used in Chapters 2 and 3, and in more detail in association with the case studies presented in Chapter 4. We certainly make no claims to present, in this brief text, a comprehensive treatment of the vast terrain occupied by the intersection between science and ethic/values issues. Our purpose in this and succeeding chapters is to demonstrate the important and essential need to teach science in a manner that illuminates its ethical content. One reason for doing this is the practical result discovered by the teachers who attended our Summer Institutes: it heightens the interest of their students because of the "humanizing" effect of incorporating ethics into science teaching. But, we believe that a more important reason is our obligation as teachers to convey to our students the true nature of the human enterprise that we call science. As Rescher states in the final section of his essay, "It is a regrettable fact that too many persons, both scientists and students of the scientific method, have had their attention focused so sharply upon the abstracted 'logic' of an idealized 'scientific method' that this ethical dimension of science has completely escaped their notice. This circumstance seems to me particularly regrettable because it has fostered a harmful myth that finds strong support in both the scientific and the humanistic camps -- namely the view that science is antiseptically devoid of any involvement with human values." (15)

  • (8) John Ziman, An Introduction to Science Studies: The Philosophical and Social Aspects of Science and Technology (Cambridge, England: Cambridge Univ. Press, 1984), p. 1.
  • (9) Ziman, p. 2.
  • (10) Committee on the Conduct of Science of the National Academy of Sciences, On Being a Scientist (Washington: National Academy Press, 1989), p. 1.
  • (11) Ibid, p. 2.
  • (12) Ziman, p. 84.
  • (13) Nicholas Rescher, in Philosophy and Science: the wide range of interaction, Frederick E. Mosedale, editor (Englewood Cliffs, New Jersey: Prentice Hall, 1979).
  • (14) Ibid, p. 317.
  • (15) Rescher, in Mosedale, p. 325.

Related Resources

Submit Content to the OEC   Donate

NSF logo

This material is based upon work supported by the National Science Foundation under Award No. 2055332. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

  • Search Menu
  • Browse content in Arts and Humanities
  • Browse content in Archaeology
  • Anglo-Saxon and Medieval Archaeology
  • Archaeological Methodology and Techniques
  • Archaeology by Region
  • Archaeology of Religion
  • Archaeology of Trade and Exchange
  • Biblical Archaeology
  • Contemporary and Public Archaeology
  • Environmental Archaeology
  • Historical Archaeology
  • History and Theory of Archaeology
  • Industrial Archaeology
  • Landscape Archaeology
  • Mortuary Archaeology
  • Prehistoric Archaeology
  • Underwater Archaeology
  • Urban Archaeology
  • Zooarchaeology
  • Browse content in Architecture
  • Architectural Structure and Design
  • History of Architecture
  • Residential and Domestic Buildings
  • Theory of Architecture
  • Browse content in Art
  • Art Subjects and Themes
  • History of Art
  • Industrial and Commercial Art
  • Theory of Art
  • Biographical Studies
  • Byzantine Studies
  • Browse content in Classical Studies
  • Classical History
  • Classical Philosophy
  • Classical Mythology
  • Classical Literature
  • Classical Reception
  • Classical Art and Architecture
  • Classical Oratory and Rhetoric
  • Greek and Roman Papyrology
  • Greek and Roman Epigraphy
  • Greek and Roman Law
  • Greek and Roman Archaeology
  • Late Antiquity
  • Religion in the Ancient World
  • Digital Humanities
  • Browse content in History
  • Colonialism and Imperialism
  • Diplomatic History
  • Environmental History
  • Genealogy, Heraldry, Names, and Honours
  • Genocide and Ethnic Cleansing
  • Historical Geography
  • History by Period
  • History of Emotions
  • History of Agriculture
  • History of Education
  • History of Gender and Sexuality
  • Industrial History
  • Intellectual History
  • International History
  • Labour History
  • Legal and Constitutional History
  • Local and Family History
  • Maritime History
  • Military History
  • National Liberation and Post-Colonialism
  • Oral History
  • Political History
  • Public History
  • Regional and National History
  • Revolutions and Rebellions
  • Slavery and Abolition of Slavery
  • Social and Cultural History
  • Theory, Methods, and Historiography
  • Urban History
  • World History
  • Browse content in Language Teaching and Learning
  • Language Learning (Specific Skills)
  • Language Teaching Theory and Methods
  • Browse content in Linguistics
  • Applied Linguistics
  • Cognitive Linguistics
  • Computational Linguistics
  • Forensic Linguistics
  • Grammar, Syntax and Morphology
  • Historical and Diachronic Linguistics
  • History of English
  • Language Evolution
  • Language Reference
  • Language Acquisition
  • Language Variation
  • Language Families
  • Lexicography
  • Linguistic Anthropology
  • Linguistic Theories
  • Linguistic Typology
  • Phonetics and Phonology
  • Psycholinguistics
  • Sociolinguistics
  • Translation and Interpretation
  • Writing Systems
  • Browse content in Literature
  • Bibliography
  • Children's Literature Studies
  • Literary Studies (Romanticism)
  • Literary Studies (American)
  • Literary Studies (Asian)
  • Literary Studies (European)
  • Literary Studies (Eco-criticism)
  • Literary Studies (Modernism)
  • Literary Studies - World
  • Literary Studies (1500 to 1800)
  • Literary Studies (19th Century)
  • Literary Studies (20th Century onwards)
  • Literary Studies (African American Literature)
  • Literary Studies (British and Irish)
  • Literary Studies (Early and Medieval)
  • Literary Studies (Fiction, Novelists, and Prose Writers)
  • Literary Studies (Gender Studies)
  • Literary Studies (Graphic Novels)
  • Literary Studies (History of the Book)
  • Literary Studies (Plays and Playwrights)
  • Literary Studies (Poetry and Poets)
  • Literary Studies (Postcolonial Literature)
  • Literary Studies (Queer Studies)
  • Literary Studies (Science Fiction)
  • Literary Studies (Travel Literature)
  • Literary Studies (War Literature)
  • Literary Studies (Women's Writing)
  • Literary Theory and Cultural Studies
  • Mythology and Folklore
  • Shakespeare Studies and Criticism
  • Browse content in Media Studies
  • Browse content in Music
  • Applied Music
  • Dance and Music
  • Ethics in Music
  • Ethnomusicology
  • Gender and Sexuality in Music
  • Medicine and Music
  • Music Cultures
  • Music and Media
  • Music and Religion
  • Music and Culture
  • Music Education and Pedagogy
  • Music Theory and Analysis
  • Musical Scores, Lyrics, and Libretti
  • Musical Structures, Styles, and Techniques
  • Musicology and Music History
  • Performance Practice and Studies
  • Race and Ethnicity in Music
  • Sound Studies
  • Browse content in Performing Arts
  • Browse content in Philosophy
  • Aesthetics and Philosophy of Art
  • Epistemology
  • Feminist Philosophy
  • History of Western Philosophy
  • Metaphysics
  • Moral Philosophy
  • Non-Western Philosophy
  • Philosophy of Language
  • Philosophy of Mind
  • Philosophy of Perception
  • Philosophy of Science
  • Philosophy of Action
  • Philosophy of Law
  • Philosophy of Religion
  • Philosophy of Mathematics and Logic
  • Practical Ethics
  • Social and Political Philosophy
  • Browse content in Religion
  • Biblical Studies
  • Christianity
  • East Asian Religions
  • History of Religion
  • Judaism and Jewish Studies
  • Qumran Studies
  • Religion and Education
  • Religion and Health
  • Religion and Politics
  • Religion and Science
  • Religion and Law
  • Religion and Art, Literature, and Music
  • Religious Studies
  • Browse content in Society and Culture
  • Cookery, Food, and Drink
  • Cultural Studies
  • Customs and Traditions
  • Ethical Issues and Debates
  • Hobbies, Games, Arts and Crafts
  • Lifestyle, Home, and Garden
  • Natural world, Country Life, and Pets
  • Popular Beliefs and Controversial Knowledge
  • Sports and Outdoor Recreation
  • Technology and Society
  • Travel and Holiday
  • Visual Culture
  • Browse content in Law
  • Arbitration
  • Browse content in Company and Commercial Law
  • Commercial Law
  • Company Law
  • Browse content in Comparative Law
  • Systems of Law
  • Competition Law
  • Browse content in Constitutional and Administrative Law
  • Government Powers
  • Judicial Review
  • Local Government Law
  • Military and Defence Law
  • Parliamentary and Legislative Practice
  • Construction Law
  • Contract Law
  • Browse content in Criminal Law
  • Criminal Procedure
  • Criminal Evidence Law
  • Sentencing and Punishment
  • Employment and Labour Law
  • Environment and Energy Law
  • Browse content in Financial Law
  • Banking Law
  • Insolvency Law
  • History of Law
  • Human Rights and Immigration
  • Intellectual Property Law
  • Browse content in International Law
  • Private International Law and Conflict of Laws
  • Public International Law
  • IT and Communications Law
  • Jurisprudence and Philosophy of Law
  • Law and Politics
  • Law and Society
  • Browse content in Legal System and Practice
  • Courts and Procedure
  • Legal Skills and Practice
  • Primary Sources of Law
  • Regulation of Legal Profession
  • Medical and Healthcare Law
  • Browse content in Policing
  • Criminal Investigation and Detection
  • Police and Security Services
  • Police Procedure and Law
  • Police Regional Planning
  • Browse content in Property Law
  • Personal Property Law
  • Study and Revision
  • Terrorism and National Security Law
  • Browse content in Trusts Law
  • Wills and Probate or Succession
  • Browse content in Medicine and Health
  • Browse content in Allied Health Professions
  • Arts Therapies
  • Clinical Science
  • Dietetics and Nutrition
  • Occupational Therapy
  • Operating Department Practice
  • Physiotherapy
  • Radiography
  • Speech and Language Therapy
  • Browse content in Anaesthetics
  • General Anaesthesia
  • Neuroanaesthesia
  • Clinical Neuroscience
  • Browse content in Clinical Medicine
  • Acute Medicine
  • Cardiovascular Medicine
  • Clinical Genetics
  • Clinical Pharmacology and Therapeutics
  • Dermatology
  • Endocrinology and Diabetes
  • Gastroenterology
  • Genito-urinary Medicine
  • Geriatric Medicine
  • Infectious Diseases
  • Medical Toxicology
  • Medical Oncology
  • Pain Medicine
  • Palliative Medicine
  • Rehabilitation Medicine
  • Respiratory Medicine and Pulmonology
  • Rheumatology
  • Sleep Medicine
  • Sports and Exercise Medicine
  • Community Medical Services
  • Critical Care
  • Emergency Medicine
  • Forensic Medicine
  • Haematology
  • History of Medicine
  • Browse content in Medical Skills
  • Clinical Skills
  • Communication Skills
  • Nursing Skills
  • Surgical Skills
  • Browse content in Medical Dentistry
  • Oral and Maxillofacial Surgery
  • Paediatric Dentistry
  • Restorative Dentistry and Orthodontics
  • Surgical Dentistry
  • Medical Ethics
  • Medical Statistics and Methodology
  • Browse content in Neurology
  • Clinical Neurophysiology
  • Neuropathology
  • Nursing Studies
  • Browse content in Obstetrics and Gynaecology
  • Gynaecology
  • Occupational Medicine
  • Ophthalmology
  • Otolaryngology (ENT)
  • Browse content in Paediatrics
  • Neonatology
  • Browse content in Pathology
  • Chemical Pathology
  • Clinical Cytogenetics and Molecular Genetics
  • Histopathology
  • Medical Microbiology and Virology
  • Patient Education and Information
  • Browse content in Pharmacology
  • Psychopharmacology
  • Browse content in Popular Health
  • Caring for Others
  • Complementary and Alternative Medicine
  • Self-help and Personal Development
  • Browse content in Preclinical Medicine
  • Cell Biology
  • Molecular Biology and Genetics
  • Reproduction, Growth and Development
  • Primary Care
  • Professional Development in Medicine
  • Browse content in Psychiatry
  • Addiction Medicine
  • Child and Adolescent Psychiatry
  • Forensic Psychiatry
  • Learning Disabilities
  • Old Age Psychiatry
  • Psychotherapy
  • Browse content in Public Health and Epidemiology
  • Epidemiology
  • Public Health
  • Browse content in Radiology
  • Clinical Radiology
  • Interventional Radiology
  • Nuclear Medicine
  • Radiation Oncology
  • Reproductive Medicine
  • Browse content in Surgery
  • Cardiothoracic Surgery
  • Gastro-intestinal and Colorectal Surgery
  • General Surgery
  • Neurosurgery
  • Paediatric Surgery
  • Peri-operative Care
  • Plastic and Reconstructive Surgery
  • Surgical Oncology
  • Transplant Surgery
  • Trauma and Orthopaedic Surgery
  • Vascular Surgery
  • Browse content in Science and Mathematics
  • Browse content in Biological Sciences
  • Aquatic Biology
  • Biochemistry
  • Bioinformatics and Computational Biology
  • Developmental Biology
  • Ecology and Conservation
  • Evolutionary Biology
  • Genetics and Genomics
  • Microbiology
  • Molecular and Cell Biology
  • Natural History
  • Plant Sciences and Forestry
  • Research Methods in Life Sciences
  • Structural Biology
  • Systems Biology
  • Zoology and Animal Sciences
  • Browse content in Chemistry
  • Analytical Chemistry
  • Computational Chemistry
  • Crystallography
  • Environmental Chemistry
  • Industrial Chemistry
  • Inorganic Chemistry
  • Materials Chemistry
  • Medicinal Chemistry
  • Mineralogy and Gems
  • Organic Chemistry
  • Physical Chemistry
  • Polymer Chemistry
  • Study and Communication Skills in Chemistry
  • Theoretical Chemistry
  • Browse content in Computer Science
  • Artificial Intelligence
  • Computer Architecture and Logic Design
  • Game Studies
  • Human-Computer Interaction
  • Mathematical Theory of Computation
  • Programming Languages
  • Software Engineering
  • Systems Analysis and Design
  • Virtual Reality
  • Browse content in Computing
  • Business Applications
  • Computer Security
  • Computer Games
  • Computer Networking and Communications
  • Digital Lifestyle
  • Graphical and Digital Media Applications
  • Operating Systems
  • Browse content in Earth Sciences and Geography
  • Atmospheric Sciences
  • Environmental Geography
  • Geology and the Lithosphere
  • Maps and Map-making
  • Meteorology and Climatology
  • Oceanography and Hydrology
  • Palaeontology
  • Physical Geography and Topography
  • Regional Geography
  • Soil Science
  • Urban Geography
  • Browse content in Engineering and Technology
  • Agriculture and Farming
  • Biological Engineering
  • Civil Engineering, Surveying, and Building
  • Electronics and Communications Engineering
  • Energy Technology
  • Engineering (General)
  • Environmental Science, Engineering, and Technology
  • History of Engineering and Technology
  • Mechanical Engineering and Materials
  • Technology of Industrial Chemistry
  • Transport Technology and Trades
  • Browse content in Environmental Science
  • Applied Ecology (Environmental Science)
  • Conservation of the Environment (Environmental Science)
  • Environmental Sustainability
  • Environmentalist Thought and Ideology (Environmental Science)
  • Management of Land and Natural Resources (Environmental Science)
  • Natural Disasters (Environmental Science)
  • Nuclear Issues (Environmental Science)
  • Pollution and Threats to the Environment (Environmental Science)
  • Social Impact of Environmental Issues (Environmental Science)
  • History of Science and Technology
  • Browse content in Materials Science
  • Ceramics and Glasses
  • Composite Materials
  • Metals, Alloying, and Corrosion
  • Nanotechnology
  • Browse content in Mathematics
  • Applied Mathematics
  • Biomathematics and Statistics
  • History of Mathematics
  • Mathematical Education
  • Mathematical Finance
  • Mathematical Analysis
  • Numerical and Computational Mathematics
  • Probability and Statistics
  • Pure Mathematics
  • Browse content in Neuroscience
  • Cognition and Behavioural Neuroscience
  • Development of the Nervous System
  • Disorders of the Nervous System
  • History of Neuroscience
  • Invertebrate Neurobiology
  • Molecular and Cellular Systems
  • Neuroendocrinology and Autonomic Nervous System
  • Neuroscientific Techniques
  • Sensory and Motor Systems
  • Browse content in Physics
  • Astronomy and Astrophysics
  • Atomic, Molecular, and Optical Physics
  • Biological and Medical Physics
  • Classical Mechanics
  • Computational Physics
  • Condensed Matter Physics
  • Electromagnetism, Optics, and Acoustics
  • History of Physics
  • Mathematical and Statistical Physics
  • Measurement Science
  • Nuclear Physics
  • Particles and Fields
  • Plasma Physics
  • Quantum Physics
  • Relativity and Gravitation
  • Semiconductor and Mesoscopic Physics
  • Browse content in Psychology
  • Affective Sciences
  • Clinical Psychology
  • Cognitive Psychology
  • Cognitive Neuroscience
  • Criminal and Forensic Psychology
  • Developmental Psychology
  • Educational Psychology
  • Evolutionary Psychology
  • Health Psychology
  • History and Systems in Psychology
  • Music Psychology
  • Neuropsychology
  • Organizational Psychology
  • Psychological Assessment and Testing
  • Psychology of Human-Technology Interaction
  • Psychology Professional Development and Training
  • Research Methods in Psychology
  • Social Psychology
  • Browse content in Social Sciences
  • Browse content in Anthropology
  • Anthropology of Religion
  • Human Evolution
  • Medical Anthropology
  • Physical Anthropology
  • Regional Anthropology
  • Social and Cultural Anthropology
  • Theory and Practice of Anthropology
  • Browse content in Business and Management
  • Business Ethics
  • Business Strategy
  • Business History
  • Business and Technology
  • Business and Government
  • Business and the Environment
  • Comparative Management
  • Corporate Governance
  • Corporate Social Responsibility
  • Entrepreneurship
  • Health Management
  • Human Resource Management
  • Industrial and Employment Relations
  • Industry Studies
  • Information and Communication Technologies
  • International Business
  • Knowledge Management
  • Management and Management Techniques
  • Operations Management
  • Organizational Theory and Behaviour
  • Pensions and Pension Management
  • Public and Nonprofit Management
  • Strategic Management
  • Supply Chain Management
  • Browse content in Criminology and Criminal Justice
  • Criminal Justice
  • Criminology
  • Forms of Crime
  • International and Comparative Criminology
  • Youth Violence and Juvenile Justice
  • Development Studies
  • Browse content in Economics
  • Agricultural, Environmental, and Natural Resource Economics
  • Asian Economics
  • Behavioural Finance
  • Behavioural Economics and Neuroeconomics
  • Econometrics and Mathematical Economics
  • Economic History
  • Economic Systems
  • Economic Methodology
  • Economic Development and Growth
  • Financial Markets
  • Financial Institutions and Services
  • General Economics and Teaching
  • Health, Education, and Welfare
  • History of Economic Thought
  • International Economics
  • Labour and Demographic Economics
  • Law and Economics
  • Macroeconomics and Monetary Economics
  • Microeconomics
  • Public Economics
  • Urban, Rural, and Regional Economics
  • Welfare Economics
  • Browse content in Education
  • Adult Education and Continuous Learning
  • Care and Counselling of Students
  • Early Childhood and Elementary Education
  • Educational Equipment and Technology
  • Educational Strategies and Policy
  • Higher and Further Education
  • Organization and Management of Education
  • Philosophy and Theory of Education
  • Schools Studies
  • Secondary Education
  • Teaching of a Specific Subject
  • Teaching of Specific Groups and Special Educational Needs
  • Teaching Skills and Techniques
  • Browse content in Environment
  • Applied Ecology (Social Science)
  • Climate Change
  • Conservation of the Environment (Social Science)
  • Environmentalist Thought and Ideology (Social Science)
  • Natural Disasters (Environment)
  • Social Impact of Environmental Issues (Social Science)
  • Browse content in Human Geography
  • Cultural Geography
  • Economic Geography
  • Political Geography
  • Browse content in Interdisciplinary Studies
  • Communication Studies
  • Museums, Libraries, and Information Sciences
  • Browse content in Politics
  • African Politics
  • Asian Politics
  • Chinese Politics
  • Comparative Politics
  • Conflict Politics
  • Elections and Electoral Studies
  • Environmental Politics
  • European Union
  • Foreign Policy
  • Gender and Politics
  • Human Rights and Politics
  • Indian Politics
  • International Relations
  • International Organization (Politics)
  • International Political Economy
  • Irish Politics
  • Latin American Politics
  • Middle Eastern Politics
  • Political Behaviour
  • Political Economy
  • Political Institutions
  • Political Methodology
  • Political Communication
  • Political Philosophy
  • Political Sociology
  • Political Theory
  • Politics and Law
  • Public Policy
  • Public Administration
  • Quantitative Political Methodology
  • Regional Political Studies
  • Russian Politics
  • Security Studies
  • State and Local Government
  • UK Politics
  • US Politics
  • Browse content in Regional and Area Studies
  • African Studies
  • Asian Studies
  • East Asian Studies
  • Japanese Studies
  • Latin American Studies
  • Middle Eastern Studies
  • Native American Studies
  • Scottish Studies
  • Browse content in Research and Information
  • Research Methods
  • Browse content in Social Work
  • Addictions and Substance Misuse
  • Adoption and Fostering
  • Care of the Elderly
  • Child and Adolescent Social Work
  • Couple and Family Social Work
  • Developmental and Physical Disabilities Social Work
  • Direct Practice and Clinical Social Work
  • Emergency Services
  • Human Behaviour and the Social Environment
  • International and Global Issues in Social Work
  • Mental and Behavioural Health
  • Social Justice and Human Rights
  • Social Policy and Advocacy
  • Social Work and Crime and Justice
  • Social Work Macro Practice
  • Social Work Practice Settings
  • Social Work Research and Evidence-based Practice
  • Welfare and Benefit Systems
  • Browse content in Sociology
  • Childhood Studies
  • Community Development
  • Comparative and Historical Sociology
  • Economic Sociology
  • Gender and Sexuality
  • Gerontology and Ageing
  • Health, Illness, and Medicine
  • Marriage and the Family
  • Migration Studies
  • Occupations, Professions, and Work
  • Organizations
  • Population and Demography
  • Race and Ethnicity
  • Social Theory
  • Social Movements and Social Change
  • Social Research and Statistics
  • Social Stratification, Inequality, and Mobility
  • Sociology of Religion
  • Sociology of Education
  • Sport and Leisure
  • Urban and Rural Studies
  • Browse content in Warfare and Defence
  • Defence Strategy, Planning, and Research
  • Land Forces and Warfare
  • Military Administration
  • Military Life and Institutions
  • Naval Forces and Warfare
  • Other Warfare and Defence Issues
  • Peace Studies and Conflict Resolution
  • Weapons and Equipment

The Oxford Handbook of Philosophy of Science

  • < Previous chapter
  • Next chapter >

The Oxford Handbook of Philosophy of Science

12 Ethics in Science

David B. Resnik, PhD, JD, MA, Bioethicist, National Institute of Environmental Health Sciences, Research Triangle Park, North Carolina

  • Published: 02 September 2014
  • Cite Icon Cite
  • Permissions Icon Permissions

This chapter provides an overview of the ethics of scientific research. Topics covered include: a review of significant historical events, trends, and cases pertaining to scientific ethics; a discussion of the philosophical foundations of science’s ethical norms; a description of science’s ethical norms; and an examination of some common ethical dilemmas that arise in such areas of research as reporting and investigating misconduct, sharing data and materials, assignment of authorship and credit, management of conflict of interest, peer review of publications, research with human beings and animals, and social responsibility. The chapter also discusses conflicts among ethical norms; empirical versus conceptual approaches to studying ethical norms; international variations in research practices; and institutional and government efforts to promote integrity in research.

1 Introduction

Ethics questions, problems, and concerns arise in many different aspects of scientific research ( Shrader-Frechette, 1994 ; Resnik, 1998a , Macrina, 2005 ; Shamoo and Resnik, 2009 ). Ethical issues, such as reporting misconduct, sharing data, assigning authorship, using animals or humans in experiments, and deciding whether to publish results that could be used by others in harmful ways impact the day-to-day activities of scientists and frequently draw the attention of the media, policymakers, and the public ( Steneck, 2007 ; National Academy of Sciences, 2009 ; Briggle and Mitcham, 2012 ). These issues occur in many different types of research disciplines, including biology, medicine, physics, chemistry, engineering, psychology, sociology, anthropology, economics, mathematics, and the humanities (including philosophy). Research ethics is a type of professional ethics, similar to medical or legal ethics ( Shamoo and Resnik, 2009 ). Philosophers have tended to focus on fundamental questions concerning the relationship between science and ethics, such as whether value judgments influence concept formation and theory-choice ( Rudner, 1953 ; Kuhn, 1977 ; Longino, 1990 ; Kitcher, 1993 ; Harding, 1998 ; Elliott, 2011 ). While these issues are important, they are not the main concern of this chapter, which will focus, for the most part, on practical and policy issues related to conduct of science.

2 Historical Background

Questions about the ethics of science are not a uniquely modern concern. In Reflections on the Decline of Science in England , Charles Babbage (1830) scolded his colleagues for engaging in deceptive research practices he described as trimming, cooking, and forging. Trimming, according to Babbage, involves reporting only the data that support one’s hypothesis or theory. Cooking involves conducting an experiment that is designed only to obtain a specific result consistent with one’s hypothesis. Since the experiment is rigged in advance, it is not a genuine test of a hypothesis. Forging involves making up or fabricating data. One of the first major ethical scandals involving science began in 1912, when Charles Dawson claimed to have discovered parts of a skull in the Piltdown gravel bed near Surrey, England, which he said was a “missing link” between humans and apes. Though paleontologists doubted that the skull was genuine because it was inconsistent with the hominid fossil record, the skull was not definitely proven to be a fake until 1953, when laboratory tests indicated that the bones had been artificially aged with chemicals. The artifact was actually composed of a human skull, an orangutan jaw, and chimpanzee teeth ( Shamoo and Resnik, 2009 ).

Ethical issues came to the forefront in World War II. American and British physicists were concerned that Germany would develop an atomic bomb that it could use to win the war. Albert Einstein wrote a letter to President Roosevelt warning him about this threat and urging him to support scientific research to develop an atomic bomb. Roosevelt took this advice and initiated the top secret Manhattan Project in 1941. Robert Oppenheimer led a group of scientists, including Enrico Fermi and Richard Feynman, who worked on developing an atomic bomb in Los Alamos, NM. Germany surrendered to the Allied forces in May 1945, but Japan continued to fight. In August 1945, the US dropped atomic bombs on Hiroshima and Nagasaki, and the Japanese surrendered shortly thereafter. During the war, physicists wrestled with their ethical responsibilities related to the atomic bomb. After the war ended, Robert Oppenheimer and other scientists led an “atoms for peace” movement that sought to stop the proliferation of nuclear weapons and encourage the use of nuclear energy for non-violent purposes, such as electric power generation ( Resnik, 1998a ; Briggle and Mitcham, 2012 ).

During the war crimes trials at Nuremberg, Germany, the world learned about horrific experiments that German physicians and scientists had conducted on concentration camp prisoners. Most of these experiments caused extreme suffering, injury, and death. The prisoners did not consent to these experiments. Some experiments involved exposing subjects to extreme environmental conditions, such as low air pressure or freezing cold temperatures. Other experiments involved injuring subjects with shrapnel or bullets to study wound healing. One of the most notorious researchers, Josef Mengele, injected chemicals into children’s eyes in an attempt to change their eye color; amputated limbs and organs without anesthesia, and subjected prisoners to electroconvulsive therapy. In 1947, at the conclusion of the war trials, the judges promulgated the Nuremberg Code, a set of ten principles for the conduct of ethical experimentation involving human beings. The Code requires that subjects provide their informed consent to participate in research, that experiments should be expected to yield socially valuable results that cannot be obtained by other means, and that experiments minimize suffering and injury. In 1964, the World Medical Association adopted the Helsinki Declaration to provide ethical guidance for medical research, and in the 1970s and 1980s the United States adopted laws and regulations governing research with human subjects ( Shamoo and Resnik, 2009 ).

By the 1980s, most scientists acknowledged that ethical issues arise in the conduct of science, but they viewed these as having mostly to do with science’s interactions with society. Science itself was viewed as objective and ethically pristine. All of this changed, however, when several scandals emerged involving fraud or allegations of fraud in the conduct of federally funded research. The most famous of these became known as the Baltimore Affair, after Nobel Prize–winning molecular biologist David Baltimore, who was associated with the scandal. In 1986, Baltimore, Thereza Imanishi-Kari, and four co-authors associated with the Whitehead Institute published a paper in Cell on using gene transfer methods to induce immune reactions ( Weaver et al., 1986 ). The study was funded by the National Institutes of Health (NIH). Margot O’Toole, a postdoctoral fellow in Imanishi-Kari’s lab, had trouble repeating some of the experiments reported in the paper and she asked for Imanishi-Kari’s lab notebooks. When O’Toole could not reconcile data recorded in the lab notebooks with the results reported in the paper, she accused Imanishi-Kari of misconduct. After an internal investigation found no evidence of misconduct, the Office of Scientific Integrity (OSI), which oversees NIH-funded research, began investigating the case. A congressional committee looking into fraud and abuse in federal agencies also investigated the case, and the New York Times reported the story on its front pages. In 1991, the OSI concluded that Imanishi-Kari falsified data and recommended she be barred from receiving federal funding for ten years. However, this finding was overturned by a federal appeals panel in 1996. Though Baltimore was not implicated in misconduct, his reputation was damaged and he resigned as President of Rockefeller University due to the scandal. Imanishi-Kari acknowledged that she was guilty of poor recordkeeping but not intentional wrongdoing ( Shamoo and Resnik, 2009 ).

Other scandals that made headlines during the 1980s included a dispute between Robert Gallo, from the NIH, and Luc Montagnier, from the Pasteur Institute, over credit for the discovery of the human immunodeficiency virus (HIV) and patent claims concerning a blood test for the HIV; a finding by the OSI that Harvard Medical School postdoctoral fellow John Darsee had fabricated or falsified data in dozens of published papers and abstracts; and a finding by the NIH that University of Pittsburgh psychologist Stephen Breuning had fabricated or falsified data on dozens of grant applications and published papers. Breuning was convicted of criminal fraud and sentenced to sixty days in prison and five years of probation ( Shamoo and Resnik, 2009 ).

In response to these scandals, as well as growing concerns about the impact of financial interests on the objectivity and integrity of research, federal agencies, chiefly the NIH and National Science Foundation (NSF), took additional actions to address ethical problems in the conduct of science ( Steneck, 1999 ). In 1989, the NIH required that all extramurally funded graduate students receive instruction in the responsible conduct of research (RCR). It later extended this requirement to include post-doctoral fellows and all intramural researchers. In 2009, the NSF expanded its RCR instructional requirements to include all undergraduate or graduate students receiving NSF research support. During the 1990s, federal agencies adopted and later revised policies pertaining to research misconduct. In 2000, federal agencies agreed upon a common definition of research misconduct as fabrication of data, falsification of data, or plagiarism (FFP) ( Office of Science and Technology Policy 2000 ). During the 1990s, federal agencies adopted policies concerning the disclosure and management of financial interests related to research. The federal government also reorganized the OSI and renamed it the Office of Research Integrity (ORI). The ORI expanded the scope of its mission beyond oversight and investigation and started sponsoring research on research integrity and conferences on scientific ethics ( Steneck, 1999 ).

Although the United States is the world’s leader in research ethics oversight, regulation, and policy, many other countries have now adopted rules and guidelines pertaining to scientific integrity and RCR instruction. Scientific journals, professional associations, and universities also have developed rules and policies that cover a number of different topics, including the ethical conduct of research involving humans or animals; authorship, publication, and peer review; reporting and investigating misconduct; and data ownership and management ( Resnik and Master, 2013 ).

3 Philosophical Foundations of Science’s Ethical Norms

Science’s ethical norms are standards of behavior that govern the conduct of research. Many of science’s norms are embodied laws and regulations, institutional policies, and professional codes of conduct. Science’s ethical norms have two distinct foundations. First, ethical norms in science can be viewed as rules or guidelines that promote the effective pursuit of scientific aims. When scientists fabricate or falsify data, they propagate errors and undermine the search for truth and knowledge. Conflicts of interest in research are an ethical concern because they can lead to bias, fraud, or error ( Resnik, 2007 ). Ethical norms also indirectly foster the advancement of scientific aims by helping to promote trust among scientists, which is essential for collaboration, publication, peer review, mentoring, and other activities. Unethical behavior destroys the trust that is vital to social cooperation among scientists ( Merton, 1973 ; Whitbeck, 1995 ).

Ethical norms also help to foster the public’s support for science. Science takes place within a larger society, and scientists depend on the public for funding, human research participants, and other resources ( Ziman, 2000 ). Unethical behavior in science can undermine funding, deter people from enrolling in studies, and lead to the enactment of laws that restrict the conduct of research. To receive public support, scientists must be accountable to the public and produce results that are regarded as socially valuable ( Resnik, 1998a ). Since science’s ethical norms stem from the social aspects of research, they can be viewed as part of the social epistemology of science ( Longino, 1990 ; Resnik, 1996 ; Goldman, 1999 ).

Second, science’s ethical norms are based on broader, moral norms. For example, rules against data fabrication and falsification can be viewed as applications of the principle of honesty, a general rule that applies to all moral agents ( Resnik, 1998a ). Rules for assigning credit in science can be viewed as applications of a moral principle of fairness, and rules pertaining to the treatment of human subjects in research can be viewed as applications of respect for autonomy, social utility, and justice ( Shamoo and Resnik, 2009 ). Although science’s ethical norms are based on moral norms, science’s norms are not the same as the norms that apply to all people. For example, honesty in science differs from honesty in ordinary life because it is more demanding. It is morally acceptable, one might argue, to stretch the truth a little bit when telling a friend about a fish one caught at a lake. However, even a little bit of stretching of the truth is unethical in science because minor changes in data can impact results and mislead the scientific community. Further, honesty in science is different from honesty in ordinary life because scientific honesty is defined by technical rules and procedures that do not apply to ordinary life. For example, falsification of data includes unjustified exclusion of data in research. Some data may be excluded only when they are due to experimental error or are statistical outliers. To determine whether a data point from a scientific study can be excluded, one must therefore have an expert understanding of the field of research.

4 Science’s Ethical Norms

Ethical norms that are widely recognized by working scientists and endorsed by philosophers and other scholars include ( Resnik, 1998a ; Shamoo and Resnik, 2009 ; Elliott, 2012 ):

Honesty . Do not fabricate or falsify data. Honestly report the results of research in papers, presentations, and other forms of scientific communication.

Openness . Share data, results, methods, and materials with other researchers.

Carefulness . Keep good records of data, experimental protocols, and other research documents. Take appropriate steps to minimize bias and error. Subject your own work to critical scrutiny and do not overstate the significance of your results. Disclose the information necessary to review your work.

Freedom . Support freedom of inquiry in the laboratory or research environment. Do not prevent researchers from engaging in scientific investigation and debate.

Due credit . Allocate credit for scientific work fairly.

Respect for colleagues . Treat collaborators, students, and other colleagues with respect. Do not discriminate against colleagues or exploit them.

Respect for human research subjects . Respect the rights and dignity of human research subjects and protect them from harm or exploitation.

Animal welfare . Protect and promote the welfare of animals used in research.

Respect for intellectual property . Do not plagiarize or steal intellectual property. Respect copyrights and patents.

Confidentiality . Maintain the confidentiality of materials that are supposed to be kept confidential, such as articles or grants proposals submitted for peer review, personnel records, and so on.

Legality . Comply with laws, regulations, and institutional policies that pertain to research.

Stewardship . Take proper care of research resources, such as biological samples, laboratory equipment, and anthropological sites.

Competence . Maintain and enhance your competence in your field of study. Take appropriate steps to deal with incompetence in your profession.

Social responsibility . Conduct research that is likely to benefit society; avoid causing harm to society. Engage in other activities that benefit society.

A few comments about this above list of scientific norms are in order. First, the list is not intended to be complete. There may be other norms not included on this list. However, the list probably includes most of science’s most important norms.

Second, most of these norms imply specific rules and guidelines that are used to apply the norms to specific situations. For example, respect for human research subjects implies rules pertaining to informed consent, risk minimization, confidentiality, and so on. Due credit implies rules concerning authorship on scientific papers, and animal welfare implies rules for minimizing pain and suffering to animals, and so on.

Third, the norms pertain not only to individuals but also to institutions and organizations, which play a key role in promoting and enforcing ethical conduct in research ( Shamoo and Resnik, 2009 ). For example, university contracts with pharmaceutical companies can help to ensure that researchers have access to data, and that companies cannot suppress the publication of data or results. University policies pertaining to the reporting and investigation of illegal or unethical activity can help ensure the researchers comply with the law and abide by ethical standards. Journal policies concerning authorship can play an important role in sharing credit fairly in scientific publications. Conflict of interest rules adopted by funding agencies can help ensure that grants are reviewed fairly, without bias.

Fourth, sometimes scientific norms conflict with each other or with other moral or social norms. For example, openness may conflict with the respect for human research subjects when scientists are planning to share private data about individuals. One way of handling this conflict is to remove information that can identify individuals (such as name or address) from the data. Alternatively, one can share data by requiring recipients to sign an agreement in which they promise to keep the data confidential. When a conflict among norms arises in a particular situation, scientists should use their reasoning and good judgment to decide how to best resolve the conflict in light of the relevant information and available options ( Shamoo and Resnik, 2009 ). The possibility of conflicts among ethical norms in science implies that RCR instruction must involve more than teaching students how to follow rules: it must also help students learn to use their reasoning and judgment to deal with ethical dilemmas ( Shamoo and Resnik, 2009 ). Ethical dilemmas in research will be discussed at greater length below.

Fifth, science’s ethical norms can be studied from an empirical or conceptual perspective. Empirical approaches to scientific norms attempt to describe the norms that are accepted by scientists and explain how they function in the conduct of research. Sociologists, psychologists, and historians have gathered data on scientific norms and developed explanatory hypotheses and theories. For example, Brian Martinson and colleagues surveyed 3,247 NIH-funded scientists at different stages of their careers concerning a variety of behaviors widely regarded as unethical or ethically questionable. They found that 0.3% admitted to falsifying or cooking data in the last three years; 0.3% admitted to ignoring major aspects of human subjects requirements; 1.4% said they had used someone else’s ideas without permission or giving proper credit; 10% said they had inappropriately assigned authorship credit; and 27.5% admitted to keeping poor research records ( Martinson et al., 2005 ). In the 1940s, Robert Merton (1973) described four different scientific norms that he regarded shaping scientific behavior: universalism, communism, disinterestedness, and organized skepticism. He later added originality to his list of norms ( Ziman, 2000 ).

Conceptual approaches examine the justification, definition, and meaning of scientific norms. Conceptual approaches attempt to evaluate and criticize norms that are accepted by the scientific community rather than simply describe them. Philosophers, ethicists, and working scientists have taken a conceptual approach to scientific norms. Philosophers have tended to focus on epistemic norms, such as simplicity, explanatory power, testability, empirical support, and objectivity, rather than ethical ones ( Quine and Ullian, 1978 ; Kitcher, 1993 ; Thagard, 1993 ). Only a handful of philosophers have conducted extensive studies of science’s ethical norms ( Shrader-Frechette, 1994 ; Resnik, 1998a ; Whitbeck, 1998 ). Working scientists have tended to address specific normative issues in research, such as authorship ( Rennie et al., 1997 ), misconduct ( Kornfeld, 2012 ), conflict of interest ( Morin et al., 2002 ), data integrity ( Glick and Shamoo, 1993 ), and human subjects research ( Hudson et al., 2013 ).

Sixth, variations in scientific practices raise the issue of whether there is a core set of norms that applies to all sciences at all times. For example, different scientific disciplines have different practices concerning authorship ( Shamoo and Resnik, 2009 ). In some biomedical fields, authorship is granted for providing biological samples for analysis; in other fields, providing biological samples, without making any other contribution, would not merit authorship ( Shamoo and Resnik, 2009 ). There are also variations in informed consent practices for research. In the United States and other industrialized nations, the research subject must provide consent. If the subject is incapable of providing consent, the subject’s legal representative may provide it. However, in some African communities tribal elders make all the important decisions, including consent for medical treatment or research. In other communities, a woman does not provide consent for herself, but consent may be provided by her husband (if she is married) or an older relative (such as her father) if she is not ( Shamoo and Resnik, 2009 ).

There are also variations in scientific practices across time. Robert Millikan won the Nobel Prize in Physics in 1923 for measuring the smallest electrical charge (i.e., the charge on an electron) and for his work on the photoelectric effect. To measure the smallest electrical charge, Millikan sprayed oil drops through electrically charge plates. When a drop was suspended in the air, the electrical force pulling the drop up was equivalent to the force of gravity pulling it down. By calculating these forces, Millikan was able to determine the smallest possible charge. In 1978, historian of science Gerald Holton (1978) examined Millikan’s laboratory notebooks for these experiments and compared them to the data presented in the published paper. In the paper, Millikan said that he had reported all the data. However, he did not report 49 out of 189 observations (26%). Holton also discovered that Millikan had graded his observations as “good,” “fair,” and “poor.” Other physicists who have attempted to measure the smallest electrical charge have obtained results very close to Millikan’s. Even though Millikan got the right answer, some commentators have argued that Millikan acted unethically by excluding observations that should have been reported. At the very least, he should have mentioned in the paper that not all observations were reported and explained why ( Broad and Wade, 1983 ). Others have argued that Millikan should be judged by the ethical standards of his own time. While it would be standard practice today to explain why some data were excluded, this was not the practice in Millikan’s time. Science has become more honest, open, and rigorous since then ( Franklin, 1981 ).

How can one account for these social and historical variations in scientific norms? Two different questions arise. The first is empirical: are there variations in the norms accepted by scientists? The answer to this question appears to be in the affirmative, given the examples previously discussed. The second is normative: should there be variations in scientific norms? One might argue that variations in scientific norms are ethically acceptable, provided that they are based on different interpretations of core ethical standards ( Macklin, 1999 ). This would be like claiming that variations in laws pertaining to murder in different societies are acceptable even though a general prohibition against murder should apply universally. For example, in all fields of research, authorship should be awarded for making a significant intellectual contribution to a research project, even though different disciplines may interpret “significant contribution” differently. Informed consent is an ethical norm that applies to all human subjects research even though it may be interpreted differently in different cultures. Honesty applies to all fields of research, but different disciplines may have different understandings of what it means to fabricate or falsify data. Honesty was still an important concern during Millikan’s time even though the standards for reporting data may not have been the same as they are today ( Shamoo and Resnik, 2009 ).

5 Ethical Dilemmas and Issues in Research

As noted previously, ethical dilemmas can arise when scientific norms conflict with each other or with other social or moral norms. The following are some of the ethical dilemmas and issues that frequently arise in research.

Research Misconduct.

Research misconduct raises a number of issues. The first one is definitional: what is research misconduct? In general, research misconduct can be viewed as behavior that is widely regarded as highly unethical. In many cases, a finding of misconduct has legal and career consequences. For example, an individual who is found to have committed misconduct while conducting federally funded research may be barred from receiving federal funding for a period of time. Individuals who are found to have committed misconduct may also have their employment terminated by their institution. In extreme cases, misconduct could lead to prosecution for criminal fraud. As noted earlier, US federal agencies have defined misconduct as fabrication, falsification, or plagiarism. However, before the agencies agreed upon this common definition, many agencies also include “other serious deviations” from accepted scientific practices in the definition of misconduct. Many US universities still include this category of behavior in the definition of misconduct. Additionally, other countries define misconduct differently. While most include FFP in the definition of misconduct, some include significant violations of rules pertaining to research with humans or animals in the definition ( Resnik, 2003 ). One might object that the US government’s definition is too narrow to deal with all the highly unethical behaviors that undermine the integrity of scientific research. However, if the definition of misconduct is too broad, it may be difficult to enforce, due to lack of clarity and the sheer number of violations that need to be handled ( Resnik, 2003 ).

Another difficult issue confronts those who suspect that someone has committed misconduct. Although scientists have an ethical obligation to report suspected misconduct in order to help protect the integrity of research, honoring this obligation often comes at some personal cost. Whistleblowers will usually need to expend considerable time and effort in providing testimony, and they may face the threat of retaliation or harassment. They may develop a reputation as a troublemaker and have difficulty finding employment ( Malek, 2010 ). Of course, not reporting misconduct may also have adverse consequences. If an individual is involved in a research project in which they suspect a collaborator of misconduct, they could be implicated if the misconduct is discovered by someone else. Even if the misconduct is not discovered, the individual would have to live with the fact that they allowed fraudulent or erroneous research to be published.

A third issue concerns the distinction between misconduct and scientific disagreement ( Resnik and Stewart, 2012 ). Sometimes scientists accuse their peers of misconduct because they disagree with the assumptions their peers have made or the methods they have used. These sorts of disagreements frequently occur in research. Science sometimes makes significant advances when researchers propose radical ideas or use untested methods ( Kuhn, 1962 ). Of course, radical ideas and untested methods may also lead researchers astray. In either case, the best way to handle these sorts of disagreements is through honest and open scientific debate, rather than by accusing one’s peers of misconduct. The US federal definition of misconduct distinguishes between misconduct and scientific disagreement and error. Misconduct involves intentional or reckless deviation from commonly accepted scientific standards ( Office of Science and Technology Policy, 2000 ).

Sharing Data and Materials.

Scientists frequently receive requests to share data, methods, and materials (such as chemical reagents or biological samples). As noted earlier, the norm of openness obliges scientists to share their work. Sharing is important in science to make prudent use of resources and achieve common goals. Sharing is also essential to scientific criticism and debate. Knowledge can be obtained more readily when scientists work together ( Munthe and Welin, 1996 ). However, sharing may conflict with the desire to achieve scientific credit or priority ( Shamoo and Resnik, 2009 ). If a chemistry laboratory shares preliminary data with another laboratory working on the same topic, the other laboratory may publish first and win the race for priority. If a molecular biology laboratory has devoted considerable time and effort toward developing a transgenic mouse model of a human disease, it may be hesitant to share the mouse with other laboratories unless it can receive credit for its work. A social scientist who has amassed a considerable amount of demographic and criminological data concerning a population may not want to share this data with other researchers if she is concerned that they could use it to publish papers on topics on which she is planning to publish. Scientists have developed different strategies for dealing with these dilemmas. For example, most scientists do not share preliminary data. Data is shared widely only after publication. Many scientists may also reach agreements with researchers who receive data or materials. The molecular biology lab that developed the transgenic mouse could collaborate with other laboratories that use the mouse and receive some credit. The sociologists with the large database could share the data with the understanding that the recipients would not publish papers on certain topics.

Authorship.

As discussed previously, authorship raises ethical issues. Indeed, authorship disputes are some of the most common ethical controversies in everyday science ( Shamoo and Resnik, 2009 ). Although deciding the authorship of a scientific paper is not as important to society as protecting human subjects from harm, it is very important for scientists, since careers may hang in the balance. The adage “publish or perish” accurately describes the pressures that academic researchers face. The number of authors per scientific paper has increased steadily since the 1950s ( Shamoo and Resnik, 2009 ). For example, the number of authors per paper in four of the top medical journals increased from 4.5 in 1980 to 6.9 in 2000 ( Weeks et al., 2004 ). Some papers in experimental physics list more than 1,000 authors ( King, 2012 ). One of the reasons why the numbers of authors per paper is increasing is that science has become more complex, collaborative, and interdisciplinary in the last few decades. While this is clearly the case, the desire for career advancement undoubtedly helps drive this trend ( Shamoo and Resnik, 2009 ).

Two different authorship issues frequently arise. First, sometimes researchers are not listed as authors on a paper even though they have made a significant intellectual contribution. Students and technicians are especially vulnerable to being exploited in this way, due to their lack of power. Indeed, some senior scientists believe that technicians should not be included as authors because they are paid to do technical work and do not need publications ( Shamoo and Resnik, 2009 ). This attitude in unjustified, since authorship should be based on one’s contributions, not on one’s professional status. Sometimes individuals who have made significant contributions to the paper are not listed as authors in order to make the research appear unbiased. Pharmaceutical companies frequently hire professional writers and statisticians to help prepare articles submitted for publication involving research they have funded. One study found that 7.9% of articles published in six leading medical journals had ghost authors ( Wislar et al., 2011 ).

A second type of ethical problem related to authorship occurs when individuals are listed as authors who have not made a significant intellectual contribution to the research. This phenomenon, known as honorary authorship, is fairly common: one study found that 17.6% of papers published in six leading medical journals have at least one honorary author ( Wislar et al. 2011 ). There are several reasons why honorary authorship is fairly common. First, some laboratory directors insist that they be named as an author on every paper that comes out of their lab, even if they have not made a significant intellectual contribution to the research. Second, some researchers demand that they receive authorship credit for supplying data or materials, even if they make no other contribution to the research. Third, some researchers include well-known scientists on the authorship list in order to enhance the status of the paper and increase its chances of being read or taken seriously. Fourth, some researchers have informal agreements to name each other as authors on their papers, so they can increase their publication productivity ( Shamoo and Resnik, 2009 ).

Most scientific journals have adopted authorship policies to deal with the previously mentioned problems. Policies attempt to promote two important values: fair assignment of credit and accountability ( Resnik, 1997 ). Assigning credit fairly is important to make sure that researchers receive the rewards they deserve. Accountability is important for ensuring that individuals who are responsible for performing different aspects of the research can explain what they did, in case questions arise concerning the validity or integrity of the results. If reviewers or readers detect some problems with the data, it is important to know who conducted the experiments that produced the data. Many journals follow authorship guidelines similar to those adopted by the International Committee of Medical Journal Editors (ICMJE). According to the ICJME (2013) guidelines authorship should be based on:

Substantial contributions to the conception or design of the work; or the acquisition, analysis, or interpretation of data for the work; AND Drafting the work or revising it critically for important intellectual content; AND Final approval of the version to be published; AND Agreement to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved.

The ICMJE also recommends that each author be accountable for the work he or she has done and should be able to identify parts of the research for which other authors are accountable. Individuals who meet some but not all of the authorship criteria should be listed in the acknowledgments section of the paper ( ICMJE, 2013 ). Some journals go a step farther than the ICMJE recommendations and require authors to describe their specific contributions to the research ( Shamoo and Resnik, 2009 ).

Conflict of Interest.

A conflict of interest (COI) in research is a situation in which an individual has financial, professional, or other interests that may compromise his or her judgment or decision-making related to research. COIs are ethically problematic because they can lead to bias, misconduct, or other violations of ethical or scientific standards, and can undermine the public’s trust in research ( Resnik, 2007 ). The most common types of interests that create COIs include financial relationships with companies that sponsor research, such as funding, stock ownership, or consulting arrangements; and intellectual property related to one’s research. An institution can also have a COI if the institution or its leaders have interests that could compromise institutional decision-making. Non-financial COIs can occur when a scientist has a professional or personal relationships with a researcher whose work he or she is reviewing. For example, if a scientist reviews the grant application from another researcher at the same institution, the scientist would have a COI. (COIs in peer review will be discussed more below.)

Scientific journals, federal agencies, and research institutions have rules that require COI disclosure. Such disclosure can help to minimize ethical problems by providing interested parties with information that can be useful in evaluating research ( Resnik and Elliott, 2013 ). For example, if the author of a review article on hypertension claims that a drug made by company X is superior to the alternatives, and the author has a significant amount of stock in company X, then a reader of the article may be skeptical of the author’s conclusions. Disclosure can also help promote public trust in research. If a researcher has an undisclosed COI that later comes to light, the revelation may cause members of the public to distrust the researcher because they feel that they have been deceived. By disclosing COIs, problematic relationships are out in the open, for all to see and assess.

Sometimes disclosures may not be a strong enough response to a COI. If a COI is particularly egregious or difficult to manage, the best option may be to prohibit the COI. For example, most grant agencies do not allow reviewers to evaluate research proposals from colleagues at the same institution or from former students or supervisors. However, sometimes prohibiting a COI has adverse consequences for science or society that are worse than the consequences of allowing the COI. For example, suppose that a speech pathology researcher, who is a stutterer, has invented a device to help control stuttering. He has several patents on the invention, which he has transferred to the university in exchange for royalties. He has also formed a start-up company, with university support, to develop, manufacture, and market the device. The researcher and the university both own a significant amount of stock in the company. The researcher is planning to conduct a clinical trial of the device on campus.

One could argue that this COI should be prohibited, given this complex web of individual and institutional interests in the research. However, prohibiting the COI might adversely impact the development of this device, because the researcher is best qualified to conduct the clinical trial. A clinical trial of the device conducted by another researcher on another campus is not a feasible option. Since development of the device can benefit science and the public, the best option might be to permit the trial to take place on campus but require that it has additional oversight from a group of independent investigators, who could make sure that scientific and ethical standards are upheld ( Resnik, 2007 ).

Peer Review.

Peer review is a key part of science’s self-correcting method ( Ziman, 2000 ). Journals and research funding organizations use peer review to ensure that papers (or research proposals) meeting appropriate scientific and ethical standards. Peer review can improve the quality of papers and proposals and promote scientific rigor and objectivity. The Philosophical Transactions of the Royal Society of London instituted the first peer review system in 1665, but the practice did not become common in science until the mid-twentieth century ( Resnik, 2011 ). When a paper is submitted to a journal, it will usually be reviewed by an editor. If the editor determines that the paper falls within the journal’s scope and is has the potential to make an important contribution to the literature, he or she will send it to two or more reviewers who have expertise in the paper’s topic area. After receiving reports from the reviewers, the editor will decide whether it should be published, revised, or rejected. The entire peer review process is supposed to be confidential in order to protect the author’s unpublished work.

Although most scientists believe that the peer review system is an indispensable part of the evaluation of research, peer review is far from perfect. Studies have shown that reviewers often fail to read manuscripts carefully, provide useful comments, or catch obvious errors. Reviewers may be influenced by various biases, such as the author’s gender, ethnicity, geographic location, or institutional affiliation ( Resnik, 2011 ). Studies have also shown that reviewers often have very different assessments of the same paper: one may recommend that it be rejected, while another may recommend acceptance without any changes ( Resnik, 2011 ). Finally, research indicates that a number of ethical problems can impact peer review: reviewers may breach confidentiality; unnecessarily delay the review process in order to publish an article on the same topic or stifle competitors; use data or methods disclosed in a manuscript without permission; make personal attacks on the authors; require unnecessary references to their own publications; and plagiarize manuscripts ( Resnik et al., 2008 ).

Scientists and scholars have proposed some reforms to improve the reliability and integrity of peer review. First, reviewers should inform editors if they have any COIs related to the paper. For example, a reviewer would have a conflict of interest if he or she is from the same institution as the author, has a significant professional or personal relationship to the author, or has a financial interest related to the manuscript, such as ownership of stock in a company sponsoring the research or a competitor. The editors can review disclosures and decide whether the reviewer should review the paper. In some cases, it may be desirable to have a reviewer review a paper despite a COI, due to the limited availability of other experts to review the paper ( Shamoo and Resnik, 2009 ).

A second reform is to experiment with alternative methods of peer review. Most scientific disciplines use a single-blind approach: the reviewers are told the identities of the authors but not vice versa. Some journals use a double-blind approach: neither the reviewers nor the authors are told the others’ identity. One of the advantages of this approach is that it could help to minimize gender, ethnic, geographic, or institutional biases, as well as COIs. However, studies have shown that more than half of the time reviewers are able to accurately identify authors when they are blinded ( Resnik, 2011 ). Hence, a double-blind system may give editors and authors a false sense of security. Some journals have begun using an un-blinded (or open) approach: both authors and reviewers are told the others’ identity. The chief advantage of this approach is that it will hold reviewers accountable. Reviewers will be less likely to plagiarize manuscripts, unnecessarily delay publication, or commit other ethical transgressions, if they know that the authors can hold them accountable. They may also be more likely to read the manuscript carefully and provide useful comments. One of the problems with the open approach is that scientists may not want to reveal their identities to the authors out of fear of possible retribution. Reviewers also may not give candid comments if their identities will be revealed. Since the current system has numerous flaws, journals should continue to experiment with different methods of peer review and other reforms ( Resnik, 2011 ).

Research with Human Subjects.

Research with human subjects raises many different ethical issues. Since entire volumes have been written on this topic, only a couple of issues will be discussed in this chapter (see Shamoo and Resnik, 2009 , Emanuel et al., 2011 ). Various ethical guidelines, such as the Nuremberg Code and Helsinki Declaration (mentioned above), and legal regulations, such as the US Common Rule ( Department of Health and Human Services, 2009 ), require researchers to take steps to protect human subjects, such as obtaining informed consent, minimizing risks, and protecting confidentiality. However, these regulations and guidelines do not cover every situation and are subject to interpretation. Hence, ethical dilemmas can still arise even when researchers comply with applicable rules.

Most of the ethical dilemmas in research with human subjects involve a conflict between protecting the rights and welfare of research participants and advancing scientific knowledge and benefiting society. A recurring example of this dilemma occurs when researchers use placebo control groups in clinical trials. The placebo effect is a well-documented phenomenon in which individuals tend to respond better when they think they are getting an effective treatment. To control for this effect, researchers can use a placebo group when testing a new drug or other treatment. When this happens, neither the subjects nor the investigators will be told who is receiving a placebo or the experimental treatment. To reduce bias, subjects are randomly assigned to different groups ( Emanuel and Miller, 2001 ).

Using a placebo control group does not raise any significant ethical issues when there is no effective treatment for a disease. However, it does create an ethical dilemma if there is an effective treatment, because some subjects would be foregoing treatment in order to help advance medical research. Foregoing treatment could adversely impact their health and violate the physician’s ethical obligation to provide his or her patients with medical treatment. Some commentators have argued that placebos should never be used when there is an effective treatment, because promoting the welfare of the subjects should take priority over advancing research. If there is an effective treatment for a disease, then the experimental treatment should be compared to a currently accepted effective treatment. Others have argued that it is more difficult to conduct scientifically rigorous clinical trials that compare two treatments because such studies will usually find small differences between groups. In statistics, the sample size needed to obtain significant results is inversely proportional to the effect size: the smaller the effect, the larger the sample needs to be. By using a placebo control group in a clinical trial, one can conduct a smaller study with scientific rigor. Smaller studies are easier to execute and take less time and money than larger ones. They also expose fewer subjects to risks ( Emanuel and Miller, 2001 ). For these reasons, some commentators argue that placebos can be used even when effective treatments exist in two situations: (a) foregoing an effective treatment is not likely to cause subjects any significant pain or permanent harm and (b) the new treatment has the potential to significantly benefit society. For example, a placebo group could be used to study the effectiveness of a pain-control medication for moderate arthritis, because subjects could forego arthritis treatment during a clinical trial without significant pain or permanent harm. Placebo control groups could also be used in testing a treatment that has potential to significantly benefit society, such as an inexpensive method for preventing mother-child transmission of HIV ( Resnik, 1998b ).

Another situation that creates a conflict between the rights/welfare of human subjects and the advancement of knowledge/social benefit is the use of deception in behavioral research. Subjects in an experiment may alter their behavior if they know what is being studied. Psychologists sometimes use some form of deception to minimize this type of bias. The most famous experiments involving deception were conducted in the 1960s by Stanley Milgram, a Harvard psychologist. Milgram wanted to understand why people follow orders given by those they view as authorities. This was an important topic to address, especially since many German soldiers, prison guards, and government officials who committed atrocities during World War II claimed they were only following orders. Milgram’s experiments involved three people: a scientist, a helper, and a test-taker. The test-taker was hooked up to some electrodes connected to a machine that would produce a shock. The helper would ask the test-taker a series of questions. If the test-take got the wrong answer, the scientist instructed the helper to administer a shock. The test-taker would react by expressing discomfort or pain. The shock would be increased each time, eventually reaching levels classified as “dangerous” or “potentially life-threatening.” Milgram found that a majority of the helpers continued to give shocks when instructed to do so, even when the shocks reached “life-threatening” levels. They also gave shocks even when the test-takers pleaded with the scientist to end the experiment. Whenever the helpers expressed doubts about giving a shock, the scientists would ask them to continue and stressed the importance of completing the experiment. Fortunately, no one got a real shock; the test-takers were faking pain and discomfort. The point of the experiment was to see whether the helpers would obey an authority (the scientist), even when they were asked to harm another person. Deception was a necessary part of the experiment, because it would not have been possible to test the helpers’ tendency to obey authority if they had known that the test-takers were not getting shocked ( Milgram, 1974 ).

Milgram debriefed the test-takers after the experiments were over by telling them what had really taken place. Some of them were distraught after the experiments ended. Although they were glad that they hadn’t hurt anyone, they were upset that they had been willing to harm someone in order to complete the experiment. They found out something about their character they did not want to know. Many of them were upset that they had been manipulated to answer a scientific question. Milgram (1974) has defended the use of deception in these experiments on the grounds that (a) it was necessary to answer an important scientific question; (b) it did not significantly harm the subjects; (c) the subjects did consent to be in an experiment; and (d) the subjects were debriefed. However, critics have argued that the experiments significantly harmed some of the subjects and that the knowledge Milgram sought could have been gotten without resorting to deception. For example, researchers could have developed a role-playing game with different subjects placed in positions of authority. By observing people play the game, they would learn about obedience to authority ( Shamoo and Resnik, 2009 ).

Research with Animals.

The most important ethical question with research involving animals is whether it should be conducted at all. Scientists who work with animals regard this research as necessary to produce knowledge that benefits people. Animal research can provide us with a basic knowledge of physiology, anatomy, development, genetics, and behavior that is useful for understanding human health and treating human diseases. Animal experiments also play an important role in the testing of new drugs, medical devices, and other medical products. Indeed, research laws and guidelines require that investigators provide regulatory authorities data from animal studies before they can test a new drug in human beings ( Shamoo and Resnik, 2009 ). Although animal research has produced many important benefits for human societies, it has been embroiled in controversy for many years. Animal rights activists have expressed their opposition to animal research by protesting on university campuses, disrupting experiments, infiltrating laboratories, turning laboratory animals loose in the wild, and threatening researchers. Most universities now tightly control access to animal research facilities to protect experiments, laboratory animals, and researchers ( Shamoo and Resnik, 2009 ).

Peter Singer (1975) , Tom Regan (1983) , and other philosophers have developed arguments against animal experimentation. Singer’s argument is based on three assumptions. First, many species of animals used in laboratory research can suffer and feel pain. Second, animal pain and suffering should be given equal moral consideration to human pain and suffering. Third, the right thing to do is to produce the greatest balance of benefits/harms for all beings deserving of moral consideration (utilitarianism). From these three assumptions, Singer concludes that animal experimentation is unethical because the pain and suffering it causes to animals do not outweigh the benefits it produces for people ( Singer, 1975 ). Observational research on animals (i.e., studying them in the wild) may be acceptable, but research that involves inflicting pain or suffering or causing disability or death should not be allowed.

Regan (1983) approaches the topic from a moral rights perspective, not a utilitarian one. He argues that the function of moral rights is to protect interests: the right to life protects one’s interest in life; the right to property protects one’s property interest; etc. According to Regan, animals have rights because they have interests, such as the interest in living, obtaining food, and avoiding suffering. Although research with human subjects may adversely impact their interests, this research does not violate their rights because people can make an informed decision to sacrifice some of their interests in order to benefit science and society. Animals cannot give informed consent, however. Hence, research on animals is unethical because it violates their rights. Like Singer, Regan would have no problems with studying animals in the wild, but he would object to controlled experiments that cause pain, suffering, and death.

While evaluating Singer’s and Regan’s arguments is beyond the scope of this chapter (see LaFollette and Shanks, 1996 ; Garrett, 2012 ), it is worth noting that other issues arise in animal research besides the fundamental question of whether it should be done at all. Scientists who work with animals and policymakers have developed laws and guidelines for protecting animal welfare in research. These rules deal with issues such as feeding, living conditions, the use of analgesia and anesthesia, and euthanasia. Some widely recognized ethical principles that promote animal welfare in research are known as the three Rs: reduction (reduce the number of animals used in research wherever possible); refinement (refine animal research techniques to minimize pain and suffering); and replacement (replace animals experiments with other methods of obtaining knowledge, such as cell studies or computer models, wherever possible). One of the more challenging ethical issues for scientists with no moral objections to experimenting with animals involves the conflict between minimizing pain/suffering and advancing scientific knowledge ( LaFollette and Shanks, 1996 ). For example, consider an experiment to study a new treatment for spinal injuries. In the experiment, researchers partially sever the spinal cords of laboratory mice. The experimental group receives the treatment; the control group receives nothing. The scientists measure variables (such as the ability to walk, run a maze, etc.) that reflect the animals’ recovery from the spinal injury in both groups. Because analgesia medications could interfere with the healing process and confound the data, they will not be administered to these animals. Thus, it is likely that the animals will experience considerable pain or discomfort in this experiment. Does the social value of this experiment—to develop a treatment for spinal injuries—justify inflicting harm on these animals?

Social Responsibility.

The final group of issues considered in this chapter pertains to scientists’ social responsibilities. Scientists interact with the public in many different ways: they give interviews to media, provide expert testimony in court, educate the public about their work, and advocate for policies informed by their research. Some of the most challenging issues involve conducting and publishing potentially harmful research. As noted earlier, the nuclear weapons research conducted at Los Alamos was kept secret to protect national security interests. Military research raises many issues that will not be considered here, such as whether scientists should work for the military and whether classified research should be conducted on university campuses (see Resnik, 1998b ). This chapter will focus on issues related to conducting and publishing non-military research that may be used for harmful purposes.

Biomedical researchers have published numerous studies that have potential public health benefits but also may be used by terrorists, criminals, or rogue nations to develop weapons that could pose a grave threat for national security, society, and the economy (also known as “dual use research”). Some examples include studies that could be used to develop smallpox viruses that can overcome the human immune system’s defenses ( Rosengard et al., 2002 ); a demonstration of how to make a polio virus from available genetic sequence data and mail-order supplies ( Cello et al., 2002 ); a paper describing how to infect the US milk supply with botulinum toxin ( Wein and Liu, 2005 ); and research on genetically engineering the H5N1 avian flu virus so that it can be transmissible by air between mammals, including humans ( Imai et al., 2012 ; Russell et al., 2012 ).

Dual-use research raises several concerns for scientists, institutions, journals, and funding agencies: (1) should the research be conducted?, (2) should the research be publicly funded?, and (3) should the research be published? As noted earlier, freedom, openness, and social responsibility are three of science’s ethical norms. Restricting research to protect society from harm creates a conflict between freedom/openness and social responsibility ( Resnik, 2013 ). To manage this conflict, one needs to consider the potential benefits and harms of the research and the impact on the scientific community (such as a chilling effect) of proposed restrictions. Because freedom and openness are vital to the advancement of scientific knowledge, arguments to restrict publication must overcome a high burden of proof. The default position should be that unclassified research will normally be published in the open literature unless there is a significant chance that it can be used by others for harmful purposes. If restrictions on publication are warranted, the research could be published in redacted form, with information necessary to repeat experiments removed. The full version of the publication could be made available to responsible scientists ( Resnik, 2013 ). Another option would be to classify the research. The decision not to fund or defund research does not need to overcome as high a burden of proof as the decision to restrict publication, because freedom of inquiry does not imply a right to receive funding. Researchers who cannot obtain funding for their work from a government agency are still free to pursue other sources of funding, such as private companies or foundations.

6 Conclusion

Research ethics is of considerable concern to scientists, the media, policymakers, and the public. The ethical issues that scientists face are important, challenging, complex, and constantly evolving. To date, philosophers have focused mostly on fundamental questions concerning the relationship between science and human values and have had little to say about the day-to-day ethical questions and problems that confront scientists. Hopefully, this trend will change and more philosophers will take an interest in investigating and analyzing ethical issues in science. The benefits for science, society, and the philosophical profession could be significant.

Acknowledgments

This article is the work product of an employee or group of employees of the National Institute of Environmental Health Sciences (NIEHS), National Institutes of Health (NIH). However, the statements, opinions, or conclusions contained therein do not necessarily represent the statements, opinions, or conclusions of NIEHS, NIH, or the United States government.

Babbage, C. ( 1830 ) [1970]. Reflections on the Decline of Science in England (New York: Augustus Kelley).

Google Scholar

Google Preview

Broad, W. , and Wade, N. ( 1983 ). Betrayers of Truth: Fraud and Deceit in the Halls of Science (New York: Simon and Schuster).

Briggle, A. , and Mitcham C. ( 2012 ). Ethics in Science: An Introduction (New York: Cambridge University Press).

Cello, J. , Paul A. , Wimmer, E. ( 2002 ). “ Chemical Synthesis of Poliovirus cDNA: Generation of Infectious Virus in the Absence of Natural Template. ” Science 297(5583): 1016–1018.

Department of Health and Human Services. ( 2009 ). “ Protection of Human Subjects. ” Code of Federal Regulations 45, Part 46.

Elliott, K. C. ( 2011 ). Is a Little Pollution Good for You?: Incorporating Societal Values in Environmental Research (New York: Oxford University Press).

Emanuel, E. J. , Grady, C. C. , Crouch, R. A. , Lie, R. K. , Miller, F. G. , and Wendler, D. D. , eds. ( 2011 ). The Oxford Textbook of Clinical Research Ethics (New York: Oxford University Press).

Emanuel, E. J. , and Miller, F. G. ( 2001 ). “ The Ethics of Placebo-Controlled Trials—A Middle Ground. ” New England Journal of Medicine 345(12): 915–919.

Franklin, A. ( 1981 ). “ Millikan’s Published and Unpublished Data on Oil Drops. ” Historical Studies in the Physical Sciences 11: 185–201.

Garrett, J. R. , ed. ( 2012 ). The Ethics of Animal Research: Exploring the Controversy (Cambridge, MA: Massachusetts Institute of Technology Press).

Glick J. L. , Shamoo, A. E. ( 1993 ). “ A Call for the Development of “Good Research Practices” (GRP) Guidelines. ” Accountability in Research 2(4): 231–235.

Goldman, A. ( 1999 ). Knowledge in a Social World (New York: Oxford University Press).

Harding, S. ( 1998 ). Is Science Multicultural?: Postcolonialisms, Feminisms, and Epistemologies (Bloomington, IN: Indiana University Press).

Holton, G. ( 1978 ). “ Subelectrons, Presuppositions, and the Millikan-Ehrenhaft Dispute. ” Historical Studies in the Physical Sciences 9: 166–224.

Hudson, K. L. , Guttmacher, A. E. , and Collins, F. S. ( 2013 ). “ In Support of SUPPORT—A view from the NIH. ” New England Journal of Medicine 368(25): 2349–2351.

Imai, M. , Watanabe, T. , Hatta M. , Das, S. C. , Ozawa, M. , Shinya, K. , Zhong, G. , Hanson, A. , Katsura, H. , Watanabe, S. , Li, C. , Kawakami, E. , Yamada, S. , Kiso, M , Suzuki, Y. , Maher, E. A. , Neumann, G. , Kawaoka, Y. ( 2012 ). “ Experimental Adaptation of an Influenza H5 HA Confers Respiratory Droplet Transmission to a Reassortant H5 HA/H1N1 Virus in Ferrets. ” Nature 486(7403): 420–428.

International Committee of Medical Journal Editors. (2013). Recommendations for the Conduct, Reporting, Editing and Publication of Scholarly Work in Medical Journals. Available at: http://www.icmje.org/ . Accessed: August 31, 2013.

King, C. (2012). Multiauthor Papers: Onward and Upward. ScienceWatch Newsletter, July 2012. http://archive.sciencewatch.com/newsletter/2012/201207/multiauthor_papers/ . Accessed: August 31, 2013.

Kitcher, P. ( 1993 ). The Advancement of Science (New York: Oxford University Press).

Kornfeld, D. S. ( 2012 ). “ Perspective: Research Misconduct: The Search for a Remedy. ” Academic Medicine 87(7): 877–882.

Kuhn, T. S. ( 1962 ). The Structure of Scientific Revolutions . (Chicago: University of Chicago Press).

Kuhn, T. S. ( 1977 ). The Essential Tension (Chicago: University of Chicago Press).

LaFollette, H. , and Shanks, S. ( 1996 ). Brute Science: Dilemmas of Animal Experimentation (New York: Routledge).

Longino, H. ( 1990 ). Science as Social Knowledge (Princeton, NJ: Princeton University Press).

Macklin, R. ( 1999 ). Against Relativism: Cultural Diversity and the Search for Ethical Universals in Medicine (New York: Oxford University Press).

Macrina, F. ( 2005 ). Scientific Integrity . 3rd ed. (Washington, DC: American Society of Microbiology Press).

Malek, J. ( 2010 ). “ To Tell or Not to Tell? The Ethical Dilemma of the Would-Be Whistleblower. ” Accountability in Research 17(3): 115–129.

Martinson, B. C , Anderson, M. S. , and de Vries, R. ( 2005 ). “ Scientists Behaving Badly. ” Nature 435(7043): 737–738.

Merton, R. ( 1973 ). The Sociology of Science: Theoretical and Empirical Investigations (Chicago: University of Chicago Press).

Milgram, S. ( 1974 ). Obedience to Authority (New York: Harper and Row).

Morin, K. , Rakatansky, H. , Riddick, F. A. Jr. , Morse, L. J. , O’Bannon, J. M. , Goldrich, M. S. , Ray, P. , Weiss, M. , Sade, R. M. , and Spillman, M. A. ( 2002 ). “ Managing Conflicts of Interest in the Conduct of Clinical Trials. ” Journal of the American Medical Association 287(1): 78–84.

Munthe, C. , Welin, S. ( 1996 ). “ The Morality of Scientific Openness. ” Science Engineering Ethics 2(4): 411–428.

National Academy of Sciences. ( 2009 ). On Being a Scientist: A Guide to Responsible Conduct in Research . 3rd. ed. (Washington, DC: National Academy Press).

Office of Science and Technology Policy. 2000 . “ Federal Research Misconduct Policy. ” Federal Register 65(2350): 76260–76264.

Quine, W. V. , and Ullian, J. S. ( 1978 ). The Web of Belief. 2nd ed. (New York: McGraw-Hill).

Regan, T. ( 1983 ). The Case for Animal Rights (Berkeley: University of California Press).

Rennie, D. , Yank, V. , and Emanuel, L. ( 1997 ). “ When Authorship Fails. A Proposal to Make Contributors Accountable. ” Journal of the American Medical Association 278(7): 579–585.

Resnik, D. B. ( 1996 ). “ Social Epistemology and the Ethics of Research. ” Studies in the History and Philosophy of Science 27(4): 566–586.

Resnik, D. B. ( 1997 ). “ A Proposal for a New System of Credit Allocation in Science. ” Science and Engineering Ethics 3(3): 237–243.

Resnik, D.   1998 a. The Ethics of Science (New York: Routledge).

Resnik, D. B. ( 1998 b). “ The Ethics of HIV Research in Developing Nations. ” Bioethics 12(4): 285–306.

Resnik, D. B. ( 2003 ). “ From Baltimore to Bell Labs: Reflections on Two Decades of Debate about Scientific Misconduct. ” Accountability in Research 10(2): 123–135.

Resnik, D. ( 2007 ). The Price of Truth (New York: Oxford University Press).

Resnik, D. ( 2009 ). Playing Politics with Science (New York: Oxford University Press).

Resnik, D. B. ( 2011 ). “ A Troubled Tradition. ” American Scientist 99(1): 24–28.

Resnik, D. B. ( 2013 ). H5N1 “Avian Flu Research and the Ethics of Knowledge. ” Hastings Center Report 43(2): 22–33.

Resnik, D. B. . and Elliott, K. C. ( 2013 ). “ Taking Financial Relationships into Account When Assessing Research. ” Accountability in Research 20(3): 184–205.

Resnik, D. B. , Gutierrez-Ford, C. , and Peddada, S. ( 2008 ). “ Perceptions of Ethical Problems with Scientific Journal Peer Review: An Exploratory Study. ” Science and Engineering Ethics 14(3): 305–310.

Resnik, D. B. , and Master, Z. ( 2013 .) “ Policies and Initiatives Aimed at Addressing Research Misconduct in High-Income Countries. ” PLoS Med 10(3): e1001406.

Resnik, D. B. , and Stewart, C. N. Jr. ( 2012 ). “ Misconduct Versus Honest Error and Scientific Disagreement. ” Accountability in Research 19(1): 56–63.

Rosengard, A. M , Liu, Y. , Nie, Z. , and Jimenez, R. ( 2002 ). “ Variola Virus Immune Evasion Design: Expression of a Highly Efficient Inhibitor of Human Complement. ” Proceedings of the National Academy of Sciences 99(13): 8808–8813.

Rudner, R. ( 1953 ). “ The Scientist Qua Scientist Makes Value Judgments. ” Philosophy of Science 21(1): 1–6.

Russell, C. A. , Fonville, J. M. , Brown, A. E. , Burke, D. F , Smith, D. L. , James, S. L. , Herfst, S. , van Boheemen, S. , Linster, M. , Schrauwen, E. J. , Katzelnick, L. , Mosterín, A. , Kuiken, T. , Maher, E. , Neumann, G. , Osterhaus, A. D. , Kawaoka, Y. , Fouchier, R. A. , and Smith, D. J. ( 2012 ). “ The Potential for Respiratory Droplet-Transmissible A/H5N1 Influenza Virus to Evolve in a Mammalian Host. ” Science 336(6088): 1541–1547.

Shamoo, A. E. , and Resnik, D. B. ( 2009 ). Responsible Conduct of Research . 2nd ed. (New York: Oxford University Press).

Shrader-Frechette, K. ( 1994 ). Ethics of Scientific Research (Lanham, MD: Rowman and Littlefield).

Singer, P. ( 1975 ). Animal Liberation (New York: Random House).

Steneck, N. H. ( 1999 ). “ Confronting Misconduct in Science in the 1980s and 1990s: What Has and Has Not Been Accomplished? ” Science and Engineering Ethics 5(2): 161–176.

Steneck, N. H. ( 2007 ). ORI Introduction to the Responsible Conduct of Research . Rev. ed. (Washington, DC: Department of Health and Human Services).

Thagard, P. ( 1993 ). Computational Philosophy of Science (Cambridge, MA: Massachusetts Institute of Technology Press).

Weaver, D. , Reis M. H. , Albanese, C. , Costantini F. , Baltimore, D. , and Imanishi-Kari, T. ( 1986 ). “ Altered Repertoire of Endogenous Immunoglobulin Gene Expression in Transgenic Mice Containing a Rearranged Mu Heavy Chain Gene. ” Cell 45(2): 247–259.

Weeks, W. , Wallace, A. , and Kimberly, B. ( 2004 ). “ Changes in Authorship Patterns in Prestigious US Medical Journals. ” Social Science and Medicine 59(9): 1949–1954.

Wein, L. , and Liu, Y. ( 2005 ). “ Analyzing a Bioterror Attack on the Food Supply: The Case of Botulinum Toxin in Milk. ” Proceedings of the National Academy of Sciences 102(28): 9984–9989.

Whitbeck, C. ( 1995 ). “ Truth and Trustworthiness in Research. ” Science and Engineer Ethics 1(4): 403–416.

Whitbeck, C. ( 1998 ). Ethics in Engineering Practice and Research (New York: Cambridge University Press).

Wislar, J. S. , Flanagin, A. , Fontanarosa, P. B. , and Deangelis, C. D. ( 2011 ). “ Honorary and Ghost Authorship in High Impact Biomedical Journals: A Cross Sectional Survey. ” British Medical Journal 343: d6128.

Ziman, J. , 2000 . Real Science: What It Is, And What It Means (Cambridge: Cambridge University Press).

  • About Oxford Academic
  • Publish journals with us
  • University press partners
  • What we publish
  • New features  
  • Open access
  • Institutional account management
  • Rights and permissions
  • Get help with access
  • Accessibility
  • Advertising
  • Media enquiries
  • Oxford University Press
  • Oxford Languages
  • University of Oxford

Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide

  • Copyright © 2024 Oxford University Press
  • Cookie settings
  • Cookie policy
  • Privacy policy
  • Legal notice

This Feature Is Available To Subscribers Only

Sign In or Create an Account

This PDF is available to Subscribers Only

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

Einstein, Ethics and Science

  • First Online: 16 June 2017

Cite this chapter

ethics in science essay

  • Alex C. Michalos 2  

1169 Accesses

In celebration of Einstein’s remarkable achievements in 1905, this essay examines some of his views on the role of “intellectuals” in developing and advocating socio-economic and political positions and policies, the historical roots of his ethical views and certain aspects of his philosophy of science. As an outstanding academic and public citizen, his life and ideas continue to provide good examples of a life well-used and worth remembering.

Michalos, A.C.: 2004. Einstein, Ethics and Science. Journal of Academic Ethics, 2, pp. 339–354. © Springer 2005.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
  • Durable hardcover edition

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Arendt, H. (1976). Eichmann in Jerusalem: A report on the banality of evil . New York: Penguin Books.

Google Scholar  

Aristotle. (c.350 B.C.E., 1999). Nicomachean ethics (2nd ed.) (T. Irwin, Trans.). Indianapolis: Hackett Publishing Co.

Dallaire, R. A. (2003). Shake hands with the devil: The failure of humanity in Rwanda . Toronto: Random House Canada.

Diogenes Laertius. (c. 225 A.C.E., 1925). Lives of eminent philosophers (Vol. 1) (R. D. Hicks, Trans.). Cambridge: Harvard University Press.

Durbin, P. T. (Ed.). (1980). A guide to the culture of science, technology and medicine . New York: The Free Press.

Einstein, A. (1950). Essays in humanism . New York: Philosophical Library.

Einstein, A. (1954). Ideas and opinions . New York: Wings Books.

Einstein, A. (1979). The world as i see it . New York: Wisdom Library.

Gardner, J. W. (1990). On leadership . New York: The Free Press.

Gillings, R. J. (1972). Mathematics in the time of the pharaohs . Cambridge: MIT Press.

Hinshaw, V. G. (1951). Einstein’s social philosophy. In P. A. Schilpp (Ed.), Albert Einstein: Philosopher-scientist . New York: Tudor Publishing Co.

Levenson, T. (2003). Einstein in Berlin . New York: Bantam Books.

Michalos, A. C. (1969). Principles of logic . Englewood Cliffs: Prentice-Hall Inc.

Michalos, A. C. (1970). Improving your reasoning . Englewood Cliffs: Prentice-Hall Inc.

Michalos, A. C. (1971). The Popper-Carnap controversy . Hague: Martinus Nijhoff.

Book   Google Scholar  

Michalos, A. C. (1980–82). North American social report ( Vols. 1–5). Dordrecht: D. Reidel Publishing.

Michalos, A. C. (1981). North American social report, vol. 3: Science, education and recreation. Dordrecht: D. Reidel Publishing.

Michalos, A. C. (1985). Multiple discrepancies theory (MDT). Social Indicators Research, 16, 347–413.

Article   Google Scholar  

Michalos, A. C. (1989). Militarism and the quality of life . Toronto: Science for Peace.

Michalos, A. C. (1991–93). Global report on student well-being (Vols. 1–4). Berlin, Heidelberg, New York: Springer.

Michalos, A. C. (1995). A pragmatic approach to business ethics . Thousand Oaks: Sage.

Michalos, A. C. (1997). Good taxes: The case for taxing foreign currency exchange and other financial transactions . Toronto: Science for Peace.

Michalos, A. C., & Poff, D. C. (Eds.). (2002). Bernard Shaw and the Webbs . Toronto: University of Toronto Press.

Schilpp, P. A. (Ed.). (1951). Albert Einstein: Philosopher- scientist . New York: Tudor Publishing Co.

Zajonc, R. B. (1980). Feeling and thinking: Preferences need no inferences. American Psychologist, 35, 151–175.

Download references

Acknowledgements

An earlier version of this paper was presented at a regional meeting of the Royal Society of Canada, held at the University of Guelph, Ontario, May 2, 2005. I would like to thank O.P. Dwivedi for inviting me to write the paper and Deborah C. Poff for helping me clarify some ideas in it.

Author information

Authors and affiliations.

University of Northern British Columbia, Prince George, BC, Canada

Alex C. Michalos

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Alex C. Michalos .

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this chapter

Michalos, A.C. (2017). Einstein, Ethics and Science. In: Connecting the Quality of Life Theory to Health, Well-being and Education. Springer, Cham. https://doi.org/10.1007/978-3-319-51161-0_14

Download citation

DOI : https://doi.org/10.1007/978-3-319-51161-0_14

Published : 16 June 2017

Publisher Name : Springer, Cham

Print ISBN : 978-3-319-51160-3

Online ISBN : 978-3-319-51161-0

eBook Packages : Social Sciences Social Sciences (R0)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

Science, ethics and responsibility, focus of the World Science Forum

ethics in science essay

The scientific community gathered at the World Science Forum to focus on the ethical problems scientists face and on the responsibility of researchers for the consequences of their scientific results. In the 21st century, ways of separating the scientific method from values, beliefs and opinions are no longer self-evident, and the complex realities of science call for a greater consensus in the ethical principles of scientific research. The Forum brings together scientists, policy-makers and industry from over 100 countries in Budapest, Hungary from 20 to 23 November 2019, to share experiences and reflect on the future of science. 

“ We need science that responds to the needs and aspirations of societies, science that is open and participatory, science that is more connected to society and grounded in ethics and human rights ” explained Shamila Nair-Bedouelle, Assistant Director-General of UNESCO for the Natural Sciences. “ We expect that the results of this World Science Forum will feed into these timely global reflections at UNESCO on the future of science .”

This edition of the Forum marks the 20th anniversary of the first conference that launched the World Science Forum. Organized in 1999 by UNESCO and the International Council for Science (ICSU), the interdisciplinary conference was also an invitation to reflect on the issues of the role and responsibility of science, and the ethical questions of scientific research.

" When talking about freedom of research we must address questions like: How to share resources between basic research and innovation? How can science funding be made more transparent and just? Why is it so crucial for scientists to take part in the policy making process? " stressed László Lovász, President of the Hungarian Academy of Sciences, in his welcome speech.

The Fora are organized every two years to bring the global science community together. “ Nations do not work in different sciences: we share the building we are mutually constructing ” said János Áder, President of Hungary, in his opening address. “ What one begins, the other continues, and ultimately the result can be claimed by all who have contributed to its creation ."

Two scientific prizes were given during the opening of the Forum. Australian science writer and journalist Karl Kruszelnicki received the UNESCO Kalinga Prize for the Popularization of Science , in recognition of his longstanding commitment to fire up people’s curiosity for science and share his passion for the subject. “Dr Karl” has been a science communicator for over 30 years, using television, radio, podcasts, print media, books and social media  The UNESCO Sultan Qaboos Prize for Environmental Conservation as awarded to the Bangalore-based Ashoka Trust for Research in Ecology and the Environment (ATREE) in recognition of its socially-just environmental conservation and sustainable development activities make science accessible to all.

UNESCO is organizing several sessions, and will launch the Charter of Ethics of Science and Technology in the Arab Region during the Forum. Her Royal Highness Princess Sumaya bint El Hassan, President of the Royal Scientific Society of Jordan and UNESCO Special Envoy for Science for Peace will participate in the launch and the press conference to present the Charter of Ethics of Science and Technology in the Arab Region. As an addition to the traditional programme the flow of plenary sessions will be completed by keynote lectures by eminent scientists focusing on recent and inspiring discoveries and new approaches that will be or are already shaping our future.

UNESCO is involved in the following sessions:

Opening  20 November, 17:30 – 22:00; Venue: Müpa Budapest, Béla Bartók National Concert Hall

  • Opening addresses  
  • UNESCO Kalinga Prize for the Popularization of Science - Ceremony  
  • UNESCO Sultan Qaboos Prize for Environmental Conservation – Ceremony

The urgent responsibility of science for the 2030 Sustainable Development Agenda 21 November 2019; 11:30-13:00; Small Lecture Hall

Open Science - The Future of Science and Science for the Future 21 November 2019; 17:00-18:30 hrs; Ceremonial Hall

Gender Equality in Science, Technology and Innovation (STI): an Ethical Issue 21 November 2019; 17:00-18:30; Large lecture hall

Ethics of Artificial intelligence – Global Considerations and Potential for Africa 21 November 2019; 17:00-18:30; Library Conference Room

Science Diplomacy for Global Challenges: International Frames, Norms and Ethical Principles 22 November 2019; 17:00-18:30; large lecture hall

Launch of the Charter of Ethics of Science and Technology in the Arab Region (press conference) Friday 22 November 2019 at 13:30, Hungarian Academy of Sciences - Kisterem room Contact information:    UNESCO: Ms. Jana El-Baba ( &#106;&#46;&#101;&#108;&#45;&#98;&#97;&#98;&#97;&#64;&#117;&#110;&#101;&#115;&#99;&#111;&#46;&#111;&#114;&#103; )

Related items

  • Country page: Hungary

More on this subject

UNESCO names 18 new Geoparks

Article Hungary’s education responses to the influx of Ukrainian students 26 March 2022

Other recent news

Students take the lead to spread an eco-schools initiative in Hungary

When you choose to publish with PLOS, your research makes an impact. Make your work accessible to all, without restrictions, and accelerate scientific discovery with options like preprints and published peer review that make your work more Open.

  • PLOS Biology
  • PLOS Climate
  • PLOS Complex Systems
  • PLOS Computational Biology
  • PLOS Digital Health
  • PLOS Genetics
  • PLOS Global Public Health
  • PLOS Medicine
  • PLOS Mental Health
  • PLOS Neglected Tropical Diseases
  • PLOS Pathogens
  • PLOS Sustainability and Transformation
  • PLOS Collections

Understanding Scientific and Research Ethics

ethics in science essay

How to pass journal ethics checks to ensure a smooth submission and publication process

Reputable journals screen for ethics at submission—and inability to pass ethics checks is one of the most common reasons for rejection. Unfortunately, once a study has begun, it’s often too late to secure the requisite ethical reviews and clearances. Learn how to prepare for publication success by ensuring your study meets all ethical requirements before work begins.

The underlying principles of scientific and research ethics

Scientific and research ethics exist to safeguard human rights, ensure that we treat animals respectfully and humanely, and protect the natural environment.

The specific details may vary widely depending on the type of research you’re conducting, but there are clear themes running through all research and reporting ethical requirements:

Documented 3rd party oversight

  • Consent and anonymity
  • Full transparency

If you fulfill each of these broad requirements, your manuscript should sail through any journal’s ethics check.

ethics in science essay

If your research is 100% theoretical, you might be able to skip this one. But if you work with living organisms in any capacity—whether you’re administering a survey, collecting data from medical records, culturing cells, working with zebrafish, or counting plant species in a ring—oversight and approval by an ethics committee is a prerequisite for publication. This oversight can take many different forms:

For human studies and studies using human tissue or cells, obtain approval from your institutional review board (IRB). Register clinical trials with the World Health Organization (WHO) or International Committee of Medical Journal Editors (ICMJE). For animal research consult with your institutional animal care and use committee (IACUC). Note that there may be special requirements for non-human primates, cephalopods, and other specific species, as well as for wild animals. For field studies , anthropology and paleontology , the type of permission required will depend on many factors, like the location of the study, whether the site is publicly or privately owned, possible impacts on endangered or protected species, and local permit requirements. 

TIP: You’re not exempt until your committee tells you so

Even if you think your study probably doesn’t require approval, submit it to the review board anyway. Many journals won’t consider retrospective approvals. Obtaining formal approval or an exemption up front is worth it to ensure your research is eligible for publication in the future.

TIP: Keep your committee records close

Clearly label your IRB/IACUC paperwork, permit numbers, and any participant permission forms (including blank copies), and keep them in a safe place. You will need them when you submit to a journal. Providing these details proactively as part of your initial submission can minimize delays and get your manuscript through journal checks and into the hands of reviewers sooner.

Consent & anonymity

Obtaining consent from human subjects.

You may not conduct research on human beings unless the subjects understand what you are doing and agree to be a part of your study. If you work with human subjects, you must obtain informed written consent from the participants or their legal guardians. 

There are many circumstances where extra care may be required in order to obtain consent. The more vulnerable the population you are working with the stricter these guidelines will be. For example, your IRB may have special requirements for working with minors, the elderly, or developmentally delayed participants. Remember that these rules may vary from country to country. Providing a link to the relevant legal reference in your area can help speed the screening and approval process.

TIP: What if you are working with a population where reading and writing aren’t common?

Alternatives to written consent (such as verbal consent or a thumbprint) are acceptable in some cases, but consent still has to be clearly documented. To ensure eligibility for publication, be sure to:

  • Get IRB approval for obtaining verbal rather than written consent
  • Be prepared to explain why written consent could not be obtained
  • Keep a copy of the script you used to obtain this consent, and record when consent was obtained for your own records

Consent and reporting for human tissue and cell lines

Consent from the participant or their next-of-kin is also required for the use of human tissue and cell lines. This includes discarded tissue, for example the by-products of surgery.  

When working with cell lines transparency and good record keeping are essential. Here are some basic guidelines to bear in mind:

  • When working with established cell lines , cite the published article where the cell line was first described.
  • If you’re using repository or commercial cell lines ,  explain exactly which ones, and provide the catalog or repository number. 
  • If you received a cell line from a colleague , rather than directly from a repository or company, be sure to mention it. Explain who gifted the cells and when.
  • For a new cell line obtained from a colleague there may not be a published article to cite yet, but the work to generate the cell line must meet the usual requirements of consent—even if it was carried out by another research group. You’ll need to provide a copy of your colleagues’ IRB approval and details about the consent procedures in order to publish the work.

Finally, you’re obliged to keep your human subjects anonymous and to protect any identifying information in photos and raw data. Remove all names, birth dates, detailed addresses, or job information from files you plan to share. Blur faces and tattoos in any images. Details such as geography (city/country), gender, age, or profession may be shared at a generalized level and in aggregate. Read more about standards for de-identifying datasets in The BMJ .

TIP: Anonymity can be important in field work too

Be careful about revealing geographic data in fieldwork. You don’t want to tip poachers off to the location of the endangered elephant population you studied, or expose petroglyphs to vandalism.

Full Transparency

No matter the discipline, transparent reporting of methods, results, data, software and code is essential to ethical research practice. Transparency is also key to the future reproducibility of your work.

When you submit your study to a journal, you’ll be asked to provide a variety of statements certifying that you’ve obtained the appropriate permissions and clearances, and explaining how you conducted the work. You may also be asked to provide supporting documentation, including field records and raw data. Provide as much detail as you can at this stage. Clear and complete disclosure statements will minimize back-and-forth with the journal, helping your submission to clear ethics checks and move on to the assessment stage sooner.

TIP: Save that data

As you work, be sure to clearly label and organize your data files in a way that will make sense to you later. As close as you are to the work as you conduct your study, remember that two years could easily pass between capturing your data and publishing an article reporting the results. You don’t want to be stuck piecing together confusing records in order to create figures and data files for repositories.

Read our full guide to preparing data for submission .

Keep in mind that scientific and research ethics are always evolving. As laws change and as we learn more about influence, implicit bias and animal sentience, the scientific community continues to strive to elevate our research practice.

A checklist to ensure you’re ethics-check ready

Before you begin your research

Obtain approval from your IRB, IACUC or other approving body

Obtain written informed consent from human participants, guardians or next-of-kin

Obtain permits or permission from property owners, or confirm that permits are not required

Label and save all of records

As you work

Adhere strictly to the protocols approved by your committee

Clearly label your data, and store it in a way that will make sense to your future self

As you write, submit and deposit your results

Be ready to cite specific approval organizations, permit numbers, cell lines, and other details in your ethics statement and in the methods section of your manuscript

Anonymize all participant data (including human and in some cases animal or geographic data)

If a figure does include identifying information (e.g. a participant’s face) obtain special consent

The contents of the Peer Review Center are also available as a live, interactive training session, complete with slides, talking points, and activities. …

The contents of the Writing Center are also available as a live, interactive training session, complete with slides, talking points, and activities. …

There’s a lot to consider when deciding where to submit your work. Learn how to choose a journal that will help your study reach its audience, while reflecting your values as a researcher…

Ethical Implications of the John Money Experiment: a Critical Analysis

This essay about the ethical implications of the John Money experiment critically examines the tragic case of David Reimer, who was raised as a girl following a failed circumcision and subsequent medical advice. It discusses issues of informed consent, the harm caused by unethical medical practices, and the misuse of authority in clinical research. The text highlights the severe psychological impact on Reimer and critiques the ethical lapses in handling his case, emphasizing the need for compassion and strict ethical standards in medical practices and research.

How it works

The story of the John Money experiment is a cautionary tale of the interplay between medical ethics and the complexities of gender identity.

This episode in the history of psychology and medical science revolves around the tragic case of David Reimer, originally born as Bruce Reimer, who was raised as a girl following catastrophic medical advice and treatment initiated by Dr. John Money. The ethical implications of this case are profound, impacting notions of consent, the responsibilities of healthcare professionals, and the psychosocial dynamics of gender identity.

In 1965, Canadian twin boys, Bruce and Brian Reimer, were born healthy. However, during a routine circumcision, Bruce’s penis was irreparably damaged. Dr. Money, a psychologist and sexologist who advocated for the theory that gender identity is primarily learned through social and environmental cues rather than biological, inherited traits, advised the distraught parents to raise Bruce as a girl. Consequently, Bruce was renamed Brenda, surgically altered, and raised as a female. This case presented Money with a unique opportunity to further his research and theories on gender identity and fluidity.

From an ethical standpoint, the first major issue was the lack of informed consent. Although the Reimers were desperate for a solution, they were arguably not fully informed of the potential risks and the experimental nature of the treatment proposed. Money’s assurance of success was based on theoretical assumptions rather than concrete evidence. The principle of informed consent is critical in medical ethics, ensuring that patients or, in the case of minors, their guardians, are fully aware of all potential risks and outcomes associated with a medical intervention. The Reimers’ decision was influenced heavily by Money’s authority and the promise of a normal life for their child, which clouds the authenticity of their consent.

Secondly, the experiment underscores the ethical responsibility of healthcare professionals to avoid harm—primum non nocere (first, do no harm). Money’s experiment, though initially seeming to show Brenda adapting well, eventually resulted in severe psychological distress and confusion for her as she grew. The dissonance between Brenda’s biological sex and imposed gender identity led to significant behavioral and emotional issues, which Money reported misleadingly to support his hypothesis. This manipulation of data for theoretical validation, rather than patient welfare, starkly contravenes medical ethics.

Moreover, the experiment raises critical questions about the ethical treatment of children in medical research. Children are a vulnerable population, and any medical intervention, especially those involving identity and psychological wellbeing, must be approached with extreme caution and ethical rigor. Money’s continuation of the experiment, despite evident adverse effects on Brenda’s mental health, highlights a grievous ethical lapse—the prioritization of research outcomes over the patient’s wellbeing.

The psychological toll on David Reimer (who reassumed his male identity in adolescence after learning the truth of his medical history) was immense. He suffered long-term consequences, including depression and identity struggles, ultimately leading to his tragic suicide at the age of 38. This outcome serves as a potent reminder of the ethical responsibility towards long-term welfare in medical decision-making, particularly in cases involving non-urgent, non-life-threatening conditions.

Analyzing this case through the lens of virtue ethics, which emphasizes the moral character of the practitioner rather than the ethicality of an act, presents an additional dimension of critique. The virtues of empathy, humility, and integrity, essential for ethical medical practice, were conspicuously lacking in Money’s handling of the Reimer case. His determination to prove a theory seemingly took precedence over the humane treatment of Brenda/David.

In conclusion, the John Money experiment with David Reimer exposes profound ethical violations, highlighting the necessity for rigorous ethical standards in medical and psychological research. This case study is a critical reminder of the potential human cost when ethical boundaries are overlooked in the pursuit of scientific advancement. It underscores the enduring need for compassion, rigorous adherence to informed consent, and the prioritization of individual welfare in all medical practices. This tragic narrative invites ongoing discourse on medical ethics, reinforcing the imperative to safeguard the most vulnerable among us from experimental practices devoid of empirical validation and ethical integrity.

owl

Cite this page

Ethical Implications of the John Money Experiment: A Critical Analysis. (2024, May 12). Retrieved from https://papersowl.com/examples/ethical-implications-of-the-john-money-experiment-a-critical-analysis/

"Ethical Implications of the John Money Experiment: A Critical Analysis." PapersOwl.com , 12 May 2024, https://papersowl.com/examples/ethical-implications-of-the-john-money-experiment-a-critical-analysis/

PapersOwl.com. (2024). Ethical Implications of the John Money Experiment: A Critical Analysis . [Online]. Available at: https://papersowl.com/examples/ethical-implications-of-the-john-money-experiment-a-critical-analysis/ [Accessed: 15 May. 2024]

"Ethical Implications of the John Money Experiment: A Critical Analysis." PapersOwl.com, May 12, 2024. Accessed May 15, 2024. https://papersowl.com/examples/ethical-implications-of-the-john-money-experiment-a-critical-analysis/

"Ethical Implications of the John Money Experiment: A Critical Analysis," PapersOwl.com , 12-May-2024. [Online]. Available: https://papersowl.com/examples/ethical-implications-of-the-john-money-experiment-a-critical-analysis/. [Accessed: 15-May-2024]

PapersOwl.com. (2024). Ethical Implications of the John Money Experiment: A Critical Analysis . [Online]. Available at: https://papersowl.com/examples/ethical-implications-of-the-john-money-experiment-a-critical-analysis/ [Accessed: 15-May-2024]

Don't let plagiarism ruin your grade

Hire a writer to get a unique paper crafted to your needs.

owl

Our writers will help you fix any mistakes and get an A+!

Please check your inbox.

You can order an original essay written according to your instructions.

Trusted by over 1 million students worldwide

1. Tell Us Your Requirements

2. Pick your perfect writer

3. Get Your Paper and Pay

Hi! I'm Amy, your personal assistant!

Don't know where to start? Give me your paper requirements and I connect you to an academic expert.

short deadlines

100% Plagiarism-Free

Certified writers

Jump to navigation

Feminisms in the 21st-Century Science Fiction Novel; PAMLA (Nov. 7-10, 2024)

The 121st annual conference of the Pacific Ancient and Modern Language Association (PAMLA) will be held from Thursday, November 7, to Sunday, November 10, 2024, at the Margaritaville Resort, Palm Springs, California.

Feminisms in the 21st-Century Science Fiction Novel:

Call for Chapters – Advanced Unicorn Theory

Advanced Unicorn Theory is an edited, peer-reviewed volume of interdisciplinary essays about the meaning of the modern image of the unicorn. Edited by Alicia King Anderson, Ph.D. in Mythological Studies and Depth Psychology, and contracted to be published by McFarland Press.

While scholarship about unicorns tends to stick with the history and art history around the image, the cultural meaning of the unicorn has moved on, largely outside of the notice of academia. Unicorns are perhaps more present now than they have been since the early 1980s in toy aisles, books, and television. What is it about the symbol of the unicorn that speaks to a modern mind?   

Al-Kīmiyā - Journal of the Faculty of Languages and Translation (FdLT) Call for Papers for Issue Number 25

Call for Papers for Issue Number 25

The issue 25 of Al-Kīmiyā , the Journal of the Faculty of Languages and Translation of Saint Joseph University of Beirut will receive, under the sign of diversity, articles covering various fields of research in translation and in language. Proposals can deal with issues that currently concern research in translation studies and language sciences. The choice of themes is left to researchers who will thus reflect in their articles the diversity of approaches and perspectives paving the way to dismantle the barriers among the disciplines.

Submission Guidelines

MAPACA 2024 Annual Conference Call for Proposals

Mid-Atlantic Popular & American Culture Association (MAPACA)

2024 Annual Conference

November 7-9, 2024

Atlantic City, NJ

Tropicana Casino and Resort, Atlantic City

Call for Proposals

Proposals are welcome on all aspects of popular and American culture for inclusion in the 2024 Mid-Atlantic Popular & American Culture Association (MAPACA) conference in Atlantic City, NJ. Single papers, panels, roundtables, and alternative formats are welcome.

27th Generative Art International Conference

We kindly invite you to participate in the 27th Generative Art International Conference and we are very pleased to communicate to you that our Conference gained the support of the Italian National Commission of UNESCO for "the great value of the GA activities considering the result of the previous Conferences and the interest shown by researches from all over the world". The location of the next GA conference is in Venice, Italy, at the UNESCO Regional Bureau for Science and Culture in Europe, Castello 4930, 30122 Venice. The dates are the 17th, 18th, and 19th of December 2024. ( https://www.generativeart.com ). The conference focus on: "GENERATIVE ART WHERE THE BEAUTY VOICE SURVIVES

International Congress Memory, legacy and presence in the public and media space: historical fascism and the radical right in the 20th and 21st centuries

Call for Papers | International Congress Memory, legacy and presence in the public and media space: historical fascism and the radical right in the 20th and 21st centuries Extension of the Call for Papers deadline until May 31st

Venue : Faculty of Arts and Humanities of the University of Porto (FLUP)

Date : 26th and 27th September 2024

Promoter : Centre for Transdisciplinary Research on Culture, Space and Memory (CITCEM)

Call for Papers :

Call for Panel and Seminar Proposals for the ESRA 2025 Conference

Call for Panel, Roundtable and Seminar Proposals for the ESRA conference that will take place in Porto from the 9th to the 12th of July 2025.

The topic: Shakespeare and Time: the retrieved pasts, the envisaged futures

The confirmed plenary speakers: Michael Dobson (Shakespeare Institute, U. of Birmingham); Evelyn Gajowski (U. of Nevada, Las Vegas); Shaul Bassi (U. Ca’ Foscari Venezia); Boika Sokolova (U. of Notre Dame, England) and Kirilka Stavreva (Cornell College)

Please send your proposals until  3 June 2024  to the following email address:  [email protected]

Thanatic Ethics Conference #4: “Death and migration in times of conflict: a forensic perspective”

~Proposal submission deadline has been extended to May 20th.~

Thanatic Ethics Conference #4

“Death and migration in times of conflict: a forensic perspective”

Sciences Po, Paris

in partnership with the Education University of Hong Kong 

and EMMA (Paul Valery University Montpellier 3)

Venue:  Sciences Po, Paris

Dates:   Oct. 17-19, 2024

Language:  English

Figures of Vulnerability through the lens of the Second World War: Between Resistance and Resilience

In order to contribute to the duty of remembrance for the eightieth anniversary of France’s Liberation, FoReLLIS, MIMMOC, and Criham, research centres at the University of Poitiers, are organising, in partnership with Archives Départementales de la Vienne, CultureLLe, VRID-le Musée, M&V, Espace Mendès-France and Maison de la Poésie, a week of events around the Second World War from 7 to 11 October 2024. An international, multidisciplinary conference will be held from October 10 to October 11.

International Conference on Interdisciplinary Dialogues

3-Day International Conference on Interdisciplinary Dialogues (ICID-2024)

We are glad to invite you to present your research paper at the 3-Day International Conference on Interdisciplinary Dialogues (ICID-2024), organised by Chinmaya Vishwa Vidyapeeth (Deemed to be University), Kerala from 10th to 12th August 2024 in the online mode. This conference provides SIX MAJOR TRACKS to discuss the significance and scope of interdisciplinary research and provides opportunities for collaboration and networking with researchers from India and abroad.

IMAGES

  1. Ethics In Science Essay Example

    ethics in science essay

  2. Sample essay on ethics

    ethics in science essay

  3. Science Essay

    ethics in science essay

  4. Ethics in Science: Overview

    ethics in science essay

  5. (PDF) Ethics of Research

    ethics in science essay

  6. Code Of Ethics Argumentative Essay Example

    ethics in science essay

VIDEO

  1. Thus Saith the Science: C. S. Lewis on the Dangers of Scientism

  2. Ethics of science by P Balaram

  3. SSLC SOCIAL SCIENCE

  4. Essay on blessings of science for class 7-12|importance of science|Advantages of science

  5. Scientific Conduct

  6. mod12lec70

COMMENTS

  1. How Do Scientists Perceive the Relationship Between Ethics and Science? A Pilot Study of Scientists' Appeals to Values

    Introduction. One challenge to promoting ethical behavior in science is that scientists sometimes view ethics as being external to science (Hempel, 1965; Lacey, 1999; Betz, 2013).). Douglas notes the influence of the long tradition of the "value-free ideal" that holds that value-laden decision-making about science is limited to the choice of projects and application of products in society.

  2. What Are Ethics In Science And Why They Are Important?

    It's a list of dos and don'ts that set right apart from wrong. Ethics in science are the principles that scientists must follow while conducting their research. It creates a strong foundation for trustworthy and undeniable findings that can be accepted all over the world by the scientific community. The definition of ethics.

  3. Chapter 1: How Ethics and Values Intersect in Science

    Examples of ethics and values issues in science. In an essay entitled "The Ethical Dimensions of Scientific Research" the widely published logician and philosopher of science Nicholas Rescher attacks the view that science is value free, and shows how ethical considerations enter into many aspects of the practice of scientific research. Rescher ...

  4. Ethics in Science

    Abstract. This chapter provides an overview of the ethics of scientific research. Topics covered include: a review of significant historical events, trends, and cases pertaining to scientific ethics; a discussion of the philosophical foundations of science's ethical norms; a description of science's ethical norms; and an examination of some common ethical dilemmas that arise in such areas ...

  5. How Do Scientists Perceive the Relationship Between Ethics and Science

    One challenge to promoting ethical behavior in science is that scientists sometimes view ethics as being external to science (Hempel, 1965; Lacey, 1999; Betz, 2013).). Douglas notes the influence of the long tradition of the "value-free ideal" that holds that value-laden decision-making about science is limited to the choice of projects and application of products in society.

  6. PDF The Importance of Ethical Conduct in Scientific Research

    from other researchers over several years, and from 2002 to 2003 over 25 of his papers were retracted. Subsequently, Schön saw his doctoral degree revoked from the University of Konstanz for dishonorable conduct, as well as a host of international research awards. A similar case involves former Columbia University graduate student Bengü Sezen ...

  7. Scientific Ethics: A New Approach

    Science is an activity of the human intellect and as such has ethical implications that should be reviewed and taken into account. Although science and ethics have conventionally been considered different, it is herewith proposed that they are essentially similar. The proposal set henceforth is to create a new ethics rooted in science: scientific ethics. Science has firm axiological ...

  8. [PDF] The ethics of science: An introduction

    Ethics of Science is a comprehensive and student-friendly introduction to the study of ethics in science and scientific research. The book covers: * Science and Ethics * Ethical Theory and Applications * Science as a Profession * Standards of Ethical Conduct in Science * Objectivity in Research * Ethical Issues in the Laboratory * The Scientist in Society * Toward a More Ethical Science ...

  9. Einstein, Ethics and Science

    In celebration of Einstein's remarkable achievements in 1905, this essay examines some of his views on the role of "intellectuals" in developing and advocating socio-economic and political positions and policies, the historical roots of his ethical views and certain aspects of his philosophy of science. As an outstanding academic and ...

  10. Science, ethics and responsibility, focus of the World Science ...

    The scientific community gathered at the World Science Forum to focus on the ethical problems scientists face and on the responsibility of researchers for the consequences of their scientific results. In the 21st century, ways of separating the scientific method from values, beliefs and opinions are no longer self-evident, and the complex realities of science call for a greater consensus in ...

  11. On the Relationship between Science and Ethics

    Abstract. The relationship between ethics and science has been discussed within the framework of continuity versus discontinuity theories, each of which can take several forms. Continuity theorists claim that ethics is a science or at least that it has deep similarities with the modus operandi of science. Discontinuity theorists reject such ...

  12. How I Became Involved in Scientific Ethics

    How I Became Involved in Scientific Ethics. 29 Jun 2001. By Mark Murray. Share: A lthough most people reading this essay are probably already interested in scientific integrity and the ethical issues involved in research, I have to admit that before entering the research world, I had not given the issue any real consideration.

  13. Understanding Scientific and Research Ethics

    Reputable journals screen for ethics at submission—and inability to pass ethics checks is one of the most common reasons for rejection. Unfortunately, once a study has begun, it's often too late to secure the requisite ethical reviews and clearances. Learn how to prepare for publication success by ensuring your study meets all ethical requirements before work begins.

  14. Ethical Considerations in Research

    Research ethics are a set of principles that guide your research designs and practices in both quantitative and qualitative research. In this article, you will learn about the types and examples of ethical considerations in research, such as informed consent, confidentiality, and avoiding plagiarism. You will also find out how to apply ethical principles to your own research projects with ...

  15. Ethics of Science

    In this, ethics of science shall be understood as a part of philosophical ethics that refers to a specific, societal field of action determined by particular forms of knowledge. Above all, ethics of science, in respect of the task on hand, has to deal with two interrelated areas of phenomena. First, ethics of science draws on the specific ethos ...

  16. Ethics in Science Essay

    Ethics in Science Essay. What do you think of when you hear or see the word "science"; test tubes, Einstein, Space? Science is "generally taken as meaning either (a) the exact sciences, such as chemistry, physics, etc., or (b) a method of thought which obtains verifiable results by reasoning logically from observed fact" (Orwell ...

  17. Ethics

    The term ethics may refer to the philosophical study of the concepts of moral right and wrong and moral good and bad, to any philosophical theory of what is morally right and wrong or morally good and bad, and to any system or code of moral rules, principles, or values. The last may be associated with particular religions, cultures, professions, or virtually any other group that is at least ...

  18. Full article: Integrating Data Science Ethics Into an Undergraduate

    1 Introduction. Data ethics is a rapidly-developing yet inchoate subfield of research within the discipline of data science, which is itself rapidly-developing (Wender and Kloefkorn 2017 ). For example, the Data Science department at Stanford University lists "Ethics and Data Science" as one of its research areas: https://datascience ...

  19. Ethical Implications of the John Money Experiment: a Critical Analysis

    Essay Example: The story of the John Money experiment is a cautionary tale of the interplay between medical ethics and the complexities of gender identity. This episode in the history of psychology and medical science revolves around the tragic case of David Reimer, originally born as Bruce.

  20. Environmental ethics and ancient philosophy: A complicated affair

    Environmental ethics and ancient philosophy: A complicated affair. Jorge Torres. Published in Environmental Values 6 May 2024. Philosophy, Environmental Science. This article provides a comprehensive review of the rather intricate relationship between contemporary environmental ethics, understood as a philosophical branch, and ancient philosophy .

  21. cfp

    Call for Papers for Issue Number 25 The issue 25 of Al-Kīmiyā, the Journal of the Faculty of Languages and Translation of Saint Joseph University of Beirut will receive, under the sign of diversity, articles covering various fields of research in translation and in language.Proposals can deal with issues that currently concern research in translation studies and language sciences.