If the Tuskegee Syphilis Study were unique, we might have less concern about the ethics of human research. Sadly, many cases of such violations in medical (Beecher, 1970) and psychological (Ad Hoc Committee, on Ethical Standards of Psychological Research, 1973) research have occurred.

The review of human research guidelines, which led to the National Research Act of 1974, stemmed from other major scandals besides the Tuskegee Syphilis Study. In the 1960s, lives cancer cells were injected into humans without their informed consent in the Brooklyn Jewish Chronic Disease Hospital.

Drug research without subject consent had taken place, also in the 1960s, in the Milledgeville State Hospital in Georgia. Such incidents made it clear that human subjects needed more protection from researchers (Hershey & Miller, 1976).



Ethics involves the study of right and wrong conduct. This branch of philosophy has produced numerous ethical theories throughout recorded history. For example, Plato held that proper conduct led to harmony. But we observe that the desires of people tend rather to conflict than to harmony. Thus the central ethical question asks how we are to reconcile two or more conflicting preferences.

This dilemma appears in a dialogue from the Republic between Socrates and Thrasymachus, in which the latter says that “justice is nothing else than the interest of the stronger”.

Later philosophers became unhappy both with idealized but impractical ethics and with realistic ethics defined by brute force. Some tried to solve the dilemma by the scientific- seeming method of utilitarianism, which seeks a rational assessment and balancing of costs and benefits of behavior.

This approach identifies proper behavior because it produces the maximum benefit for the least cost, taking into account the costs and benefits of all concerned.


Applied to human research, this means that we must weigh the benefits of the knowledge to be gained against the costs, including those to the subject. However, this approach requires that someone place weights or values on the benefits and costs.

In terms of the Tuskegee Study, what value shall we place on one work-year of forgone treatment for syphilis? How shall we measure the worth of knowledge about racial differences in the development of syphilis, the rationale for the study? Placing values on such intangibles can prove quite difficult, and ultimately this task must move to the political arena for resolution.

The philosophical effort to discover proper behavior by logic and science has not produced a consensus. Researchers want to gain knowledge with minimal resistance. Human subjects want the right to avoid or leave potentially risky research. Not surprisingly researchers and subjects will usually place different values on these differing goals.

Other naturally occurring conflicts in the research process can pose similar ethical dilemmas. For example, the researcher may falsify data or claim the words written by another. These and other types of misconduct result from conflicting interests.


Such conflicts, although serious, seldom produce shocking moral tragedies of the magnitude of the Tuskegee Study. Perhaps for that reason efforts to protect human subjects have produced a more detailed body of principles and procedures than the other ethical conflicts in research.

As a result we will focus first on these subject protection principles and the procedures designed to uphold them.

Nuremberg Code:

The First Legal Effort. In a utilitarian framework, the valuation of costs and benefits will likely depend on the customary beliefs and values of the culture. How does the culture judge the relative importance of the individual subject and the society’s right to scientific knowledge? Even in the brief history of large-scale social research, different societies have disagreed on such values and the research practices that flow from them.


For example, in World War II, human research in Germany took place under official policy in various concentration camps. After the war, the cruelty of some of this research shocked the Allies. Some of these concentration camp researchers stood trial before the Nuremberg Military Tribunal. This court needed a set of ethical guidelines, from the perspective of the Allied governments’ values. The first legal need for such ethical guidelines in social research produced the Nuremberg Code.

The Nuremberg Code has spawned many sets of ethical principles for the protection of human subjects. Almost every national and world medical and social science organization has devised its own code. These codes have tended to converge on principles that protect the subject’s rights to safety and informed consent. By announcing these rules, these groups hope to reduce the uncertainty and disorder that had marked the researcher-subject relationship. However, most of these codes, except for the Nuremberg Code, expressed not laws but guiding principles to which researchers would voluntarily conform.