The concept of "wicked problem" has been coined by Horst W. J. Rittel and Melvin M. Webber in the article "Dilemmas in a General Theory of Planning," Policy Sciences, 4 (1973): 155-69. We define wicked problems as problems that can be formulated, explained, or approached in a number of conflicting ways—a number that cannot be determined a priori—in which each way can be justified by an argumentation that seems to be reasonable; these conflicting argumentations usually rest on distinctly different interests, world-views, or values.
Firearms on university campuses
On Dec. 19, 2012, five days after the Sandy Hook Elementary School shooting in Newtown that killed 20 children and six adults, then State Representative-elect Charles Gregory (R) from Marietta proposed House Bill 29, calling to “enact Georgia Campus Carry Act of 2013.” This is the suggested name for a “BILL to be entitled an Act to amend Part 3 of Article 4 of Chapter 11 of Title 16 of the Official Code of Georgia Annotated, relating to carrying and possession of firearms, so as to repeal prohibitions against carrying a firearm on to postsecondary institution campuses.”
In the meantime, Georgia State Senator Vincent Fort (D) from Atlanta announced: "I'm going to introduce a bill to ban assault weapons.”
Imagine representatives of Georgia Tech have been invited to outline their views on firearms on university campuses in a hearing on Georgia’s capitol. President Peterson forms a task force composed of the members of your team. The task force is charged with providing a broad and deep understanding of the case in question. The task is not to develop just one position, but to take into account the broadest variety of possible positions, and only then to develop what seems to be the most convincing one.
Geo-engineering the Earth's climate
Global warming holds the prospect of potential disaster. But governments are obviously not able—neither on the national level nor globally—to set in place measures that seem to be necessary to keep the planet’s climate in a safe equilibrium. Alarmed by what is perceived as political failure, some scientists and engineers have been proposing major “last-minute” schemes for a number of years. If these schemes, they argue, were properly developed and assessed in advance, they could be available for rapid deployment in case of imminent, catastrophic and, possibly, irreversible increases in global temperatures. Opponents, on the other hand, argue that geoengineering presents unpredictable risks, questions about responsibility for its effects and for future generations, justice issues regarding who will benefit most, and many other concerns. Since both sides of the issue provide ethical arguments for their positions, the challenge is to weigh the power of these arguments from an ethical point of view.
Lethal Autonomous Robots (LARs)
Ronald Arkin, Director of the Mobile Robot Laboratory and Associate Dean for Research in the College of Computing at Georgia Tech, has devised an algorithm for an “ethical governor” that he says could one day guide whether an aerial drone or ground robot uses lethal force in accordance with internationally agreed-upon rules of war. The hope is to develop systems whose ethical behavior is superior to that of humans, whose decisions on the battlefield are influenced by stress and emotions. There is also the possibility to design remote-controlled systems that could take command over themselves in case remote control breaks down. Of course, the “ethical governor” algorithm for future lethal autonomous robots is just an early contribution to a discussion that can develop in many different ways.
Imagine your group is part of an expert team that the U.S. Department of Defense convened to inform its decision making concerning new development goals and future budget requests to Congress. The objective of your team is to develop a comprehensive understanding of the ethical issues that would be at stake when there is, at some point in time, the need to decide whether autonomous machines that make death-or-life decisions on the battle field can or should be deployed and, if so, under which conditions.
To prepare your work, your first task is to study the current status of research on ethical decision making in lethal autonomous robots. The second task is to sort out the multitude of ethical and policy issues involved. For example, here are a few issues to get you started:
- What are the possibilities of ethics algorithms? What are the limits?
- What could happen if enemies hack into lethal robot software? How can you mitigate this risk?
- Who is ultimately responsible for the actions of the robots? How do these responsibilities change from development to deployment?
Synthetic biology is a new and growing field of engineering-based biology that will make it possible to build living machines from off-the-shelf chemical ingredients that are, at the same time, biological organisms with the ability to procreate and evolve.
“Among the potential applications of this new field is the creation of bioengineered microorganisms (and possibly other life forms) that can produce pharmaceuticals, detect toxic chemicals, break down pollutants, repair defective genes, destroy cancer cells, and generate hydrogen for the postpetroleum economy.” (Tucker, 2006)
Many of the technologies used for synthetic biology have existed for several years. They are employed, for instance, in the genetic engineering of crops and bacteria.
“The main difference between genetic engineering and synthetic biology is that whereas the former involves the transfer of individual genes from one species to another, the latter envisions the assembly of novel microbial genomes from a set of standardized genetic parts. These components may be natural genes that are being applied for a new purpose, natural genes that have been redesigned to function more efficiently, or artificial genes that have been designed and synthesized from scratch.” (Tucker, 2006)
The scientific possibilities for synthetic biology are making headlines:
- “Discovery in synthetic biology a step closer to a new industrial revolution” –Imperial College London, 2/1/2013 
- “How to turn living cells into computers” –Nature, 2/13/2013 
- “Cell circuits remember their history” –Science Daily, 2/11/2013 
- “Synthetic biology: Stanford, UC Berkeley engineering a new frontier” –San Jose Mercury News, 2/17/2013 
Imagine representatives of Georgia Tech have been invited to outline their views on a few projects in synthetic biology that are in an early stage of development in a hearing on Georgia’s capitol. President Peterson forms a task force composed of the members of your team. The task force is charged with providing a broad and deep understanding of the projects in question. The task is not to develop just one position, but to take into account the broadest variety of possible positions on these projects, and only then to develop what seems to be the most convincing one.
Brain mapping: Understanding neuronal activity through advanced brain imaging
In April 2013, the launch of the Brain Research through Advancing Innovative Neurotechnologies (BRAIN) initiative was announced, which has the goal of developing technologies to map neuronal activity in the brains of both non-human animals and humans. The interdisciplinary project is in its early stages and is currently being spearheaded by a working group from the National Institutes of Health (NIH), the Defense Advanced Research Projects Agency (DARPA), and the National Science Foundation (NSF), led by neuroscientists Cornelia Bargmann and William Newsome. The BRAIN initiative—and other similar projects—pose myriad ethical questions, including: Should we allocate resources to brain mapping at the expense of other causes? If neuronal activity can be successfully mapped, how will intellectual property rights be assigned? Similarly, what are the implications of brain mapping for privacy? What are the implications of brain mapping for military practices and technology (particularly given DARPA’s involvement)? How will advances in brain mapping impact issues of distributive justice?
Imagine your team is a task force established by NIH, NSF, and DARPA that is charged with drafting funding guidelines for the BRAIN initiative. What should these guidelines include? Should specific research directions be prevented? Should certain activities be required by research teams that apply for grants? How could public investments be distributed to the greatest possible benefit without harming anybody?
I know who you are, where you are, and what you are doing: Facial Recognition Technology in the public sphere
Facial recognition technology (FRT) identifies people simply from their images. Some companies want to use FRT to make people make the purchase of goods require only a nod or another preprogrammed response; others want to scan for known criminals and identify new ones; and still others just want to better organize their personal life by keeping a highly searchable visual record of their activities. But all of this pivots on taking pictures of people—in public or private, with or without consent, online or offline. Does FRT hold the key to safer, more efficient, and smarter world? If so, at what cost? Is opting out even feasible for this type of technology? What kind of governance structures are helpful for regulating FRT?
Imagine you are a task force that is charged with drafting a law that regulates the design or use of facial recognition technology in the public sphere. What should such a law include? The task includes to take into account the broadest variety of possible proposals--based on a stakeholder analysis--and then to develop what seems to be the most convincing proposal.
Robotic Caregivers and the Elderly
Given that certain populations in the world are rapidly aging, especially in the United States, Western Europe, and Japan, there is a pressing need to address their health care needs. One potential option is to provide these individuals with robots that could assist them with their medications and other health-related tasks. Yet there are many ethical issues to examine relating to this “technological fix”, including whether this may decrease the amount of human contact that the elderly receive, and privacy issues based on the fact that these robots will collect data of people in their care. Relevant should also be the effects on employment.
Imagine you are a task force that is charged with drafting a law that regulates the design or use of robots for the elderly. What should such a law include? The task includes to take into account the broadest variety of possible proposals--based on a stakeholder analysis--and then to develop what seems to be the most convincing proposal.
(The original version of thiis problem description was provided by Dr. Jason Borenstein, Georgia Institute of Technology)
As the time nears that it may become possible from a technological perspective to place a human colony on the moon or another planet, it becomes important to address ethical issues relating to this step. Among the key issues include whether the long-term health effects are worth the benefits of this endeavor, which countries would have sovereignty over a human colony and how it would be governed, and whether colonizing space would lessen the incentive to improve conditions on Earth. (The original version of this problem description was provided by Dr. Jason Borenstein, Georgia Institute of Technology)
Brain-computer interfaces: DARPA and its Silent Talk program
Advances in brain-related science and technology have had clear and profoundly positive impacts. From transcranial direct current stimulation (tDCS) to treat such conditions as Parkinson’s disease, to Cochlear implants for the deaf and hard of hearing, their benefits can in some cases—quite literally—be life changing. Despite these success stories, there are advances that seem worthy of significant pause. Such is the case with recent interest by the Defense Advanced Research Projects Agency (DARPA) in developing technology to decode neural signals and allow silent, non-vocalized communication between soldiers on the battlefield. The project, which has been labeled Silent Talk, aims to realize a non-invasive brain computer interface (BCI), which could be housed in soldiers’ helmets, and analyze electroencephalography (EEG) patterns to enable communication. It remains unclear whether, and on what timeline, such a technology could be achieved. Nonetheless, its potential poses serious ethical questions, including: Could soldiers be required to use Silent Talk technology? If use were to be voluntary, how would questions of consent be addressed? And how would coercion be avoided? What kinds of information would the technology collect? Who would retain ownership of that information? And how would soldier privacy be protected? How would the accuracy of the technology be ensured? Does the technology raise novel questions about responsibility? For example, is it possible that the technology could incorrectly analyze a commander’s EEG activity and transmit an erroneous order, which could cause unintended results? What would be the implications of this technology if it made its way into the private sector?
National, local and personal security and the Internet of things
Given the historic and ongoing problems with security and privacy on the Internet, what can we expect when all of our things—our utilities, cars, homes, appliances, medical devices, etc.—are connected to the Internet? What new challenges are created to our security and privacy? Will the promised social improvements of efficiency and convenience be achieved and, if so, at what societal and personal cost? How can we ever predict the consequences of interconnecting everything with both everything else and everyone? What policies need to be developed now to ensure that the Internet of Things evolves with input from everyone?
Neuro-enhancement: The rise of pharmaceutical cognitive neuro-enhancers
Pharmaceutical cognitive enhancers have long been prescribed for people with various medical conditions, including attention deficit hyperactivity disorder (ADHD) and dementia. These drugs aim to increase attention, memory, or both. Another class of cognitive enhancers aims to block memories. Such drugs are being used to treat people with Post-Traumatic Stress Disorder (PTSD). Pharmaceutical cognitive enhancers for each of these purposes are being heavily researched and developed, and there is growing interest in expanding their use. A growing number of healthy people—those without cognitive impairment—are using pharmaceutical cognitive enhancers, such as methylphenidate (Ritalin) and amphetamine (Adderall) to enhance their abilities. Perhaps nowhere is the use of pharmaceutical cognitive enhancers more prolific than on college campuses. Studies suggest not only that an increasing number of students are using the drugs, but that they are using them at younger ages. Moreover, there is reason to believe that scientific advances will yield pharmaceutical cognitive enhancers with increasing efficacy. As their use and enhancement abilities increase, society will increasingly be faced with—and increasingly unable to ignore—serious ethical questions posed by these drugs. These questions might include: Are these drugs safe? How can we assess the safety of drugs that interact with such a complex, and still poorly understood, part of the human body: the brain? What level of enhancement is appropriate? And in what contexts is enhancement appropriate? For those contexts in which enhancement is not appropriate, how can use be regulated? Will these drugs disproportionally advantage the privileged? Are these drugs addictive? If so, how should this impact their development and use? Will these drugs change how we value such things as work and memories? Could people be coerced into using these drugs (e.g., as a condition for employment)?
Tri-state water dispute
For the last two decades, Alabama, Georgia, and Florida are involved in a dispute about water management in the Apalachicola-Chattahoochee-Flint (ACF) River Basin (also called the Tri-State Water Wars). How could this conflict be resolved?
More problems ...
... were drafted by investigators for a National Science Foundation, Ethics Education in Science and Engineering project, Award ID SES-0832912 (PI Dr. Roberta Berry), and have been made available for instructor use under a Creative Commons license in two repositories: National Center for Professional & Research Ethics, Ethics CORE http://nationalethicscenter.org/resources/808; and National Academy of Engineering, Online Ethics Center http://www.onlineethics.org/Resources/TeachingTools/Modules/27534.aspx.
This work is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License. To view a copy of this license, visit http://creativecommons.org/licenses/by-nc-sa/3.0/ or send a letter to Creative Commons, 171 Second Street, Suite 300, San Francisco, California, 94105, USA.
In any permitted copying, distribution, or transmittal of this work under the above Creative Commons License, please attribute this work to: Georgia Tech AGORA FIPSE Project ( http://agora.gatech.edu/).
To request a use of this work not permitted by the above Creative Commons License, please contact Dr. Michael Hoffmann.
 http://www.legis.ga.gov/Legislation/en-US/display/20132014/HB/29. See for an overview of all bills related to firearms in the 2013-2014 session of the Georgia General Assembly http://www.legis.ga.gov/Legislation/en-US/CommitteeLegislation.aspx?Committee=129&Session=23.