Environmental disasters are becoming increasingly more frequent and severe globally (IPCC 2012). Although those who respond to disasters (“responders”) continue to improve their preparation for and response to crises, integrating the best available scientific information and expertise is vital for efficient and effective decision making (Lubchenco et al. 2012). Yet how scientific information and expertise is incorporated into decision-making processes in the United States is challenging because industry and governmental decision makers, often from multiple agencies, operate on different systems of rules and rewards across agencies and compared with academic institutions (Janssen et al. 2010). Our results indicate these disparities must be addressed to improve effective collaboration, and response should be informed by an understanding of the complex human behaviors that are present during crises.
Human-centered design is a problem-solving methodology that focuses on the needs and behaviors of stakeholders who are either driving the problem or have the authority to enact interventions (Giacomin 2014). By comparison, systems thinking is a set of synergistic analytic skills used to improve the capability of identifying and understanding systems as sets of dynamic interdependent relationships, predicting their behaviors over time, and devising modifications to them in order to produce desired effects (Arnold and Wade 2015). The systems thinking approach contrasts with traditional analysis, which studies systems by breaking them down into their separate elements. When human-centered design is strategically integrated with systems thinking, it is uniquely suited to address complex problems. Importantly, this integrated design process is not a science, but rather a method of iteratively framing, understanding, and deconstructing human-system challenges (Brown 2008). We present a methodology for applying human-centered design and systems thinking to crisis preparedness and response, demonstrate how this methodology is applicable to other social and environmental challenges, and propose a solution, the Science Action Network.
Despite the increasing popularity of human-centered design, or “design thinking,” across a number of fields, its methodology remains a relatively neglected area of research as applied to environmental challenges (Razzouk and Shute 2012, Santo et al. 2015, Sorice and Donlan 2015). We explore the convergence of human-centered design and systems thinking (henceforth called the Deep Change Method) as tools for tackling complex problems rooted in human behavior (Brown and Wyatt 2010). Also referred to as systemic design, this intersection “brings human-centered design to complex, multi-stakeholder service systems” (Jones 2014:93). Both human-centered design and systems thinking have received attention as potent tools for addressing current societal problems, but literature on the coapplication of these methods is scarce (Jones 2014). The Deep Change Method, created by Stanford ChangeLabs, combines the two methods into a formal, cohesive process for addressing complex systems challenges and producing scalable interventions (Banerjee 2014). We define a complex system as a set of elements interconnected in such a way that they produce their own internal dynamics that are difficult to understand, predict, manage, and/or change (Magee and De Weck 2004, Meadows and Wright 2008). Thus, designing an intervention for changing how system components interact requires the integration of multiple stakeholder perspectives, and a high level of comfort with developing and testing solutions in the face of uncertainty and continually shifting system dynamics.
The Deepwater Horizon oil spill disaster (DWH, 2010) was the largest marine oil spill, and the largest mobilization of resources to address an environmental emergency, in U.S. history (Lubchenco et al. 2012). DWH was an unprecedented disaster: oil spilled for 87 days in an extreme ocean environment, spreading rapidly throughout the water column and to the atmosphere. The constantly changing crisis presented novel policy challenges and required unprecedented engagement and collaboration among scientists from multiple disciplines across government, academia, and industry (Lubchenco et al. 2012, McNutt et al. 2012).
Large oil spills, like many environmental challenges, are often classified as “wicked problems” (Rittel and Webber 1973, Buchanan 1992). Wicked problems are extremely difficult to solve and are characterized by a lack of information, overlapping and difficult-to-map drivers, and conflicting value systems among actors (Rittel and Webber 1973, Buchanan 1992). More than six factors contributed to DWH being a wicked problem. First, DWH was the largest marine oil spill in U.S. history and its geographic breadth, the far-reaching impacts, and the urgency of the response were unprecedented (Mabus 2010). Second, there were many unknown unknowns, e.g., the flow rate of oil egressing from the damaged well, affecting multiple dimensions of the response. Third, complex dynamics within and between stakeholders affected the response, where traditional stakeholders, e.g., multiple federal and state authorities and a designated responsible party, engaged with new stakeholders, including academic scientists. The convergence of different missions, cultures, and perceptions posed coordination challenges. These differences were sometimes magnified by 24/7 media coverage, which introduced additional pressure to the stakeholders (National Commission 2011). Fourth, human cognitive bias known as “discounting,” i.e., prioritizing the acquisition of present and near-term rewards rather than longer term, potentially greater rewards, negatively affected the amount of organizational, financial, and human resources invested in sustainable long-term solutions. Fifth, the diversity of actors who became involved in the response reported to different authorities, some outside the National Contingency Plan, contributing to confusion and communication challenges. Sixth, governmental, industry, academic, and public interests had overlapping priorities, yet differing levels of ability to solve the problems that emerged during the disaster (Lubchenco et al. 2012). Finally, extrinsic factors further influenced and complicated the implementation of solutions, including funding constraints, legal restrictions, and bureaucracy.
In examining the diverse scientific, social, and institutional complexities that amplified the challenge of responding to DWH, crisis decision makers consistently identify cross-sectoral collaboration as one of the most critical challenges that, if addressed, could greatly improve the social and ecological outcomes of future environmental crises (Lubchenco et al. 2012, McNutt et al. 2012). DWH revealed the wealth of science and technology resources available within the broader scientific community, their strong desire to help, and how crises can spur the rapid advancement of valuable new scientific knowledge. Yet it also exposed weaknesses in the system of information dissemination and exchange among scientists from those three sectors (Machlis and McNutt 2011, Lubchenco et al. 2012). Traditional communication mechanisms, affected by policy, previous experience, and culture, constrained the sharing of information between responders and the broader scientific community. Lack of communication across sectors and legal constraints complicated collaboration and the ability of academic scientists to rapidly mobilize, assist with spill response, and evaluate the potential social-ecological impacts. In the words of one scientist interviewed as part of this project and who was involved in DWH: “The many stakeholders involved did not share a common language, timeframe, set of values, or pre-existing relationships.” The lessons learned from prior spills such as Exxon Valdez (1989) were helpful, but ultimately did not create an effective infrastructure to support rapid collaboration among federal, industry, and academic scientists during a future spill. In addition, the unique nature of DWH being the first spill to occur at significant depth (~1500m below the sea surface) introduced new response challenges.
Wicked problem characteristics are common across complex environmental problems, including climate change, extractive resource use, and biodiversity loss (Levin et al. 2012). The Deep Change Method is uniquely suited to address these problems because it considers (1) system complexity, (2) behavioral psychology of human stakeholders, and (3) adaptability for unknown unknowns (Banerjee 2015). When designers use a systems thinking lens, they can decipher interventions that will impact complex, multistakeholder challenges. Both human-centered design and systems thinking can be applied to any human experience to solve wicked problems within and across subjects, putting human behaviors that drive the problem, and are necessary for sustainably solving the problem, at the center of the process (Buchanan 1992). Contrary to some science disciplines that use linear processes to distill deterministic components of a problem and a solution, a design process assumes nonlinearity and unpredictable system dynamics. It is structured to yield previously unknown solutions, solutions that have not yet been previously generated or applied to the problem. The Deep Change Method, like other design methods, explicitly avoids taking any single disciplinary approach, and instead employs tools that integrate multiple points of view to address seemingly intractable challenges.
Using the Deep Change Method in the context of environmental disasters presents an opportunity: How might we design a solution to the relationship failures that emerge in high-stress, high-stakes disasters? We pinpoint why and how design research and the Deep Change Method can be leveraged to address diverse and complex environmental problems. Specifically, we present a series of methods applied to the challenge of scientific collaboration before, during, and after environmental crises, specifically focusing on large oils spills such as DWH.
Our process consisted of six phases: challenge framing, ethnography, synthesis, concept generation and idea selection, prototyping, and testing and solution refinement (Fig. 1), using oil spills as our test case. These six phases map to the phases included in the human-centered design process and the Deep Change Method (Brown 2009, Banerjee 2015). However, we adapt them here based on the time frame and scope of our project challenge. Throughout the process, we hosted in-person workshops and conference calls with a group of advisors with broad experience with DWH and other environmental crises across government agencies and academia to guide our design. We had four target outcomes for the project focused on environmental response and remediation (instead of the engineering challenges posed by deep-water drilling, such as source containment and kill operations): (1) minimum oil exists in the marine and coastal environments because responders prevent or stop the flow of oil and/or released oil is contained; (2) the environmental and human impacts of the oil spill are mitigated through preparation, response, and/or restoration; (3) response efforts in future large oil spills are effective and efficient in achieving harm reduction goals; and (4) increased scientific understanding of environmental and human health impacts of an oil spill improve long-term ecosystem management. During concept generation and solution refinement, we vetted our material against these target outcomes, which provided the basis for excluding or including insights and ideas intended to address the challenge. For example, we rated how shifting a set of incentives or behaviors would likely impact each target outcome and then prioritize ideas that scored highly across all four.
At the beginning of a design project, the challenge framing should be broad enough to allow the project team to discover areas of unexpected value (Seelig 2012). Our initial framing centered on the question of how we might enable “effective, scientific communication between academic scientists and federal responders during large oil spill crises.” During preliminary challenge framing, we mapped the federal and state government and academic actors and resources for disaster preparedness and response, including the flow of funding, information, and partnerships. These maps helped us identify where new solutions were most critical. The exercise also expanded our understanding of the complex and overlapping drivers that influenced human relationships during DWH. As a result of the insights gathered during this preliminary stage, the final challenge framing was then revised and refined after the design ethnography stage.
During ethnography, a process to observe and understand the users for whom researchers are designing, two of the authors and members of the internal project team (Mease and Gibbs-Plessl) identified stakeholder interviewees. These stakeholder groups were identified during pilot interviews and through a review of the literature on preparedness and response structures. This project identified several types of stakeholders involved in the response and recovery from Deepwater Horizon: academic scientists, government scientists, industry scientists, responders, and government administrators. Key stakeholders included those (1) that directly influenced the relationship failures that occurred between government administrators and responders, government scientists, and academic scientists; (2) who had decision-making authority within the government or academic structures that our project sought to assess, e.g., elected or appointed officials, university deans; and/or (3) whose work was hampered by insufficient scientific support during the disaster response cycle, e.g., local nongovernmental organizations.
The internal project team conducted semistructured phone interviews with 72 stakeholders including academic scientists, government agency staff, elected officials, and industry representatives, (Fig. 2). The interviews included narrative-style questions similar to vignettes in sociological research (Bloor and Wood 2006) to elicit feelings and underlying interests within disaster scenarios. The majority of the questions were structured to identify the goals, motivations, and perceptions of the respondents to contribute to our understanding of the needs and mindsets driving their behavior (Table A1.1). One researcher conducted the interview while a second researcher took detailed notes. Finally, the team used existing literature on scientific crisis preparation and response from other disaster types to guide our ethnography process, such as research on the science community’s engagement during influenza pandemics and the Fukushima nuclear meltdown, as well as the broader literature on crisis communication.
The internal project team used a postinterview capture form to interpret and synthesize the insights from the interviews. The form captured a range of information: compelling insights, challenge framing, primary responsibilities of the interviewee, resources controlled, self-perceived agency of decision making in a response situation, relationship with other system stakeholders, key underlying emotions communicated, role in the system, familiarity with existing solutions, and potential new solutions envisioned for the project challenge. From this information, we created persona profiles for four primary stakeholders: (1) academic scientist local to the disaster site, i.e., working at an academic institution in the state affected by the disaster, (2) academic scientist not local to disaster site, (3) U.S. Coast Guard administrator, and (4) National Oceanic and Atmospheric Administration (NOAA) Scientific-Support Coordinator. These profiles included the stakeholders’ motivations and goals, intrinsic and extrinsic barriers to their agency, and dominant perceptions of other primary stakeholders (Fig. A1.1). Human-centered design often employs user personas to connect solutions to user needs refined from ethnography, and have been shown to improve the usability of final design products (Long 2009). In this study, persona profiles served as generalized, fictitious representations of stakeholders the project team could reference to guide evaluation of stakeholder needs, motivations, and behaviors.
From final challenge framing, shaped by interviewees, we constructed a root cause map. Root cause analysis investigates and links observable phenomena to underlying drivers. Addressing root causes can mitigate negative cascading effects and amplify positive impacts. We used the online concept mapping software Kumu (https://kumu.io/) to link the drivers of the challenge to each other and order them hierarchically, to visually depict the dominant root causes adjacent to one another in a tree format (Fig. 3).
Following root cause analysis, we identified the leverage points within the system of oil spill preparation and response, and associated nongovernmental, e.g., academic and industry, science (Table 1). Leverage points are places within a complex system where a small change in one component will produce a large change across the system (Meadows and Wright 2008). After creating a list of hypothesized leverage points (Table A1.2), we narrowed our focus points based on the following criteria: (1) impact (shifting this leverage point will create significant positive results across the four desired outcomes), (2) feasibility (the leverage point can be effectively shifted within 1–5 years with existing resources and minimal funding), and (3) scalability (the leverage point applies to other scales and types of disaster response situations). The team mapped the resulting strategic leverage points onto the root cause map to identify interventions that also address root causes. This process enables a design team to revisit the preliminary challenge framing and tighten it to reflect the critical root causes and stakeholder needs discovered through design ethnography.
We also referenced written material and distilled insights from interviews with stakeholders from analog challenge contexts. Analogs are systems that mirror a key component of the target system, and from which designers can derive ideas and reframe the problem. An example of an analog system is the public health disaster response for Ebola and H1N1 and their respective patterns in resource mobilization and allocation.
To generate solutions to the science collaboration challenges that emerge during large environmental disasters, the team identified concept generation prompts based on the strategic synthesis of the ethnography findings. Where root causes overlapped with system leverage points, the team constructed “How Might We” statements. How Might We statements frame a challenge as surmountable and spur the design team to generate a wide range of possible solutions through brainstorming. For example, the root cause of “academic scientists and government staff do not codevelop priority research questions between crisis events” is an example of a “length of delays, relative to the rate of system change” leverage point (the delay in the communication of research needs from responders to academic scientists during oil spills). The corresponding How Might We statement was “How might we enhance academic participation in preparedness and contingency planning?” For the root cause of “a lack of consistent scientific interest in applied spill response research,” the leverage point is the “size of buffers” (increased buffer of time around spill response research between spills), and the How Might We is “How might we create long term funding cycles for oil response research?”
The internal project team and a group of 15 expert project advisors (including coauthors Reddy, Ludwig, and Lubchenco) conducted concept generation during a two-day in-person workshop, generating 15 solution ideas. The internal project team then brainstormed over 50 additional ideas, based on the system leverage points. The 15 project advisors scored the solution ideas across four criteria: impact, feasibility, novelty, and applicability to other disaster types (on a scale of 1–5). Based on this scoring, three ideas were selected to prototype. Novelty was evaluated to gauge the innovation potential of the idea, as an indicator of shifting a system from its status quo.
The internal project team built three prototypes to test in-person with a broader group of users at the 2015 Gulf Oil Spill and Ecosystem Science Conference in Houston Texas, attended by individuals across the target stakeholder groups. A prototype is an early model of a concept, process, or product developed to explore and test its efficacy in achieving the desired design outcomes (Brown 2009). For example, at the conference, the primary prototyping tool that was used was a “user journey scenario,” presented in the form of an interactive printed booklet. The prototypes were scenario-based and walked users through an experience flow of their imagined role in the new crisis decision-making system and the change in their behavior if the solution’s value was realized. The booklets solicited interaction through periodic reflection questions, both written and through verbal discussion with surrounding participants. Prototyping is useful for testing specific hypotheses about the value that a proposed solution is designed to create. Prototypes allow designers to incorporate user feedback quickly, iteratively, and inexpensively. Prototyping also helps designers identify how real-world human or system constraints compromise an idea’s integrity. Importantly, prototypes often start conversations with potential users that uncover additional insights and help inform improved solutions.
During the group workshop at the conference, 36 participants were arranged into multistakeholder tables (academic scientists, responders, and oil industry reps) to facilitate cross-sector dialogue. The participants were guided through the interactive booklets, which posed the storyline, hypothetical decisions, and potential rewards associated with each idea. We collected the participant feedback through audio recording and transcription of the participant conversations and captured written feedback each participant provided in the interactive booklets. Additional prototyping feedback was received from individual 30- to 75-minute conversations with academic and government stakeholders at the conference. We synthesized this feedback to refine the solution structure of all three prototypes.
The last phase of the design process was to select and refine a single proposed solution based on prototyping and testing the solution with key system stakeholders. This phase also included implementation planning and identified target outcomes for the proposed solution. Importantly, during the testing phase designers must remain curious and open to user feedback as they continue to refine the quality and power of the proposed solution. Because a greater diversity and number of users are engaged, designers continually enhance the robustness of the idea, tweaking the solution to serve the needs and create additional value for stakeholders.
The internal project team tested the final solution idea through one- to two-hour long phone interviews with 45 system stakeholders and seven webinar presentations. The proposed solution was described in detail, highlighting the value created for a variety of users and the structural components that underpin the solution’s feasibility, impact, and scalability. We asked scenario questions to identify how the proposed solution may or may not achieve its target value with the interviewees. We then refined the proposed solution based on the feedback received over the course of three months. For example, based on user feedback from government stakeholders, the project team shifted the solution’s focus from integrating nongovernmental scientific involvement in actual disaster response activities toward disaster preparedness activities, e.g., table-top exercises. In addition to phone interviews, one in-person, two-day workshop was held with nine project advisors to refine the features of the solution, build an implementation plan, and identify key benchmarks for its implementation.
The key stakeholders of the focus challenge were continuously involved in the framing, brainstorming, and testing of our project solutions. This method of “codesign” with users and stakeholders is distinct from traditional processes of product design, where interaction between the designer and user is discrete and limited. We build on the definition of codesign given by Sanders and Stappers (2009) to refer to the creative act of designers and people not trained in design intentionally working together at multiple points of the design development process. Codesign is a valuable and important aspect of systems thinking, because it harnesses the insights of users directly involved in the challenge that is being addressed. In a systems solution, there is almost never a single “end user,” instead, there are multiple stakeholders who will interact with the solution. Designers must create distinct solution features to effectively engage and generate value for these different stakeholders. The complexity of this design space necessitates a participatory approach where the users become coproducers of viable solutions, and the design team takes the role of process facilitators. This methodology builds upon the scholarship of participatory design, which involves giving prototypes to nondesigners and adaptively incorporating their feedback (Björgvinsson et al. 2010). In the case of experience design, this can be done through iteratively proposing new decision points and revising the prototype to reflect differing stakeholder experiences. The close involvement of the project advisors in our design challenge—who are ultimately the users of the design solution itself—is also an example of using participatory design to ensure the outcome is grounded in the human needs and realistic constraints of the challenge system.
Our targeted ethnography uncovered key insights about how scientific collaboration is incentivized and motivated among governmental decision makers and academic scientists. Through interviews with key stakeholders, idea generation with project advisors, and identification of desired project outcomes, we identified a set of key system root causes and leverage points from which we could address the problem.
We synthesized the challenge framings articulated by our informants to narrow and refine the scope of the challenge. Determining the final scope of the challenge framing is critical because it determines the scope of the solution. Many interviewees drew parallels between the challenges of collaboration in marine oil spills and other environmental disasters, thus we broadened our challenge framing to include a broader set of environmental crises (chemical spills, rail-based oil spills, hurricanes, tsunamis, and severe winter storms). Further, we heard that many of the barriers to collaboration during response were rooted in the obstacles to relationship-building and communication that exist between crises, i.e., before and after disasters occur. Thus, we extended our final challenge framing to include the collaboration challenges that emerge before, during, and after crises occur.
Our synthesis of ethnographic insights revealed a number of key system characteristics that, if achieved, may enable greater scientific collaboration between our target users: academic scientists and government agencies tasked with disaster preparation and response activities. These characteristics served as our design specifications as we generated potential solutions to our design challenge. We included a qualitative analysis of these design specifications when evaluating solutions for their impact, feasibility, and scalability. Specifically, we analyzed the extent to which the solution (1) builds trust between academia and the government response community before a crisis occurs; (2) maps to the existing motivations of target users by creating genuine and tangible value on relevant time scales; (3) remains active between crises to foster the relationships necessary to rapidly identify, investigate, and communicate about scientific unknown unknowns; and (4) decreases the effort (time and money) responders must expend on engaging with academic scientists during a response operation.
The root cause map revealed 78 system attributes that drive undesirable individual, institutional, or system behaviors relative to our target outcomes. Of the 78 root causes mapped, 27 were related to the leverage point of shifting mindsets, 23 were related to material or information flows and their buffers (elements that maintain system stability), 13 were related to feedback loops, 8 were related to changing system goals, and 7 were related to changing the rules of the system. We found that the stakeholder perceptions, particularly ones that were critical or unfavorable regarding the values of other stakeholder groups, were impeding collaboration and relationship building. For instance, government administrators tend to operate with a mindset that they know how to manage spills, without additional input from academic scientists, and academic scientists tend to have a mindset that their research will be used, regardless of their communication and engagement of that research. We also found that extrinsic forces, such as limited research funding for crisis science, legal and contracting hurdles, and time constraints, exacerbate stakeholder cultural conflicts.
Solution ideas were generated from How Might We statements based on the 20 leverage points that were identified as uniquely powerful in shifting crisis collaboration toward our 4 desired outcomes (Table 2). The final project solution explicitly incorporates features to use 16 leverage points to create tangible and durable value for all system stakeholders. A full list of the 44 system leverage points identified for enabling system change can be found in Appendix A1.3.
The multistakeholder prototyping workshop at the 2015 Gulf Oil Spill and Ecosystem Science Conference was valuable for provoking generative, productive exchanges between stakeholders and identifying opportunities to improve the solution ideas by creating additional mutual value. One-on-one user testing interviews were also important for receiving targeted feedback on specific solution components. We observed stakeholders adopting a focused, creative mindset when given a tangible solution prototype to interact with, as opposed to discussing solutions in the abstract, a frequent pitfall of solution processes for complex systemic challenges. The workshop and other small group prototyping sessions also provided a space to initiate and strengthen cross-stakeholder relationships important for solution implementation.
The final solution selected by the internal planning team was the Science Action Network, a community of academic and professional scientists who are linked to regional government planning and response bodies to coordinate and streamline scientific input for decision making before and during disasters. The Science Action Network would enable increased cross-disaster preparedness and would support response decision making through novel academic-agency partnerships, resource sharing, and coordination of scientific input (see Table 3 for descriptions of how the Science Action Network meets the project design specifications).
The project team selected the Science Action Network as the final project solution based on prototyping feedback and our filtering criteria (Table 4). Based on what we heard from key stakeholders while testing the network’s potential value, we adapted the network to be more heavily focused on precrisis relationship building through collaborative research and preparedness efforts. Further, we refined the network structure to include both a geographic organization (via 10 hubs) and a disciplinary organization, e.g., academics in the field of crisis communication connected across regions. A disciplinary structure enables liaisons to streamline identification of disciplinary expertise necessary for a particular incident, which may not exist locally. It also ensures cross-pollination of key research priorities, opportunities, and accomplishments across the network by piggybacking on existing disciplinary social networks and professional societies.
Our methodology was well suited to address the complexity and difficulty of our project challenge. The human-centered design and systems thinking techniques we used uncovered the goals and perceptions of our target users and informed a final solution that is grounded in the real-world needs and behaviors of those that have the potential to improve crisis decision making through increased cross-sectoral scientific collaboration. We found that stakeholders in the system of scientific decision making for disasters have differing motivations to engage and, thus, partly because of extrinsic forces such as funding and intrinsic forces of culture, lack the trust necessary for collaboration or communication in high-stakes contexts. These insights illuminated both why there were collaboration successes and why there were collaboration failures during Deepwater Horizon, and provided a springboard for reimagining how we might create more consistent, mutual value among stakeholders. Finally, our codesign approach enabled rapid iteration on our key insights, the generation of ideas grounded in the perceptions of the target users, and greater buy-in to the final solution.
Our ethnography uncovered goals, motivations, perceptions, and structural barriers impeding consistent collaboration among our target stakeholder groups before and during large, environmental crises. First, a profound perception gap exists between academic scientists and government administrators and responders, leading to a missed opportunity of collaboration before, during, and after a disaster event. Of the 82 root causes identified, the highest proportion related to the entrenched mindsets and perceptions of stakeholders regarding the role of science during response and the value of sharing information across sectors (n = 27). This gap in perceptions about the role of science in decision making, expectations for collaboration, and the potential mutual value that could be created among stakeholders hinders trust building. The gap is exacerbated because the two groups have dramatically different reward systems and priorities, which deepen the cultural divide between them and weaken their willingness to engage productively. We also found that extrinsic forces, such as limited funding for rapid, opportunistic science during crises, legal constraints around governmental data transparency or academic contracting, and time constraints inherent to any crisis response, all amplify cultural conflicts and impede the ability of stakeholders to collaborate. By mapping how these forces affected collaborative relationship building, we were able to pinpoint the root causes of our challenge and craft a solution that addressed both the structural barriers to collaboration, e.g., lack of communication channels, and also the cultural barriers erected by polarized mindsets, e.g., the mindset of some government staff that there is no role for academic scientists in response decision making.
Using the Deep Change Method, which combines human-centered design and systems thinking techniques, to address our complex project challenge revealed how these tools may have broad relevancy to other socio-environmental challenges. Complementing the traditional, assessment-based research process with design methods and principles may be a powerful approach for integrating the real-world human behaviors and motivations of those affected by and/or driving system interactions. Methodologically, we took away several key lessons.
Design, by nature, is a flexible method that may look dramatically different in theory and practice across problems. When seeking solutions for complex system challenges, design processes methodologically rely on the ability to prototype in real-world conditions. If the design team does not have access to end users or real challenge scenarios to prototype ideas, the effectiveness of the final solution is limited. In this case study, we were unable to test the final solution in a real-world disaster context and our prototyping was limited to likely scenarios and stakeholder memories from past incidents. Future work would benefit from testing our final solution during an actual event or interagency table-top exercise. Finally, the field of human-centered design and systems thinking can use language and tools that make it impenetrable for stakeholders unfamiliar with the process. Thus, it is critically important to involve stakeholders early, provide clear definitions, and cocreate process goals to ensure buy-in and meaningful participation from participants and codesigners alike.
The fields of human-centered design, systems thinking, and the Deep Change Method are still nascent. Further research is necessary for the enhancement of the credibility, usability, and efficacy of the tools and processes used in this study. We outline five dimensions of a research agenda that we believe are uniquely valuable in advancing the theory and practice of complex systems thinking.
First, research is needed to understand how engaging stakeholders from analogous challenge types, e.g., involving representatives from medical crisis response, is useful in each phase of the design process. Seeking inspiration from analogous challenges is a basic innovation tool for designers because it seeds the process with new ideas and helps ensure solutions scale to other contexts. In our project, we conducted literature reviews and several conversations in the fields of nuclear disaster management, public health epidemics, and cybersecurity threats. However, we did not invite stakeholders from any of those analogous sectors to participate as core project advisors or as participants in project workshops. It would be valuable to understand how such analog stakeholders or experts influence the design process and the novelty of produced solutions.
Second, research is needed to test and identify additional techniques for multistakeholder system prototyping. Creating a tangible product or experience to represent a system intervention is difficult and complex. More tools are needed for prototyping systemic solutions, and to better understand their efficacy in testing solution assumptions, both on the human and system scales.
Third, codesign is becoming a more widely applied practice across multiple fields, e.g., policy creation and urban planning, and research is needed to test and compare codesign practice and principles. Additionally, little is known about the long-term impact of the participatory design process on stakeholders who participate in it, both on their perceptions of the challenge and the system, and of themselves and their own motivations.
Fourth, there are few metrics for evaluating the diffuse and often indirect impacts of system-oriented design solutions, as well as holding the process and designers accountable for those impacts. A designer often sees his or her work as done once the solution proposal is finished, and is minimally involved with the process of solution implementation. In complex systems, it is difficult to assess the distributed effects of interventions. Designers need methods for tracking impact of implemented solutions to both assess the efficacy of their methods and to increase their connectivity with stakeholders who are involved in or have been affected by the implementation of their proposed solutions.
Finally, because systems thinking tools and methods proliferate across disciplines and problem contexts, research on the efficacy of mixed qualitative and quantitative methods is needed. Design is both an art and a science. A better understanding of how various methodologies influence design outcomes would be valuable, particularly as design metrics and standards for accountability become formalized.
In this study we provide a framework for using design methods in novel and diverse applications across the environmental and social sector. Using oil spills as our focus case study, we demonstrated the value of the Deep Change Method, which combines human-centered design, behavioral psychology, and systems thinking, for creating a systemic solution for scientific collaboration during environmental crises. Codesign methods created active buy-in from relevant stakeholder groups in the final solution. Human relationships, organizational culture, and trust were identified as critical barriers to collaboration by project interviewees. Thus, human motivations and organizational culture were centered in the final solution concepts. Most problems are, at their root, caused or can be remedied by humans, thus integrating human behavior and incentives in designing solutions is critical for addressing them. And, finally, the final solution is efficient and innovative by leveraging existing resources and stakeholder motivations to generate new value for multiple stakeholder groups. These outcomes demonstrate the potential for the Deep Change Method to be applied across many complex social and environmental problems. It is hoped that some of the tools and insights discussed here serve as a foundation for considering the system complexity that drives human challenges.
This project began during a seminar about the Deepwater Horizon oil spill disaster that was given by JL when she was the Peter and Mimi Haas Distinguished Visitor in Public Service at Stanford in the spring of 2013. TGP participated in the seminar and proposed the idea of using the ChangeLabs methodology to identify ways to improve future disaster response. The internal project team is deeply grateful to Tara Adiseshan for their collaboration in shaping the early stages of this project. The authors thank the David and Lucile Packard Foundation for a grant to undertake this project and enable participation of a wide range of participants and interviewees. We thank the Center for Ocean Solutions and ChangeLabs for their oversight and support. And finally, we are indebted to the many hours and insightful guidance we received from our project advisors who helped shape our final solution: Jane Lubchenco, Thad Allen, Marcia McNutt, David Kennedy, Chris Reddy, Steve Murawski, Dave Westerholm, Dana Tulis, Debbie Payton, Scott Lundgren, Nancy Kinner, LaDon Swann, Gary Machlis, and Bob Haddad. We are pleased to acknowledge that the project is now led by Nancy Kinner at the University of New Hampshire and colleagues at NOAA, USCG, and DOI for implementation of the Science Action Network and other recommendations we identified.
Arnold, R. D., and J. P. Wade. 2015. A definition of systems thinking: a systems approach. Procedia Computer Science 44:669-678. http://dx.doi.org/10.1016/j.procs.2015.03.050
Banerjee, S. B. 2014. Large scale integrated innovation. Pages 71-87 in C. Bason, editor. Design for policy. Ashgate, Farnham, UK.
Banerjee, S. B. 2015. Our methodology: Deep Change Model. Stanford University, Stanford, California, USA. [online] URL: http://changelabs.stanford.edu/resources/our-methodology
Björgvinsson, E., P. Ehn, and P. Hillgren. 2010. Participatory design and “democratizing innovation.” Pages 41-50 in Proceedings of the 11th Biennial Participatory Design Conference. Sydney, Australia. http://dx.doi.org/10.1145/1900441.1900448
Bloor, M., and F. Wood. 2006. Keywords in qualitative methods: a vocabulary of research concepts. Sage, London, UK. http://dx.doi.org/10.4135/9781849209403
Brown, T. 2008. Design thinking. Harvard Business Review June:1-10.
Brown, T. 2009. Change by design: how design thinking transforms organizations and inspires innovation. Harper Business, New York, New York, USA.
Brown, T., and J. Wyatt. 2010. Design thinking for social innovation. Stanford Social Innovation Review 31-35. [online] URL: https://ssir.org/articles/entry/design_thinking_for_social_innovation
Buchanan, R. 1992. Wicked problems in design thinking. Design Issues 8:5-21. http://dx.doi.org/10.2307/1511637
Giacomin, J. 2014. What is human centred design? Design Journal 17:606-623. http://dx.doi.org/10.2752/175630614X14056185480186
Intergovernmental Panel on Climate Change (IPCC). 2012. Managing the risks of extreme events and disasters to advance climate change adaptation. A Special Report of Working Groups I and II of the Intergovernmental Panel on Climate Change. C. B. Field, V. Barros, T. F. Stocker, D. Qin, D. J. Dokken, K. L. Ebi, M. D. Mastrandrea, K. J. Mach, G.-K. Plattner, S. K. Allen, M. Tignor, and P. M. Midgley, editors. Cambridge University Press, Cambridge, UK.
Janssen, M., J. Lee, N. Bharosa, and A. Cresswell. 2010. Advances in multi-agency disaster management: key elements in disaster research. Information Systems Frontiers 12:1-7. http://dx.doi.org/10.1007/s10796-009-9176-x
Jones, P. H. 2014. Systemic design principles for complex social systems. Pages 91-128 in G. S. Metcalf, editor. Social systems and design. Springer, Tokyo, Japan. http://dx.doi.org/10.1007/978-4-431-54478-4_4
Levin, K., B. Cashore, S. Bernstein, and G. Auld. 2012. Overcoming the tragedy of super wicked problems: constraining our future selves to ameliorate global climate change. Policy Sciences 45:123-152. http://dx.doi.org/10.1007/s11077-012-9151-0
Long, F. 2009. Real or imaginary: the effectiveness of using personas in product design. Proceedings of the Irish Ergonomics Society Annual Conference 14 May, Dublin, Ireland.
Lubchenco, J., M. K. McNutt, G. Dreyfus, S. A. Murawski, D. M. Kennedy, P. T. Anastas, S. Chu, and T. Hunter. 2012. Science in support of the Deepwater Horizon response. Proceedings of the National Academy of Sciences 109:20212-20221. http://dx.doi.org/10.1073/pnas.1204729109
Mabus, R. 2010. America’s Gulf Coast: a long-term recovery plan after the Deepwater Horizon oil spill. United States Environmental Protection Agency, Washington, D.C., USA.
Machlis, G. E., and M. K. McNutt. 2011. Ocean policy: Black swans, wicked problems, and science during crises. Oceanography 24:318-320. http://dx.doi.org/10.5670/oceanog.2011.89
Magee, C. L., and O. L. De Weck. 2004. Complex system classification. INCOSE International Symposium 14:471-488. http://dx.doi.org/10.1002/j.2334-5837.2004.tb00510.x
McNutt, M. K., S. Chu, J. Lubchenco, T. Hunter, G. Dreyfus, S. A. Murawski, and D. M. Kennedy. 2012. Applications of science and engineering to quantify and control the Deepwater Horizon oil spill. Proceedings of the National Academy of Sciences 109:20222-20228. http://dx.doi.org/10.1073/pnas.1214389109
Meadows, D. H., and D. Wright. 2008. Thinking in systems: a primer. Chelsea Green, White River Junction, Vermont, USA.
National Commission. 2011. Deep Water: the Gulf oil disaster and the future of offshore drilling. Report to the President. National Commission on the BP Deepwater Horizon Oil Spill and Offshore Drilling.
Razzouk, R., and V. Shute. 2012. What is design thinking and why is it important? Review of Educational Research 82:330-348. http://dx.doi.org/10.3102/0034654312457429
Rittel, H. W. J., and M. M. Webber. 1973. Dilemmas general theory of planning. Policy Sciences 4:155-169. http://dx.doi.org/10.1007/BF01405730
Sanders, E. B.-N., and P. J. Stappers. 2008. Co-creation and the new landscapes of design. CoDesign 4(1):5-18. http://dx.doi.org/10.1080/15710880701875068
Santo, A. R., M. G. Sorice, C. J. Donlan, C. T. Franck, and C. B. Anderson. 2015. A human-centered approach to designing invasive species eradication programs on human-inhabited islands. Global Environmental Change 35:289-298. http://dx.doi.org/10.1016/j.gloenvcha.2015.09.012
Seelig, T. 2012. InGenius: a crash course on creativity. Harper One, New York, New York, USA.
Sorice, M. G., and C. J. Donlan. 2015. A human-centered framework for innovation in conservation incentive programs. Ambio 44:788–792. http://dx.doi.org/10.1007/s13280-015-0650-z