Participatory processes have become popular in environmental management and governance policy in recent decades (Reed 2008, Bryson et al. 2013, Bobbio 2019). From an instrumental or management perspective this reflects both a recognized need for linking knowledge systems, and the desire to generate broad-based support for public decisions in these areas (Benham 2017). Equally from a normative perspective the democratic dimension of participation is just as important, with a focus on negotiating combined or colearning, coproduction, and codevelopment in the journey together (Reed 2008, Le Heron et al. 2019a). Given these diverging settings, claims about and definitions of success will vary widely.
Despite varying rationales for the use of participation, it is often referred to as though it were a specific identifiable action or task. We stress that regardless of social purpose, participatory processes often involve many work streams and multiple stakeholders. Such processes have implicit objectives relating to membership formation, dynamics, relationships among the different groups involved, and collective action aims that are integral to the effectiveness of attaining their outcomes (Israel et al. 1995, Schulz et al. 2003). These elements are required to be assembled in place-specific ways that play out over time and feature temporalities and territorialities and different conceptions of successes. This complexity demands practical guidance for those embarking on, or engaged in, the management of these participatory processes. However, in practice the management of participatory processes are often hampered by a lack of clarity around the different components that underpin these concepts, and a lack of tools to guide and evaluate progress in these areas (e.g., Piltan and Sowlati 2016).
Against this background we explore the development of a rubric as a design and assessment framework to support the management of these processes. Rubrics provide a methodology that can be used to articulate the key elements of a task or behavior that can be evaluated against desired outcomes, or demonstrations of different levels of competence (Allen et al. 2018). Moreover, if we engage those involved in participatory processes in the development of such assessment frameworks then we will also help them to talk about the different social, cultural, political, technical, and management dimensions within the setting of their own process (Allen and Knight 2009). These guided discussions can help actors in participatory processes assess and develop skills and pathways to improve the quality of their activities over time.
We begin by introducing marine spaces as an arena for multistakeholder involvement in decision making in Aotearoa New Zealand[1] (Aotearoa NZ), and the trend toward increased use of participatory processes in these contested spaces. We describe the wider research program informing the paper, the project involving trialing participatory processes relating to Ecosystem Based Management (EBM) and our choice of five case studies to examine participatory processes. We outline our approach and introduce rubrics as an instructional and assessment tool to help participants develop awareness of participatory processes. We develop an indicative rubric for a generic participatory process and outline how it is both a process and product with distributed effects. We end by discussing the benefits and challenges from using rubrics for helping different stakeholder groups to work collaboratively toward governing objectives. We argue that self-defined, ongoing learning evaluation, where criteria for success are debated and created by users, contributes to empowering participatory processes. Success as seen in this way is not about meeting an externally imposed set of standards, which characterize bureaucratic governance but about meeting a set of standards that are developed collectively by those involved to best meet their needs.
In the 21st century, the “participatory turn” in coastal nations around the world has been directed to resolving contestation regarding multiuse/r coastal and marine spaces (Pomeroy and Douvere 2008, Parlee and Wiber 2018). Similarly, since 1990 at least, participatory processes have been a strong feature of marine planning and management around Aotearoa NZ[2] (Bremer and Glavovic 2013, Davies et al. 2018). The country’s marine spaces have a strong indigenous interpretative tradition to decision making, with active community involvement, deliberation, and consensus-building (Waitangi Tribunal 1988). Accordingly, the contemporary Aotearoa NZ scene is quite distinctive with its coexistence of Māori and non-Māori worldviews, types of knowledge, and modes of governing (Le Heron et al. 2019a), facilitated by the Treaty of Waitangi legislated partnership model of cogovernance between iwi (Māori tribe/s) and the Crown.
The research for this paper emerges from the “Testing EBM-supportive participatory processes for application in multi-use marine environments’ project[3] (henceforth referred to as Participatory Processes project). EBM is a knowledge corpus that recognizes the full array of interactions within an ecosystem, including human and more-than-human actors, rather than considering issues, species, or services in isolation. The project involved examining how participatory processes as social technologies of governance contributed to (or hindered) EBM on a number of fronts. This focus highlighted the tensions and interplays between top-down governance associated with existing institutions and bottom-up relatively autonomous governance contributions as exemplified by participatory processes. Despite governance being one of many recognized dimensions of EBM (Long et al. 2015, Hewitt et al. 2018), it has been largely overlooked arguably because governance goes beyond biophysical concerns that are largely the substance of science efforts (Parlee and Wiber 2014, Gluckman 2018, Foley et al. 2020). However, insights from both the wider participatory literature and socio-theoretical writings dealing with organizing governance in neo-liberalizing conditions reveal a convergence on making power and politics key in establishing governance framings and in translating governance concerns in management settings. The emphasis however has been to critically document the design of externally conceived technologies relating to organizations and industries[4].
EBM is often held to be a coherent set of principles that should be bureaucratically enforceable through top-down technologies and replicable from place to place (Long et al. 2015). However, recent literature suggests otherwise, arguing instead for a recognition of multiple possible EBM practice starting points and directions (Gelich et al. 2018, Le Heron et al. 2019a). These realizations had implications for how we positioned the rubric research reported on in the paper. This conceptual realignment showed lacunae in the literature around assessment of proposed and implemented EBM approaches from bottom-up perspectives. This throws open the very idea of naming and proclaiming success. The project confronted the dual and frequently intersecting situated moves to support both top-down and bottom-up governance of EBM that constantly brought to center-stage claims by different interests and stakeholders about the effectiveness and success of the contrasting approaches. We focus on the development of rubrics to facilitate discussions about what success might look like to different interests in these contested settings, and to determine whether and how success, however formulated, is playing out in practice.
The research approach was developed to be attentive to a Treaty partnership mode, distinguished by attention to Māori voices and self-governance and collaboration with participants (Le Heron et al. 2020a). The wider project (Fig. 1) began by identifying and mapping marine-based participatory processes by regional council areas and compiled background information on their origins, evolution, purpose, and so on (Le Heron et al. 2019a). The initial list comprised mainly single-issue processes, although a few were wider in scope and extremely complex. This initial scoping led to a short list of 15 multiuse/user case studies for which document analysis was undertaken, with five case studies chosen for in-depth research. The five case studies covered a range of issues: from developing process and mechanisms to holistically support a harbor, contesting EPA decisions, and developing marine spatial plans to engaging with notions of ownership and belonging (Le Heron et al. 2020b).
As part of a semistructured interview process, three to six people for each case study were interviewed individually. The emergence and internal and external dynamics of the participatory processes were probed by asking why, what, by whom, for whom, where, and with what implications various decisions were being made about EBM and what barriers were withholding implementation. These questions were intended to reveal aspects of the interface between governance issues and the actual “how” and “how nots” of management. In total 31 in-depth interviews were undertaken, and then transcribed. Following interviews, a thematic analysis revealed aspects of participatory processes evident across the case studies. Detailed tables were developed to illustrate cross-sectional relationships between thematic issues, actors, place, and the Aotearoa NZ context. These analyses were robustly discussed by the research team. This comparative approach allowed new insights to emerge and to crystallize into the key ingredients within participatory process outlined in Le Heron et al. (2019b). These were subsequently reformulated as key performance criteria for the participatory processes rubric developed in this paper. The case studies each experienced different process needs, depending on their particular time and place. This recognition that not all needs are experienced at the same time is the reason why we developed a diagram focusing on “ingredients for negotiated change”, and also why we have a wide approach to both ingredients/criteria and the rubric.
The diverse and independent character of the participatory case studies encouraged a customized interviewing approach to reveal insights into relationships, power, and context. The methodology conferred flexibility in accessing insights such as what ingredients might be necessary for participatory process achievement, who was experiencing greater or lesser satisfaction, and what changing experiences in this process look like. The reflexive nature of the conversational interviews, where the interviewers and those being interviewed added to a deep and productive dialogue, meant that momentum grew during the interviews, fresh perspectives were offered, and contrary views could be gently teased out. It also revealed sometimes contentious power-politics and conflicts, which when voiced carried implicit judgements about success expectations.
The rubric was developed using research data from the research and analysis phases of the Participatory Processes project (Fig. 1) and associated literature. As such we have outlined very briefly what methodologies were involved in those earlier phases so that it is clear that the research and thematic analysis that underpin this paper are based on a sound and rigorous methodology. However, this paper seeks to extend this previous work and develop a rubric to help practitioners further explore the process side of participation. See Figure 1 for where this paper fits in terms of the wider project.
An action research-based approach (Kemmis 2009) centered on rubric construction was used to develop practical lessons and constructive practice change from our case studies. Action research seeks transformative change through the simultaneous process of taking action and doing research, which are linked together by critical reflection (AERA [date unknown]). This action is simultaneously directed toward self-change and toward restructuring the institutional or cultural setting within which the practitioner works. Thus the aim of action research is not just to understand the social and organizational arrangements in place, but also to effect change as a path to generating new knowledge about participation and collaboration and to empower the participants in the study (Huang 2010). The process is thus potentially performance enhancing especially as it supports reflective practice (Plummer and Armitage 2007, Podestá et al. 2013). This participatory approach to developing a rubric supports researchers to critically reflect on factors that foster or impede cooperative production of knowledge, and to change their practices accordingly.
A rubric is a form of assessment that can also be thought of as a guide or an evaluation tool that lists specific criteria for assessing performance. Developing rubrics involves articulating and clarifying “the things that matter” in a complex task or behavioral platform, which can encompass aspects related to the performance, quality, usefulness, and effectiveness of the initiative’s activities, services, or products (Allen et al. 2018). These aspects are not necessarily “things that can be counted” but are elements that are considered by those involved in the project as important to pay attention to.
Developing rubrics requires defining the task or behavior to be assessed and this involves working on what constitutes both the criteria and the assessment (Andrade 2000). First, a list of criteria to be assessed needs to be defined; these should represent the component elements that are required for successful achievement of the task to be rated. This can include consideration of outputs (things completed) and tasks/processes (level of participation, required behaviors, etc.). Second, assessment scales need to be developed. These scales should be based on gradations of quality that describe how well any given task or process has been performed, e.g., excellent, good, adequate, or poor.
Codeveloping rubrics helps clarify the expectations of those involved for different aspects of performance by providing detailed descriptions of collectively agreed-upon measures. They not only formulate standards for key areas of accomplishment, but they can be used to make these areas clear and explicit to all those with an interest in improving participatory process performance. They can also operate as an instructional tool to support planning by helping participants understand the targets for their learning and the standards of quality for an assigned task (Allen et al. 2018). Equally, they can help participants make transparent and informed judgments about their own work that can inform revision and improvement (Reddy and Andrade 2010). We found that using governance-oriented quotes from the case study interviews in different ways was a very powerful way to ground the formulation of criteria for the rubric, give interpretive depth to discussion and provide reflections on the nature of claims about success.
There were three main steps in the development of our participatory processes rubric (Fig. 1). We developed key performance criteria from our New Zealand case study work. We then triangulated these criteria against participatory process literature. Finally, we developed the rubric itself in the course of developing this manuscript.
The process of developing this rubric for participatory processes was designed to support critical reflective practice by the research team. By critical we mean that reflection is enabled that goes beyond more typical process questions of what worked and what did not (Chiu 2006). Critical reflection considers a broader range of social dynamics influencing the processes. Individual discussions and focused workshops were used to develop the criteria to be assessed, and to provide an initial introduction to the development of rubrics. This mixed method approach to generate critical reflection and encourage active involvement in the rubric and paper writing was chosen as an efficient and effective way to meet the time and budget constraints facing a multidisciplinary social science research team that was also dispersed around the country.
Developing key performance criteria is a first step in rubric development. As per Figure 1, this is not discussed in detail here: the criteria were developed by the research team using thematic analysis (Le Heron et al. 2019b). The thematic analysis we did was grounded through case studies, conference and workshop feedback, related conversations, and original research. It was developed into a graphic, termed an Ingredients Figure (published elsewhere: Le Heron et al. 2019b). The ingredients/criteria[5] were presented to and workshopped with practitioners and actors from a range of backgrounds that are also involved in or interested in participatory processes[6]. This provided a first check on the applicability of the criteria. The feedback we received indicated that there is an appetite for this kind of guidance to help expand practitioner’s view of the range of functions that underpin successful participatory processes. We identified at this point that extension work on the criteria was necessary for them to be able to be used as a tool in assessing process. We thus began the rubric creation process.
The literature review and associated discussions, which also built on the research team’s experience, suggested that two other key criteria underpinning good practice in managing participatory processes were (i) learning-based monitoring and evaluation; and (ii) consideration of different phases or steps in any participatory process cycle. These elements were subsequently added to the mix. Figure 2 outlines the final criteria underpinning the rubric, acknowledges a feedback loop of reflection and evaluation, and provides a space to consider the importance of phases and timing for participatory process activities.
Step two in the rubric development process was to triangulate our criteria against international scholarship in this area both to ascertain that our choice of criteria was robust for use in a rubric-type assessment tool, and to begin to identify how good practice in each of these areas might manifest. Each of the criteria is expanded upon below with reference to both team discussions and participatory process literature, highlighting salient points that could usefully help practitioners and proponents of participatory processes. These criteria iterate and link with each other and should be read accordingly.
In a participatory process, different stakeholder groups can share their ideas about the future of the marine environment, discuss interests in common, and identify shared long-term goals. Developing shared goals is especially challenging in marine contexts where there is little regulation or conversely where there is a lot of regulation that is both disconnected and overlapping, making lines of responsibility and accountability unclear. It is important to spend time up front defining goals, management processes, and evaluative metrics (Tholke 2003, van Huijstee et al. 2007). In turn, initial dialogue enables better appreciation of each other’s values, meaning systems, aspirations, and expectations (Oliver 2002, San Cristóbal Mateo et al. 2017). When a set of agreed goals have been identified, they can be translated into shorter term action plans for that environment (Kusters et al. 2018). Although the objective of stakeholder dialogue is to build shared understanding and a platform for collaboration, it is also a forum for different stakeholders to move beyond a focus on general principles to develop a package of solutions that meet the different needs of those involved, and can foster cooperation and ongoing active engagement (Lynam et al. 2007).
The success of place-based processes often depends on the balance between the goals chosen and the social and institutional contexts that underpin the opportunities and constraints of participatory processes (Eastwood et al. 2017, Baker and Chapin III 2018). Social and cultural contexts relate to, among other things, the traditions, social networks, property rights, and peer influences that affect the willingness of individuals, groups, communities, and cultures to participate and the ways in which they do so. Institutional contexts include organizational culture, such as openness, statutory obligations, the strength of organized interests, and the geographic scales at which problems can be addressed (Reed et al. 2018). Individuals, whether in organizations or members of broader stakeholder groupings will respond to their own history of experience with engagement, fatigue, funding shortfalls, or their trust/distrust of the other parties and individuals involved (De Vente et al. 2016). Many of these factors can be addressed by improving the design of more effective processes that include skills, capacity, and capability building (Le Heron et al. 2011). Connections refers to the fact that participatory processes exist and work within a patchwork of other participatory initiatives: they connect, interconnect, and have dependencies and synergies. The multiplicity and relevance of different histories in any given context must also be acknowledged and addressed, especially where there are concerns and tensions over past grievances and priorities of indigenous peoples in land-sea interactions.
Who is around the negotiation table—and who is silent or silenced or absent—critically affects the dynamics of participatory processes and their outcomes (Kaufman et al. 2014). Who should be involved in a participatory process is a relative judgement and context-specific; a stakeholder or partner is always defined relative to the particular issue or goal that the process is set up to achieve, and this will be place. Project stakeholders or partners may also change as particular issues evolve over time. The term stakeholder is commonly used in situations where industries or government agencies are negotiating an outcome but there is also a growing desire to recognize a human rights-based approach to development, particularly in respect of indigenous peoples (Rodhouse and Vanclay 2016). In Aotearoa NZ the Treaty settlement process goes further than requiring consent, as suggested by the UN Declaration on the Rights of Indigenous Peoples (United Nations 2007), and recognizes that Māori are “partners” who expect to be able to exercise rangatiratanga or authority in decision making in the management and sustainability of a natural resource as a right, not only because of their long-term presence in a location, but also because of their responsibilities to future generations (Ruckstuhl et al. 2014). The usual silence of future generations is thus partly addressed by a kaitiakitanga[7] approach.
Geography, functional groupings and sectors, local politics and history should all be used to construct partner and stakeholder maps (Glicken 2000). One new and emerging approach is the use of legal personality to protect water systems in law through the granting of legal rights to rivers and other iconic environmental entities and ecosystems (O'Donnell and Talbot-Jones 2018). In such ways, the issue of the silence of those who cannot speak (more-than-human, rivers, oceans, etc.) can be partly addressed.
Participatory processes are collective journeys that take time. Experience shows that they need to be designed with expectations of a shared outcome, even though this cannot be fully specified in advance (Fraser et al. 2014). Although local context affects process outcomes, process design is more important (Newig et al. 2016), highlighting the importance of comprehensive and broader engagement to robust decision making. Three broad process principles underpinning more successful outcomes are (i) who participates; (ii) how communication and decision-making processes among participants are organized; and (iii) how process discussions are linked to policy and management action (Fung 2006). In turn, these principles are enacted through specific process elements, such as problem scoping, stakeholder selection, providing professional facilitation and engagement skills, allowing time and scope needed for group building, a mix of formal and informal engagement spaces, social learning processes and frameworks, spaces that support reflection on task and process, and acknowledging the stages of process (van de Kerkhof and Wieczorek 2005, Allen et al. 2011, Schlauppenleher-Kloyber and Penker 2015). Around the world there are now numerous technical guides to best practice participatory processes design and facilitation (e.g., https://www.theweave.info/images/TheWeave-V1-High-July2011.pdf; von Korff et al. 2010, Te Tari Taiwhenua/Department of Internal Affairs [date unknown]). However, there remains institutional and organizational cultures that are dismissive of participation.
Social learning is foundational for participatory approaches seeking to manage complex environmental problems within their larger social context (Stringer et al. 2006). Social learning emphasizes social interactions among stakeholders, reflection on what is being learned, and iterative attempts to apply what is being learned to the problem/opportunity situation under discussion (Whitfield and Reed 2012, Bautista et al. 2017). Increasingly local and indigenous types of knowledge are recognized in these processes alongside conventional scientific knowledge in natural resource management decision making (Kettle et al. 2014, Mantyka-Pringle et al. 2017). However, in a court-centered planning environment in Aotearoa NZ, scientific evidence-based knowledge is still privileged over local and indigenous knowledge (Hikuroa et al. 2021). Social learning and drawing from and building on all knowledge systems can be a struggle given the tendency to privilege science and the techno-scientific-legalistic discourse that prevails in the regulatory sphere of environmental decision making (Parlee and Wiber 2014).
Within any knowledge system there are both tacit and explicit types of knowledge (Kothari et al. 2012). Tacit knowledge is informed and influenced by values, i.e., preferences relating to actions or particular outcomes; and associated world views, i.e., the frameworks that people use to make sense of the world, e.g., kaitiakitanga, ecosystem-based management. Explicit knowledge is often in the form of information (that can be tested and is drawn from observations or experiments, including mātauranga Māori[8]). It follows that all available types of knowledge should be at the table and be given due weight. Supportive tools that facilitate integrative negotiation processes contribute to helping groups work through a collaborative and experiential learning cycle (Barnaud and van Paassen 2013).
Power is often thought of as individuals or groups exercising influence, control, or authority over others. However, a wider conception stresses intentionality, discourse, and political and economic interests (Brown and Dillard 2015). Top-down approaches to expert analysis and stakeholder engagement can be disempowering and politically marginalizing. Developing clear, authoritative, and prescriptive recommendations often comes at the expense of evaluative diversity that recognizes competing views (Stirling 2008). Participatory processes have developed in response to a quest for a more inclusive and empowering involvement of a wider range of interested and affected parties (Allen et al. 2014, López-Bao et al. 2017). Participatory process proponents and designers aim to bring together multiple stakeholders in an attempt to account for diverse social perspectives, opinions, and values (Bixler et al. 2015). They do so in an effort to enable more equitable distribution of benefits, costs, and obligations, and to strengthen the legitimacy of decisions in a transparent and trust-building process (Turner et al. 2016).
Just including multiple interest groups, however, does not ensure good governance (Rauschmayer et al. 2009, Felipe-Lucia et al. 2015). Many participatory processes incorporate power asymmetries and undemocratic exclusions, often by default rather than intention (McGuire 2006, Kallis et al. 2009). Genuinely encouraging and facilitating legitimate, equitable, and democratic processes is challenging, given it is a “how” question that few authors address (Johnson et al. 2004, Barnaud and van Paassen 2013).
As Barnaud and van Paassen (2013) point out, if designers claim to have a neutral posture and be independent of particular interests, they run the risk of being seen as manipulated by the more powerful stakeholders, and if they decide to be nonneutral and empower some particular stakeholders, their legitimacy to do so can be questioned. Solutions to this dilemma are not clear cut, and most put the onus on the process designers and facilitators to ensure transparency and reflective practice stances (Sirajuddin and Grudens-Schuck 2016). This suggests that the designers should make their underlying assumptions and objectives explicit so that participants can question them and reject or accept them as being legitimate for the process and participants. At the same time, it is important to use tools, such as Gaventa’s (2006) power cube, to work with participants so different stakeholder groups begin to think about power imbalances in discourses, and how they might reposition themselves (Bradley 2017).
The overall design of any process should consider the importance of building in multiple levels of participation and engagement (Brackertz and Meredyth 2009, Vines et al. 2013) to encourage the involvement of stakeholders and interested and affected parties, and wider community buy in. The different levels of engagement require clarity on where parties fit into the process, and into democratic practice, taking account of the different phases of the participatory process, and ensuring that the participatory process aligns with the socio-political and cultural context, and appropriate group development and decision-making stages over the course of the process (Schlauppenleher-Kloyber and Penker 2015).
The criteria mix of Figure 2 is a tableau for ongoing planning, monitoring, and evaluation. Knowing that a group’s collective efforts to manage each of these criteria are succeeding, and how and when they need adapting if they are not, is a central tenet of iterative participatory process design (Bryson et al. 2013). Understanding the different performance measures that might be associated with each criteria is therefore important; and linked instructive and evaluative tools, such as rubrics, offer mechanisms to take action and improve the working relationships that are central to efficient and effective collaboration (Schulz et al. 2003). It is also important that evaluations take account of the different cultures and knowledge systems that may be present, and tailor evaluation approaches appropriately. For example, Kaupapa Māori theory, i.e., carrying things out properly from a Māori standpoint, has provided a theoretically sound platform from which unique evaluation theory and practices have been developing in Aotearoa New Zealand (Kerr 2012). The multistakeholder process evaluation literature agrees that performance evaluation and adaptation is key to a productive and continuing initiative (Piltan and Sowlati 2016), and ongoing learning in this way contributes to collaborative success (Gray and Stites 2013, Bryson et al. 2015).
Step three of developing the rubric was to populate each of the “performance criteria” with exemplary details of what a well thought out, inclusive, negotiated participatory process might consider in its process. Each of the criteria boxes is intended to provide indicative guidance of “what success might look like” for key performance criteria. This can include consideration of both outputs (things completed) and processes (level of participation, appropriate behaviors, etc.). These examples are generated from both the team’s research into and experience of participatory processes, and the reviewed international literature. They are also grounded in the Aotearoa NZ setting that recognizes the Treaty of Waitangi legislated partnership model of cogovernance between iwi and the Crown.
The format for displaying the information described here is termed a single-point rubric (Fluckiger 2010, Gonzalez 2014), chosen because it provides constructive help in several ways. It does not try to cover all the aspects of an activity that could go well or poorly. Instead it describes in different ways what success may look like for any criteria but does not place boundaries on the ways in which participants may demonstrate good performance in achieving such success. It gives guidance and then allows the actors involved to approach their task or behaviors in creative and unique ways, meaning that they build on lessons that are tailored for them and their situation.
The rubric has three main parts (see Fig. 3). Column One (Performance criteria) details suggested practices and behavior; Column Two (Assessment) allows groups to self-assess their progress and practices; Column Three (Evidence of performance) importantly provides space to explain why the score was given and thus encourage those involved to discuss and acknowledge progress, and assess any gaps and how to address these.
We ask, “what does success look like?” and answer with a sketched out indicative rubric available for others to copy, adapt, and use. To be clear, problematizing the notion of success is the work of the paper and the rubric, not of the earlier research. We believe that the discussion around what that success might actually look like in specific places and contexts is a key contribution to the process-side of participation. There is very little scholarly research on evaluative or instructional frameworks designed through a bottom-up approach to assess and adapt ongoing marine participatory processes.
Below are selected quotes from the research that illustrate elements of the criteria in action in the situated Aotearoa context. The quotes speak to the narratives of achievement that the rubric seeks to engage with, that is, what success might look like. These quotes give depth and authenticity to the contingent nature of how each dimension might be experienced. Although other contexts and countries will have different drivers of and experiences of participatory processes, the quotes tie each criterion back into the context we are writing from and show how they are shaped by our Aotearoa NZ experiences of participation.
Getting to shared values in a diverse group can be difficult. One respondent half-jokingly remarked that you know you are approaching shared values when “the shouting stops”. This, however, illustrates the work that is needed to stay around the table and in communication to achieve shared goals and visions.
In another scenario, the initial vision was altered to reflect a Māori worldview, and over time this became a vision that all were proud of:
I remember my mum saying the vision they had all worked hard on and agreed on was “not Māori enough,” she took it away and “Māori-ed it up,” the others in the team didn’t argue with the result, and the Māori team members were pleased. Now [years later] I’m pleased that the words and meanings that my mum included in the vision have come to be really understood by Te Korowai members and used, for example, kaitiaki.
The case studies illustrated a number of ways to take into account history, connections, and context. One respondent reminded us that the history of an issue always starts earlier than the moment at hand: “Awaroa was taken as part of Wakefield agreement in 1840,” and that acknowledgement of Māori history, grievance, and rights in the area was a key success in this project: “There was always notification to whânau, hapû, iwi, right up front, before the media.”
In a different initiative, a key player reflected the success of their initiative revolved around the approach taken:
Change makes people anxious. The collaborative model is a more comfortable way of dealing with change ... People don’t like change and they don’t like fast change. They can cope with change if you give them a chance to come along for the ride.
Participants themselves reflected on the difficulties of inclusion and diversity in practice, and in doing so illustrated that the process of asking these questions sets up ways to answer them that are appropriate for the project in question. For example, one respondent said,
The government could look at our initiative and ask: did you have iwi? Commercial fishers? Etc? Yes, yes. We covered the bases to include.” [He then reflected further on this,] Some sectors were left out... But they always will be...
In other initiatives respondents pondered the silences of the nonhuman actors in policy and plan making and attempted to address this by imagining themselves into a kaitiaki role. For example, “We thought of ourselves as being the voice of the gulf.”
Strong meeting protocols and culture are important components in participatory processes but will look different in different contexts. One respondent reflected on the benefits that following “marae protocols” gave their long-term initiative:
Being at the marae [community meeting place/tribal complex] enabled two important things ... [One] Manaakitanga [hospitality, kindness, generosity, support] demanded a respect and acceptance of guests you have invited in. This kept iwi restrained from going into battle. [Two] Most people were sufficiently unsettled and unfamiliar to keep on their best behavior. I always appreciated the intelligent discussions you can have and then go and have kai [together].
Navigating uncertainty, complexity, and contestation can be an eye-opening experience for those involved. This participant reminded us that participatory processes can have beneficial outcomes along the way:
Before I thought they [other people] were all rape and pillage, now I know we need to understand [each other] in order to make progress. Different people have different viewpoints and need to be included.
When attention is paid to process design, facilitation, and leadership, the participatory process can be remarkably effective. This respondent reflects how they took what they learned from one initiative and applied it elsewhere:
I appreciated seeing a collaborative model working in practice. ... I’d been there right from the beginning and seen it evolve over time. Now I’m working with a range of stakeholders to codesign projects, and a community of farmers. I’m using the same techniques: start with facilitator; tease out aspirations of individuals; give everyone an opportunity to speak what’s important to you, what do you want; find common ground, agreements, or disagreements; ask how can we build a set of projects around common ground; park more difficult things. Using all these techniques is very effective.
Acknowledging and respecting different values and types of knowledge requires us to reflect on the weight or respect given to them in practice. It is important to query which types of knowledge and values are dominant and learn how to create a more equal balance. Respondents in some initiatives critiqued the dominance and/or capture of science: “The solutions to these issues are not technical science, they are social solutions.”
It raises broader questions of New Zealand science culture’s [in]ability to resist interference of vested interests. Science providers are under pressure. They are no longer a source of neutral public good information. We weren’t able to solicit frank and fearless science advice.
Other initiatives dealt with the interface of mātauranga Māori and Western science, and how this works in practice:
One is quite heavy and weighted. The stress is getting the balance, so my work plan is building the kaitiakitanga ... There is a central difference in how we measure things.
Even when local, science, and indigenous knowledge systems are incorporated, it is not an easy road. It is however, one that many people are keen to be involved in. A lead koro (Māori elder or grandparent) reflected on the difficulties:
How do I balance the two cultures? The two cultures never meet but there are parallels. Everybody wants to see how mātauranga Māori fits in but there is a long way to go... Even Māori are wanting to restore their cultural practices. But the follow-through of how they work and educate is essential, including how people do things according to the seasons and planting by the moon. This is how we are trying to plan. We need to sit down together and draft a cultural aspect for our individual work plan to promote that to the council. Cogovernance: can we find a better way? Where everyone chooses their aspirations.
Recognizing the forces that lead to “messiness” (Law et al. 2014) involves placing power at the forefront of building a useful process. One participatory process was acknowledged by many as having created safe spaces. As one respondent put it, “[this] created an environment where it’s safe to speak. You’re not going to be criticized for your space, whether that’s being a commercial fisher or sand mining.”
Other initiatives had moments of significant tension around different issues. One participant noted that “what you can live with” in consensus decision making may be influenced by a reluctance to dig into hard issues: “‘What you can live with’ was not talking about the Treaty.”
What you can live with also brought up other issues of process, power, and politics. An iwi-initiated stoppage (The Pause) of one process initially caused concern from others, but was later agreed by many parties to have been beneficial:
What we didn’t want and what we could do was stand behind a plan that we didn’t agree with ... at the time of The Pause it was seen as the Mana Whenua [those who have territorial rights, power from the land, authority over land or territory, jurisdiction over land or territory] Roundtable overstepping, but later The Pause was well-received.
When asked “how do you engage differently and recalibrate dominant paradigms?” a koro elaborated:
After the hui [gathering or meeting] I wondered why I didn’t present it in Te Reo Māori [the Māori language], which would have been a challenge because councils like to see strategic outcomes they can manage; but because they hold the power, I was forced to give them the white man’s, the colonial, picture ... For my report on last year’s funding, perhaps I should go the cultural way and give a waiata [song], just do a mihi [speech of greeting, acknowledgement, or tribute].
To ensure community buy in, informants noted that it is important to bring the community along on the journey:
The beauty of the structure is that people can come and go, it has a fluid flow (the breadth of group grew, included many NGOs, individuals, diverse backgrounds).
Informants also emphasized that the importance of ensuring bidirectional information sharing is an important component of building community support as this initiative found:
The field days empowered group learning: Other farmers told what had been their turning point, invited neighboring farmers to say what they were not doing well, and how to turn it around. It was community building and sharing opinions, networking and empowering each other; “This is really good stuff and the best way to help the Kaipara Harbour” ... XX was born from the iwi and the community: we report back to the community. For example, the marine report on the work done to achieve the vision, we had a day on the marae to discuss the vision, principles, and long-term objectives; we still hold quarterly hui to report to the community.
Further, as the rubric suggests, being able to see enthusiasm and passion in the group and wider community is a good indication that community support has been built: “They represent so many parts of the community. Going up against XX is picking a fight you’re probably going to lose.”
The rubric suggests that collective planning, milestones that achieve shared goals, regular checking in, and applying lessons learned in reflection and evaluation are key to developing success. As these three quotes indicate, the case study initiatives applied these principles in different but effective ways. The first highlights the effectiveness of record keeping to help those involved to respond and reflect on decisions made:
A strength of the process that’s often overlooked - all these minutes that capture the in-betweenness of the milestone moments, you can see that voices were included, it makes decisions transparent, you can see if things had a bearing on final decisions. That’s why decision-making documentation is so important - we can say “yes we heard you, but we have made a decision.”
Another initiative organized for an independent review to assess and report back to the initiative on their processes as they went, thus capturing weaknesses in time to adapt: “The Independent Review Panel was appointed by the Project Steering Group as guardians of process to ensure outcome.”
Finally, those involved in this initiative recognized the ongoing nature of participatory work, and that, as Figure 4 suggests, there needs to be a continual revisiting of the different phases involved: “XX is an ongoing process ... [the plan is] a piece of paper to work with, it needs an implementation process, a review process ...”
Often participation is treated as a limited set of events: a workshop, a seminar, or just one or two meetings. However, if a participatory process is to be more than consultation it must be treated as a process that takes some time, and it is often the beginning of a continuing engagement (Allen et al. 2002). Although successful approaches generally have been individually tailored to encourage stakeholders’ involvement in each situation, there are some common, albeit overlapping, phases that make these participatory approaches flow constructively (Fig. 4).
We have proposed an open-ended framing of participatory processes that recognizes probable phasing of participatory processes. For example, the same questions will not be asked about politics and power at the beginning of a participatory process as at the end. Equally, questions around the goals and visions could be markedly different at the beginning and middle of a process than at the end. Further, the stakeholders involved in participatory processes often change over time, often frequently. Figure 3 provides a range of criteria that can be used to problematize the Participatory Processes Rubric at different phases. This needs to be reflected on and different aspects planned for at the beginning and end of each phase, asking critical questions such as “If it works what might happen?” “What is needed to make these aspects happen?” “How well did that go and why?”
In this paper we set out to achieve two main goals. Our multidisciplinary research team has been provided an opportunity to critically engage with the notion of success in participatory processes. We aimed to provide some guidance on “what success might look like” (building on our research looking at participation in multiuse, multiuser marine environments) and clearly articulate some examples of that success in practice. The findings advance knowledge about linking top-down and bottom-up perspectives of governance with particular reference to advancing EBM initiatives. In this regard we have inserted new elements of approach into socio-theoretic understandings of governing mentalities that are grounded in actual relational dynamics of agency.
We stress that our results are intended to be indicative, providing an initial guide for those involved in participatory processes to develop their own performance assessment rubric. In this way the indicative rubric provided in this paper can be used by others as both an instructional and performance assessment tool. The intention is that this and similar rubrics can be used to provoke reflection and enable critical questions to be asked, based on a prerequisite set of key criteria that underpin successful processes. In this way we have facilitated articulation of what’s important, what processes need to be included, and offered guidance through the medium of a rubric for both practitioners and those who may identify a need for or are engaged in a participatory initiative.
Our work contributes to developing collective forward-looking understandings and the recognition that participatory processes endure over time. The Ingredients Figure (Le Heron et al. 2019b) started collective conversations, and the participatory processes rubric outlined here delves more broadly and deeply into the basis of collective understandings, providing suggestions for structuring continuing discussions and initiatives. Participatory processes in different settings and on different issues can be utilized as comparative collective initiatives. By having different sites and case studies, the research team was able to look across and between them. If the knowledge ownership of participatory processes is taken seriously, then regardless of context, there is the opportunity in the collective framework of the rubric to establish some sense of why and how successful progressive steps are being assembled and institutionalized over time.
Through creating a rubric we aim to give some illustrative guidance. We asked a challenging question of participatory processes: “how do you know if you’ve collectively achieved anything meaningful, and what does it mean to invoke the word success?” But, and it is important to stress this, we have not attempted to provide an external or universal assessment. Rather, we offer an indicative example of an assessment approach that can be crafted to work with, alongside and for, different parties in varied settings. This “sketch” or “indicative rubric” came from the deeply embedded and embodied examples we have worked with. Although we have tested the rubric criteria we provide here through our research, workshopping, and engagement with the literature, the rubric itself remains an exploratory tool. The diversity of aims underlying participatory processes means that it is impossible to provide “one rubric to rule them all” (Bryson et al. 2015). Each needs to be embedded and situated in people, place, and context in order to be meaningful or helpful.
The facilitated discussion (the process) that goes into the development of any rubric needs to be seen by agencies and operation managers as being as important as the developed rubric itself (the product). In this regard, rubrics can be seen as both a process and a product (Allen et al. 2018). The participatory processes rubric we have developed in this paper should therefore be introduced to participatory process actors by those with the process skills to help multistakeholder groups to refine them for their own individual context, and to help the group utilize the tailored rubric to guide planning, implementation, and evaluation. Rubrics are most helpful when intimately understood and self-generated, not simply used to document process. Our research and accompanying discussions highlight that an indicative rubric is a meaningful contribution to reflexive participatory practice in Aotearoa NZ; a guide about how to ask questions, to reflect on tools and practices, to enable a deeper level of reflection and critical questioning to be present in participatory processes.
Some of the research team were hesitant originally that we may have created an unnecessarily complex depiction of the lived circumstances of the participatory processes examined but were reassured by the engaged responses we received regarding the original Ingredients Figure (Le Heron et al. 2019b) that confronting complexity was essential. However, we recognize that some practitioners may not see these ingredients particularly clearly in relation to themselves and the case study initiatives, despite the findings being drawn from research on their experiences. Our rubric was born from the desire to better involve an array of stakeholders in complex initiatives in a way that helps each actor realize they are part of a bigger shared picture. It is this process that the research team engaged in that provides the confidence and skills to be comfortable with the rubric as a guide and assessment for the current and future participatory activities we engage in. The greatest asset of the rubric is that it is not a universal assessment that is externally imposed, but rather it is a tool that can be creatively formed by those involved to negotiate, navigate, and facilitate discussion about grounded experiences that can be labeled outcomes of successful participation and organizational initiatives.
In writing this paper, we discovered a tension between what an indicative participatory processes rubric would look like if generated in practice and what work we would like it to do in a journal paper. There are different expectations in these two contexts, and yet we wish the rubric to be able to navigate both spaces. Using a rubric in practice requires adaptation, expansion, and discussion; and most fundamentally, active and meaningful involvement of participants. Writing about the development of a rubric, and its use as an indicative framing to enable critical questions to be asked, requires thinking on how the rubric was and could be developed, and how best to communicate the main points of tension, politics, and application. This tension was present throughout our development of both the paper and the rubric, and this was the cause of much debate and reflection. We bring this to the reader’s attention because the complexity of participatory process design and evaluation is belied by the apparent simplicity of indicative diagrams. These things are not easy to reconcile! But debating each point brought greater exploration and deeper understanding for the research team, and this is the mind-set with which we would encourage readers to approach the indicative rubric.
The tension between formative evaluation (which prioritizes ongoing evaluation as a learning process) and summative evaluation (external, one time) is power-laden. We have chosen to develop a formative, self-defining, process-oriented example because we wish to stress that the coproductive, ongoing, and self-described nature of such an evaluation is far more empowering. When success is defined by those to whom it matters and is applied, it becomes a useful metaphor rather than a constraining one. In this way the use of a rubric such as this changes the nature of governance in these participatory marine initiatives.
Related to these reflections, the subtle work of the business-as-usual metaphor of success came to our attention. Success is mostly accepted uncritically and too often as some obvious state of affairs. Yet, as Foucault’s (1975) scholarship suggests, success has to be seen as a new kind of organizing device in itself. The discourse of success is often narrow, with attempts to widen how success might be framed and named resisted by those in power. It is a governing mentality that favors the status quo: you must show us how you will convince us of your success. With this paper we contribute to locating success narratives in their possible institutional and other settings. Thinking about success for whom, by what means, in what context, and defined by whom, takes the paper deeper into democratic spaces and away from business-as-usual mainstream hierarchical politics that characterize much natural resource management.
Any discussion on negotiating and navigating and asking how we know, runs counter to the forces of top-down socio-technical governance; the rubric is a democratic and collectivist move. We sought to problematize and bring into the open the notion of success, by making the narratives of achievement visible, in which new political and analytical spaces are potentially possible and wider agendas made available in new ways. The strength that a multidisciplinary team brought to developing the rubric made this distinction clear. The rubric is meant to be played with, stretched, and transformed, so that those using it can find ways of articulating the version of success that means the most to their communities. In this way the politics of the metaphor of success is more likely be recognized and negotiated, empowering those involved to define their own success.
Although much scholarship on participatory processes exists, and the criteria identified in the rubric are not new per se, the paper has brought them together in a novel way by thinking holistically about participatory processes as governance frameworks supportive of EBM. The elements themselves are well known in the literature and people will be familiar with many of them already, however they are often disparate, unconnected, or partial. A strength of the paper is that the rubric brings them together in an accessible fashion.
Our work on the democratic and participative component of rubrics in participatory processes links with other work on ethics and justice. For example, Bennett and colleagues’ (2019) discussion on just sustainability recognizes that justice is about recognition, procedural, and distributional elements. The rubric developed here touches on all three components: success is multifaceted, different for different agents, and about both process and empowered decision making. There is much potential for future engagement between the rubric, practitioner experiences, and theoretical developments. It is our hope that the rubric continues to spark conversations, structure discussions, and empowers those involved in multiple and diverse contexts.
__________
[1] Although Aotearoa is a Māori name for New Zealand’s North Island, to reflect the nations bicultural foundation it is commonly used in this context, e.g., Aotearoa New Zealand, to mean all New Zealand.
[2] There is a growing conflict between the country’s many uses of the marine environment, including its important marine economy (including fisheries, aquaculture, tourism, oil and gas, minerals, renewable energy, shipping) and protection of the marine environment. As the fourth largest exclusive economic zone in the world and 20 times larger than the landmass, Aotearoa NZ’s marine estate is an important part of Aotearoa NZ’s culture, economy, lifestyle, and spiritual well-being.
[3] Sustainable Seas Ko ngā moana whakauka National Science Challenge in Aotearoa New Zealand 2014–2019.
[4] This genre of research drawing especially on Foucault includes new governmental foundations (Dean 2010, Larner and Le Heron 2002a), investigations of standards, benchmarking and audit regimes in the economy (Power 1996, Campbell and Le Heron 2007), and the extension of such ideas into social organizations such as universities (Shore 2008, Larner and Le Heron 2002b). Although these advances confront top down governance formations and problematize the nature of performance and success, they offer little to partnership research in participatory processes, which is the concern of the paper.
[5] The “ingredients” in the Ingredients Figure are the same as the “criteria” in the Participatory Processes Rubric. The different terms are used to reflect the nature of the Figure versus the Rubric. The Figure provides initial conversation prompts over a range of elements, with the advice “it’s not a recipe but there are ingredients” (Le Heron et al. 2019b). The Rubric uses “criteria” because this is the correct terminology for rubrics. In this context, the criteria detail what each ingredient done well might look like, what success for that criteria/ingredient might be.
[6] Blackett, P., E. Le Heron, R. Le Heron, J. Logie, C. Lundquist and team. 28 Nov 2018. Webinar Participatory processes for Sustainable Seas: a review of initiatives in New Zealand’s ocean domain. Invited presentation to Regional Councils, hosted by Waikato Regional Council; Blackett, P. 2018. Plenary - Our science in an increasingly complex world. NZ Marine Sciences Society Conference, Napier 3rd - 6th July 2018; Le Heron, R. and team. 2018. Transformational participatory processes in multi-use/r marine spaces: Is Aotearoa New Zealand (ANZ) leading the world? NZ Marine Sciences Society Conference, Napier 3rd - 6th July 2018; Hikuroa, D. 2018. Transformative co-leadership through participatory processes using ki uta ki tai and mountains to seas knowledge. NZ Geographical Society Conference Auckland 11-14th July; Le Heron, R. and team. 2018. Navigating negotiated change through participatory processes in multi-use/r marine spaces Sustainable Seas Annual Conference 6-8th November 2018.
[7] Kaitiakitanga is adaptive and collective decision making and action that is tailored to local conditions to realize the principles of reciprocity and intergenerational sustainability via the practices undertaken, drawing from mātauranga Māori, within a Māori worldview.
[8] Mātauranga = knowledge, wisdom, understanding, skill (https://maoridictionary.co.nz/). Mātauranga Māori = Māori knowledge, wisdom, etc.
AUTHOR CONTRIBUTIONS
ELH and WA jointly led the writing of the manuscript. PB and RLH initiated and led the wider research program that this research contributes to. ELH led the case study interviews and was assisted by MJL, AG, WA, KD, BG, and DH. WA provided the rubrics methodology and facilitation. All authors contributed to the final version of the manuscript.
ACKNOWLEDGMENTS
The authors would like to acknowledge the generous contributions of time, insights, and reflection of the interviewees and participants in this research, without whom this work could not have been possible. Two anonymous reviewers provided helpful comments on earlier drafts of the manuscript. This research was funded by the Sustainable Seas National Science Challenge as part of Ministry of Innovation, Business and Employment contract C01X1515.
DATA AVAILABILITY
The raw data (transcripts) developed through in-depth interviewing that contribute to the findings of this study are supported by the ethical protocol No. 0004-2017-NIWA. None of the data are publicly available because of their containing information that could compromise the privacy of research participants. Ethical approval for this research study was granted by the National Institute of Water & Atmospheric Research (NIWA) New Zealand for the work done under the “Testing EBM-supportive participatory processes for application in multi-use marine environments” project, which was itself part of the Sustainable Seas/Ko nga moana whakauka New Zealand National Science Challenge. For more information on the ethical protocol and the storage/release of project data contact the project manager: Paula Blackett, NIWA - Paula.Blackett@niwa.co.nz.
Allen, S., and J. Knight. 2009. A method for collaboratively developing and validating a rubric. International Journal for the Scholarship of Teaching and Learning 3(2):10. https://doi.org/10.20429/ijsotl.2009.030210
Allen, W., A. Fenemor, M. Kilvington, G. Harmsworth, R. G. Young, N. Deans, C. Horn, C. Phillips, O. Montes de Oca, J. Ataria, and R. Smith. 2011. Building collaboration and learning in integrated catchment management: the importance of social process and multiple engagement approaches. New Zealand Journal of Marine and Freshwater Research 45(3):525-539. https://doi.org/10.1080/00288330.2011.592197
Allen, W., A. Grant, L. Earl, R. MacLellan, N. Waipara, M. Mark-Shadbolt, S. Ogilvie, E. R. (L.) Langer, and M. Marzano. 2018. The use of rubrics to improve integration and engagement between biosecurity agencies and their key partners and stakeholders: a surveillance example. Pages 269-298 in J. Urquhart, M. Marzano, and C. Potter, editors. The human dimensions of forest and tree health: global perspectives. Palgrave Macmillan, Cham, Switzerland. https://doi.org/10.1007/978-3-319-76956-1_11
Allen, W., M. Kilvington, and C. Horn. 2002. Using participatory and learning-based approaches for environmental management to help achieve constructive behaviour change. Prepared for Ministry for the Environment, Landcare Research Contract Report LC0102/057, Lincoln, New Zealand.
Allen, W., S. Ogilvie, H. Blackie, D. Smith, S. Sam, J. Doherty, D. McKenzie, J. Ataria, L. Shapiro, J. McKay, E. Murphy, C. Jacobson, and C. Eason. 2014. Bridging disciplines, knowledge systems and cultures in pest management. Environmental Management 53:429-440. https://doi.org/10.1007/s00267-013-0180-z
American Education Research Association (AERA). [date unknown]. What is action research? AERA, Washington, D.C., USA. [online] URL: https://sites.google.com/site/aeraarsig/Home/what-is-action-research
Andrade, H. G. 2000. Using rubrics to promote thinking and learning. Educational Leadership 57(5):13-19.
Baker, S., and F. S. Chapin III. 2018. Going beyond “it depends:” the role of context in shaping participation in natural resource management. Ecology and Society 23(1):20. https://doi.org/10.5751/es-09868-230120
Barnaud, C., and A. Van Paassen. 2013. Equity, power games, and legitimacy: dilemmas of participatory natural resource management. Ecology and Society 18(2):21. https://doi.org/10.5751/ES-05459-180221
Bautista, S., J. Llovet, A. Ocampo-Melgar, A. Vilagrosa, Á. G. Mayor, C. Murias, V. R. Vallejo, and B. J. Orr. 2017. Integrating knowledge exchange and the assessment of dryland management alternatives—a learning-centered participatory approach. Journal of Environmental Management 15(195):35-45. https://doi.org/10.1016/j.jenvman.2016.11.050
Benham, C. F. 2017. Aligning public participation with local environmental knowledge in complex marine social-ecological systems. Marine Policy 82:16-24. https://doi.org/10.1016/j.marpol.2017.04.003
Bennett, N. J., J. Blythe, A. M. Cisneros-Montemayor, G. G. Singh, and U. R. Sumaila. 2019. Just transformations to sustainability. Sustainability 11(14):3881. https://doi.org/10.3390/su11143881
Bixler, R. P., J. Dell'Angelo, O. Mfune, and H. Roba. 2015. The political ecology of participatory conservation: institutions and discourse. Journal of Political Ecology 22(1):164-182. https://doi.org/10.2458/v22i1.21083
Bobbio, L. 2019. Designing effective public participation. Policy and Society 38(1):41-57. https://doi.org/10.1080/14494035.2018.1511193
Bradley, H. 2017. Community development and co-production: thinking critically about parameters and power. Concept 8(3):10.
Brackertz, N., and D. Meredyth. 2009. Community consultation in Victorian local government: a case of mixing metaphors? Australian Journal of Public Administration 68(2):152-166. https://doi.org/10.1111/j.1467-8500.2009.00627.x
Bremer, S., and B. Glavovic. 2013. Exploring the science-policy interface for integrated coastal management in New Zealand. Ocean and Coastal Management 84:107-118. https://doi.org/10.1016/j.ocecoaman.2013.08.008
Brown, J., and J. Dillard. 2015. Dialogic accountings for stakeholders: on opening up and closing down participatory governance. Journal of Management Studies 52(7):961-985. https://doi.org/10.1111/joms.12153
Bryson, J. M., B. C. Crosby, and M. M. Stone. 2015. Designing and implementing cross-sector collaborations: needed and challenging. Public Administration Review 75(5):647-663. https://doi.org/10.1111/puar.12432
Bryson, J. M., K. S. Quick, C. S. Slotterback, and B. C. Crosby. 2013. Designing public participation processes. Public Administration Review 73(1):23-34. https://doi.org/10.1111/j.1540-6210.2012.02678.x
Campbell, H., and R. Le Heron. 2007. ‘Big supermarkets, big producers and audit technologies: the constitutive micro-politics of food legitimacy food and food system governance’. Pages 131-153 in D. Burch and G. Lawrence, editors. Supermarkets and agri-food supply chains. Edward Elgar, Cheltenham, UK.
Chiu, L. F. 2006. Critical reflection: more than nuts and bolts. Action Research 4(2):183-203. https://doi.org/10.1177/1476750306063991
Davies, K., A. A. Murchie, V. Kerr, and C. Lundquist. 2018. The evolution of marine protected area planning in Aotearoa New Zealand: reflections on participation and process. Marine Policy 93:113-127. https://doi.org/10.1016/j.marpol.2018.03.025
De Vente, J., M. S. Reed, L. C. Stringer, S. Valente, and J. Newig. 2016. How does the context and design of participatory decision making processes affect their outcomes? Evidence from sustainable land management in global drylands. Ecology and Society 21(2):24. https://doi.org/10.5751/ES-08053-210224
Dean, M. 2010. Governmentality: power and rule in modern society. SAGE, Thousand Oaks, California, USA.
Eastwood, A., A. Fischer, and A. Byg. 2017. The challenges of participatory and systemic environmental management: from aspiration to implementation. Journal of Environmental Planning and Management 60(9):1683-1701. https://doi.org/10.1080/09640568.2016.1249787
Felipe-Lucia, M. R., B. Martín-López, S. Lavorel, L. Berraquero-Díaz, J. Escalera-Reyes, and F. A. Comín. 2015. Ecosystem services flows: why stakeholders’ power relationships matter. PLoS ONE 10(7):e0132232. https://doi.org/10.1371/journal.pone.0132232
Fluckiger, J. 2010. Single point rubric: a tool for responsible student self-assessment. Delta Kappa Gamma Bulletin 76(4):18.
Foley, P., E. Pinkerton, M. G. Wiber, and R. L. Stephenson. 2020. Full-spectrum sustainability: an alternative to fisheries management panaceas. Ecology and Society 25(2):1. https://doi.org/10.5751/es-11509-250201
Foucault, M. 1975. Surveiller et punir. Gallimard, Paris, France. [Translated as Discipline and punish, Alan Sheridan (translator). 1977. Pantheon, New York, New York, USA.] https://doi.org/10.14375/NP.9782070729685
Fraser, C., A. Fenemor, J. Turner, and W. Allen. 2014. The wheel of water research programme: designing collaborative catchment decision-making processes using a water wheel—reflections from two case studies. Prepared for Ministry of Business, Innovation and Employment, No. C1205601. Aqualinc Research Limited, Christchurch, New Zealand.
Fung, A. 2006. Varieties of participation in complex governance. Public Administration Review 66:66-75. https://doi.org/10.1111/j.1540-6210.2006.00667.x
Gaventa, J. 2006. Finding the spaces for change: a power analysis. IDS Bulletin 37(6):23-33. https://doi.org/10.1111/j.1759-5436.2006.tb00320.x
Gelcich, S., F. Reyes-Mendy, R. Arriagada, and B. Castillo. 2018. Assessing the implementation of marine ecosystem based management into national policies: insights from agenda setting and policy responses. Marine Policy 92:40-47. https://doi.org/10.1016/j.marpol.2018.01.017
Glicken, J. 2000. Getting stakeholder participation ‘right’: a discussion of participatory processes and possible pitfalls. Environmental Science and Policy 3(6):305-310. https://doi.org/10.1016/s1462-9011(00)00105-2
Gluckman, P. 2018. The role of evidence and expertise in policy-making: the politics and practice of science advice. Journal and Proceedings of the Royal Society of New South Wales 151(467/468):91-101.
Gonzalez, J. 2014. Know your terms: holistic, analytic, and single-point rubrics. Cult of Pedagogy, 1 May. Web site. [online] URL: https://www.cultofpedagogy.com/holistic-analytic-single-point-rubrics
Gray, B., and J. P. Stites. 2013. Sustainability through partnerships: capitalizing on collaboration. Network for Business Sustainability, Ivey Business School, London, Ontario, Canada. [online] URL: http://www.wageningenportals.nl/sites/default/files/resource/nbs-systematic-review-partnerships.pdf
Hewitt, J., L. Faulkner, A. Greenaway, and C. Lundquist. 2018. Proposed ecosystem-based management principles for New Zealand. Resource Management Journal 10-13.
Hikuroa, D., R. Le Heron, E. Le Heron, and Participatory Processes Research Team. 2021. Re-commoning in the spirit of Ki Uta Ki Tai (Mountains to the sea): towards generative economic-environment transitionings. In R. H. M. Prince, A. Gallagher, C. Morris, and S. Fitzherbert, editors. Markets in their place. Routledge, London, UK, in press.
Huang, H. B. 2010. What is good action research? Why the resurgence? Action Research 8:93-109.
Israel, B. A., K. M. Cummings, M. B. Dignan, C. A. Heaney, D. P. Perales, B. G. Simons-Morton, and M. A. Zimmerman. 1995. Evaluation of health education programs: current assessment and future directions. Health Education & Behavior 22(3):364-389. https://doi.org/10.1177/109019819402200308
Johnson, N., N. Lilja, J. A. Ashby, and J. A. Garcia. 2004. The practice of participatory research and gender analysis in natural resource management. Natural Resources Forum 28(3):189-200. https://doi.org/10.1111/j.1477-8947.2004.00088.x
Kallis, G., M. Kiparsky, and R. Norgaard. 2009. Collaborative governance and adaptive management: lessons from California’s CALFED Water Program. Environmental Science and Policy 12(6):631-643. https://doi.org/10.1016/j.envsci.2009.07.002
Kaufman, S., C. P. Ozawa, and D. F. Shmueli. 2014. Evaluating participatory decision processes: Which methods inform reflective practice? Evaluation and Program Planning 42:11-20. https://doi.org/10.1016/j.evalprogplan.2013.08.002
Kemmis, S. 2009. Action research as practice-based practice. Educational Action Research 17(3):463-474. https://doi.org/10.1080/09650790903093284
Kerr, S. 2012. Kaupapa Māori theory-based evaluation. Evaluation Journal of Australasia 12(1):6-18. https://doi.org/10.1177/1035719x1201200102
Kettle, N. P., K. Dow, S. Tuler, T. Webler, J. Whitehead, and K. M. Miller. 2014. Integrating scientific and local knowledge to inform risk‐based management approaches for climate adaptation. Climate Risk Management 4-5:17-31. https://doi.org/10.1016/j.crm.2014.07.001
Kothari, A., D. Rudman, M. Dobbins, M. Rouse, S. Sibbald, and N. Edwards. 2012. The use of tacit and explicit knowledge in public health: a qualitative study. Implementation Science 7:20. https://doi.org/10.1186/1748-5908-7-20
Kusters, K., L. Buck, M. de Graaf, P. Minang, C. van Oosten, and R. Zagt. 2018. Participatory planning, monitoring and evaluation of multi-stakeholder platforms in integrated landscape initiatives. Environmental Management 62:170-181. https://doi.org/10.1007/s00267-017-0847-y
Larner, W., and R. Le Heron. 2002a. From economic globalisation to globalising economic processes: Towards post-structural political economies. Geoforum 33(4):415-419. https://doi.org/10.1016/s0016-7185(02)00044-1
Larner, W., and R. Le Heron. 2002b. The spaces and subjects of a globalising economy: a situated exploration of method. Environment and Planning D: Society and Space 20(6):753-774. https://doi.org/10.1068/d284t
Law, J., G. Afdal, K. Asdal, W. Y. Lin, I. Moser, and V. Singleton. 2014. Modes of syncretism: notes on noncoherence. Common Knowledge 20(1):172-192. https://doi.org/10.1215/0961754X-2374817
Le Heron, E., R. Le Heron, P. Blackett, K. Davies, J. Logie, W. Allen, A. Greenaway, and B. Glavovic. 2019b. It’s not a recipe... but there are ingredients: navigating negotiated change through participatory processes in multi-use/r marine spaces. Planning Quarterly 213:32-37.
Le Heron, E., R. Le Heron, and N. Lewis. 2011. Performing research capability building in New Zealand’s social sciences: capacity-capability insights from exploring the work of BRCSS’s ‘sustainability’ theme, 2004-2009. Environment and Planning A: Economy and Space 43(6):1400-1420. https://doi.org/10.1068/a43303
Le Heron, E., R. Le Heron, J. Logie, A. Greenaway, W. Allen, P. Blackett, K. Davies, B. Glavovic, and D. Hikuroa. 2020b. Participatory processes as 21st century social knowledge technology: metaphors and narratives at work. Chapter 11 in E. Probyn, K. Johnston and N. Lee, editors. Sustaining the seas: oceanic space and the politics of care. Rowan and Littlefield, Lanham, Maryland, USA.
Le Heron, E., R. Le Heron, L. Taylor, C. Lundquist, and A. Greenaway. 2020a. Remaking ocean governance in Aotearoa New Zealand through boundary-crossing narratives about ecosystem-based management. Marine Policy 122:104222. https://doi.org/10.1016/j.marpol.2020.104222
Le Heron, E., J. Logie, W. Allen, R. Le Heron, P. Blackett, K. Davies, A. Greenaway, B. Glavovic, and D. Hikuroa. 2019a. Diversity, contestation, participation in Aotearoa New Zealand’s multi-use/user marine spaces. Marine Policy 106:103536. https://doi.org/10.1016/j.marpol.2019.103536
Long, R. D., A. Charles, and R. L. Stephenson. 2015. Key principles of marine ecosystem-based management. Marine Policy 57:53-60. https://doi.org/10.1016/j.marpol.2015.01.013
López-Bao, J. V., G. Chapron, and A. Treves. 2017. The Achilles heel of participatory conservation. Biological Conservation 212:139-143. https://doi.org/10.1016/j.biocon.2017.06.007
Lynam, T., W. De Jong, D. Sheil, T. Kusumanto, and K. Evans. 2007. A review of tools for incorporating community knowledge, preferences, and values into decision making in natural resources management. Ecology and Society 12(1):5. https://doi.org/10.5751/ES-01987-120105
Mantyka-Pringle, C. S., T. D. Jardine, L. Bradford, L. Bharadwaj, A. P. Kythreotis, J. Fresque-Baxter, E. Kelly, G. Somers, L. E. Doig, P. D. Jones, K. E. Lindenschmidt, and the Slave River and Delta Partnership. 2017. Bridging science and traditional knowledge to assess cumulative impacts of stressors on ecosystem health. Environment International 102:125-137. https://doi.org/10.1016/j.envint.2017.02.008
McGuire, M. 2006. Collaborative public management: assessing what we know and how we know it. Public Administration Review 66:33-43. https://doi.org/10.1111/j.1540-6210.2006.00664.x
Newig, J., D. Schulz, and N. W. Jager. 2016. Disentangling puzzles of spatial scales and participation in environmental governance—the case of governance re-scaling through the European Water Framework Directive. Environmental Management 58(6):998-1014. https://doi.org/10.1007/s00267-016-0753-8
O'Donnell, E. L., and J. Talbot-Jones. 2018. Creating legal rights for rivers: lessons from Australia, New Zealand, and India. Ecology and Society 23(1):7. https://doi.org/10.5751/ES-09854-230107
Oliver, P. 2002. Natural resource and environmental management partnerships: panacea, placebo or palliative. Pages 333-336 in National Coastal Management Coast to Coast Conference. Tweed Heads, Australia. [online] URL: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.554.4008&rep=rep1&type=pdf
Parlee, C. E., and M. G. Wiber. 2014. Institutional innovation in fisheries governance: adaptive co-management in situations of legal pluralism. Current Opinion in Environmental Sustainability 11:48-54. https://doi.org/10.1016/j.cosust.2014.09.012
Parlee, C. E., and M. G. Wiber. 2018. Using conflict over risk management in the marine environment to strengthen measures of governance. Ecology and Society 23(4):5. https://doi.org/10.5751/ES-10334-230405
Piltan, M., and T. Sowlati. 2016. A multi-criteria decision support model for evaluating the performance of partnerships. Expert Systems with Applications 45:373-384. https://doi.org/10.1016/j.eswa.2015.10.002
Plummer, R., and D. Armitage. 2007. A resilience-based framework for evaluating adaptive co-management: linking ecology, economics and society in a complex world. Ecological Economics 61(1):62-74. https://doi.org/10.1016/j.ecolecon.2006.09.025
Podestá, G. P., C. E. Natenzon, C. Hidalgo, and F. R. Toranzo. 2013. Interdisciplinary production of knowledge with participation of stakeholders: a case study of a collaborative project on climate variability, human decisions and agricultural ecosystems in the Argentine Pampas. Environmental Science and Policy 26:40-48. https://doi.org/10.1016/j.envsci.2012.07.008
Pomeroy, R., and F. Douvere. 2008. The engagement of stakeholders in the marine spatial planning process. Marine Policy 32(5):816-822. https://doi.org/10.1016/j.marpol.2008.03.017
Power, M. 1996. Making things auditable. Accounting, Organizations and Society 21(2/3):289-315. https://doi.org/10.1016/0361-3682(95)00004-6
Rauschmayer, F., S. van den Hove, and T. Koetz. 2009. Participation in EU biodiversity governance: how far beyond rhetoric? Environment and Planning C: Government and Policy 27(1):42-58. https://doi.org/10.1068/c0703j
Reddy, Y. M., and H. Andrade. 2010. A review of rubric use in higher education. Assessment and Evaluation in Higher Education 35(4):435-448. https://doi.org/10.1080/02602930902862859
Reed, M. S. 2008. Stakeholder participation for environmental management: a literature review. Biological Conservation 141(10):2417-2431. https://doi.org/10.1016/j.biocon.2008.07.014
Reed, M. S., S. Vella, E. Challies, J. de Vente, L. Frewer, D. Hohenwallner-Ries, T. Huber, R. K. Neumann, E. A. Oughton, J. Sidoli del Ceno, and H. van Delden. 2018. A theory of participation: what makes stakeholder and public engagement in environmental management work? Restoration Ecology 26:S7-S17. https://doi.org/10.1111/rec.12541
Rodhouse, T., and F. Vanclay. 2016. Is free, prior and informed consent a form of corporate social responsibility? Journal of Cleaner Production 131:785-794. https://doi.org/10.1016/j.jclepro.2016.04.075
Ruckstuhl, K., M. Thompson-Fawcett, and H. Rae. 2014. Māori and mining: indigenous perspectives on reconceptualising and contextualising the social licence to operate. Impact Assessment and Project Appraisal 32(4):304-314. https://doi.org/10.1080/14615517.2014.929782
San Cristóbal Mateo, J. R., E. Díaz Ruiz de Navamuel, and M. A. González Villa. 2017. Are project managers ready for the 21th challenges? A review of problem structuring methods for decision support. International Journal of Information Systems and Project Management 5(2):43-56.
Schauppenlehner-Kloyber, E., and M. Penker. 2015. Managing group processes in transdisciplinary future studies: how to facilitate social learning and capacity building for self-organised action towards sustainable urban development? Futures 65:57-71. https://doi.org/10.1016/j.futures.2014.08.012
Schulz, A. J., B. A. Israel, and P. Lantz. 2003. Instrument for evaluating dimensions of group dynamics within community-based participatory research partnerships. Evaluation and Program Planning 26(3):249-262. https://doi.org/10.1016/S0149-7189(03)00029-6
Shore, C. 2008. Audit culture and illiberal governance: universities and the politics of accountability. Anthropological Theory 8(3):278-298. https://doi.org/10.1177/1463499608093815
Sirajuddin, Z., and N. Grudens-Schuck. 2016. Bridging power asymmetries in facilitating public participation. In J. Goodwin, editor. Confronting the challenges of public participation in environmental, planning, and health decision-making. Iowa State University Summer Symposium on Science Communication, Ames, Iowa, USA. https://doi.org/10.31274/sciencecommunication-180809-16
Stirling, A. 2008. “Opening up” and “closing down” power, participation, and pluralism in the social appraisal of technology. Science, Technology, and Human Values 33(2):262-294. https://doi.org/10.1177/0162243907311265
Stringer, L. C., A. J. Dougill, E. Fraser, K. Hubacek, C. Prell, and M. S. Reed. 2006. Unpacking “participation” in the adaptive management of social-ecological systems: a critical review. Ecology and Society 11(2):39. https://doi.org/10.5751/es-01896-110239
Te Tari Taiwhenua Department of Internal Affairs. [date unknown]. Good practice participate. Te Tari Taiwhenua Department of Internal Affairs, Wellington, New Zealand. [online] URL: https://www.dia.govt.nz/Good-Practice-Participate
Tholke, M. 2003. Collaboration for a change: a practitioner’s guide to environmental nonprofit-industry partnerships. Erb Environmental Management Institute, Ann Arbor, Michigan, USA. [online] URL: http://users.homebase.dk/~nat/t10/afgang/SF/TholkePartnershipReport.pdf
Turner II, B. L., K. J. Esler, P. Bridgewater, J. Tewksbury, N. Sitas, B. Abrahams, F. S. Chapin III, R. R. Chowdhury, P. Christie, S. Diaz, et al. 2016. Socio-environmental systems (SES) research: what have we learned and how can we use this information in future research programs. Current Opinion in Environmental Sustainability 19:160-168. https://doi.org/10.1016/j.cosust.2016.04.001
United Nations. 2007. United Nations Declaration on the Rights of Indigenous Peoples. United Nations, New York, New York, USA. [online] URL: https://www.un.org/development/desa/indigenouspeoples/declaration-on-the-rights-of-indigenous-peoples.html
Van de Kerkhof, M., and A. Wieczorek. 2005. Learning and stakeholder participation in transition processes towards sustainability: methodological considerations. Technological Forecasting and Social Change 72(6):733-747. https://doi.org/10.1016/j.techfore.2004.10.002
Van Huijstee, M. M., M. Francken, and P. Leroy. 2007. Partnerships for sustainable development: a review of current literature. Environmental Sciences 4(2):75-89. https://doi.org/10.1080/15693430701526336
Vines, J., R. Clarke, P. Wright, J. McCarthy, and P. Olivier. 2013. Configuring participation: on how we involve people in design. Pages 429-438 in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems . Association for Computing Machinery, New York, New York, USA. https://doi.org/10.1145/2470654.2470716
Von Korff, Y., P. d'Aquino, K. A. Daniell, and R. Bijlsma. 2010. Designing participation processes for water management and beyond. Ecology and Society 15(3):1. https://doi.org/10.5751/ES-03329-150301
Waitangi Tribunal. 1988. Report of The Waitangi Tribunal on the Muriwhenua Fishing Claim. Waitangi Tribunal, New Zealand.
Whitfield, S., and M. S. Reed. 2012. Participatory environmental assessment in drylands: introducing a new approach. Journal of Arid Environments 77:1-10. https://doi.org/10.1016/j.jaridenv.2011.09.015