+1(978)310-4246 credencewriters@gmail.com

Case Study 2: City Center Hospital Case Study

Organizational Problem:

Center City Hospital is a large outpatient surgery center in a mid-size urban area of Kansas. The CEO, Director of Nursing (DON), and the HR manager are concerned with a morale issue amongst the nursing staff of the organization. During the past year, the hospital has been trying to change the culture to become more profit and patient-centered.

Organizational Design:

Full-time staffing of the hospital is operational. This includes the CEO, DON, Human Resources Management (HRM), Finance, Risk Management, and Unit Managers for each section of the surgery center. Nursing staff within the center is outsourced and nurses are hired on as conditional permanent staff. The doctors are independent contractors with the hospital and are on yearly contracts.

Challenge and Request Criteria:

The CEO, DON, and the HRM believe the morale of nurses is at an all-time low and the relationship between doctors, unit managers, and nurses is damaging to the mission of the hospital. The CEO is concerned about losing physicians due to the lack of leadership skills displayed by the unit managers. The unit managers are concerned about the stress level for the nurses and blame the arrogant behavior of the physicians as the cause of stress.

The CEO, DON, and HRM request an OD intervention for all internal managers with the intent of developing a common focus of leading a united team. The leadership team requires that this be addressed in a one-day workshop and that a visible improvement be seen over the next 12 months.

As defined by the leadership team, the intervention

is to develop stronger leadership skills within the unit manager staffing, and must result with visible improvement over the subsequent 12 months.

The paper must be four to five pages in length, and include level headings to lead the reader through the specific areas of content. Level headings also provide the reader with an outline to ensure all areas of the project are covered. You may refer to the

Introduction to APA

(Links to an external site.)

webpage, and read the information under “Level Headings” for assistance on level headings. The paper must include the course textbook as a reference, as well as four additional scholarly, peer-reviewed, and/or credible sources to support the content of the paper.

Action Research:
The Planning Phase
Rawpixel/iStock/Getty Images Plus
Learning Outcomes
After reading this chapter, you should be able to:
Describe action research and compare Lewin’s model with those of at least two other
OD theorists.
State the importance of considering multiple levels of analysis in the planning phase.
Identify the steps of the planning phase.
Describe different types of research.
Describe different types of research methodologies.
Explain how to prepare for and manage the feedback meeting, including how to address
confidentiality concerns and manage defensiveness and resistance.
Discuss five methods of gathering organization data, including strengths and weaknesses of each.
Discuss methods of analyzing the data collected.
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.

In Chapter 3, the QuickCo vignette provided one example of how OD consultants work. Jack,
the internal OD consultant at QuickCo, led his clients, Ned (the shipping supervisor) and Sarah
(the manufacturing manager), through an action research process to solve communication and
teamwork problems in the shipping department. Action research, the process OD consultants
follow to plan and implement change, follows three general phases:
1. Planning. Data is collected, analyzed, and shared with the client to determine
corrective action.
2. Doing. Action is taken to correct the problem.
3. Checking. The effectiveness of the intervention is evaluated, and the cycle is
repeated as needed.
Let us return to the QuickCo vignette and examine the action research steps taken. Ned and
Sarah met with Jack to outline how employees were at each other’s throats, letting conflicts
fester, and failing to work well together. Their first meeting incorporated their planning phase.
As explained in Chapter 3, this initial meeting is known as contracting. During the meeting,
Jack asked questions to begin identifying the root cause of the conflicted department. The three
struck a collaborative agreement and worked to devise a plan for resolving the issues.
The first action they took was to collect data. Jack reviewed the performance trends and customer complaints from the shipping department and interviewed the employees individually
about their views on the problems.
The planning also involved analyzing the data Jack collected to arrive at a diagnosis. When he
met with Ned and Sarah to share feedback from the data collection, Jack presented his analysis,
noting, “Ned and Sarah, you have a dysfunctional team on your hands. They have no ground
rules, collaboration, or means of handling conflict. Everyone needs to be more understanding
and respectful toward each other. It would also be helpful to create some guidelines for how the
team wants to operate and manage conflict. Ned, you also need to take a more active role in
resolving issues.”
Jack laid the problems out in a matter-of-fact, nonjudgmental way. Once all the analyzed data
was presented, the three worked jointly to plan an intervention to address the problems. They
agreed to take the group through a facilitated process to address communication and team
effectiveness. They also agreed that Ned would benefit from individualized executive coaching to
help him learn behaviors that would be more productive for dealing with conflict.
The second phase of action research, doing, occurred when Jack, Ned, and Sarah scheduled the
intervention with the shipping department and implemented it. The outcome of the intervention
was a tangible plan for the department for how to be more effective, including specific actions
they would take to address conflict.
The final phase, checking, involved Ned, Sarah, and Jack continuing to monitor the shipping
department after the intervention. Ned helped the department uphold its new ground rules on a
daily basis and coached employees to help them stick to the plan. He also asked for regular feedback on his own management skills as part of his ongoing coaching. Ned, Sarah, and Jack
reviewed departmental data on productivity and customer complaints and learned that the
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Section 4.1
A Review of Action Research
timeliness and accuracy of shipped orders
had significantly improved. Jack followed up
a few months later by conducting individual
interviews with shipping department members. He discovered that the solutions had
been maintained. If and when new conflicts
arise, or new members join the team, it may
be time to start the action research process
over again to address new issues.
The QuickCo vignette demonstrates all three
phases of the action research process. This
chapter focuses on the first phase, planning. Chapters 5 and 6 provide a similarly
detailed look at the second and final phases,
doing and checking, respectively. But before
turning to the planning phase, let us review
action research.
Catherine Yeulet/iStock/Thinkstock
Following the action research process helped
the QuickCo shipping department resolve
employees’ interpersonal conflicts.
4.1 A Review of Action Research
Chapter 1 defined OD as a process of planned change that is grounded in a humanistic, democratic ethic. This specific process of planned change is known as action research.
Defining Action Research
Action research is a recurring, collaborative effort between organization members and OD
consultants to use data to resolve problems. As such, it involves data collection, analysis,
intervention, and evaluation. Essentially, it is a repeating cycle of action and research, action
and research. However, the words action research reverse the actual sequence (Brown, 1972),
in that “research is conducted first and then action is taken as a direct result of what the
research data are interpreted to indicate” (Burke, 1992, p. 54). Moreover, the cycle yields new
knowledge about the organization and its issues that becomes useful for addressing future
problems. It thereby allows organizations to improve processes and practices while simultaneously learning about those practices and processes, the organization, and the change process itself.
Action research provides evidence, which enables a consultant to avoid guesswork about
what the issue is and how to resolve it. According to French and Bell (1999),
Action research is the process of systematically collecting research data about
an ongoing system relative to some objective, goal, or need of that system;
feeding these data back into the system; taking actions by altering selected
variables within the system based both on the data and on hypotheses; and
evaluating the results of actions by collecting more data. (p. 130)
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
A Review of Action Research
Section 4.1
Action Research Is a Democratic Approach to Problem Solving
Many theorists have characterized action research as democratic and collaborative:
“Action research is a participatory, democratic process concerned with developing
practical knowing in the pursuit of worthwhile human purposes, grounded in a participatory worldview” (Reason & Bradbury, 2008, p. 1).
“Action research is the application of the scientific method of fact-finding and experimentation to practical problems requiring action solutions and involving the collaboration and cooperation of scientists, practitioners, and laypersons” (French &
Bell, 1999, p. 131).
“Action research approaches are radical to the extent that they advocate replacing
existing forms of social organization” (Coghlan & Brannick, 2010, p. 6).
In addition, Coghlan and Brannick (2010) identified broad characteristics of action research:
Research in action, rather than research about action
A collaborative, democratic partnership
Research concurrent with action
A sequence of events and an approach to problem solving (p. 4)
These definitions are similar in that they all characterize action research as a democratic,
data-driven, problem-solving, learning-based approach to organization improvement. Some
other examples of how organizations apply action research include a nonprofit organization
that surveys donors or beneficiaries before engaging in strategic planning, a government
department that conducts a needs analysis prior to a training program, or a corporation that
conducts exit interviews before initiating recruitment for positions.
Consider This
Can you recall a project in your organization that involved members in a collaborative problem-solving mission? Chances are it was action research, even if that terminology was not
used. Can you think of any other examples?
Action Research Helps Clients Build Capacity for Future Problem Solving
Although typically guided by a consultant, action research engages key stakeholders in the
process. Indeed, its effectiveness depends on the active engagement and accountability of
the stakeholders. As discussed in Chapter 3, OD consultants are responsible for influencing
the action research process while at the same time exercising restraint to avoid solving the
problem for the client.
An example can illuminate how action research helps the client build problem-solving capacity. Suppose an organization introduces a process of assimilating new leaders when they join
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Section 4.1
A Review of Action Research
it (action). The organization hires a consultant to survey team members about this initiative’s
effectiveness (research). The client and the consultant collaborate to develop the survey and
analyze the results. What is learned informs continued assimilation of new leaders and the
way the process gets modified (action). The client is initially engaged to learn the process so
that it can be repeated in the future without the help of a consultant. The action research process helps the organization collect, analyze, and apply data to make informed decisions and
not waste time and money on inappropriate interventions. Helping organizations become
proficient at the action research process is the outcome of effective consulting, because the
best consultants work themselves out of a job.
Models of Action Research
Recall from Chapter 1 that action research originated with the work of Kurt Lewin, the father
of OD. Lewin’s model (1946/1997) includes a prestep (in which the context and purpose of
the OD effort are identified), followed by planning, action, and fact finding (evaluation). Several models of action research generally follow Lewin’s, although the number and names of
steps may vary. See Table 4.1 for a comparison.
Table 4.1: Comparison of action research models to Lewin’s original model
Lewin’s (1946/1997)
original action
research steps
Cummings and
Worley (2018)
Coghlan (2019)
Stringer (2013)
0. Prestep:
context and
purpose of the
1. Constructing:
Determining what
the issues are
1. Look
a. Gather relevant
b. Build a picture;
describe the
1. Prestep to
determine context
and purpose
1. Entering and
2. Planning
2. Diagnosing
2. Planning action
3. Action
3. Planning and
3. Taking action
4. Fact finding
4. Evaluating and
4. Evaluating action
2. Think
a. Explore and
b. Interpret and
3. Act
a. Plan
b. Implement
c. Evaluate
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Section 4.1
A Review of Action Research
Figure 4.1: Plan, do, check action research cycle
The plan, do, check model of action research was popularized by the total quality movement. The
contemporary research cycle has more steps, although it essentially accomplishes the same steps of
diagnosing and designing (plan), implementing (do), and evaluating (check).
The model of action research used in this book has three phases, paralleling Lewin’s
(1946/1997) model (Figure 4.1): planning, doing, and checking. (See Who Invented That?
Plan, Do, Check Cycle to read about the person who originally developed plan, do, check.) Each
phase has substeps derived from multiple action research models:
1. Planning (the discovery phase)
a. Diagnosing the issue
b. Gathering data on the issue
c. Analyzing the data gathered
d. Sharing feedback (data analysis) with the client
e. Planning of action to address the issue
2. Doing (the action phase)
a. Learning related to the issue
b. Changing related to the issue
3. Checking (the evaluative phase)
a. Assessing changes
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Planning: The Discovery Phase
Section 4.2
b. Adjusting processes
c. Ending or recycling (back to the planning stage) the action research process
The action research steps may look simple, and it may appear that planning change is a neat,
orderly, and rational process. In reality, though, it can be chaotic, political, and shifting, with
unexpected developments and outcomes. Nevertheless, learning the action research process
equips consultants with a proven method for navigating such shifts as they work with clients
on organization challenges.
Who Invented That? Plan, Do, Check Cycle
Although often attributed to quality guru W. Edwards Deming, the plan, do, check cycle was
created by Walter A. Shewhart of Bell Labs. Shewhart was an American physicist, engineer, and
statistician who was one of the originators of statistical quality control, which preceded the
total quality movement.
Consider This
In your life, what example do you have of action research? How have you employed plan, do,
check? What actions or adjustments were necessary?
4.2 Planning: The Discovery Phase
When beginning an OD intervention, the initial steps taken to identify the problem and gather
data about it are known as planning. The planning phase is a diagnostic one. The client and
consultant work with other organization stakeholders to study the problem and determine
the difference between desired outcomes and actual outcomes. The discrepancy between
what is and what should be is known as a performance gap. For example, if an organization
aspires to be first in quality in the industry but lags behind in second or third place, that would
be a performance gap. The organization would have to engage in performance improvement
practices to close the gap with its competitors. Or, perhaps a leader receives feedback that she
is not as skilled at leadership as she had thought. The leader begins to work with a mentor or
coach to identify what behaviors she needs to be more effective. By improving listening, recognition, and delegation behaviors, the leader begins to narrow the gap between her current
and desired future leadership performance.
Organizations perform gap analysis to assess reasons for a gap between reality and the
desired outcome. The performance gap idea can also be applied to yourself. Let us say you
aspire to a managerial position but have not achieved it. Upon analyzing the gap, you realize
you lack the training and experience to attain the position. If you decide to eliminate the gap,
you might enroll in a graduate program, earn a leadership certificate, or find a mentor to help
you attain your goal. Consider a performance gap you have experienced and complete the
chart in Figure 4.2.
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Section 4.2
Planning: The Discovery Phase
Figure 4.2: Performance gap analysis
Use this chart to assess your own performance gap. Identify a desired reality—perhaps running a 5K.
Next, honestly note your current performance goal: Can you run around the block? Run or walk for a
mile? Once you determine the gap, fill out the middle column with specific action steps to move closer
to your goal—how will you close the gap? To download an interactive version of this figure, visit
your e-book.
Current reality
Steps to close the gap
Desired reality
Now that you have applied the gap analysis to yourself, let’s think about using it in an organization setting. Identify a desired reality—perhaps being first to market with a new technology. Next, honestly note the organization’s current reality. In the case of introducing the
technology: Does it have the right people to do the work? Is the technology ready for market?
Is the marketing campaign ready to go? Once you determine the gap, fill out the middle column with specific action steps to move the organization closer to its goal—how will you close
the gap? What would be the desired reality in your own organization? How equipped is it to
close the gap? What other performance gaps have you experienced?
Benefits of the Planning Phase
Planning is a critical phase of OD, because poor plans will result in poor outcomes such as fixing the wrong problem, wasting time and resources, and frustrating organization members.
The benefits of good planning include setting the OD process up for success through careful
analysis and diagnosis of the problem; engaging organization members from the beginning in
the processes of collaboration, ongoing learning, and capacity building in the action research
process; and prioritizing issues. See Tips and Wisdom: Alan Lakein to read and apply tips
about planning.
Tips and Wisdom: Alan Lakein
Time management guru Alan Lakein is credited with coining the phrase “Failing to plan is planning to fail” (as cited in Johnson & Louis, 2013, para. 1). This advice is to be heeded in OD. Planning is key to effective interventions. How does Lakein’s quotation apply to your experience?
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Section 4.2
Planning: The Discovery Phase
Levels of Analysis
Before we delve into the steps of the planning phase, we should understand the location of
the OD effort—that is, the level at which the action research might occur. This is known as
the level of analysis. The OD effort might focus on the individual, group, organization, or system. Each level comes with its own issues, needs, and appropriate interventions. These levels,
along with appropriate interventions, were discussed in Chapter 2.
All levels of analysis, from the individual to the system, face similar issues. Cockman, Evans,
and Reynolds (1996) categorized organization issues according to purpose and task, structure, people, rewards, procedures, or technology:
Purpose and task refers to identifying the reason the organization exists and how its
members advance its mission.
Structure pertains to reporting relationships and how formal and informal power
relations affect the organization.
People issues relate to relationships, leadership, training, communication, emotions,
motivation and morale, and organization culture.
Rewards systems include financial and nonfinancial incentives available for performance and perceived equity among employees.
Procedures include decision-making processes, formal communication channels, and
policies. These are an important category for analysis.
Technology involves assessing whether the organization has the necessary equipment, machinery, technology, information, and transport to accomplish its tasks.
Table 4.2 identifies questions to ask about each area of Cockman, Evans, and Reynolds’s levels
of analysis.
Table 4.2: Cockman, Evans, and Reynolds’s organizational issues and
diagnostic questions
Organizational issues
Diagnostic questions
Purpose and tasks
• What business are we in?
• What do people do?
• Who reports to whom?
• Where is the power?
How are relationships managed?
What training is provided?
Who communicates with whom?
How do people feel?
How high is motivation and morale?
What is the culture?
• What are the incentives to perform well?
• What are the decision-making procedures?
• What are the channels of communication?
• What are the control systems?
• Does the organization have the necessary equipment, machinery,
information technology, transport, and information?
Source: From Client-Centered Consulting: Getting Your Expertise Used When You’re Not in Charge, by P. Cockman, B. Evans, & P.
Reynolds, 1996, New York, NY: McGraw-Hill.
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Planning: The Discovery Phase
Section 4.2
Identify a performance gap you are aware of personally or professionally and see if you can
answer Cockman, Evans, and Reynolds’s questions.
Steps in the Planning Phase
The steps in the planning phase include identifying the problem area, gathering data, analyzing the data, sharing feedback, and planning action. These steps illuminate the core problem
and identify key information for making an intervention.
Step 1: Preliminary Diagnosis of the Issue
When an OD process is initiated, it is imperative that the problem be correctly defined. Doing
so involves a process of diagnosis. A consultant’s job is to push the client to identify the
root cause of the problem, rather than its symptoms. Considering the QuickCo example, it
might have been easy for Ned to decide to put the department through a customer service
training based on the symptoms of late, erroneous orders. Had he done so, however, it likely
would have worsened matters, because no amount of customer service training would fix
the department’s interpersonal conflicts, poor communication, and ineffective conflict resolution. It may take intensive study and data collection to accurately diagnose a problem, but
doing so is well worth it.
The action research process begins by defining a problem that warrants attention. Consultants must ask good questions to illuminate a problem’s source. They can then move on to
the next step in the planning phase. Questions a consultant might ask a client include the
“What do you think is causing the problem?”
“What have you tried to fix it?”
“How has this attempt to fix the problem worked?”
“What has been stopping you from fully addressing this issue?”
In addition to asking questions to pinpoint the issue, consultants must ask questions about
who else will be involved in the OD effort. Also, as Chapter 3 explored, a consultant needs to
uncover the client’s expectations regarding the duration of the project and make sure the client is willing to assume an equal responsibility for outcomes.
Good questioning enhances one’s authenticity as a consultant. How have you diagnosed
problems in your organization? Have you ever misdiagnosed an issue? What were the
Step 2: Gathering Data on the Issue
Once QuickCo diagnosed the team’s lack of communication and interpersonal effectiveness
as the source of the problem, it was ready to collect information to inform next steps. This is
known as data gathering. Data can be gathered in many ways. The most common data collection methods in action research include interviews, questionnaires, focus groups, direct
observation, and document analysis.
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Section 4.2
Planning: The Discovery Phase
Jack, the internal QuickCo consultant,
took several steps to better understand
the problem. He reviewed performance
trends and customer complaints, interviewed department members, and
relied on his own working knowledge
and observations of the department to
formulate a solid understanding of the
issues. What types of data have you
gathered to better understand organization issues? Methods of data gathering
are explored in detail in the next section
of this chapter.
Filo/DigitalVision Vectors/Getty Images Plus
Collecting data ensures the OD process is evidence
Step 3: Analyzing the Data
Once data has been collected, it must be turned into something meaningful and useful for the
client. Data collected to provide information about a problem is not useful until it is interpreted in ways that inform the issue and provide clues to possible interventions. For example,
a survey is not helpful unless it is examined within the organization’s context. Data analysis
will be more fully defined in the data analysis methods section later in this chapter.
Step 4: Sharing Feedback With the Client
Once data has been collected and analyzed, a feedback meeting is scheduled in which results
are presented to the client. In the QuickCo example, Jack met with Ned and Sarah to share
his analysis. Feedback meetings require careful planning to keep the consultancy on track.
Consultants should decide on the key purpose and desired outcomes for the meeting. For
example, do they want the client to better understand the problem? Agree on a course of
action? Confront some issues affecting the problem? Sharing feedback with the client involves
determining the focus of the feedback meeting, developing the agenda for feedback, recognizing different types of feedback, presenting feedback effectively, managing the consulting presence during the meeting, addressing confidentiality concerns, and anticipating defensiveness
and resistance.
Step 5: Planning Action to Address the Issue
The last step of the planning or discovery phase is to plan the action that will be taken. This
planning might occur during the feedback meeting, or you might schedule a time at a later
date to give the client an opportunity to digest the data analysis and feedback. The outcome of
the planning is to design the activity, action, or event that will be the organization’s response
to the issue. This is known as an intervention. The type of intervention selected depends on
the organization’s readiness and capability to change, the cultural context, and the capabilities
of the OD consultant and internal change agent (Cummings & Worley, 2018). The intervention
will also target strategy, technology and structure, and human resource or human process
issues. The consultant and the client will collaboratively plan the appropriate intervention(s)
to address the issue. Chapter 5 will address interventions in detail.
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Types of Research
Section 4.3
4.3 Types of Research
OD is a joint endeavor between the client and the consultant that includes data gathering and
analysis. Involving clients in the data collection process reinforces their commitment to the
OD process. The consultant’s role in this process is to help the client focus on the root cause of
the problem and to organize the data collection and interpretation. A consultant’s objectivity
can be very helpful to clients, enhancing their understanding of how they might be contributing to the problem or how the issue plays out within the broader organization context.
Einstein is credited with saying, “If we knew what it was we were doing, it would not be called
research, would it?” (as cited in Albert Einstein Site, 2012, para. 4). People conduct research
when they have questions that do not have obvious answers. Depending on the question they
wish to answer, there are differing types of research.
Basic Research
The word research might evoke images of people working in labs, examining petri dish cultures, and making new discoveries. This type of research is known as basic research, and it
generally creates or extends the knowledge base of a discipline such as medicine, physics, or
chemistry through experiments that allow researchers to test hypotheses and examine perplexing questions. Basic research results in new discoveries and theories and includes innovations such as testing cures for cancer, establishing scientific laws such as gravity, or refuting
previously held beliefs such as the world being flat. There are other types of research beyond
basic, and they vary based on the type of question being asked.
Applied Research
When people seek to answer questions such as “What is the best way to facilitate learning
during change?” or “How do we motivate employees to embrace new technology?” they are
usually seeking to improve practice within a certain field. This is known as applied research
because its results are germane to problems and issues within a particular setting such as
business. This type of research is practical and helps people solve problems, but unlike basic
research, it does not necessarily yield new knowledge. OD is applied research because it asks
questions about challenges that are unique to the individual organizational context in which
they are located but does not necessarily expand our understanding of human behavior in
Action Research
Action research explores specific problems within a locality such as an organization or community. It might ask questions such as “How can we prevent employees from leaving Company A at a rate three times higher than the industry standard?” “How can Hospital B implement an electronic health record with minimal disruption to patient care?” or “How can we
lower poverty rates in Community C?” As the name implies and we have already covered,
action research involves recurring cycles of study and action regarding a problem within a
specific context. Action research is participative because it usually involves members of the
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Section 4.4
Research Methodology
OD generally engages in both applied research and action research because it aims to improve
practice (applied) within a specific context (action). When you engage in action research, you
are conducting a systematic inquiry on a particular organization problem by methodically
collecting and analyzing data to provide evidence on which to base your intervention. When
people do research in organizations, they are seeking not so much to generate new knowledge (or cure diseases) as to improve the quality of organization life. Action research is therefore a form of applied research because it seeks to directly address organization problems
and respond to opportunities in ways that improve the organization for all its stakeholders.
Evaluation Research
People may also want to judge the quality of something like an educational program, conference, or OD intervention. Here they might ask, “How was the learned information applied?”
“What was the most effective mode of delivery of instruction?” or “What are people doing
differently as a result of the intervention?” This type of research is known as evaluation
research. Evaluation seeks to establish the value of programs or interventions and judge
their usefulness. Evaluation can occur during the OD process, especially when the process is
being evaluated before, during, or after the intervention. We will learn more about evaluation
research in OD in Chapter 6. Refer to Table 4.3 for further description of the different types
of research.
Table 4.3: Different types of research
• Contributes to
knowledge base in
field (basic, pure)
• Experimental
• Tests hypotheses
• Seeks to answer
• Improves practice
in discipline
• Seeks to describe,
interpret, or
problems within
specific settings
• Will not
necessarily create
new knowledge
• Addresses
particular, local
problem (action
• Systematic inquiry
• Addresses specific
problem within
specific setting
• Often involves
• Focused on
practical problems,
social change
• Assesses value
• Measures worth or
value of program,
process, or
• Judges
and effectiveness
• Establishes
4.4 Research Methodology
In addition to the four types of research based on the types of questions asked, research can
also be classified according to the type of methodology that is used to collect data. Methodology represents the overarching philosophy and approach to collecting data.
Qualitative Research Methodology
When seeking to understand “how” a phenomenon occurs or unfolds (“How do leaders best
develop?”) or inquire into the nature or meaning of something (“How does participation on a
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Research Methodology
Section 4.4
high-performing team affect individual identity and performance?”), a qualitative methodology is appropriate. Qualitative methodology is concerned with “understanding the meaning people have constructed” (italics in original, Merriam & Tisdell, 2015, p. 15) and as “an
umbrella term covering an array of interpretive techniques which seek to describe, decode,
translate, and otherwise come to terms with the meaning, not the frequency, of certain moreor-less naturally occurring phenomena in the social world” (Van Maanen, 1979, p. 520).
Qualitative inquiry is not generally quantifiable but rather provides convincing evidence.
Qualitative data is generated from methods such as interviews, focus groups, or observations
that are commonly conducted as part of the discovery phase of action research. Qualitative
methods are rooted in constructivist philosophy—the idea that people build meaning from
experience and interpret their meanings in different ways. For example, two people would
likely define the meaning of life differently.
Qualitative research occurs within the social setting or field of practice, and data collection
is often referred to as “fieldwork” or being in the “field.” Qualitative approaches can effectively address organization members’ everyday concerns, help consultants understand and
improve their practice, and inform decisions. Examples of qualitative questions asked in OD
include “Why are employees dissatisfied with Organization Y?” and “What specific concerns
do employees have about anticipated changes in the organization?” Qualitative methodology
uses techniques that allow deep exploration of social phenomena through interviews, observations, focus groups, or analysis of documents.
Qualitative Research Characteristics
Qualitative research focuses on building meaning and understanding about social phenomena. The researcher (generally the consultant in OD) is the primary instrument for data collection and analysis. This means that it is the consultant who conducts interviews, focus
groups, or observations and then interprets or analyzes their meaning. Interpretation is considered an inductive process—that is, meaning is inferred from the data through a process
of comparison, reflection, and theme building. Unlike quantitative methodology, where study
participants are often collected at random, qualitative participants are selected purposefully
and are individuals who can provide informed accounts of the topic under study. For example,
if a consultant wants to know about the experiences of new employees, he or she obviously
needs to ask new employees.
Qualitative Analysis and Results
Qualitative analysis provides a detailed account of the phenomenon. Direct quotations from
participants and a full depiction of the setting, issue, or individuals under study is known as
rich description. The design of a qualitative study is emergent and flexible, meaning that the
questions may change as new insights are gained. For example, if Sarah is conducting focus
groups on issues faced by new employees, a topic may arise that she wants to query future
groups about as she collects data.
Quantitative Research Methodology
When people want to know “how much” or “how many” of something, they generally seek a
quantitative methodology. For example, a researcher might ask, “What are the percentage
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Section 4.4
Research Methodology
breakdowns of employee satisfaction in Organization Y, from very dissatisfied to very satisfied?” or “What is our organization’s productivity rate compared with the industry standard?”
Quantitative methods assume there is one correct answer to a question. This type of research
yields statistical descriptions and predictions of the topics under study.
Recall from earlier coverage in this book the process of survey feedback, in which employees
are given a questionnaire about the organization’s management, culture, or atmosphere. Surveys are regularly used in OD to assess issues such as attitudes, individual performance, and
technology needs or to evaluate certain functions or products. Surveys provide quantifiable
data, such as what percentage of employees feel management is doing a good job or what
percentage of employees plan to look for other work in the coming year.
Quantitative Research Characteristics
Quantitative techniques include surveys, questionnaires, and experiments that may involve
testing with control groups. For example, Team A might be trained on effective team dynamics and facilitation procedures. Its productivity and performance might then be measured
against Team B, which received no prior training. Quantitative studies are carefully designed,
and once data collection begins, they are not changed. For example, if Jonas were administering a survey to a population, he would not change the questions halfway through data collection. Samples in a quantitative study are random and large. A corporation of 40,000 employees being surveyed on their opinions about health benefits would target a smaller number
of randomly selected workers to provide a representation of what the majority of workers
would likely prefer.
Quantitative Analysis and Results
Quantitative data is analyzed using a deductive process in which the numbers or statistics
will be used to determine an understanding of what is being studied. Assuming a benefits
survey was conducted in the previous example, the organization might learn that 60% of
employees prefer managed care, 40% want vision, and only 30% want dental insurance. The
company would use this information to modify its benefits packages.
Table 4.4 compares and contrasts qualitative and quantitative methods.
Table 4.4: Comparison of qualitative and quantitative research methods
Research focus
Quality (nature, essence)
Quantity (how much, how many)
Philosophical roots
Associated phrases
Goal of investigation
Phenomenology, symbolic interactionism, constructivism
Fieldwork, ethnographic, naturalistic, grounded, constructivist
Understanding, description,
discovery, meaning, hypothesis
Positivism, logical empiricism,
Experimental, empirical,
Prediction, control, description,
confirmation, hypothesis testing
(continued on next page)
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Section 4.5
Research Methods
Table 4.4: Comparison of qualitative and quantitative research methods
Flexible, evolving, emergent
Predetermined, structured
Data collection
Small, nonrandom, purposeful,
Researcher as primary instrument, interviews, observation,
Inductive, constant comparative
Comprehensive, holistic, richly
Large, random, representative
Inanimate instruments (scales,
tests, surveys, questionnaires,
Deductive, statistical
Precise, numerical
4.5 Research Methods
Research methods are procedures used to collect data. They are based on the type of research
methodology used. Methods typically used in OD are profiled in this section.
A conversation facilitated by the consultant for the purpose of soliciting a participant’s opinions, observations, and beliefs is an interview. Interviews give participants the opportunity
to explain their experience, record their views and perspectives, and legitimize their understandings of the phenomenon under study (Stringer, 2013). The interviews at QuickCo likely
asked employees about departmental problems, communication, leadership, and so forth.
Conducting interviews requires constructing questions that best address the issues under
investigation. For example, Jack might have asked the QuickCo shipping employees these
“What do you see as the top three challenges in the shipping department?”
“Can you tell me about a specific event that contributed to the problems you face
“What has to change for you to be happy here?”
“What have you tried to resolve the problem?”
“What role have you played in the shipping department?”
“How likely are you to leave your position in the next year?”
Recording interviews can be useful, but make sure you have permission from the participant
(interviewee) and prepare and test the recording equipment in advance. If you are not able
to record, you will want to take notes, but this is not ideal because it distracts you from what
the interviewee is sharing.
Interviews have several strengths. They provide in-depth insight into an interviewee’s opinions, attitudes, thoughts, preferences, and experiences. Interviews allow the interviewer to
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Section 4.5
Research Methods
probe and pose follow-up questions. Interviews can be done rapidly, particularly by telephone and email, and they tend to elicit high response rates.
Interviews also have several weaknesses, including that they can be costly and time consuming, especially when done in person. Interviewees may answer in ways they think will please
the interviewer rather than tell the truth. The quality of the interview is dependent on an
interviewer’s skill and ability to avoid bias and ask good questions. To avoid bias, an interviewer should set aside expectations about the problem and solutions and truly listen to what
the participants say during data collection. Interviewees may lack self-awareness or forget
important information and thus fail to provide good data. They may also have confidentiality
and trust concerns. Data analysis can also be time consuming.
A questionnaire is an electronic or paper form that has a standardized set of questions
intended to assess opinions, observations, and beliefs about a specific topic, such as employee
satisfaction. It is a quantitative method. Questionnaires are also known as surveys, and one of
OD’s first interventions was survey research, as was discussed in Chapter 1. Questionnaires
measure attitudes and other content from research participants. The results can be quantified, often to show statistical significance of the responses.
Questionnaires are commonly administered to employees to inquire about the organization’s culture and climate and their satisfaction levels with their work, management, and
relationships. Participants are usually asked to rate the questionnaire items using a Likert
scale (described in Chapter 1). For example, they might rate an item such as “Management is
concerned with my welfare” on a 5-point scale from “Strongly Disagree” to “Strongly Agree.”
Questionnaires should feature clearly written questions that will yield actionable information.
Questionnaires and surveys have several
benefits. They are inexpensive to administer, especially if done electronically or in
groups. Software programs make surveys
relatively easy to develop and distribute.
Questionnaires provide insights into participants’ opinions, thoughts, and preferences.
They allow rapid data collection and are
generally trusted for confidentiality and
anonymity. Questionnaires are reliable and
valid when well constructed and permit
open-ended data to be collected, as well as
exact responses to direct questions.
AnreyPopov/iStock/Getty Images Plus
Surveys and questionnaires are common data
collection methods used in OD.
Questionnaires and surveys also pose some
challenges. They should be kept short or
participants may not complete them. Participants may answer in ways they think please you
instead of telling the truth. They may not respond to certain items at all, especially if the
wording is unclear. Participants may not trust confidentiality or may feel that the survey is
tedious; thus, the response rate may be low. Finally, data analysis can be time consuming for
open-ended items.
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Research Methods
Section 4.5
Focus Groups
A group of approximately eight to 12 participants assembled to answer questions about a
certain topic is known as a focus group. Focus groups are similar to interviews, but they
are conducted collectively and facilitated by a moderator. Developing targeted questions is
important, as is inviting the right people who possess insight and experience relevant to the
problem. Focus group sessions should be recorded and transcribed verbatim, with participants’ permission.
Focus groups are beneficial for understanding participants’ thinking and perspectives, as well
as for exploring new ideas and concepts. Participants can generate new knowledge and ideas,
especially if they build off each other’s remarks. Focus groups might also yield in-depth information about problems or potential fixes. They can offer insight into the client organization’s
relationships and communications and may provide an opportunity to probe relationship
issues. Focus groups are relatively easy to organize and represent an efficient way to collect
data from several stakeholders simultaneously.
Focus groups also pose challenges. They might be expensive to conduct if participants are
brought in from multiple locations. Finding a skilled facilitator can be difficult. Participants
may be suspect of the process and have confidentiality concerns. Participants may also be
overbearing, negative, or dominant during the session, so adroit facilitation is needed. If
employees are angry or worried, their emotions can dominate. Focus groups can also generate voluminous findings that may not be generalizable if the participants are not representative of the organization or that may not be relevant to the issue under investigation. Finally,
large amounts of data may be time consuming to analyze. Consultants should hone their focus
group facilitation skills, and resources for building this competency are listed at the end of
this chapter.
Direct Observation
Suppose Nina watches people, meetings, events, work processes, or day-to-day activity in the
organization setting and records what she sees. Nina is undertaking direct observation. This
data collection method involves recording observations in the form of field notes. Stringer
(2013) listed typical observations made in action research:
Places: the contexts where people work, live, and socialize, including the physical
People: the personalities, roles, formal positions, and relationships experienced by
Objects: the artifacts in our contexts such as buildings, furniture, equipment, and
Acts: the actions people take (signing a form, asking a question)
Activities: a set of related acts (e.g., facilitating a meeting)
Events: a set of related activities (e.g., putting on a training session)
Purposes: what people are trying to accomplish
Time: times, frequency, duration, and sequencing of events and activities
Feelings: emotional orientations and responses to people, events, activities, and
so forth
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Section 4.5
Research Methods
Direct observation has several benefits. It allows direct insight into what people are doing,
avoiding the need to rely on what they say they do. Observation offers firsthand experience,
especially if the observer participates in activities he or she observes. This is known as participant observation, and it is just as useful for observing what happens as for what does not
(for example, a manager may tell you she involves employees in decision making, but you may
observe her doing the opposite). An observation might yield valuable details that offer insight
into the organization’s context and politics that organization members may miss. Observational data may also provide a springboard from which to raise issues that people would
otherwise be unwilling to talk about.
Direct observation also poses challenges. It may be impossible to determine a rationale for
observed behavior. If people know they are being observed, they may alter their behavior.
Observations may be clouded by personal bias and selective perception. One must avoid overidentifying with the studied group so that observations are objective (this is especially challenging in the case of participant observation). Doing observation can be time consuming,
and access may sometimes be limited, depending on the type of organization. A consultant
may have to sort through observations that seem meaningless in relation to the problem. Data
analysis can also be time consuming.
See Tips and Wisdom: Effective Observation to read advice about undertaking productive
Tips and Wisdom: Effective Observation
Doing effective observations requires planning, skill,
and focus. Here are some tips to make your observations more robust:
Determine the purpose of the observation.
Observations can be used to understand
a wide range of activities in organizations
such as how employees respond to a new
office layout, how customers engage with
employees, how supervisors relate to their
DragonImages/iStock/Getty Images Plus
subordinates, or how certain procedures are
Tips for conducting effective
executed. You should be able to state in one
observation are to determine the
sentence the focus of your observation: The
purpose and what is relevant,
purpose of this observation is to document
decide how to document, and
the use of personal safety equipment usage
report observations directly rather
[specify time, procedure, or location]. Or perthan interpreting.
haps you are more interested in the nature of
interaction: The purpose of this observation is
to understand what types of questions medical professionals ask during a clinic. Specificity saves the consultant from capturing a lot of extraneous data. In the first example,
you might note the frequency and types of personal safety equipment used, and the
(continued on next page)
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Research Methods
Section 4.5
Tips and Wisdom: Effective Observation (continued)
conditions when it is not. In the second example, you might be interested in who
is asking the questions, assumptions made about the cases, or what emotions
are expressed. Clarity about purpose increases the likelihood of seeing what you
are seeking.
Determine what is relevant to the observation. If you are observing participation and
team dynamics during a meeting, what occurs during an outside interruption of the
meeting is probably irrelevant to what is going on in the team.
Decide how to document the observation. Your choices include videotaping, audiotaping, photography, and notetaking. There is not a perfect method. Technology-assisted
video or audio recording might subdue participants who feel self-conscious about the
information they are sharing or are fearful of reprisals. Notes can miss key information
and quickly lose their meaning. Notetaking can be assisted by creating a shorthand for
participants (e.g., “ee” for “employee” and “mgr” for “manager”). Practice taking notes
to build skill. Use more than one notetaker and then compare findings. Finally, create
a checklist for the observation to make it easy to record items, such as a list of behaviors during meetings (interruptions, new ideas, constructive criticism, building on
ideas, etc.).
Avoid interpreting what is observed and instead report it directly. So, if you were
observing personal safety equipment usage, you might say “Person A did not wear
safety glasses” instead of “Person A appeared to be distracted and hurried and forgot
to put on safety glasses.” You will likely have to pair observations with interviews or
focus groups to understand intentions behind behaviors and interactions you witness.
Document Analysis
Document analysis involves reviewing relevant records, texts, brochures, or websites to gain
insight into organization functioning, problem solving, politics, culture, or other issues. Documents might include memoranda, meeting minutes, records, reports, policies, procedures,
bylaws, plans, evaluation reports, press accounts, public relations materials, vision statements, newsletters, and websites. Most organizations have a prolific amount of documentation, so using this type of data requires a clear focus and purpose. For example, Jack, our
QuickCo consultant, reviewed performance trends and customer complaints to better understand the shipping department’s problems. If Jack were trying to help an executive improve
communication skills, he might review his client’s email correspondence to determine how
effectively and respectfully the executive communicates. This type of data collection can significantly inform the OD initiative.
Documents provide several advantages, including access to historical data on people, groups,
and the organization, as well as insight into what people think and do. Document analysis is
an unobtrusive data collection method, which minimizes negative reactions. Certain documents might also prove useful for corroborating other data collected; for example, Jack could
compare the executive’s email communications with colleagues’ accounts collected through
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Section 4.6
Methods of Analyzing the Data
On the other hand, documents may provide little insight into participants’ thinking or behavior or may not apply to general populations. They can also be unwieldy and overwhelming
in the action research process. Confidential documents may sometimes be difficult to access.
Additional Data Sources
Although interviews, questionnaires, focus groups, direct observation, and document analysis are the most commonly used OD data sources, other sources of information include the
Tests and simulations: Structured situations to assess an individual’s knowledge or
proficiency to perform a task or behavior. For example, some organizations might
use an inbox activity to assess delegation skills during a hiring process. Others use
psychological tests to measure ethics, personality preferences, or behaviors. These
instruments can be used in hiring, team development, management development,
conflict resolution, and other activities.
Product reviews: Reviews of products or services from internal or external sources.
These can be useful for addressing quality or market issues.
Performance reviews: Formal records of employee performance. These can be particularly useful for individual interventions that are developmental or for succession
planning on an organization level.
Competitor information and benchmarking: Comparative analyses of what competitors are doing regarding the issue under exploration. Examples might include salary,
market, or product comparisons.
Environmental scanning: Analysis of political, economic, social, and technological
events and trends that influence the organization now or in the future.
Critical incidents: Interviews that ask participants to identify a specific task or
experience and pinpoint when it went well, when it did not go well, and what they
learned. Critical incidents were first used in military pilot training to identify and
eliminate mistakes.
4.6 Methods of Analyzing the Data
The most common types of research in OD
are survey research using quantitative
methods and qualitative inquiry that could
employ interviews, focus groups, observation, document analysis, or a combination
thereof. As you recall, quantitative methods
are used to determine “how much,” while
qualitative methods are used to determine
“how.” We have already identified the many
methods for collecting data; now, what do
you do with it?
Data points are simply bits of information
until they are assimilated in ways that tell
a story or provide deeper understanding of
ktasimarr/iStock/Getty Images Plus
A consultant’s role is not just to collect data but
to analyze its significance and present findings
to the client.
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Section 4.6
Methods of Analyzing the Data
a phenomenon. For instance, employee responses on a survey about job satisfaction are just
numbers on a page until interpreted. Once you know that 35% of respondents are only moderately satisfied and are clustered within a certain division or job classification, then you can
begin to understand the scope of the problem and consider interventions.
A consultant’s job is to make sense of data and present it to the client. Such a presentation
should be in plain language and in quantities that the client can easily manage. It is advisable to involve the client and other relevant organization members in the presentation of the
analysis, because doing so promotes buy-in, collaboration, and accurate data interpretation.
There are several steps to analyzing data effectively. These steps differ depending on whether
you are doing qualitative or quantitative analysis. It is beyond the scope of this book to fully
train you as a researcher, so it is a good idea to gain additional training and experience in this
area if it interests you. Until you gain experience with data analysis, it is recommended that
you partner with someone who is an expert. If you have access to a university or other organizational research team, this can be an easy way of both finding a research expert and developing a research partnership. Such relationships help bridge theory and practice and can be
great opportunities to enhance your learning. There are also some suggestions for continued
learning listed at the end of this chapter. Case Study: Data Collection and Analysis at Jolt Transformers offers an example of how to ensure effective data analysis.
Case Study: Data Collection and Analysis at Jolt Transformers
Jo Lee of Design Solutions Consulting receives a phone call from Rex James of Jolt Transformers. “Rex, what can I do for you?” asks Jo, who has done work for Jolt in the past. “Jo, we’ve
got a problem with our technicians,” Rex replies. “We can’t keep them. We hire them and train
them, and then they go work for the competition for more money. Then the cycle repeats and it
seems we wind up hiring folks back again until they can jump ship for more cash. Our management team thinks they need more training.”
“What makes you think that, Rex?” Jo is skeptical that training is the solution in this case. She
listens a bit longer and sets up a time to meet with Rex and his division CEO. During the meeting, Jo asks several questions about the extent of the problem and what steps have been taken
to address it. The three agree that the first step is to collect more data to understand the scope
of the problem. They decide on a three-pronged approach: a survey of technicians, interviews
with key executives, and focus groups with selected technicians. These methods will provide
both quantitative and qualitative data.
Over the coming weeks, Jo and Rex work on developing a survey with a small team that
includes technician supervisors, technicians, and human resource personnel. They administer
the survey to each of the company’s 75 technicians. The survey results show that 70% are dissatisfied with their careers at Jolt and 62% are planning to apply elsewhere in the next year.
Jo and Rex also develop interview questions for the executives and a format and questions for
the technician focus groups.
During the interviews, it becomes clear to Jo that the executives believe the problem is that the
company lacks a training institute for technicians. A couple of executives want her to design a
curriculum to train the technicians more effectively. Jo is highly skeptical of this assumption,
however, because it runs counter to what she is learning from the technicians. Other executives
(continued on next page)
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Section 4.6
Methods of Analyzing the Data
Case Study: Data Collection and Analysis at Jolt Transformers
express concern that the company is not investing appropriately in the development and
retention of its work force. Jo thinks they might be on to something.
During the focus groups with technicians, Jo hears comments such as these:
“There is no clear career path at Jolt. The only way to progress is to go elsewhere.”
“The company doesn’t seem interested in us at all. They just want us to produce—the
faster, the better.”
“The competing companies provide a much better orientation program and connect
you with a mentor to help you develop your skills.”
“It’s a mystery what you have to do to get promoted around here. Instead of moving up,
you might as well just plan to move out.”
During the weeks that Jo collects and analyzes data, she undertakes several measures to promote a thorough, effective analysis. Each is discussed as a tenet of effective analysis related to
the case.
Design a systematic approach; keep a data log. Jo works with a team from Jolt to design a
process for collecting quantitative and qualitative data. As the data collection process unfolds,
Jo keeps a detailed log of the steps taken, especially for the interviews and focus groups. These
notes allow her to tweak the interview and focus group questions based on what she learns.
When you use data logs, you can keep them in the form of a journal or official memoranda that
highlight key steps, decisions, and emerging themes. These logs might include visual images of
what you are learning, such as models, system diagrams, or pictures. Write notes to yourself
as you analyze. Thoroughly documenting your procedures is good practice and should allow
another person to step in and repeat your data collection and analysis procedures.
Allow data to influence what is learned. Jo listens and watches carefully as she collects
data. Her attention to detail offers her new insights into prevailing assumptions at play in
the organization. She is able to add questions to the interviews and focus groups that push
participants to think more broadly about the problem. For example, she pushes executives
to provide evidence that a training institute would result in better retention of employees.
When the executives find they cannot provide clear answers, they reflect more deeply on the
problem. Jo is also able to probe more around the lack of development and retention activities
going on in the organization.
Constantly compare qualitative data. Constant comparison earned its name because it
involves a repetitive process of comparing themes that appear in the data until the researcher
arrives at a cogent list that satisfactorily explains the phenomenon. This involves careful study,
note making, and looking for patterns in the data. Having more than one set of eyes coding the
data and generating themes helps verify the analysis.
In the case of Jolt, two themes were found from the data analysis:
Technicians were dissatisfied with
a. the lack of a career path and
b. the level of support for advancement and growth.
Jolt was lacking
a. long-term strategies for recruitment, retention, and development of technicians and
(continued on next page)
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Methods of Sharing Feedback (Data Analysis) With the Client
Section 4.7
Case Study: Data Collection and Analysis at Jolt Transformers
a strong orientation program.
The focus groups at Jolt began to produce themes and patterns related to technicians’ lack of
clarity regarding career paths.
Often, researchers will use technology to organize and compare qualitative data, such as
NVivo, ATLASti, or the Ruona method (Ruona, 2005). This involves repeatedly reading the
transcripts or other documentation and coding similar issues. For example, if you continually noted issues related to poor leadership, you would assign a code such as “PL” to data that
speaks to that theme. As just mentioned, this approach is known as constant comparison.
Constantly comparing the data allows you to identify themes you may have previously missed
and also validate recurring ones. You should pay attention to all data, even data that does not
make sense or fit the emerging themes. Sometimes, the outliers can provide unique insight
that proves to be helpful in addressing the issue.
Code data. Qualitative data is coded, and the codes will eventually be grouped into themes.
This involves reading passages of the transcript and giving them codes. This is known as code
data. For instance, Jo might have coded Jolt’s transcripts with the following categories:
job dissatisfaction
learning and development issues
career progression
Once a series of themes has been established, it is best to narrow these down to a more manageable three to five ideas. Subgroups can be created under themes if necessary. You will likely
wind up with more themes than you will ultimately want to share with the client. The client
needs to find the initial analysis digestible and in accessible language. You may share additional analysis as it becomes relevant to client needs.
Document findings in an accessible way for the client. Jo had dozens of pages of data,
including survey results and interview and focus group transcripts. She distilled this information into specific key findings and recommendations that were not overwhelming to the client.
Find ways to disseminate findings with other practitioners and researchers. The best
research happens in organizations, yet it rarely gets shared further once the problem is solved.
It is helpful to attend professional meetings and conferences and to write up your findings for
other practitioners to review. This advances the OD knowledge base, promotes discussion, and
elicits new questions.
4.7 Methods of Sharing Feedback (Data Analysis)
With the Client
Like the action research process itself, feedback meetings require careful planning to keep the
consultancy on track. As the consultant, you are responsible for identifying a meeting’s key
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Methods of Sharing Feedback (Data Analysis) With the Client
Section 4.7
purpose and desired outcomes. For example, do you want the client to better understand the
problem? Agree on a course of action? Confront some issues affecting the problem?
Sharing feedback with the client involves determining the feedback meeting’s focus, developing the meeting’s agenda, addressing confidentiality concerns, recognizing different types
of feedback, presenting feedback effectively, managing your consulting presence during the
meeting, and anticipating defensiveness and resistance. Each of these will be discussed in this
Determining the Focus of the Feedback Meeting
Several issues should be considered when planning a feedback meeting. What outcomes do
you seek? Do you want to enhance understanding of the problem? Obtain agreement on a
course of action? Study the issue further? No matter the meeting’s focus, there are at least two
issues that must be incorporated into the feedback meeting design:
1. data analysis presentation and
2. discussion about the analysis and recommendations for future action.
Keep your goals in mind as you plan the meeting. Structure it to help the client move to the
next phase. Allow time to present results, and engage the client in a conversation about the
data. In the spirit of authenticity, plan ways to solicit feedback on your consultation during
the meeting, perhaps by asking questions such as “Are you getting what you need from me?”
“Is this what you expected?” and “What don’t you understand?”
Developing the Meeting Agenda
As you create the agenda, you will want to split the meeting into two parts: (a) data analysis presentation and (b) dialogue about the analysis and next steps. Block’s (1999) meeting
agenda format for a feedback session has been adapted into the following:
1. Restate the original contract (described in Chapter 3).
2. State the purpose, outcomes, and process for the meeting.
3. Present the analysis and recommendations.
4. Present recommendations.
5. Ask for client reactions. For example, “Are you surprised by anything I’ve said?” or
“Is this meeting your expectations?”
6. Be authentic. Ask the client during the meeting, “Are you getting what you want?”
7. Make a decision on actions or next steps and assign tasks and dates.
8. Address concerns and assess commitment.
9. Reflect on whether your goals were met—conduct a meeting evaluation and ask for
feedback on your consulting.
10. Close with support and a focus on the next steps.
Block (1999) suggested beginning with a compelling statement that explains why the problem exists and outlines the consequences if no action is taken over the short and long term.
Next, recommend solutions in collaboration with the client, identifying anticipated benefits.
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Methods of Sharing Feedback (Data Analysis) With the Client
Section 4.7
Once the feedback meeting is completed, it is a good idea to conduct a meeting postmortem. This involves evaluating the meeting, reflecting on what happened, soliciting input from
stakeholders, and seeking feedback on your consulting.
Addressing Confidentiality Concerns
The agenda should be structured in a way that respects confidentiality and anonymity, especially when presenting data analysis. Consultants will invariably work with sensitive data.
They have an ethical obligation to simultaneously provide the client access to the data but
also protect the confidentiality of the people who provided it. Consultants should verify
data usage and confidentiality expectations in the contracting process outlined in Chapter 3.
Detailing such expectations in writing allows everything to be spelled out should the client
ever insist on viewing the raw data.
Collecting sensitive data—such as an attitude survey in an organization where there is high
employee dissatisfaction—requires taking appropriate research measures to protect the confidentiality and anonymity of participants.
A consultant’s credibility can be compromised if he or she lapses in the area of data confidentiality. It is especially important to protect confidential data when under pressure. A colleague
once worked as an external consultant for a company whose president demanded raw data
from an attitude survey that was highly negative. He insisted that he “owned” the data because
he had paid for the survey. The consultant quit and took the data with her rather than violate
her ethics and turn the data over to the president. Protecting confidentiality will enhance
your integrity and authenticity. You can protect confidentiality by keeping your promise not
to share raw data, protecting your data sources (e.g., completed surveys) by keeping them in
a secure location, and limiting access to the data only to individuals whom you trust and who
need to work with it.
Recognizing Different Types of Feedback
Consultants recognize two broad types of feedback. Positive feedback involves sharing what
the client is doing well or what is working in the organization. Negative feedback involves
sharing what the client is doing badly or what is not working.
It is important that the client hear the good things before the consultant delves too deeply
into the opportunities to improve. However, not all positive feedback is helpful. Positive feedback that undermines problem-solving progress is known as destructive positive feedback.
Examples include offering unwarranted praise or saying what the client wants to hear instead
of what the client needs to hear. Destructive positive feedback is counterproductive to helping clients solve problems because it convinces them they are doing well enough and do not
need to change. In contrast, positive feedback that helps the client is constructive positive
feedback. Examples include describing what the client does well, what others appreciate,
successes, and behaviors that are helpful to others. What types of destructive or constructive
positive feedback have you received?
Feedback that hurts the client is known as destructive negative feedback. Examples
include put-downs, insults, or nonspecific criticism. This type of feedback is not helpful and
may even erode progress on the problem. Negative feedback that helps the client is known
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Section 4.7
Methods of Sharing Feedback (Data Analysis) With the Client
as constructive negative feedback. Examples include outlining what the client does badly,
failures, behavior that hinders others, behavior that is uncomfortable for others, and specific criticism. You can probably recall receiving both forms of feedback and how they made
you feel.
Presenting Feedback Effectively During the Meeting
It is a big job to analyze data and decide what
to share at the feedback meeting. It is likely
that you will not present all of the data collected. Cummings and Worley (2018) suggested that feedback is most useful when it is
relevant to the client and presented in an
understandable and descriptive way. Clients
also want information that is verifiable, significant, timely, and not overwhelming to
digest. You will want to ensure that the data
is balanced; include the success data in addition to the failure data. It is also helpful to
provide comparative data when available,
such as cross-department comparisons or
benchmarking with other competitors or
industries. You should also be willing to collect more data as needed.
fizkes/iStock/Getty Images Plus
It is important that consultants communicate
data and analysis effectively to their clients.
How might communication methods change
from client to client?
Consultants should present feedback in a way that enables the client to hear it. Whether negative or positive, feedback should be constructive, or helpful to the client. Regardless of the
feedback shared, it must be delivered with respect; feedback should never come across as
hurtful or insulting. In all cases, feedback should be based on available evidence—the data
that has been collected and analyzed. Block (1999) urged that it is important to be assertive
and use language that is descriptive, focused, specific, brief, and simple. Avoid language that
is judgmental, global, stereotyped, lengthy, or complicated.
Managing Your Consulting Presence During the Meeting
This book has discussed the need for consultants to be authentic and to complete each phase
of action research. How consultants give feedback is critical because it affects how well the
client will hear and accept the message. Striking an effective stance during feedback involves
being respectful, providing direct and constructive description, and anticipating resistance. It
is imperative to respect the client; hurtful feedback is not productive, so make sure the feedback is constructive and nonjudgmental.
In addition to being respectful, consultants should provide direct, constructive description.
This involves being assertive and straightforward about the analysis. The feedback meeting
is not the time to timidly sugarcoat results, particularly if they are negative. As discussed
under types of feedback, there are ways to constructively deliver negative feedback. The way
feedback is described will affect the client’s receptivity to it. Describing a problem clearly,
directly, and convincingly helps the client absorb the breadth and depth of the issue without
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Methods of Sharing Feedback (Data Analysis) With the Client
Section 4.7
getting overburdened in detail. Include data that links to both the root cause and the symptoms (often the presenting problem). It can be helpful to highlight data in areas where the client has the responsibility and authority to make changes. It is also beneficial to include data
that the client views as important and that calls attention to problems where there is a commitment to change. For instance, if the organization has an ongoing initiative, such as work
teams, showing evidence of their activities and results can validate the data and reinforce the
value of the action research process in implementing change.
It is also advantageous to anticipate aspects of the feedback that are likely to cause client
defensiveness and to come prepared to defuse them. Do not allow the client to project frustration about the data onto you—you are just the messenger. You should also identify stakeholders who will be absent from the meeting and plan to follow up with them about the meeting’s
content and outcomes. Be prepared to deal with resistance as directly and constructively as
you presented the data, and invite the client’s assessment of the problems and courses of
action. Anticipating what might come up during the meeting helps a consultant effectively
prepare for the unexpected.
Managing Defensiveness and Resistance During the Meeting
Clients often become defensive about feedback, particularly if it is negative or will require
significant changes. For instance, suppose Janessa was assessing an organization’s retention
issues and had data indicating that women and people of color were leaving the organization
due to discrimination and harassment. She might anticipate denial and defensiveness from
a mostly White, male-dominated organization. Having benchmarking data handy on how
other organizations have dealt with this issue would be one way Janessa could counteract
this defensiveness.
In addition to being defensive, clients might also resist making changes. A consultant might
hear, “It will cost too much,” “We don’t have time to do this!” or “This will never work here.” A
good way to respond to such sentiments is
to push the client to consider the cost of not
adopting the solution. A consultant might
ask, “Do you still want to be dealing with this
problem 6 months from now?” “What are
you afraid of?” or “What is your opposition
really about?” Pursuing a strong line of questioning helps the client see faulty reasoning
in the resistance. Another tactic to thwart
resistance is to invite those you anticipate
will be the most resistant to attend the feedback meeting so they become involved in
determining the intervention and thus
TommL/E+/Getty Images
develop buy-in to the solution.
In addition to anticipating defensiveness
and resistance, consultants should make
sure the feedback is as constructive and
Resistance is to be expected during feedback.
Consultants must move clients beyond
resistance and get them to collaborate on the
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Methods of Sharing Feedback (Data Analysis) With the Client
Section 4.7
descriptive as possible. For example, instead of making negative, destructive, and vague statements such as “Your management structure isn’t working,” a consultant might say, “According
to the survey, employees are confused about lines of authority and the vision for the organization.” The second statement is nonjudgmental and provides more detail than the first. Also,
clearly describing authority lines and vision gives the client something tangible to work on.
Consultants should determine feedback points that are likely to cause defensiveness and
anticipate in advance of a meeting what form that defensiveness might take. They should
also develop questions that will help the client express resistance or defensiveness. These
might include “What points of the feedback concern you?” or “Are there points you disagree
with?” These questions will spark dialogue regarding the aspects of the feedback that are
troubling to the group. When a consultant detects defensiveness and resistance, he or she
should address it swiftly and tactfully, because doing so enhances the consultant’s authenticity and credibility.
Smoothly managing the feedback process is a key competency of an OD consultant. Tips and
Wisdom: Managing the Feedback Process offers some tips to help you develop this skill. Use
Assessment: Develop a Force Field Analysis to anticipate different types of support and resistance a consultant might encounter during a feedback meeting.
Tips and Wisdom: Managing the Feedback Process
• The client has a right to all the information collected (use of data should be established
in the contracting process, including confidentiality).
• Not all of the data collected will be used. It is a consultant’s job to synthesize the data so
that it is useful to the client.
• Include success data in addition to the “failure” data.
• Offer constructive feedback with respect.
• Respect confidentiality and anonymity. This enhances a consultant’s integrity.
• Include data that calls attention to the root cause as well as the symptoms (often the
presenting problem).
• Avoid sugarcoating data that the client may not want to face.
• Highlight data in areas where the client has responsibility and authority to make
• Use data to highlight a manageable number of problems.
• Include data the client will view as important; such data calls attention to problems
where there is a commitment to change.
• Avoid inundating the client with detail.
• Avoid allowing the client to project frustration about the data onto you.
• Be prepared to deal with resistance.
• Invite the client’s assessment of the problems and courses of action.
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Summary and Resources
Assessment: Develop a Force Field Analysis
Lewin (1948, 1951), widely considered the originator of OD, defined behavior as a combination of a person’s personality and his or her perception of the context where he or she is
engaged (situation and/or environment). Lewin saw the context as a field of forces affecting
the person. Lewin considered forces to be imposed from within (internal) or induced by others (external). For example, an aspiring leader is likely to be more motivated to improve his
or her leadership skills from an internal desire to do so, rather than a directive from his or
her boss to become a better leader. Lewin’s force field analysis is a depiction of the influences
that encourage (driving) or impede (restraining) a person from making a change. Returning
to the person wishing to become a better leader, a driving force might be an intrinsic desire
to mentor and coach others to be their best. A restraining force might be the costs and time
associated with developing leadership skills.
Using Lewin’s force field analysis to effect change involves first creating a force field analysis to understand what both drives and restrains change, and second, increasing or decreasing the intensity of the forces in ways that move the person in the direction of the desired
change. Either bolstering the driving forces (desiring to become a better leader, experiencing
positive interactions mentoring, enjoying learning, and so forth) or reducing the restraining
forces (finding resources to pay for leadership development, creating time and opportunities
to practice leadership, and so on) helps the person improve leadership skills. Lewin’s theory
predicted that diminishing restraining forces had more impact in facilitating change. The value
of this approach to change is that it helps individuals and organizations understand not only
the nature of change but also how to accomplish it more effectively (Burke & Noumair, 2015).
Instructions: Develop a force field analysis to anticipate different types of support and resistance a consultant might encounter during a feedback meeting. Visit your e-book to download
an interactive version of this assessment.
List the topic of feedback:
Driving forces of support for the analysis:
Resisting forces of support for the analysis:
Summary and Resources
Chapter Summary
Action research is a recurring, collaborative effort between organization members
and OD consultants to use data to resolve problems.
The three phases of action research are planning, doing, and checking.
Many types of OD theorists follow Lewin’s model, although the number and names
of steps may vary.
Planning is an opportunity to conduct a performance gap analysis to examine the
difference between what is and what should be.
Benefits of the planning phase include setting the OD process up for success through
careful analysis and diagnosis of the problem, engaging organization members from
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Summary and Resources
the beginning in the process of collaboration, ongoing learning and capacity building
in the action research process, and prioritizing issues.
The levels of analysis include the individual, group, organization, or system. Issues to
address at each of these levels include purpose and task, structure, people, rewards,
procedures, and technology.
Planning is the first phase of action research and consists of five steps: identifying
the issue, gathering data on the issue, analyzing the data, sharing feedback with the
client, and planning action to address the issue.
Different types of research answer different types of questions. Types of research
include basic, applied, action, and evaluation.
Basic research seeks to create new knowledge based on experiments and hypothesis
Applied research explores practical questions and seeks to improve practice. It may
not necessarily create new knowledge.
Action research addresses particular problems within specific contexts, such as an
organization. It is also applied research because of its practical nature.
Evaluation research assesses the value of programs, processes, or techniques and
judges their effectiveness.
A qualitative research methodology is used to understand how a phenomenon
unfolds or occurs and to create meaning and understanding about the topic
under study.
A quantitative research methodology is focused on measuring how much or how
many of something. Its goal is to interpret statistics so they are meaningful within
the context they are derived from, such as an organization.
An interview is a qualitative data collection method that solicits opinions, observations, and beliefs about a particular social phenomenon by asking the interviewee to
reflect on questions.
Questionnaires are quantitative data collection instruments that survey participants
about their opinions, observations, and beliefs according to a standardized set of
Focus groups bring eight to 12 participants together to collectively reflect on questions that are posed to the group, explore issues, and generate new ideas.
Direct observations are conducted by watching the operations and interactions taking place in the organization.
Document analysis is the use of relevant records, texts, brochures, or websites to
gain insight into the way the organization runs, solves problems, manages politics,
develops culture, and makes decisions.
Multiple data sources exist to provide information when engaging in OD. The key is
to find the methods that will yield the most useful data.
The approach to data analysis will depend on the methods used to collect it.
The case study about data collection and analysis at Jolt Transformers offered a
realistic account of how to analyze data and modeled the following tips: be systematic, keep a data log, let the data influence how you learn and think about the problem, constantly compare data, code the data, document the findings so the client can
understand them, and disseminate what has been learned.
When planning to give feedback to the client, decide the focus of the feedback meeting based on the outcomes the client needs, such as better understanding of the
problem, agreeing on a course of action, or deciding to study the issue further.
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Summary and Resources
Take time to develop a detailed meeting agenda that includes the data analysis and
presentation and a conversation about the analysis and next steps.
Make efforts at every step of the research process to protect confidentiality and
ensure the client is comfortable.
Be cognizant of what type of feedback is shared. Consultants must share both negative and positive feedback with clients. They should avoid feedback that is positively
or negatively destructive, such as saying what the client wants to hear, glossing over
problems, or sharing hurtful information.
Ensure that the information presented to the client is relevant, succinct, verifiable,
timely, and not overwhelming.
Consultants should strike a composed, confident, respectful, and competent stance
during meetings. These are imperative in helping the client view the consultant as
an authoritative partner.
Plan to defuse resistance and defensiveness during the meeting.
Think About It! Reflective Exercises to Enhance Your Learning
1. Recount a time you participated in a data collection in your organization. What was
the method (interviews, questionnaire, etc.)? How was the data used? How well did
the consultant do, based on the principles presented in this chapter?
2. Reflect on your presence as an OD consultant. What are your key strengths and
3. Recall a time you facilitated problem solving with a group (or anticipate a future
opportunity). What were your biggest strengths and challenges related to overcoming resistance?
4. Identify an issue in your organization that warrants further study using action
research. Outline the types of data you would collect, the participants, analysis, and
how you would go about sharing feedback with the leadership.
Apply Your Learning: Activities and Experiences to Bring OD to Life
1. Identify a challenge in your organization and use the questions in Table 4.2 to identify key variables.
2. Identify a problem in your organization and plan a data collection process to examine the issue. Will you take a qualitative or a quantitative approach? Why?
3. Observation is an important skill to hone in OD. During the next week, play the role
of observer in the organization of your choice. You may want to keep notes using
the tips and chart of observations provided in this chapter. See what you can learn,
particularly the contradictions between what people say and what they do. What
questions might you ask if you were an OD consultant, based on your observations?
Additional Resources
Quantitative Data Analysis

Qualitative Data Analysis

© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Summary and Resources
Web Links
“Basics of Conducting Focus Groups” (Free Management Library), useful for planning, developing, conducting, and immediately after the focus group session:
• “Guidelines for Conducting Focus Groups,” by consultant Susan Eliot:

Further Reading
Creswell, J. W. (2017). Research design: Qualitative, quantitative, and mixed methods approaches. Thousand Oaks,
CA: Sage.
Flick, U. (Ed.). (2017). The Sage handbook of qualitative data collection. San Francisco, CA: Sage.
Green, J., & Thorogood, N. (2018). Qualitative methods for health research. San Francisco, CA: Sage.
Maxwell, J. A. (2005). Qualitative research design: An interactive approach (6th ed.). Thousand Oaks, CA: Sage.
Neuman, W. L. (2016). Basics of social research (3rd ed.). Upper Saddle River, NJ: Pearson Education.
Roulston, K. (2010). Reflective interviewing: A guide to theory and practice. London, England: Sage.
Wolcott, H. (2008). Writing up qualitative research (3rd ed.). Thousand Oaks, CA: Sage.
Key Terms
applied research Research that is germane to problems and issues within a
particular setting, like an organization; it
explores practical questions and seeks to
improve practice but may not necessarily
create new knowledge.
basic research Research that seeks to
make new discoveries, test hypotheses, and
create new knowledge.
code data Qualitative data is coded, and
the codes will eventually be grouped into
themes. This involves reading passages of
the transcript and giving them codes.
constant comparison The process of
comparing data codes until clear themes
emerge when conducting data analysis.
constructive negative feedback Helpful
feedback about what the client or organization is doing poorly, such as failures or
destructive behaviors, or specific criticism
that is shared with respect.
constructive positive feedback Helpful
feedback about what the client or organization is doing effectively.
constructivist philosophy The idea that
people build meaning from experience and
interpret their meanings in different ways.
data gathering The process of collecting
information related to the client’s problem
or issue that informs the decisions and
actions taken in the OD initiative.
deductive A quantitative data analysis process in which numbers or statistics are used
to determine an understanding of what is
being studied.
destructive negative feedback Hurtful
feedback about what the client or organization is doing poorly, such as put-downs,
insults, or nonspecific criticism.
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Summary and Resources
destructive positive feedback Positive
feedback that undermines progress toward
solving the problem, such as saying what
the client wants to hear or offering undeserved praise.
interview A qualitative data collection
method that solicits opinions, observations, and beliefs about a particular social
phenomenon by asking the interviewee to
reflect on questions.
direct observation The process of watching people, meetings, events, work processes, or day-to-day activities related to
the OD issue or problem in the organization
methodology The approach to data collection that is grounded in the overarching
philosophy of the researcher and research
diagnosis The determination of the root
cause of a problem; based on a process of
data collection, analysis, and collaboration
with the client.
document analysis The review of relevant
records, texts, brochures, websites, or other
documentation to gain insight into the way
the organization runs, solves problems,
manages politics, develops culture, and
makes decisions.
evaluation research Research that
assesses the value of programs, processes,
or techniques and judges their effectiveness.
feedback meeting A meeting where the
results of data analysis related to the issue
the client is experiencing are shared, along
with an assessment of the positive and
negative aspects of the organization.
focus group A group of approximately
eight to 12 participants who have specialized knowledge or experience relevant to
an issue or problem in the organization and
are led through a series of questions and a
discussion about the issue by a consultant.
gap analysis The process of assessing
reasons for a gap between desired performance or outcomes and reality.
inductive process Inferring meaning
from data through a process of comparison,
reflection, and theme building.
level of analysis The location of the OD
effort at the individual, group or team, organization, or system level or a combination of
levels. Each level has unique needs, issues,
and appropriate interventions.
negative feedback Information shared
with the client about what is working
poorly in the organization or what the client
is doing badly.
participant observation Direct observation that includes participation of the
researcher (consultant), such as during a
performance gap The discrepancy
between what is and what should be in
terms of desired organization performance
or outcomes.
planning Also known as the discovery
phase of action research; the initial steps
taken to identify a problem and gather data
about it.
positive feedback Information shared
with the client about what the client is
doing effectively or what is going well in the
qualitative methodology A form of
research into the nature of social phenomena; it usually investigates how something
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Summary and Resources
quantitative methodology A form of
research that attempts to quantify data, asking questions related to how much or how
questionnaire A paper or electronic series
of questions that survey participants about
their opinions, observations, and beliefs.
Information shared with the client about
what the client is doing effectively or what
is going well in the organization.
research methods The procedures used to
collect data.
rich description A detailed account of the
phenomenon, with direct quotations from
participants and a full depiction of the setting, issue, or individuals under study.
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.
Action Research:
The Doing Phase
monkeybusinessimages/iStock/Getty Images Plus
Learning Objectives
After reading this chapter, you should be able to:
Describe factors that influence a client and organization’s readiness for change and promote
acceptance of interventions.
Define an OD intervention, including the different ways to classify interventions and the criteria for choosing an appropriate intervention.
Discuss common issues related to monitoring and sustaining change, including the reasons
that interventions fail, the ethics of the implementation stage, client resistance, and strategies
to sustain change.
Explain the consultant’s role in implementing OD interventions and how to promote learning
to sustain them.
© 2020 Zovio, Inc. All rights reserved. Not for resale or redistribution.

A major land-grant university received federal funding to promote education among public
health employees in a southern state. As soon as the monies were awarded, several educational
initiatives began to serve multiple stakeholders across the state. One of the projects that James,
the grant’s principal investigator, wanted to initiate was a leadership academy for mid-level
managers with potential to advance to higher levels of public health leadership in the state.
Previous analyses of the organization, including succession p…
Purchase answer to see full

error: Content is protected !!