How to debrief an OVS
OVS Process Phase 3: Reflect – Debrief
Typically, you will begin by debriefing senior leaders (or, first, those who commissioned the survey), providing information on organizational climate strengths and opportunities for improvement. Then, either you (as an external consultant) or the leader(s) will share the results with organization members.
Leadership Debrief
Usually this process will be conducted in a face-to-face meeting of approximately 60-90 minutes with the organization leader(s). The goal is to frame the feedback in a way that’s useful to moving the client forward and to suggest options to enhance organizational climate. This requires careful thought about the report outcomes and potential next steps. We recommend preparing some options for the client to consider prior to the debrief meeting.
A typical OVS debrief begins with context setting, then follows the outline of the OVS Report.
Set the Context: review the goals set with client in the Engage (project scoping) phase.
OVS overview: review the goal of the OVS and the “OVS Highlights” dashboard page and explore the leaders’ views on these results.
Organizational Orientation: analyze the organization’s style in conjunction with the organization’s goals and objectives.
Engagement and people’s scores: consider the impact of the the members of the organization’s engagement, drivers and outcomes scores and distribution.
Drivers & Outcomes: deep dive into each driver and the connected pulse points, and outline the potential outcomes associated with the climate factors and discuss any variance in scores.
Pulse Points, Comparison & Additional Questions: go into detail with the 15 pulse points to understand more about how the organization works and look at comparisons for the subgroups that were defined as well as results from additional questions.
Action Plan & Conclusions: prepare to move ahead
1. Set the Context
Review the “Why, What, Who, How contract” from Phase 1. Be sure to briefly revisit:
Why are you meeting, what is the purpose of this debrief?
Review the scope and your “contract” (how long, next steps, expectations).
Now turn to the report.
2. OVS overview
Why Organizational Vital Signs
On page 2, point out the idea that climate is like the “weather” in the organization – is this an organization where people worry it will rain, where they come in expecting lovely weather? The feelings and expectations are tied to many factors such as relationships in the organization, the leadership team, how people fit into the larger organization, and the evolution of the organization.
Welcome to OVS
On page 3, introduce the OVS, its structure and how this process works.
Background
On page 4, recap the way this OVS project has been set up, who was invited, when was the data collected and explain the validity of the data.
OVS Highlights
On page 5, explore the organization’s results from several different perspectives. The first row shows the VS value chain, from the Drivers that create Engagement to the Outcomes that explain the organization’s performance.
You can use this page as a stand alone summary page to share or use in your presentation.
The OVS Model
Review the VS Model structure and definitions on page 6. It may be helpful to lay next to the OVS highlights or the Combined Snapshot pages.
3. Organization’s Orientation
Assess the implication of the Organization’s Orientation to the challenges and objectives of the organization. This is a snapshot in time, the team style can change depending on the organizational climate and internal and external circumstances. Review the reflection questions to find out how this can impact the organization in the short and long term.
4. Engagement and the organization’s scores
Engagement
On page 8, explain that “Engagement” is calculated for each respondent based on his/her overall scores. The graph on the Individual view shows what percentage of respondents are fully engaged (for example, in a group of 9, if the graph shows 33% orange, there are 3 people who are fully engaged). Review the definitions of “Engaged,” “Neutral,” and “Disengaged” from the report.
In the OVS normative database, 25% are Engaged, 50% are Neutral, and 25% are Disengaged – so, by definition, this is “average.” Point out the text on the report showing Gallup’s research on the engagement of “world class” organizations.
To provide a summary benchmark on the Organization’s view, the OVS calculates on overall Engagement Index score out of 100. A highly-engaged team will have an Engagement Index of at least 80 (this is equivalent to Gallup’s “world class organization” composition of 67% Engaged, 26% Neutral, and only around 7% Disengaged).
Ask exploratory questions, such as:
Does this match your experience of the organization?
How do you see these results playing out?
How do you feel about these scores?
Do you have sufficient engagement in your organization?
Do you know what might be blocking engagement, or fueling it?
Distribution of Scores
Review the VS Model and definitions on page 6. It may be helpful to lay page 6 (The OVS Model) on the table next to page 9 and 10 (Distribution of Scores and Combined Snapshot).
Explain that the OVS actually is TWO questionnaires. One questionnaire asked about climate, the other about outcomes. Statistically, these two things move together… but it’s certainly possible that in some teams one is higher or lower (just as sometimes a taller person is lighter weight than a shorter person, but generally these two factors move together).
On page 9, the 5 drivers and the 4 outcomes are positioned in the 4 quadrant graph where the vertical axis represent’s the organization’s average score from Low Strength to High Strength, while the the horizontal axis represent the level of agreement of each member of the organization with the average score, in other words how cohesive the team is around those scores, and goes from Low Cohesion (high standard deviation, scores are close) to High Cohesion (low standard deviation, scores are far apart).
Knowing the average scores for an organization is important but the team’s standard deviation around those scores is equally important to understand how aligned or divided the group is.
Combined Snapshot
On page 10, the graph shows the 5 drivers and the 4 outcomes in this combined snapshot. Explain that the lower grey band represents the lowest 25% of performance, and the top band represents the highest. 100 is the average signaled by the black dotted line.
Ask for observations. Help your client notice which scores are higher and lower.
As you go through this section, be sure to focus on relative strengths as well as gaps:
“What might be creating this higher score in ____ (driver)?” “How is that helping the organization get results?”
“Do you have ideas on why this ____(driver) score might be lower?” “How is that inhibiting the organization for optimal performance?”
Discuss the feedback through questions such as:
Does this match your experience of the organization?
Where do you see yourself relative to these scores?
Where would you like the results to be in six months?
How do you see these results playing out?
How do you feel about these scores?
Turn to page 11, Combined Snapshot - Table of Scores. Explain that the graph is based on these averages. In addition, this table shows the Standard Deviation (SD), which is a measure of how close the individuals are to the average. If there is a wide range of answers, the SD will be higher. The meaning of SD varies based on sample size and the possible range. The Standard Deviation of the OVS norm group is 15, meaning this is the average SD. Generally, on the OVS:
An SD of under 15 is more consistent; under 12 represents a consistent perspective (i.e. almost all participants gave a similar rating).
An SD of over 15 is less consistent than average; over 18 represents an inconsistent range of views (i.e. there are some participants who gave a considerably higher rating, and some who gave a considerably lower rating).
Discuss the implications of this feedback.
5. Drivers and Outcomes
Drivers
From Page 11 to 16, define each of the five drivers and their three related pulse points. Each driver is strongly tied to an emotional need, in this slides you can explore that connection and also connect this data to better understand the team strengths and weaknesses.
Outcomes
Explore the link between the climate drivers and outcomes. These Outcomes are strongly connected to the Vital Signs drivers which explain between 50-60% of the variation in outcomes.
Ask which of these outcomes is most important to your client at this time.
On page 10, point out that the outcomes are tied to specific “arrows” on the model. The OVS measures four outcomes. Of course, there are many outcomes of these drivers, and many elements tied to each “dimension” (such as People - Organization). For example, on the Operations side we could look at budget, project tracking, use of systems, speed of execution…on this dimension, the OVS collects feedback about Results. Every Outcome is more strongly connected to three Drivers, Trust plus the other two that are closest to it according to the OVS model, show them page 6 to explain this point.
From page 17 to 20, each page focuses on one outcome and the three drivers more connected to it to explore how the organization is performing in that area but also to understand which driver is helping sustain this performance over time.
Discuss: “How might improving any of the drivers (especially any that were highlighted on each page) affect your ability to increase performance?”
If the drivers are generally higher than outcomes: There is untapped capacity. The leaders can leverage the climate to increase the outcomes. There may be external factors suppressing the outcomes (e.g., a highly-engaged team inside a dysfunctional organization).
If the outcomes scores are generally higher than the drivers: The performance may not be sustainable. The success is coming from something other than the climate (e.g., external factors such as lack of competitors making it easier for the organization to get results).
Discuss the linkages that may exist between Drivers and Outcomes. Is there a Driver that’s particularly lower? Does it relate to an Outcome being lower?
Are there some Drivers or Outcomes that are particularly high? How might these affect one another, and other factors?
6. Pulse Points, Comparisons (if any), and Additional Questions
The following pages serve to provide additional depth and detail to the findings. You may wish to briefly review these now, and then leave them with your client for further evaluation.
Pulse Points
Discuss the Pulse Points. These 15 scores provide more detail into specific organizational skills and how these drivers are put into action. How do the lowest scores relate to the lower scores on the previous page? Visa versa? You may wish to use some of the “Reflection Question” questions in the report to guide your discussion.
Comparisons
If there are “Personal Data” demographic questions, and there are sufficient responses to categorize respondents, a graph and a table of scores will be shown for each demographic question. Discuss why certain groups might be lower or higher. This is a rich area for the OVS, especially for large organizations where many departments and teams are invited to participate in the survey.
Additional Questions
Depending on how the OVS project was set up, on the next few pages there could be data from additional multiple choice or Likert scale questions, represented with a graph each.
Open Questions are located in a separate export file for ease of use. Remind your client that one person’s strongly worded comment should not receive more weight than other peoples’. Encourage the client to look for themes.
7. Action plan & Conclusions
Action Plan
Finally it’s important to think about the process, and introduce the Change MAP to reflect on the information and insights that have been examined and to plan how to move forward.
Conclusions
Ask the client to synthesize the report through summary questions, such as:
Overall, what is your take away message from this report?
What is one area where you’re most satisfied with the results?
What is one result you’re most concerned about?
Discuss the plan for next steps – or, if possible, schedule a follow up meeting to do so.
Will you be facilitating another meeting to review the results? If so, discuss some of the parameters, such as:
What are your client’s goals for that meeting?
Normally you would walk through the results, much as you’ve done in this conversation. Are there any changes to the presentation your client would like? Any areas that need more attention?
What role will your client play in the meeting? Will s/he introduce the meeting? Conclude the meeting? How can s/he support you in this debrief, and, how can you support your client?
After the meeting, how does s/he want the group to feel? What does s/he want the group to do?
Specifically discuss how to handle dissemination of the report: Is your client comfortable with members having access to the full report or certain pages? If not, will members have ways to revisit the report findings?
If you will not be involved in future meetings, discuss how your client will share the results with organization members. It may be most helpful to schedule another meeting with your client to assist with specifics, particularly in complex or larger organizations.
Following are some points to share with organizational leaders who may be presenting the results to their staff:
The purpose of sharing OVS with your staff is to:
Enroll them in improving the organization
Help them "feel heard"
Acknowledge strengths and growth
Provide a vision or goal for improving climate
Get buy-in for next steps
Steps for presenting the results:
Explain that statistics provide questions, not answers. This is not a "scorecard" or judgment, it is a way of looking at all of our different opinions.
Review the graph – explain the highs and lows
High means: “People see this as a strength in the organization.”
Middle means: “People see this as an area with some strength and some room for improvement.”
Low means: “People are concerned about this area.” Low does NOT mean people are “bad at this” or that there is a problem in the department – it means there is a concern or dissatisfaction.
Validate people’s perceptions. “One reason for doing the survey is so the administration could really ‘hear’ your point of view – I appreciate that you were willing to talk about the areas where you are satisfied and also dissatisfied.”
Help people see the value of strengths. “This strength is one reason we’re doing as well as we are. I’d like to see us build on that strength.”
Explore the consequences of the lower areas. “If we don’t improve on ___(low area), how will that affect us? How would it help us to improve that area? Do you want to see this improve?”
Gain commitment. “If we are going to improve in these areas, it is going to take all of us working together on it. If others in the organization commit to improving this, would you be willing to participate in that?”
Action plan. “What is one area to work on first? What is one specific action you’d like to see happen? What’s one step you will take (by when)?”
Remember, quality comes from the inside out. When people feel great, relate to each other well, and care, they provide their best possible work. Caring creates success.
Note: As you share OVS, what's on the inside speaks louder than words. As you present this data, don't criticize, blame, compete, or defend. Instead, listen, reflect, question, challenge, and care.
Dissemination of Results
Typically, the meeting(s) to review the OVS report with members at large will follow a flow much like above. A few tips:
If you are not familiar with the composition of the group, spend a few minutes at the start asking about the roles and functions of those in attendance. “What do you do here?”
As you provide feedback, be sure to frame this as feedback FROM organizational members TO the organization. It’s their assessment, not yours. For example, “Based on the way you responded, the report shows that organization members are highly ____. Does that match your experience?”
Depending on your agreement with the leader(s), be prepared to describe how members will be able (or not) to review the report or be involved going forward.
To summarize the OVS process, here's a convenient checklist for managing the project:
Typically, the OVS implementation follows these steps; often there is a small committee of leaders coordinating (“team”) and one lead consultant.
Step | Lead | Due |
Plan the schedule of steps below, get approval from leadership |
|
|
Decide how OVS will be administered & timing. If paper version required, how it will be collected and input online. If computers needed, where, how will they be set up? |
|
|
Determine demographic questions & answer options; create custom questions |
|
|
Consultant puts survey online, provides URL to team to review |
|
|
Review the online version and approve |
|
|
Send email to all personnel introducing them to survey |
|
|
Encourage personnel to take the survey via reminder emails, intranet, voicemail, or other communications |
|
|
If paper copies used, input into online system |
|
|
Create the report, email to team, initial discussion |
|
|
Team meeting to review results with consultant; if needed, schedule consulting with Six Seconds |
|
|
Leadership meeting to review results, either led by consultant/team or using outside facilitator (contact Six Seconds for advice) |
|
|
Share results back with organization, either by team or using an external facilitator |
|
|
Team creates a wrap-up memo back to organization thanking people for contribution and summarizing key “take aways” and action steps |
|
|