Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

Version 1 Next »

As explained in The OVS Process, the “Engage” step in delivering OVS is building buy-in to a plan. This includes

  • Scoping the project

  • Gaining client “buy in”

  • Agreeing on the terms and conditions of the OVS 

  • Determining how the results will be communicated to organizational members.

Before administration, in this phase the consultant and client meet in person, by phone, or even via email to define WHY, WHAT, and HOW to use the tool.

Why:  

Define the goal of this exercise.  What is happening in the organization that would warrant the use of the OVS?

Typically, the purpose of OVS is to improve organizational climate or to establish a benchmark for future measurement in a change process. 

What:

Define the scope of work.  Does the scope include the administration and debrief of the OVS only or does the client require recommendations on next step interventions?  Will you be facilitating meetings to manage the design of the survey (e.g., which demographic groups, where & how people will take)?  Will you be facilitating communication back to the organization (e.g., memo to all staff or all-hands meeting)?

How:

Depending on the overall design of the OVS assessment, lay out the process with your client.  Agree on:

  • Is there anyone who needs to be informed, or give approval, before launching this process?  How will that be accomplished and by whom?

  • What will the project be titled (the title will appear on the questionnaire)?

  • Is there a company logo to add to the survey?  (We recommend a jpg, png, or gif, up to 250 pixels wide, with white or clear background.)  Are there special instructions that should appear on the survey itself?

  • Who will be included?

  • What useful and important divisions of this group need to be identified through custom demographics (e.g., department, work location, region, role type, job level).

  • How will the participants be told about the survey?  Who will do this?  What will they be told? (See the next section for advice on this communication).

  • How will the participants actually take the survey?  Do they all have their own computers or do they need to share?  Is it necessary to use a paper version, and if so, how will it be collected and who will enter that data into the online system?

  • What is the timeline?

    • When will the survey be launched?  

    • When will a reminder be sent, and by whom?  

    • When must it be completed?  (Generally, we recommend a short window, such as 3 days, then leaving a “pad” of a couple days in case you need to remind people again.)

    • When will the report be generated?  

    • When will it be shared with the leader(s)? Others?  (The report will be immediately available for download after at least 3 people complete the survey).

  • How will the report be debriefed for the leader(s) and other members?  Who will receive a paper or email copy?  How will printouts be handled?

  • Are there additional custom questions that should be included?  The OVS can have up to 18 free-text and 24 rating-scale questions.  For example:

    • What is one way we could more effectively use our Salesforce database to be proactive with customers? (Open ended)

    • Which of our core values are we most successfully bringing to life in the organization? (Rating scale)

    • We are one organization with one shared mission (Rating scale)

    •  My direct supervisor lives up to the organization’s values (Rating scale)

    • As an organization, we are serious about our commitment to green business (Rating scale)

    • Please give an example of why you gave the rating above (Open ended)

Note on Demographic Comparisons:

The OVS is intended for use across one enterprise.  There are times when you might wish to make comparisons between various sub-groups based on role or work site.  For example, if the organization has a main office and a major satellite office in another part of the country, you might set the field name:  “Work Location” and give two answer options:  “Main Office” and “Satellite.” These will appear at the start of the questionnaire, then in the report as a graph to let you make comparisons.  Note that you will need at least three respondents in each sub-group for the group to show on the graph and maintain anonymity – so don’t make the sub-groups too narrow or specific.

When you set your project up in the online system, you will have the opportunity to add up to 10 personal data definition questions, each with a maximum of 54 value options (note: We recommend using up to 6 answer options to make the graphs readable).  For example, you might want to look at responses grouped by years of employment:


These questions then appear at the start of the online questionnaire (see an example on the next page). 

With each Comparison question, participants will voluntarily classify themselves, for example here is one question as it appears on the survey:


Then, in the report, provided there are sufficient responses in a group, the group will appear on a graph of the 5 climate factors and another graph of the 4 outcomes, such as this one:

  • No labels