Drukuj książkęDrukuj książkę

M3.07. Conducting evaluation of the mentoring process

Strona: EcoMentor Blended Learning VET Course
Kurs: Course for mentor in the sector of eco-industry
Książka: M3.07. Conducting evaluation of the mentoring process
Wydrukowane przez użytkownika: Gość
Data: piątek, 29 marca 2024, 09:06

1. Data Collection

What You Need to Ask Before Collecting Any Data
There are different methods to collecting data, each with distinct advantages and disadvantages, so before you start the data collection process, it is important that you evaluate these factors, and decide which data collection method is best suited (e.g. within time, and within the limits of your available resources.). One way of assessing these factors, is by the evaluator, along with the relevant stakeholders, determining answering the following questions.

Essential Questions to Ask Before Collecting Data
Evaluation Purpose

  • What are the evaluation questions? Which methods will help answer them and provide the most reliable and valid data?
  • Who is the primary audience for the findings? What types of data will make the most sense and be most useful to them?

Data Sources

  • Who is providing the data? How much information is needed for a reliable evaluation result?

Timing

  • How long is the mentor programme?
  • When did the mentoring programme start/what stage of the life-cycle will the evaluation be carried out?
  • How much time was budgeted for data collection and analysis?
  • When is the right time to collect the data?
  • When will the results become available for reporting to the stakeholders?

Resources

  • What is the evaluation budget? How much of it is for data collection and analysis versus reporting and dissemination of findings?
  • Who should collect the data? Does staff have the time and skills? Should an external evaluator be hired?

Once these questions have been answered, you will have a clearer picture of the evaluation process, and how you will collect the information needed. You may choose qualitative methods, or quantitative methods – or, perhaps a combination of both.

2. Use of Quantitative Methods

Determining if you should use quantitative methods.
Depending on the answers to the questions raised, quantitative methods such as surveys or assessments could be right for you. Or qualitative methods such as interviews or focus groups might suit your needs better.
Types of questions that quantitative methods can help answer
Generally, quantitative methods are concerned with what, who and when. Therefore, you should consider quantitative methods if your evaluation questions include inquiries about who participated and benefited from your program; what the impact of the programme was; what changes were brought about by your programme; and when the changes occurred.

Examples of quantitative data include:

  • Number of people who participated in programme activities over the course of the year (participation rate)
  • Whether or not programme participants developed new knowledge and skills (percent change in knowledge and skills before and after participation)

Primary audience for the findings
Quantitative methods generate data that appeal to people who prefer information that quantifies impact and provides the "bottom line."

Potential respondents and sample size
Qualitative methods can be advantageous because they can be more cost effective. For example, a survey can be distributed to 50 people, and completed in their own time, whereas a 30-minute interview with each of those 50 people would take considerably more time.
However, you must also consider the disadvantages of the survey method versus face-to-face interviews. Conducting interviews or focus groups with fewer respondents may actually be preferable because it could yield better quality data, even if it more time consuming.

Amount of time for data collection and analysis
Quantitative methods are useful if the amount of time to collect and analyse data is very limited. A survey with close-ended questions (i.e., you provide response options that respondents can select) that ask people to rate something takes less time to administer than scheduling and conducting interviews and focus groups. The data from questionnaires takes less time to analyse - calculating frequencies, averages or percentages generally take less time than reviewing, coding and analysing qualitative data (i.e., notes from discussions and interviews). Quantitative methods are equally useful if the programme is long-term, since you can compare baseline data with subsequent data.

Budget and other resources
In general, under comparable conditions (i.e., amount of time for data collection, sample size), quantitative methods can be less expensive than qualitative or mixed methods (which use a combination of quantitative and qualitative methods) for several reasons. Secondary data such as records of training course attendance, can be obtained usually with very little time or cost. Surveys can be administered electronically, which also costs less than the typical resources which may be needed to travel to an interview or focus group. An analyst needs less time to calculate frequencies and percentages than to read and code text taken from interview and focus groups. Also, a different set of skills is required to do these calculations than to code text and generate themes.

2.1. Advantages and Disadvantages of Quantitative Methods

Advantages Disadvantages
You can collect data from a large sample of people. Closed answers
You can analyse the data relatively quickly and easily, especially if you are using software packages such as Excel, STATA, SPSS, etc. No in-depth information. Responses to the questions are generally not explained.
It does not require a lot of money if a survey is used and administered online or in person. Larger scope for possible misinterpretation by the reader
The results can be generalised if the sample is representative of the study population Risk of poor return/completion rate if surveys not completed face-to-face

2.2. Surveys

There are two major ways for collecting quantitative data: (a) surveys and (b) tests and assessments.

Surveys are one of the most popular ways to collect quantitative data. In a survey, a questionnaire is distributed to a group of people to complete. While such questionnaires could include open-ended questions, closed-ended questions are typically used to collect quantitative data. The advantage of this, is that statistical analysis can be applied easily to responses to closed-ended questions.

Methods of Survey Delivery
Once you have developed your evaluation survey, you must now decide your method of delivery. There are a few methods of survey delivery to be considered when evaluating a mentoring programme.

Survey Method When to use
Handout surveys
(the questionnaire is handed out for people to complete in paper form or on a tablet [e.g., iPad])
You want to capitalise on who is available
The people you want to survey may be infrequently available or accessible.
Internet or web-based surveys You need results relatively quickly.
The people you want to survey are competent Internet users.
Your survey is short and simple.
Your survey is more complex with skip patterns (e.g., responses to a question determine which questions to be answered later, etc.).
Face-to-face surveys
(you go over the questionnaire in person)
Your survey questions are too complex and may need in-person explanation.
There is concern that people would not respond willingly unless someone they trust is present to reassure them about the content of the questions.
Poor response rate expected using other survey methods.
Resources are not restricted, and competent interviewers are committed to administer the survey consistently and properly.

3. Use of Qualitative Methods

Determining if you should use qualitative methods
Depending on the answers to the evaluation factor questions, you may decide that a qualitative method could be right for you.

Types of questions that qualitative methods can help answer
At the most basic level, qualitative methods are concerned with the why and how and are therefore useful for an in-depth study of a particular issue rather than a broad study. Therefore, if your evaluation questions include inquiries about how the participants in your programme applied what they learned in order to achieve career growth, or why there was a poor attendance rate to mentor/mentee meetings, for example, then you should consider qualitative methods.

Primary audience for the findings
Qualitative methods generate data that appeal to audiences who are curious to know what lies behind statistics, or the cause of specific data trend.

Potential respondents and sample size
Qualitative methods are helpful if you are working with a smaller number of people, mainly
because conducting interviews and focus groups with a lot of people can be expensive and time consuming. Qualitative methods also help if the people you want to collect information from feel more comfortable expressing their opinions verbally than in written form.

Programme length and amount of time for data collection
Qualitative methods can be useful regardless of programme length, however, the evaluation plan should allow enough time to analyse the data because analysing data from qualitative methods is more time-consuming than calculating data from quantitative methods.

Budget and other resources
In general, under comparable conditions (i.e., length of program, amount of time for data collection, sample size), qualitative methods can be more expensive than quantitative or mixed methods for several reasons. There may be travel costs for the data collector and for the respondents, or for specialist out-sourced skills that may be required. An analyst also needs more time to read and code text from interview and focus group transcripts and observation notes.

3.1. Advantages and Disadvantages of Qualitative Methods

Advantages Disadvantages
Provides understanding and description of in-depth experiences by individuals in your strategy, initiative or program. Not useful if you want to generalise findings to the whole study population (i.e., findings may be relevant only to one group of individuals that the programme serves).
Provides you or the evaluator with an opportunity to explain definitions or questions that are unclear to participants. Participants may not feel comfortable verbalising and discussing sensitive topics.
You or the evaluator can easily guide and redirect questions in real time. Collecting and analysing data can be expensive and time consuming.
Findings may be easier to interpret for some of your stakeholders who are uncomfortable with numbers and other forms of quantitative data.
A useful approach when no readily available, field-tested survey questionnaires or assessment tools exist for the topic you want to explore.

3.2. Different types of qualitative data collection methods

When collecting qualitative data, there are a number of methods that can be used, each with their own advantages and disadvantages;

  • Interviews (structured and semi-structured)
    In structured interviews, the questions are written out exactly the way they should be asked, and the interviewer should ask every respondent the questions in the same order. In a semi-structured interview, topics are listed and examples of probes are provided, and the interview becomes more of a discussion.
  • Focus groups
    Focus groups are structured discussions to understand people's perspectives, experiences or knowledge about a specific topic. A moderator suggests topics and facilitates the discussion. The goal is to discover the how and why of something, to get contextual responses rather than "yes" or "no" answers.
  • Observations
    Observations are structured means of recording the actions and interactions of participants in an evaluation. They provide an opportunity to collect data on a range of behaviours, capture interactions and openly explore the topic of interest in the evaluation.
  • Review of products (e.g., documents, recordings, videos)
    This method is important because products can be a source of data for the evaluation.
    Such materials can enable you to learn about important shifts in programme development or maturation. Document reviews also can help you formulate questions for a survey or an interview.
  • Use of Mixed Methods
    The trend in evaluation has been shifting toward mixing quantitative and qualitative methods into a single evaluation called mixed-method evaluation. This approach makes sense because each method has its own strengths and weaknesses. Combining them can lead to a stronger, more complete evaluation than a conventional evaluation that uses only one method.

Confidentiality and Anonymity
Confidentiality and anonymity are essential considerations for you and the evaluator. Your
respondents' privacy should be protected vigilantly. For example, names of participants should
never be revealed in an evaluation report. The terms "anonymity" and "confidentiality" have
different meanings and should not be used interchangeably.

  • Anonymity requires you and your evaluator to not know who the participants are. For instance, you don't ask respondents to put their names in a survey or identify themselves in a focus group.
  • Confidentiality means you and your evaluator know who the participants are, but you don't link any answers to the respondents. Any information you have that contains the person's name or personal information must be kept in a locked drawer or stored in a password-protected electronic file.

4. Analysing and Interpreting Data

Once you have collected your data, you need time to analyse and interpret the data, in order to act on what you learned. This process can be complicated and, at times, technical.

4.1. Quantitative Data Analysis and Interpretation of Results

Descriptive statistical analysis
When conducting quantitative analysis, you will need some basic statistical analysis skills.
When you calculate the number and percentage of responses to a particular question or calculate the average rating for questions about the usefulness of the training, you are starting to do descriptive statistical analysis. It is used to examine the responses to a question by calculating and looking at the following things:

  • Distribution of responses or frequency distribution (e.g., how many people checked response option 1, response option 2, response option 3, etc.).
  • Average value, or the mean (i.e., looking at the average rating across the participants' ratings).
  • The most common response, or the mode.
  • The number in the exact middle of the data set, or the median.

Descriptive statistical analysis also provides another piece of information technically referred to as variability, which refers to the following:

  • The spread of your results, including the range (difference between the highest and lowest scores).
  • Variance (shows how widely individuals in a group vary in their responses).
  • Standard deviation (how close or far a particular response is from the average response).

Interpretation
Quantitative findings must be interpreted in the context of the organisation/programme and these questions can guide your interpretation:

  • What is the scope of the programme's impact, or how effective was the programme?
  • If you had a lot of missing data or a poor response rate, why and what can be done differently to increase the response rate in the future?
  • Are the results what you expected when you planned the programme? If not, what do you think affected the results? Do you have qualitative data that can provide further insight into the results?
  • Are the results significant to the mentoring programme or not, regardless of statistical significance, and what does it mean? For instance, the difference in responses from two groups of people might not be statistically significant but could still be large enough to motivate change within the programme activities.
  • What implications do the results have on the programme? What actions do you need to take, if any?

4.2. Qualitative Data Analysis and Interpretation of Results

Qualitative data usually take the form of text. There are four major steps in qualitative data analysis, and these are described below.

  • Review the data
    Before conducting data analysis, you must read and understand the data you have collected, remove the data which is unclear/missing/insufficient, and get clarity before you code the data.
  • Organise the data
    Organise the data in a way which makes it easier for you to reference; e.g. organise the data by question type, or by respondent type, or both.
  • Code the data
    There are two basic methods of coding and you could use one, or both, of these:
    • Open coding — When you assign codes based on what emerges from the data.
    • Closed coding — When you already have codes prepared beforehand based on the questions you want to answer.
Coding is the process of combing the data for themes, ideas and categories and then marking similar passages of text with a code label so that they can easily be retrieved at a later stage for further comparison and analysis. Coding the data makes it easier to search the data, to make comparisons and to identify any patterns that require further investigation

  • Identify and generate themes
    After the data have been coded, you can then begin to identify themes derived from the information that was coded. This can be difficult, as you may need to review the coded text many times, to determine a theme which accurately reflects the data. It is also useful to provide examples of statements and observations to support the theme and, in order to capture how strong a theme is, you can report on the number and percentage of responses you coded that supported the theme.

Interpret the findings
The next step involves comparing your results to your expected outcomes, original evaluation questions, the goals and objectives of your programme and current state-of-the-art knowledge (e.g., research about mentoring programs). Some questions to guide your interpretation include:

  • Were any of the identified trends or patterns unexpected?
  • What are the factors that might explain the deviations?
  • If you collected both qualitative and quantitative data, do the qualitative findings support the quantitative findings? If not, what are the factors that could explain the differences (e.g., sampling, the way the questions were asked in the survey compared to the interviews, etc.)?
  • Do any interesting patterns emerge from the responses?
  • Do the results suggest any recommendations for improving the program?
  • Do the results lead to additional questions about the program? Do they suggest additional data could be needed?
  • Do you need to change the way the data are collected next time?

Be thoughtful when you are making sense of the data. Don't rush to conclusions or make assumptions about what your participants meant to say.
Involving other people (e.g., programme staff) to discuss what the findings mean may help you make sense of the data.

5. Summarise, Communicate and Reflect on Evaluation Findings

Evaluation findings can be communicated in many different ways to disseminate the results of your evaluation and tell the story of your programme. You might have a standard format that you use for presenting your evaluation findings, nevertheless, you should know why some formats for displaying and communicating your findings could be more effective than others.

There also are various options for displaying your evaluation findings and the field of data visualisation and visual analytics has grown due to the availability of large amounts of data along with technology advancements in accessing, handling and displaying data. Ultimately you need to remain focused on what you want to communicate, why and to whom.
Attending to this stage of the evaluation process is key because effective summary and communication of evaluation findings helps:

  • Disseminate knowledge
  • Facilitate understanding
  • Confirm or challenge theories or previous ways of thinking
  • Inform decision-making and action

It's not necessary to wait until the end of the programme to share findings and insights. You can share findings in the middle of your programme as long as you clarify that they are interim, preliminary findings.

Communicate and Report Your Evaluation Findings
What to Ask Before Putting the Findings Together
After data collection and analysis, you need to determine how to effectively summarise your findings so that you can sufficiently communicate your findings to your stakeholders. Sometimes, you could have much more data than you possibly can share effectively.
To help you with the process, complete the following table.

Your answers to these questions will help determine the content and format for summarising and communicating the results and impact of the program.

Tips for dealing with potentially negative findings:
From the outset, emphasise the use of evaluation for learning.

  • Involve key stakeholders in the evaluation's design and implementation and communicate throughout the evaluation process so there are no surprises.
  • Think about what to say and how to say it from the perspective of the stakeholders hearing about the evaluation findings.
  • Share any negative findings through a discussion format so you can effectively facilitate the learning process and reduce the likelihood of misunderstandings.
  • Don't start the report or discussion with negative findings. Instead, lead with positive findings and use words and phrases such as "accomplishments," "how we can do better" and "work in progress."

Keep It Simple
A key principle about effectively communication of your findings is to keep the presentation simple. Avoid cluttering information with lines, colours, shades or anything else that could improve attractiveness, but draws attention away from the content. To make the material easy for the reader to understand, include only the essential, critical information.

Ways to Display Your Evaluation Findings
If you have a communications plan and understand the importance of presenting findings simply, you can discuss the best way to display the findings based on quantitative and qualitative data. Data visualisation is a growing field and there are lots of resources about how to convey your data effectively. When displaying findings, your intention is to:

  • Draw the viewer's attention to the content, rather than the method, graphic design or something else.
  • Avoid any misrepresentation of the data.
  • Provide clear labels to.
  • Avoid small print.

Display Charts
Charts that are effective for showing patterns or trends over time include graphics that show the time on one dimension (e.g. x-axis), and the item for which change is being observed on another dimension (e.g. y-axis).
Charts that effectively show how responses are distributed along two dimensions include scatter plots and histograms
Charts that help compare two or more groups include bar charts, clustered bar charts, side-by-side bar charts and stacked bar charts

Reflecting on Your Evaluation Findings
Stakeholders are more likely to use the evaluation findings if they understand the purpose of the evaluation and have contributed to its design, implementation, interpretation and use of the findings.
However, to only summarise and communicate your findings is not sufficient. It's important to undertake the next step, which involves reflecting upon the findings and their implications and plan ways to put them to use – otherwise, the whole evaluation process loses value. Remember, evaluation must provide usable information to equip you to make informed decisions and shape your programs to be as effective as possible.
However, there are obstacles to reflecting upon the findings and planning ways to use them, such as:

  • Fear of being judged by stakeholders – especially if you must deliver unfavourable results.
  • Concern about the time and effort involved to convene stakeholders to discuss and reflect on the findings.
  • Resistance to change that could impact the way things have been done in the past.
  • Inadequate communication and knowledge sharing systems that affect how, when and with whom information is shared.
  • Staff who are not interested in the findings for various reasons.
  • Organisational limitations such as limited budget and staff capacity to carry out other functions deemed more important.
  • Concern about negative findings.

Use of Evaluation Findings
Once you have your findings, then you must agree on what you will use your finding for. For example;

  • Improving your strategy, initiative or programme
  • Improving accountability
  • Educating or building awareness
  • Leveraging support (e.g. extra funding)
  • Generating new knowledge (e.g. improve effective practice)
  • Replicating and scaling the programme
  • Developing recommendations for next steps
  • Adjusting the evaluation design and process