Interpret the Patterns for Their Meanings

By Barry Sweeny, 2010

INDEX


1. Provide copies of the Comparison Charts that have been marked with symbols to the folks who will make the decisions about what these patterns mean. Ask them to compare all the charts with similar data  to each other. For example;

  • This could include all data on one topic but from different data sources, like mentors, proteges supervisors, etc.
  • A variation could be data on the same topic but collected from demographic subgroups, like different age groups, gender, racial, or SES groups.
  • It could include all data on one topic but which was collected using different assessment methods, such as a comparison of local versus standardized assessments.
  • If you have data for the same topic but from different times, like every year in April, comparison across time can be very useful and may show trends.

2. Look at each data chart to see if a pattern was found.

  • Notice the marks (?, +, -)written in the previous “analysis” step.
    • ? + No pattern evident yet. Save for later when more data are available;
    • + A positive pattern – strength or improvement;
    • – A negative or decreasing pattern, or weakness.

3. When you see a pattern, write a DRAFT narrative conclusion describing what the pattern shows. This is usually a factual, descriptive, non interpretive statement, which just states what the pattern is, but NOT yet why it is. Make this just a draft since only when all the initial writing is done will you have a full picture of the patterns. Only THEN is it time to finalize the darfts, and THAT is not the steps on this page. So draft and wait ’til later to refine it.   When those are all done, move on …


4. Try to Understand the “Causing Factors” for Everything – Sometimes this will not be evident, but sometimes it will.

Be sure to consider the possible factors that may explain why the data look as they do, such as changes in:

  • staff training amounts, timing, goals, or methods;
  • instructional materials;
  • media use or availability;
  • forms of assessment;
  • hiring or enrollment criteria or practices;
  • demographics;
  • community factors;
  • external economic factors;
  • when funding was cut for programs and the effects of that;
  • when a new leader took over and the changes that caused;
  • when the annual assessment was given;
  • etc.

Look for anything which may have had an impact on learning or support, or caused the changes you see in the patterns.

A. Look at any data patterns that were marked with a question mark (?) because they showed no discernible pattern as of yet. Make a list of these (cut and paste?) and title or label it to be checked in year (next) to see if a pattern IS beginning to show then because of new data available then.

B. Look at any data patterns that were marked negative (-) because they showed a pattern your team did not like or that caused a concern. These data patterns require careful interpretation, looking for factors that may have caused the unacceptable patterns. You will need to collect all of these in one place (cut and paste) and separate these into two groups:

  • a. Those for which the causing factors are not in your control.
  • b. Those for which the causing factors are in your control.

a. Those for which the causing factors are NOT in your control. Deciding this is a real moment of truth because it may be in your control, at least partially, but you may not want the responsibility for it.

Note – It may be in your control, but that does not mean you caused the bad pattern. Other factors may have shaped it too. An example is a set of failing students in math. The cause of failure might include poor instruction previously, or lack of hands-on resources, but there are other environmental and family, and earlier school experiences that had an effect as well. Never-the-less, if it is in your control to some extent, you may not “own” the whole problem, but you can still choose to try and impact it and help kids do better in math despite their other issues.

When a causing factor is really NOT in your control, you mat still be able to effect the external causes. using the above example, a school can target parent education and support (OUTside of the school) with the goal of increasing home support for student success (IN school).

If there really is nothing you can do to effect the causing factors, just label that data pattern with that conclusion and set that aside.

b. Those for which the causing factors ARE in your control. Deciding this is a bit scary sometimes. If data tell you that middle managers generally (a pattern) are not even meeting standards, the issues of having to deal with that finding can be overwhelming and frightful. You can’t and wouldn’t want to fire them all. Finding, recruiting,, and orienting their replacements would be a huge, expensive, and time-consuming task. On the other hand, training and supporting them all so they improve could be very expensive and very time consuming too. So What to do?

When there is such a pattern, it’s definitely better to build up the staff you have than to start from scratch. The same advice applies to a pattern (for example) of poor or non profitable customers a business may have. It takes five times MORE time and money to get a new customer than it does to build up the value of an existing, repeating customer. In other words, whether the pattern is within the staff performance, or some other area of customers, processes, services or products, the same advice holds true. The longer-term cost savings of working with what you have is always more cost-effective.

HOW to work with these negative patterns is a topic for a different web page.

C. Look at any data patterns that were marked positive (+) because they showed an improvement or strengths.

“Strengths” are important, even if they are not yet seen as a trend. One piece of data, such as “number of middle managers scoring exceeds expectations” may not be evidence of improvement from a prior level yet (since you don’t have data across time), but this strength can be analyzed for WHY this has happened. The answer to that may help you improve more. Knowing why good things have happened is critical because then you can PROACTIVELY CAUSE improvement, not just experience it.

Even if the strength you identify seems isolated, it may be that further analysis can discover ways in which that strength points the way for improving something else.

For example, say a mentor program pilot only served ten proteges the first year and there are easily 100 people who need mentors The ten participating mentors and ten proteges can be a strength in a number of ways. Perhaps 2-3 pilot mentors can help train next year’s new crop of mentors. Perhaps 2-3 of the better, more articulate proteges can serve as a panel and, because they are “peers”, their message to next year’s new hires will be more powerful and the problems they experienced and insights they gained can become useful for ALL the new hires you will have.


5. Write a DRAFT Conclusion Explaining WHY you think the patterns may have happened. Try to keep the conclusion for each pattern to one paragraph.


6. Clearly Label Each Conclusion for data, data source and  type, etc. and link it to it’s data charts and other conclusion statements, etc. with clear cross-references.


7. Write a Draft “Evaluation of the Evaluation Process

Review all that was done from the very beginning of the evaluation process. Ask each participant to reflect on and make notes about the process- what worked well, what didn’t, and how to improve it next time.

  1. What parts of the process need a different timing?
  2. What parts need more or less steps?
  3. Was the number of people too many, adequate, too few?
  4. Should data be collected from other sources?   Or fewer?
  5. Did specific assessment items or prompts lead to confusion and need rewriting for clarity? Need better directions to users?

Go back to the outline of the Best Practice Program Evaluation Process.

  1. Were any steps skipped?
  2. Rewrite it to make it YOUR process and suit your specific needs.