David Wilsey David Wilsey

David Wilsey is the Chief Operating Officer with the Balanced Scorecard Institute and co-author of The Institute Way: Simplify Strategic Planning and Management with the Balanced Scorecard.

Types of KPIs: The Logic Model and Beyond

By: David Wilsey

Jun 5, 2017 3028 Views 0 Comments FacebookTwitterLinkedInGoogle Plus

As part of the KPI Basics series of content we are developing as part of the launch of the KPI.org website, I thought I would introduce the different types of key performance indicators (KPIs). As I describe in the accompanying video, like to use a framework called the Logic Model to describe the first four types.

The Logic Model is a framework that is helpful for differentiating what we produce from what we can only influence. It is also helpful for separating between elements that are more operational versus those that are more strategic in nature. For every key process, we spend resources like time, money, raw materials and other inputs. Then every process has measurements that could be tied to that particular process. The outputs of my process are what we produce. Ultimately though, I want to create an impact with my work. Outcomes capture that impact.

Let’s look at some examples of these types of measurements in real life. If I am a coffee maker, my Input measurements might focus on the coffee, the water, or my time invested. My Process measures could have anything to do with the process of making coffee, from the efficiency to the procedural consistency. The outputs of my process would be the coffee itself. I could have a variety of measures around the quality of my coffee output. Finally, my outcome measures would be related to things I can only influence, such as if my audience enjoys or buys the coffee. There is certainly more value in measuring impact than there is operations. If my customer enjoys the coffee I am doing something right. But you really do need a mix of both to truly understand performance.

To fully understand all of the elements of strategy execution, I can then add a few other broad categories of measures to my story. Project measures monitor the progress of our improvement initiatives and projects and can be designed to improve operations or strategic impact. These track things like scope, resources, deliverables or project risk. In my coffee example, I might have a new branding campaign to sell my coffee.

Employee measures tell us if employees are performing well or have the right skills and capabilities needed. I might measure my employees’ skills in making coffee, for instance.

Finally, risk measures tell us if there has been an important change in a risk factor that could have a significant impact on our organization. For example, I might have a risk indicator that tells me if global coffee bean availability becomes a problem. 

The information that these different types of measures provide can be used to inform decision making. Using a family of measure like this can broadly inform your entire strategy.

To learn more about Key Performance Indicator development and implementation, please look into one of our KPI training or certification programs or visit kpi.org.

David Wilsey David Wilsey

David Wilsey is the Chief Operating Officer with the Balanced Scorecard Institute and co-author of The Institute Way: Simplify Strategic Planning and Management with the Balanced Scorecard.

What I Learned About KPIs from My Six-Year-Old

By David Wilsey

May 19, 2016 7313 Views 0 Comments FacebookTwitterLinkedInGoogle Plus
I arrived to pick up my daughter on the last day of art camp just in time for program evaluations. Since we at the Balanced Scorecard Institute (BSI) use evaluation data for course improvement, I was intrigued to watch a room full of six- to nine-year-olds randomly fill in bubbles and then quickly improve their scores when the teacher noted that if any of the scores were less than three they’d have to write an explanation. 

In the car on the way home, I asked my daughter why she rated the beautiful facilities only a 3 out of 5. She said, “well, it didn’t look like a porta-potty. And it didn’t look like a palace.” She also said she scored the snack low because she didn’t like the fish crackers and wished they’d had more pretzels. As I giggled at the thought of some poor City program planner or instructional designer trying to make course redesign decisions based on the data, I reflected on the basic principles that we try to follow that would have helped the city avoid some of the mistakes they had made. 

The first is to know your customer. Obviously, giving small children a subjective course evaluation standardized for adults was ill advised. Better would have been to ask the students about their experience using their language: did they have fun? Which activities were their favorite? Which did they not like as much? 

Further, the children aren’t really the customer in this scenario. Since it is the parents that are selecting (and paying for) the after-school education for their children, their perspective should have been the focus of the survey. Were they satisfied with the course curriculum? The price? The scheduling? Would they recommend the course to others?

Another important principle is to make sure that your measures provide objective evidence of improvement of a desired performance result. My daughter’s teacher used descriptive scenarios (porta-potty versus palace) to help the young children understand the scoring scale, but those descriptions heavily influenced the results. Plus a child’s focus on pretzels versus crackers misses the mark in terms of the likely desired performance result.

Similarly, it is important not to get fooled by false precision. Between some participants superficially filling in bubbles and others changing their answers because they don’t want to do any extra work, the city is simply not collecting data that is verifiable enough to be meaningful.

These might seem like a silly mistakes, but they are common problems. We have had education clients that wanted to measure the satisfaction of a key stakeholders (politicians and unions) while ignoring their actual customers (parents and students). We see training departments that measure whether their participants enjoyed the class, but never ask if their companies are seeing any application of the learning. And we see companies making important decisions based on trends they are only imagining due to overly precise metrics and poor analysis practices.

Even the evaluations for BSI certification programs require an explanation for an answer of 3 or less. I wonder how many of our students ever gave us a 4 because they didn’t want to write an answer. I have also seen evaluations go south simply because of someone’s individual food tastes.

At least I can take solace in the fact that no one ever compared our facilities to a porta-potty.
David Wilsey David Wilsey

David Wilsey is the Chief Operating Officer with the Balanced Scorecard Institute and co-author of The Institute Way: Simplify Strategic Planning and Management with the Balanced Scorecard.

The Ultimate KPI Cheat Sheet

By: David Wilsey

Jun 9, 2015 11622 Views 0 Comments FacebookTwitterLinkedInGoogle Plus
We’ve received a lot of interest in our new KPI Certification Program. In fact, one woman said she couldn’t wait until the first scheduled program offering. She also wanted to know if we had a handy list of the most important principles – she wanted a cheat sheet! So in the interest in tiding her (and others) over, below I have compiled a few of the most important KPI tips and tricks. There are many more of course, so if you think I’ve missed anything, please add them in the comments section below.

Strategy comes first!
A training student told me his organization is struggling to implement measures for brand equity, customer engagement, and a few others because they believed the measures didn’t really apply to their company. I asked him why they were implementing those measures if they didn’t seem to apply, and he said they had found them in a book. They had no strategy or goals of any sort, and yet somehow thought they had a measurement problem.  

KPIs found in a book of measures don’t necessarily mean anything in relation to your strategy.  If you don’t have a strategy and/or can’t articulate what you are trying to accomplish, it is too early for KPIs.

KPI Development is a Process
I am embarrassed to admit that the first time I facilitated the development of performance measures with a client, I stood in front of a blank flip chart and asked them to brainstorm potential measures. It was my first consulting engagement as a junior associate and the project lead had stepped out to take an emergency phone call. Even though I had a basic understanding of what good KPIs looked like, I couldn’t help the client come up with anything other than project milestones (“complete the web redesign by August”), improvement initiatives (“we need to redesign the CRM Process”), or vague ideals (“customer loyalty”). What I didn’t understand at the time is that you need to use a deliberate process for developing KPIs, based on the intended results within your strategy. And like any other process, KPI development requires continuous improvement discipline and focus to get better.

Articulate Intended Results Using Concrete, Sensory-Specific Language
Strategy teams have a habit of writing strategy in vague, abstract ideals. As you pivot from strategy to measurement, it is critical that you articulate what this strategy actually looks like using concrete language that you could see, hear, taste, touch or smell. A vaguely written strategic objective like Improve the Customer Experience might get translated into checkout is fast, or facilities are safe and clean. Improve Association Member Engagement might get translated into a result of members volunteer for extracurricular activities. I’ve seen strategy teams shift from 100% agreement on vague ideals to diametric opposition on potential intended results, indicating that their consensus around strategy was actually an illusion.  Use simple language a fifth-grader could understand to describe the result you are seeking. If you spend your time honing this intended result, the most useful performance measures almost jumps out at you.

It’s not about the Dashboard!
Dashboard software is great when it is used to support a well-designed strategic management system. Unfortunately, many people are more interested in buying a flashy new tool than they are in understanding how they are performing (a topic I’ve talked about before). KPIs are not about a dashboard. KPIs are about articulating what you are trying to accomplish and then monitoring your progress towards those goals. A dashboard is the supporting tool and too much emphasis on technology misses and often distracts us from the point.

It’s not about the KPIs!
Speaking of people missing the point, we have many clients who think this process begins and ends with the KPIs themselves. Unfortunately, some of these folks are simply trying to meet a reporting requirement or prepare for a single important meeting. This type of approach completely misses the power of KPI development, which is that KPIs provide evidence to inform strategic decisions and enable continuous improvement.

For more about how to improve KPI development in your organization, see our KPI Professional Certification Program or The Institute Way: Simplify Strategic Planning and Management with the Balanced Scorecard.

Howard Rohm Howard Rohm

Howard Rohm is President and CEO of the Balanced Scorecard Institute and Founder of the Strategy Management Group, Inc., the Institute's parent company. He is a performance management trainer, consultant, and technologist with over 40 years' experience.

Gaming the System at the VA

By: Howard Rohm

Jul 16, 2014 8424 Views 0 Comments FacebookTwitterLinkedInGoogle Plus

Imagine you take your car to the car dealer to get serviced. Before you give your car to the service manager you see the following performance statistics posted on the wall:

  • Average time to wait for an appointment after requesting one—27 days
  • Number of people who requested an appointment but didn’t get one—46,000
Not too reassuring is it. Would you leave your car or look somewhere else?

Some Veteran Administration facilities have a performance history like this. According to a recent review of the VA requested by President Obama, the agency is in deep trouble—average wait time for an appointment is 27 days and 46,000 veterans never got an appointment after requesting one. Some veterans died while waiting for appointments, although it’s not clear if the delays in medical attention contributed to the deaths.

At some VA facilities performance measurement data were misreported to make executives’ performance appear better than it was. Fraudulent performance reporting was used to help justify executive performance bonuses. (A department audit reported that three out of four facilities had a least one instance of false wait-time data and in some facilities two sets of books were being maintained.)

This type of behavior is called “gaming the system”. It’s a consequence of a culture overly focused on the wrong things (wait times) and a measurement system that emphasizes process performance over outcome performance. We shouldn’t be too surprised by the VA experience. When the wrong things are measured and incentivized, the wrong behaviors almost always result.

Focusing on the wrong measures and missing or minimizing the right measures created a climate of misreporting and deceit at some VA facilities, leading some executives to get credit for and bonuses based on reported good performance while all along the opposite seems to be true. Almost $300 million was paid out by the VA in 2013 for performance bonuses to employees, including nearly 300 senior leaders. (Maybe some of these executives should give their bonuses back to the VA for poor performance!) We’ll leave for another discussion the bigger question—what is systemically wrong at VA that encourages a behavior to keep two sets of books on performance?

Some critical questions come to mind. Where does customer satisfaction (veterans and their families are the customers) fit into the performance reporting and incentive equation? Shouldn’t satisfaction with medical service be heavily weighted in determining executive bonuses? If performance and reward are based mostly on process measures—like wait time—and wait time is being misreported, shouldn’t one assume that outcomes like effective medical care would suffer and that cheating to gain bonuses could occur?

How can an organization choose the “right” measures?  Start with the end in mind (desired results/accomplishments) and work backwards through the processes that lead to the desired outcomes and to the resources required to produce the program outputs that yield the desired outcomes. Make sure the desired results are expressed in unambiguous language. Then test the developed measures to make sure you’re not measuring what doesn’t matter, or worse, measuring the wrong things and incentivizing the wrong behaviors. Whether you are a hospital, a car dealership, or any other business, government or nonprofit, the same principles apply for developing good performance measures.

The unintended consequences of doing measurement badly are, in the case of the VA, potentially life threatening. Can your organization afford to do performance measurement badly, or not at all?

You can learn more about developing measures that matter in our book, The Institute Way: Simplify Strategic Planning and Management with the Balanced Scorecard. You can order the book on our website or on Amazon.
David Wilsey David Wilsey

David Wilsey is the Chief Operating Officer with the Balanced Scorecard Institute and co-author of The Institute Way: Simplify Strategic Planning and Management with the Balanced Scorecard.

4 Reasons Business Intelligence Systems are Like an (Unused) Gym Membership

By: David Wilsey

May 23, 2014 11316 Views 0 Comments FacebookTwitterLinkedInGoogle Plus
My business intelligence (BI) and analytics software salesman friend said something interesting to me the other day over lunch. He said, “I don't sell software, I sell gym memberships. When someone joins a gym they are not really buying the membership. They are buying the dream of improved health and a better physique. Their intention is to work out every day and fulfill that dream, despite the fact that few people ever actually follow up. Selling BI software is the same way. I'm not selling the software; I’m selling the dream of improved insight and competitive advantage.”

The unspoken implication was that few people ever get significant benefit from their software system, a conclusion I have also observed over my years in strategic performance management.

There are many common reasons that your strategic performance management software system might be getting less use than the gym membership you bought last January.  Below are the top 4 that I’ve seen as well as some tips for avoiding them.  

Reason 1: You bought into the hype but not the skills
I overheard a CEO recently saying that he needed to buy into the big data craze.  It was clear that this person had no idea what big data or predictive analytics meant, but he definitely needed to buy some.  Many people seem to think if they just buy some software, within weeks a “number cruncher” will magically come down from a mountain with answers to all of their problems. That is like thinking that if I buy a shovel, a garden will magically appear in my back yard. Performance management and statistical analysis skills are critical to creating value in this field.
 
Reason 2: You keep the results a secret
The first question some people ask when considering a performance system is, “how do I keep everyone out of my data?”  Security around private customer, employee, or some financial information is an absolute must, but a surprising amount of strategic organizational performance information can be shared with leaders and managers.  Leaders need information to make decisions and limiting access can communicate that strategy management is something to be left to only a select few.  Analyzing data is only the first step.  The dialog around why the results occurred and what should happen next are just as critical.
 
Reason 3: You only use out-of-the-box performance report design
The standard templates provided by the software companies are almost always designed to make the software sell well, as opposed to informing YOUR strategic decision making. Good performance reports communicate three things clearly: 1) How is OUR organization currently performing, 2) Why are WE getting the results that we are getting?  And 3) What are WE doing to improve our results?

Reason 4: You count and report on everything that can be counted.
Just because the vendor promises that this tool can handle the volume doesn’t mean that this is a good idea. Strategic performance management is about focusing on the most critical things first. I would recommend selecting a handful of critical performance gaps and focus your data collection, analysis, and improvement efforts on those.  Teach everyone in your organization how to do this effectively before you expand to other areas.

There are many more common mistakes, but these four are top of mind for me.  Please share other mistakes you’ve seen in the comments section below.

For more about how to improve your performance analysis, see the Performance Analysis chapter of The Institute Way: Simplify Strategic Planning and Management with the Balanced Scorecard.
David Wilsey David Wilsey

David Wilsey is the Chief Operating Officer with the Balanced Scorecard Institute and co-author of The Institute Way: Simplify Strategic Planning and Management with the Balanced Scorecard.

"Fight" of the Bumblebee

By: David Wilsey

Feb 14, 2014 14598 Views 0 Comments FacebookTwitterLinkedInGoogle Plus
Have you heard the common legend that scientists have proven that bumblebees, in terms  of aerodynamics, can’t fly?  This is a myth that came about because about eighty years ago an aerodynamicist made this statement based on an assumption that the bees’ wings were a smooth plane.  It was reported by the media before the aerodynamicist actually looked at the wing under a microscope and found that the assumption was incorrect.  While the scientist and the media issued retractions, the legend lives on.

Unfortunately, in the management world, decisions are made every day based on “legends” rather than on real evidence. At a manufacturing company I once worked for, it was a well-known “fact” that it was more profitable to discount prices to increase volume in a particular market.  Even after a team of business managers proved discounting was a money loser, certain sales managers continued to rigorously advocate for the discount strategy for years.  I like to refer to any ongoing argument like this as the "Fight" of the Bumblebee.  This fight is the most difficult when the bumblebee argument is emotionally compelling (they’re not supposed to be able to fly!) and the truth is difficult to convey (bumblebees’ wings encounter dynamic stall in every oscillation cycle, whatever that means). Everyone loves a discount and can see pallets of product going out the door.  Not everyone understands some of the indirect nuances that contribute to profit.

Winning the fight of the bumblebee is dependent on making sure that you are interpreting, visualizing, and reporting performance information in a meaningful way.  People have to be trained to appreciate the difference between gut instinct and data-driven decision making.  Once they see analysis done well a couple of times, they will start asking for it.

The key to interpreting a measurement is comparison. And the trick is to display the information in a way that effectively answers the question, Compared to what?  Visualizing performance over time identifies trends that show data direction and development and provide context for the underlying story relative to strategy. The simplest and most effective way I’ve seen for consistently visualizing data is with a Smart Chart (or XmR chart), a tool showing the natural variation in performance data.

Once you have a better idea of how to interpret your data, reporting the information in a way that is meaningful is important.  Reports should always be structured around strategy, so that people have the right context to understand what the data is about.  Reports should answer basic questions you need to know, such as what is our current level of performance?, why are we getting that result?, and what are we going to do next?

For more about how to interpret, visualize and report performance, see The Institute Way: Simplify Strategic Planning and Management with the Balanced Scorecard.