David Wilsey David Wilsey

David Wilsey is the Chief Operating Officer with the Balanced Scorecard Institute and co-author of The Institute Way: Simplify Strategic Planning and Management with the Balanced Scorecard.

Types of KPIs: The Logic Model and Beyond

By: David Wilsey

Jun 5, 2017 3028 Views 0 Comments FacebookTwitterLinkedInGoogle Plus

As part of the KPI Basics series of content we are developing as part of the launch of the KPI.org website, I thought I would introduce the different types of key performance indicators (KPIs). As I describe in the accompanying video, like to use a framework called the Logic Model to describe the first four types.

The Logic Model is a framework that is helpful for differentiating what we produce from what we can only influence. It is also helpful for separating between elements that are more operational versus those that are more strategic in nature. For every key process, we spend resources like time, money, raw materials and other inputs. Then every process has measurements that could be tied to that particular process. The outputs of my process are what we produce. Ultimately though, I want to create an impact with my work. Outcomes capture that impact.

Let’s look at some examples of these types of measurements in real life. If I am a coffee maker, my Input measurements might focus on the coffee, the water, or my time invested. My Process measures could have anything to do with the process of making coffee, from the efficiency to the procedural consistency. The outputs of my process would be the coffee itself. I could have a variety of measures around the quality of my coffee output. Finally, my outcome measures would be related to things I can only influence, such as if my audience enjoys or buys the coffee. There is certainly more value in measuring impact than there is operations. If my customer enjoys the coffee I am doing something right. But you really do need a mix of both to truly understand performance.

To fully understand all of the elements of strategy execution, I can then add a few other broad categories of measures to my story. Project measures monitor the progress of our improvement initiatives and projects and can be designed to improve operations or strategic impact. These track things like scope, resources, deliverables or project risk. In my coffee example, I might have a new branding campaign to sell my coffee.

Employee measures tell us if employees are performing well or have the right skills and capabilities needed. I might measure my employees’ skills in making coffee, for instance.

Finally, risk measures tell us if there has been an important change in a risk factor that could have a significant impact on our organization. For example, I might have a risk indicator that tells me if global coffee bean availability becomes a problem. 

The information that these different types of measures provide can be used to inform decision making. Using a family of measure like this can broadly inform your entire strategy.

To learn more about Key Performance Indicator development and implementation, please look into one of our KPI training or certification programs or visit kpi.org.

David Wilsey David Wilsey

David Wilsey is the Chief Operating Officer with the Balanced Scorecard Institute and co-author of The Institute Way: Simplify Strategic Planning and Management with the Balanced Scorecard.

The Ultimate KPI Cheat Sheet

By: David Wilsey

Jun 9, 2015 11622 Views 0 Comments FacebookTwitterLinkedInGoogle Plus
We’ve received a lot of interest in our new KPI Certification Program. In fact, one woman said she couldn’t wait until the first scheduled program offering. She also wanted to know if we had a handy list of the most important principles – she wanted a cheat sheet! So in the interest in tiding her (and others) over, below I have compiled a few of the most important KPI tips and tricks. There are many more of course, so if you think I’ve missed anything, please add them in the comments section below.

Strategy comes first!
A training student told me his organization is struggling to implement measures for brand equity, customer engagement, and a few others because they believed the measures didn’t really apply to their company. I asked him why they were implementing those measures if they didn’t seem to apply, and he said they had found them in a book. They had no strategy or goals of any sort, and yet somehow thought they had a measurement problem.  

KPIs found in a book of measures don’t necessarily mean anything in relation to your strategy.  If you don’t have a strategy and/or can’t articulate what you are trying to accomplish, it is too early for KPIs.

KPI Development is a Process
I am embarrassed to admit that the first time I facilitated the development of performance measures with a client, I stood in front of a blank flip chart and asked them to brainstorm potential measures. It was my first consulting engagement as a junior associate and the project lead had stepped out to take an emergency phone call. Even though I had a basic understanding of what good KPIs looked like, I couldn’t help the client come up with anything other than project milestones (“complete the web redesign by August”), improvement initiatives (“we need to redesign the CRM Process”), or vague ideals (“customer loyalty”). What I didn’t understand at the time is that you need to use a deliberate process for developing KPIs, based on the intended results within your strategy. And like any other process, KPI development requires continuous improvement discipline and focus to get better.

Articulate Intended Results Using Concrete, Sensory-Specific Language
Strategy teams have a habit of writing strategy in vague, abstract ideals. As you pivot from strategy to measurement, it is critical that you articulate what this strategy actually looks like using concrete language that you could see, hear, taste, touch or smell. A vaguely written strategic objective like Improve the Customer Experience might get translated into checkout is fast, or facilities are safe and clean. Improve Association Member Engagement might get translated into a result of members volunteer for extracurricular activities. I’ve seen strategy teams shift from 100% agreement on vague ideals to diametric opposition on potential intended results, indicating that their consensus around strategy was actually an illusion.  Use simple language a fifth-grader could understand to describe the result you are seeking. If you spend your time honing this intended result, the most useful performance measures almost jumps out at you.

It’s not about the Dashboard!
Dashboard software is great when it is used to support a well-designed strategic management system. Unfortunately, many people are more interested in buying a flashy new tool than they are in understanding how they are performing (a topic I’ve talked about before). KPIs are not about a dashboard. KPIs are about articulating what you are trying to accomplish and then monitoring your progress towards those goals. A dashboard is the supporting tool and too much emphasis on technology misses and often distracts us from the point.

It’s not about the KPIs!
Speaking of people missing the point, we have many clients who think this process begins and ends with the KPIs themselves. Unfortunately, some of these folks are simply trying to meet a reporting requirement or prepare for a single important meeting. This type of approach completely misses the power of KPI development, which is that KPIs provide evidence to inform strategic decisions and enable continuous improvement.

For more about how to improve KPI development in your organization, see our KPI Professional Certification Program or The Institute Way: Simplify Strategic Planning and Management with the Balanced Scorecard.

David Wilsey David Wilsey

David Wilsey is the Chief Operating Officer with the Balanced Scorecard Institute and co-author of The Institute Way: Simplify Strategic Planning and Management with the Balanced Scorecard.

4 Reasons Business Intelligence Systems are Like an (Unused) Gym Membership

By: David Wilsey

May 23, 2014 11316 Views 0 Comments FacebookTwitterLinkedInGoogle Plus
My business intelligence (BI) and analytics software salesman friend said something interesting to me the other day over lunch. He said, “I don't sell software, I sell gym memberships. When someone joins a gym they are not really buying the membership. They are buying the dream of improved health and a better physique. Their intention is to work out every day and fulfill that dream, despite the fact that few people ever actually follow up. Selling BI software is the same way. I'm not selling the software; I’m selling the dream of improved insight and competitive advantage.”

The unspoken implication was that few people ever get significant benefit from their software system, a conclusion I have also observed over my years in strategic performance management.

There are many common reasons that your strategic performance management software system might be getting less use than the gym membership you bought last January.  Below are the top 4 that I’ve seen as well as some tips for avoiding them.  

Reason 1: You bought into the hype but not the skills
I overheard a CEO recently saying that he needed to buy into the big data craze.  It was clear that this person had no idea what big data or predictive analytics meant, but he definitely needed to buy some.  Many people seem to think if they just buy some software, within weeks a “number cruncher” will magically come down from a mountain with answers to all of their problems. That is like thinking that if I buy a shovel, a garden will magically appear in my back yard. Performance management and statistical analysis skills are critical to creating value in this field.
 
Reason 2: You keep the results a secret
The first question some people ask when considering a performance system is, “how do I keep everyone out of my data?”  Security around private customer, employee, or some financial information is an absolute must, but a surprising amount of strategic organizational performance information can be shared with leaders and managers.  Leaders need information to make decisions and limiting access can communicate that strategy management is something to be left to only a select few.  Analyzing data is only the first step.  The dialog around why the results occurred and what should happen next are just as critical.
 
Reason 3: You only use out-of-the-box performance report design
The standard templates provided by the software companies are almost always designed to make the software sell well, as opposed to informing YOUR strategic decision making. Good performance reports communicate three things clearly: 1) How is OUR organization currently performing, 2) Why are WE getting the results that we are getting?  And 3) What are WE doing to improve our results?

Reason 4: You count and report on everything that can be counted.
Just because the vendor promises that this tool can handle the volume doesn’t mean that this is a good idea. Strategic performance management is about focusing on the most critical things first. I would recommend selecting a handful of critical performance gaps and focus your data collection, analysis, and improvement efforts on those.  Teach everyone in your organization how to do this effectively before you expand to other areas.

There are many more common mistakes, but these four are top of mind for me.  Please share other mistakes you’ve seen in the comments section below.

For more about how to improve your performance analysis, see the Performance Analysis chapter of The Institute Way: Simplify Strategic Planning and Management with the Balanced Scorecard.
David Wilsey David Wilsey

David Wilsey is the Chief Operating Officer with the Balanced Scorecard Institute and co-author of The Institute Way: Simplify Strategic Planning and Management with the Balanced Scorecard.

The “Words with Friends” Strategy Disruption

By: David Wilsey

Mar 4, 2014 9868 Views 0 Comments FacebookTwitterLinkedInGoogle Plus
Umiaq is defined as a large open Inuit or Eskimo boat made of skins stretched on a wooden frame, usually propelled by paddles. I looked it up only because my Words with Friends opponent just played that word. There are several possible explanations for this move. Maybe my friend of many years has recently become an expert in the Inuit culture. Maybe his linguistic genius is finally starting to gel, although that seems unlikely after years of unexceptional Scrabble play. Or more likely, he randomly guessed over and over until something was accepted.

Much has been made about “plugging”, the practice of guessing randomly until you stumble upon a word. To a Scrabble purist like me, this is cheating, pure and simple. To seemingly everyone else, this is just part of the game and I need to shut up and stop being a sore loser.

My point here is not to rant about the game. My point is that, for better or worse, sometimes your strategic competitive environment changes. Your favorite political party loses. Your competitors merge. Technology enables your customers to replace your cash cow service for free. A small new competitor comes up with a disruptive new technology that changes the rules in your industry.

This seems almost unfair in the strategic planning and management world because you spend so much time and energy designing and executing a comprehensive strategy around certain assumptions. Just when you think that the initiatives that you are implementing are closing the gaps on your targets, the rules change and you find yourself on the Blackberry end of the iPhone revolution.

There are a few guidelines you can follow to make sure that this doesn’t happen. First, don’t skimp on your external environmental scan during the Assessment step and be sure to go back and update that analysis periodically. Some of us work in industries that change abruptly from quarter to quarter, but in most industries, change happens gradually enough that an annual update will be adequate.

Second, use scenario planning to help identify strategy alternatives. Scenario planning helps recognize the many factors that combine in complex ways to affect future success, and tries to make sense of how these factors interact and how they drive change, leading to a deeper discussion on better business strategies.

Finally, sometimes planners get too attached to their product and have to be reminded that a dynamic strategy needs to be continuously evaluated to enable the organization to nimbly adapt and change. Evaluation helps organizations understand how well strategies accomplish desired results and how well the strategic management system improves communications, alignment and performance. A more formal evaluation process is usually conducted once a year, although if your organization is in a sector that changes more rapidly than that, more frequent evaluations are needed.

If I don’t like plugging in Words with Friends, I can simply stop playing out of principle. But if my livelihood depends on my ability to adapt to a changing world, I have to be able to quickly and systematically adapt my strategy.  If I am too stuck in my ways, my organization will have a serious problem.  Sort of like being in an umiaq without a paddle.

For more about how to adapt your strategy to a changing world, see The Institute Way: Simplify Strategic Planning and Management with the Balanced Scorecard.
David Wilsey David Wilsey

David Wilsey is the Chief Operating Officer with the Balanced Scorecard Institute and co-author of The Institute Way: Simplify Strategic Planning and Management with the Balanced Scorecard.

"Fight" of the Bumblebee

By: David Wilsey

Feb 14, 2014 14598 Views 0 Comments FacebookTwitterLinkedInGoogle Plus
Have you heard the common legend that scientists have proven that bumblebees, in terms  of aerodynamics, can’t fly?  This is a myth that came about because about eighty years ago an aerodynamicist made this statement based on an assumption that the bees’ wings were a smooth plane.  It was reported by the media before the aerodynamicist actually looked at the wing under a microscope and found that the assumption was incorrect.  While the scientist and the media issued retractions, the legend lives on.

Unfortunately, in the management world, decisions are made every day based on “legends” rather than on real evidence. At a manufacturing company I once worked for, it was a well-known “fact” that it was more profitable to discount prices to increase volume in a particular market.  Even after a team of business managers proved discounting was a money loser, certain sales managers continued to rigorously advocate for the discount strategy for years.  I like to refer to any ongoing argument like this as the "Fight" of the Bumblebee.  This fight is the most difficult when the bumblebee argument is emotionally compelling (they’re not supposed to be able to fly!) and the truth is difficult to convey (bumblebees’ wings encounter dynamic stall in every oscillation cycle, whatever that means). Everyone loves a discount and can see pallets of product going out the door.  Not everyone understands some of the indirect nuances that contribute to profit.

Winning the fight of the bumblebee is dependent on making sure that you are interpreting, visualizing, and reporting performance information in a meaningful way.  People have to be trained to appreciate the difference between gut instinct and data-driven decision making.  Once they see analysis done well a couple of times, they will start asking for it.

The key to interpreting a measurement is comparison. And the trick is to display the information in a way that effectively answers the question, Compared to what?  Visualizing performance over time identifies trends that show data direction and development and provide context for the underlying story relative to strategy. The simplest and most effective way I’ve seen for consistently visualizing data is with a Smart Chart (or XmR chart), a tool showing the natural variation in performance data.

Once you have a better idea of how to interpret your data, reporting the information in a way that is meaningful is important.  Reports should always be structured around strategy, so that people have the right context to understand what the data is about.  Reports should answer basic questions you need to know, such as what is our current level of performance?, why are we getting that result?, and what are we going to do next?

For more about how to interpret, visualize and report performance, see The Institute Way: Simplify Strategic Planning and Management with the Balanced Scorecard.
Gail Stout Perry Gail Stout Perry

Gail is co-author of The Institute Way with over 20 years of strategic planning and performance management consulting experience with corporate, nonprofit, and government organizations.

Dear Abby-Gail: How Much is Too Much?

By Gail Stout Perry

Oct 29, 2013 6105 Views 0 Comments FacebookTwitterLinkedInGoogle Plus

There has been a lot of interest in my recent blog post:  “Balanced Scorecard Gone Bad: What’s that Funky Smell?”  Several people have posted comments and questions in various forums, but one in particular deserves special attention.
  
From Gary: I believe a key point in your message is that a strategy is never static due to external changes (e.g., competitor moves, new technologies), so it will require continuous adjusting.  But this raises a different question. Since as strategic objectives change or the emphasis of what to accomplish within strategic objectives change, this means some KPIs may be dropped and others added (or their weightings may need to be tweaked). As a result, how much change in KPIs can an organization tolerate?

Dear Gary: This is an excellent question.  When strategy changes, then KPIs will have to change. Organizational tolerance to change is affected by several things. 

(1) Is the scorecard system engrained in the organizational culture such that management trusts the system and uses it to make decisions?  If so, they will have relatively high tolerance for change in the KPIs because they understand that the change is necessary if they are to continue to rely upon the system to make strategically relevant decisions. 

(2) Given that you know you need to adjust the KPI, how quickly can you achieve 7 data points on the new or adjusted KPI?  In other words, is there baseline information available that will help you quickly establish an XmR chart?  If not, can you achieve frequent enough reporting points to have useful trend analysis within 6 months?  If you were using an excellent KPI in the past and then switch to one in which it will be a year (or more) before you have enough data for management to have the 7 data points needed to make statistically sound decisions, this will cause frustration and lower the tolerance for the necessary change.

(3) Can your software system handle these changes without losing your historical performance on the objective (assuming the objective does not change)?  Knowing that you won’t be throwing away historical information increases tolerance for change.

(4) What about rewards tied to KPIs?  How do your Human Resources processes link individual or group performance and incentives to KPI performance?  What will be the result of changing a KPI right now?  If it can’t be changed due to a covenant with employees, can it be removed from the calculation so that you don’t keep working towards an “expired strategy”?

I invite feedback from others.  What else has impacted your organization’s tolerance for needed change in its KPIs?  And does anyone want to share their tips for overcoming resistance to this sort of change?

For more challenges and solutions, we invite you to explore The Institute Way: Simply Strategic Planning & Management with the Balanced Scorecard.

David Wilsey David Wilsey

David Wilsey is the Chief Operating Officer with the Balanced Scorecard Institute and co-author of The Institute Way: Simplify Strategic Planning and Management with the Balanced Scorecard.

Garth Brooks and the Music Industry’s Performance Measurement Problem

By David Wilsey

Oct 11, 2013 10957 Views 0 Comments FacebookTwitterLinkedInGoogle Plus

guitarThe rock music industry in 1991 was in transition. The glam-rock and new wave music of the eighties was out and the industry had not yet settled on alternative rock and grunge as the iconic sound of the decade. And most shockingly, after almost forty years of fans preferring rock music to country music by a reliably constant percentage, sales figures were indicating that preferences were shifting from rock to country. 

The industry made what seemed like a very logical assumption: the shift was obviously caused by the incredible crossover appeal of Country superstar Garth Brooks, who had recently taken the music industry by storm. They also took very predictable actions in response: several promising rock bands were dumped while resources were shifted to other country acts.

In the short run, these actions seemed to reinforce the trend, with even more country music sales. But then something very strange happened: the sales numbers slowly drifted back to the exact pre-Garth Brooks percentages, with rock being preferred by the same percentage it had for decades. Industry analysts were left scratching their head. What just happened?

What they found after some analysis was surprising. In March 1991, the industry began counting record sales using the Nielsen SoundScan system. Before that, sales were counted by calling stores across the U.S. to collect sales data – an incredibly ineffective collection method. Unfortunately, not all record stores were able to implement the SoundScan system immediately and continued using the old method for months or even years. On the other hand, one behemoth was online immediately: Wal-Mart. In the early days of SoundScan, every single time a Wal-Mart sales associate scanned a CD, it was counted by SoundScan and reordered, whereas record store sales (and reorders) were hit-and-miss. 

Here’s the thing that nobody had thought about before: in 1991, country music fans primarily bought their music from Wal-Mart and rock music fans primarily bought their music from record stores.  Once all of the record stores were online, it became clear that the appearance of a shift in preference was nothing more than a measurement data collection problem.

The lesson to this story is that it is critical to resist the urge for a knee-jerk reaction to data such as dumping promising rock bands! There is a process discipline to performance analysis and improvement and the steps are simple. First, a Smart Chart should be used to make sure you are correctly interpreting the data. Then, a root-cause analysis is in order to understand why you are getting the results you are getting. This root cause analysis would have likely revealed the issue with the data in SoundScan being dominated by Wal-Mart sales. Finally, an improvement action plan is implemented and the results are monitored over time. 

To learn more about how to interpret, report and react to your performance data, see the PuMP Blueprint Certification Workshop, or see The Institute Way: Simplify Strategic Planning and Management Using the Balanced Scorecard.

Gail Stout Perry Gail Stout Perry

Gail is co-author of The Institute Way with over 20 years of strategic planning and performance management consulting experience with corporate, nonprofit, and government organizations.

Skinny Jeans and the New Math

By Gail Stout Perry

Oct 9, 2013 13073 Views 0 Comments FacebookTwitterLinkedInGoogle Plus

I am an engineer by training and a math geek at heart.  So articles about girls and math catch my eye.  Did you know that researchers agree that one’s ability to excel at math and science is as much about attitude as it is about “natural gifts” or gender?  This affirms my own less-than-scientific research findings.  I have a daughter and from her earliest years, I showed her how to apply math to everyday activities (baking was our favorite hands-on lesson, of course).  And anytime friends of hers would complain about how hard math was, I’d make them all stand up and shout, “Girls ROCK at math!!!”   It’s all about the attitude.   Of course, I had a good role model for this. My father showed me how fun math was when I was a child as we built motors together and played around with electronics...scribbling equations and schematics as we went.  I never feared math and science...they were FUN!  

In my work life, I’ve discovered that dread of math, especially statistics, is widespread in the business community.   So let’s tackle something fun:  the concept of correlation.

When developing performance measures in business, we sometimes face a stumbling block in that the thing we desire most to measure is, unfortunately, impossible to measure directly.  So, we have to look for a “proxy” measure that is correlated.

Let me illustrate with an example from daily life.  Let’s say I want to know if I am maintaining my ideal weight versus gaining weight.  It’s easy to measure that directly - hop on the bathroom scale.  But, unfortunately, I can’t.  I travel constantly so I do not have a bathroom scale with me most days. 

So I have a correlate that I measure.  I always carry the same pair of skinny jeans with me.  As long as the jeans will button, I am fairly certain of what the bathroom scale might say, if I had one.  The fit of my jeans is correlated to my weight.   Now, a statistician will remind us that “correlation does not equal causation.”  This simply means is that I need to consider that other things may be causing my jeans not to fit – for example, maybe they shrunk in the wash.  But understanding this, I am reasonably certain that they are a good proxy measure while on the road.

See how easy it was to master two important concepts for measuring performance in business - Direct Measure and Correlated Measure?  It’s all about the attitude!!

To learn much, much more about how to develop meaningful performance measures, we invite you to explore The Institute Way or join us at an upcoming training course.

David Wilsey David Wilsey

David Wilsey is the Chief Operating Officer with the Balanced Scorecard Institute and co-author of The Institute Way: Simplify Strategic Planning and Management with the Balanced Scorecard.

The Post-Retreat Strategic Planning Letdown

By David Wilsey

Oct 2, 2013 23118 Views 0 Comments FacebookTwitterLinkedInGoogle Plus

On the radio the other day there was mountain climber that shared her experience standing atop Mount Everest. She said that while standing on that summit she was surprised to find that rather than revel in her achievement and enjoy the view that so relatively few people have seen, her thoughts were dominated by an unexpectedly unsettling realization: now, I have to get back down. Besides the fact that getting back down was in some ways physically harder than climbing up, the bigger problem was that her primary motivation – to reach the summit – had been achieved. Reaching that summit had been an inspirational goal driving her through each step of the journey; from the mundane strength training years earlier to those final few steps. Her simple primary motivating factor now would take a very different form: survival.

This type of letdown is common to any major achievement or milestone in life. So it’s not unexpected that a similar phenomenon occurs in the strategic management world. Most commonly, this letdown occurs as soon as the big planning retreat event is over and the resulting documentation has been put together. Once the strategy team has formulated strategy, developed a strategy map, identified performance measures, prioritized initiatives, and rolled everything out to the entire organization, the team stands at the top of that mountain of work and thinks we did it, now what?

Unfortunately, this is the point that too many organizations realize that the real work was not in writing the plan but in the execution of all of those grand ideas. They let the process run out of steam and begin getting too distracted by day-to-day problems and operational concerns to follow through.

So how do you avoid the post-retreat strategic planning letdown?  Here are a few tips:

  • Don’t think of strategy as an event: Many people still think that the only time you should talk about strategy is after playing golf during a big retreat.  Strategy management is about making strategy a part of day-to-day management. Try to institutionalize the strategic thinking process that was used to develop the plan. Make strategy everybody’s job instead of just the management team. Incorporate strategy into the day-to-day agenda.
  • Prioritize & keep things simple: No organization can do everything for everyone. Select 3-4 high level goals to focus on to start and a few high-priority initiatives to support each goal. Manage your initiative list down to get to the select few.
  • Focus on process improvement instead of judging people: ownership and accountability are needed, but if you want to develop a continuous improvement culture, employees cannot worry about getting punished every time they report bad news. Underperformance is more often than not the result of a process failure and so that’s where the focus should be.
  • Use technology for analysis and information sharing: Some organizations fail to fully analyze the data they are collecting or short-circuit their strategy execution success by choosing to use spreadsheets for performance analysis.  Remember that it isn’t helpful for a single analyst to fully understand how the organization is performing. Information sharing and dialog are critical in helping turn information into knowledge and understanding so that leaders can make better strategic decisions.

For more suggestions on how to avoid this letdown, see the Sustaining and Managing with the Balanced Scorecard chapter of The Institute Way: Simplify Strategic Planning and Management with the Balanced Scorecard.

Gail Stout Perry Gail Stout Perry

Gail is co-author of The Institute Way with over 20 years of strategic planning and performance management consulting experience with corporate, nonprofit, and government organizations.

PS: Our Balanced Scorecard Saved The U.S. Army $26 Million

By Gail Stout Perry

Sep 30, 2013 8119 Views 0 Comments FacebookTwitterLinkedInGoogle Plus

I was working with an Army command at Ft. Sam Houston this week and had invited a special guest - Scott Hencshel - to address the group regarding the organizational challenges of implementing a balanced scorecard system within Army.  (Scott’s command is also stationed at Ft. Sam Houston -  Army Medical Department Center & School (AMEDDC&S), an Institute “Award for Excellence” winner.)  

As Scott was wrapping up, someone asked a final question, “What was the biggest benefit that AMEDDC&S realized after implementing its strategic balanced scorecard?”  Scott talked about alignment, focus, and data-driven decision making.  Then as he was making his way to the door he turned back and said, “Oh yeah, we immediately saved the Army $26 million.” 

Say what?!?!

AMEDDC&S is where the U.S. Army educates and trains all of its medical personnel – over 27,000 soldiers. One of the strategic measures on AMEDD’s balanced scorecard is “attrition rates.”  Before the scorecard was implemented, it was commonly believed that discipline issues were the primary reason for soldiers not completing their training programs – because resolution of these discipline issues were what consumed everyone’s time.  Once the scorecard was implemented, attrition was measured more thoroughly and two discoveries were made:

  1. Attrition was MUCH higher than originally thought.  The traditional calculation was flawed and attrition was actually over 34%.  That means 1/3 of those entering the medical training programs would “drop-out” thereby wasting the Army’s investment in their training.
  2. Academic performance, not discipline, was discovered to be the primary reason for attrition.

So as the scorecard team delved further, they looked for root causes of poor academic performance resulting in attrition incidents.  They discovered that a major cause was a lack of communication between the Brigade leadership and the AMEDDC&S faculty.  Students in the medical training program were being assigned Brigade duties that prevented them from having proper opportunities to study and prepare for classes and exams.   A prime example was students falling asleep during final exams due to having served Brigade guard duty the night before. 

Once the communications issues were corrected, overall attrition rapidly dropped from 34% to below 20%...thereby saving the U.S. Army $26 million.

PS:  Did I mention that I have the best job in the world?!?  It is extremely rewarding to hear about results like this.

For more examples of break-through performance, we invite you to read “The Institute Way: Simply Strategic Planning & Management with the Balanced Scorecard. 



12