By George Tillman, Dwayne Prosko and Ken DeNatale The balanced scorecard is one of the most widely used and hotly debated management tools in the executive arsenal. Although the original balanced scorecard was devised to assess the health of an entire business, the basic concept has been adapted to fit business units and support organizations.When it works, the scorecard is a powerful resource to help executives understand past and current performance and plan for the future. Scorecards can be a great resource for managing the IT function: First, considerable numeric data is available to measure systems performance. Second, IT scorecards can be designed to measure end user benefits and satisfaction. Third, scorecards can be a powerful vehicle to bridge the communication gap between IT professionals and the business customers they serve. For most senior business executives, delving into IT reports has limited appeal. IT performance reports that appear in a form comparable to reports of other business functions can offer a clearer window for business people into the IT domain, particularly when it reveals to corporate senior management and business unit leaders the value they receive from IT services. Scorecard WisdomAt Booz Allen, we introduced a scorecard for our IT department. Based on our experience with our clients, we were able to avoid several common mistakes. First, we tackled data integrity issues, including dueling data (inconsistent data from multiple sources) and definitional problems. Where no data existed, we re-engineered processes to produce the necessary data. Our scorecard also considered two ways to look at the selection of IT reporting data: Key performance indicators (KPIs) gave us the information we want to know; performance measures (PMs) are the actual information we can or are likely to get. KPIs are the performance knowledge we want to have about an asset, process, or group, such as sales, customer satisfaction, and e-mail availability. PMs are the reporting measures we can get our hands on - such as sales order totals, the number of survey respondents who liked the service, or the percentage of the time e-mail is available when users access it. Scorecards are not effective unless KPIs and PMs are nearly identical. Major difficulties occur when their definitions diverge. To avoid this, we spent more time on data definition than on any other single facet of the scorecard. It is a process that never ends. As new systems, processes, or groups are brought into the scorecard, all definitions have to be reviewed and recertified. Second, we clearly defined our target audiences, which include corporate and business unit management, senior and junior IT managers, and journeymen IT staff. We wanted one report for all of them. But to create one report we had to address two very different perspectives: IT shops think of themselves as selling technology, but users buy service. This distinction is at the heart of much of what goes wrong in managing IT, especially with respect to IT's relationship with business clients. More IT departments are starting to appreciate how important it is to address these two perspectives when reporting IT's value to the business (see case study in Exhibit 1 below). Value is not a question of whether one perspective is more important than the other. Reporting on both is necessary. What needs to be different in the process is how each is reported. At Booz Allen, our IT scorecard is multilayered; the top layers have relatively few items and are focused on service offerings - the services a user wants and is willing to pay for. Examples are collaboration and communication, telephones, order processing, and financial accounting. Business users find the highest levels of the scorecard most useful.High-level summaries with data that "drills down" into certain details are common in IT scorecards. What is unusual about ours is the nature of the layered approach. Items at each level have been carefully selected for a particular audience. Indeed, creating a scorecard for multiple audiences with such diverse information needs meant we had to do detailed customer segmentation analysis to understand what key performance indicators users of the scorecard wanted to see monitored. Once the key performance indicators were understood, it was relatively easy to identify the performance measures. With a clear understanding of our audience, we were well positioned to identify the data various constituencies needed and develop the right detailed data definitions for our scorecard. At our company, we knew that if a multiyear IT scorecard project was to succeed, we needed to: Keep the communication program about the scorecard project in high gear, and keep pressure on the participants to produce. Ensure that all senior IT managers were publicly enthusiastic about the project (even if they were skeptical in private). Introduce (almost monthly) new functionality\/features in the scorecard. Each change can be small; the advantage of gradual improvements is that mistakes will be smaller and easier to handle, so there's less risk of a grandstand collapse. Steady improvements can also reduce the project's dark days when enthusiasm and support wane. Maintain users' support though formal communication and demonstrations, informal dog-and-pony shows, and hands-on interaction. We try to emphasize the value of the scorecard and manage expectations. As is always the case, it is better to under promise and overdeliver. To date, we are pleased with the results. Our IT scorecard provides on a single page top-level information about our basic service offerings plus financial, customer satisfaction, and human resources information using a stoplight metaphor. (Green is good, yellow means caution, and red indicates there's a problem) - see Exhibit 4. The colors are all backed up with detailed data (numeric where possible). Level Two reports on the same 14 areas, but in greater detail - approximately 140 data points - using the same stoplight format. Together the top two levels comprise the organization's summary scorecard, which is sent to company senior management each month. Starting with Level Three and down through the lowest levels of the scorecard, the focus shifts to technology components. The data is numerical, and it is linked to numerical targets. At the lowest levels, the scorecard is organized for use by specific IT managers, and can involve thousands of data points. All information is available to all IT staff. Critical to all levels and all data points is a set of definitions that include the information about what is being measured and what the service level targets are. Continuous ImprovementsFor us, the next steps are quite clear. We are tightly integrating our monthly scorecard with numerous daily and near-real-time scorecards, monitors, and event management systems to prevent dueling data. We plan to enhance the scorecard's ability to predict rather than just report on status. More sophisticated trend analysis means our databases will have to better capture changes over time and draw the proper predictive conclusions. We will also change our internal structure (processes, skills, budgets, even our organization) to better reflect our service offerings, rather than our technology components. This means adopting a matrix organizational structure, with our service-offering management structure orthogonal to the traditional IT management silos. Staff will need to develop the ability to report to two managers: the traditional technology manager and the new service-offering manager. And most important, the culture will need to change so that everyone takes to heart that it is the service-offering dimension, and not the more comfortable technology dimension, that will drive our decisions and reflect our success or failure. The adoption of the service-offering orientation of our scorecard is an early phase of this culture change. We recognize that if the IT scorecard is to be more than a simple report - that is, if it is to represent the core of who we are and what we do - then its refinement will be an ongoing process, and not just another short-term project. It is our goal to use the scorecard as the rallying point for improvements in IT service and beneficial change for our company as a whole.