Value Benchmarking Pays Off in Quest to Optimize Power Plant Performance

Issue 11 and Volume 107.

by: Dale Probasco, Navigant Consulting Inc.

For electricity generators, benchmarking is a process that is once again beginning to come into its own. Starting with nuclear operators, and now increasingly used by fossil operators as well, the process of comparison to peers and performance assessment is growing across the industry. More and more practitioners are finding and realizing anew the value in these longstanding techniques.

But for many generators, questions still remain. What exactly is the value of benchmarking, and – more importantly – how can benchmarking be leveraged to increase value? A decade and more ago, early attempts at benchmarking in the electric business drew criticism for unfulfilled expectations. After benchmarking performance against a peer group or a similar industry norm, then what? In the early days, the question of what to do with such intelligence went unanswered all too often. And the value of and interest in these techniques, once so promising, waned in proportion.

The difference this time around is simply that the value proposition is clearer. In essence, benchmarking practitioners today are beginning to understand better how to use information as a guide to asset optimization. The chain of activity that Navigant Consulting has developed and practiced, called Value Benchmarking, illustrates this process.

Budget Allocation at the Plant

Maximizing the value of power plant assets is management’s key responsibility, at both the plant and fleet level. Yet the process by which this is done, the annual and periodic allocation of budgets and resources, has always suffered from a host of practical problems on the ground. How is value to be measured? How good are alternative forecasts of value creation? How are short-term versus life-cycle considerations to be traded off? How should intangible issues and business uncertainty be treated? Who might be gaming the system?

The fact is, every year, in every power plant in the world, a budget gets allocated and decisions get made. And every year, in every power plant in the world, those involved in the process come away frustrated and uncertain about how well they did, unclear about the impact on profitability and yearning once again for a better way.

One of the key problems with budget allocations is a lack of clarity about what is most important. Certainly in all power plants the overriding concern is to get online, stay online and produce profitable megawatt hours as the market dictates. However, it is a real challenge to meet this objective while at the same time improving reliability and operational flexibility, enhancing safety, preparing for changes to environmental requirements, and continuously reducing costs.

One way to discriminate to some degree among these competing objectives is to examine how peers are doing, and to what extent your own plant performance differs from the norm. This is where benchmarking has value, and where Navigant’s Generation Knowledge Service (GKS) and the concept of Value Benchmarking are making contributions.

A Four-Step Process

Value Benchmarking can be explained using the four-step model illustrated in Figure 1.

Click here to enlarge image

1. What is the relative position of my assets? – This is Part I of a standard benchmarking exercise, wherein performance is compared across key indicators against a carefully selected peer group, to identify potential problem areas. This can be a static comparison, or it could involve trend analysis. It is also an exercise that should be performed at regular intervals, to assess changes in performance, divergence from peers, deterioration, and the like. The GKS system makes such dynamic analysis particularly convenient.

2. What is the size of the gap? – This is Part II of standard benchmarking. After the key problem areas have been identified, the extent of the problem must be quantified. Quantification takes on a key importance in this approach, because it helps to determine the resources needed to effect change in subsequent steps and determine the economic viability of taking action.

3. What are the alternatives to closing the gap? – This is where traditional benchmarking transitions to the first step in asset optimization. In this activity creativity and expert judgement – and knowledge, such as that generated through benchmarking – are brought to bear on the problem, and solutions are proposed and evaluated on technical merit.

4. What makes economic sense? – The final step, after screening for technical effectiveness, is to evaluate the economics of the project, or what return the investment will generate. This step will result in the prioritization of proposed projects and the allocation of resources required to address the key problem areas identified at the outset.

In this formulation, benchmarking as typically understood becomes the front end to an asset optimization process, and the knowledge gleaned from the benchmarking exercise helps to guide selection and prioritization of potential remedial actions. The key is to effectively and systematically use the knowledge gained to drive decisions. How exactly is this done?

Benchmarking Insights

The following example illustrates some of the processes by which benchmarking intelligence can be used to better guide asset optimization decisions. This example relies on the industry data and processing facilities of the GKS system, but the principles are universal.

Click here to enlarge image

Cost control has always been a key element in the performance of coal-fired generators, as has availability. To be part of the generating mix a unit must keep its costs low, in particular lower than other nearby alternatives. At the same time, being available to meet market demand is imperative, because earnings are dependent on selling megawatt-hours. For a unit such as the one pictured in the benchmark comparison of Figure 2, both cost and reliability measures could stand improvement in comparison with the peer group.

In the first instance it is important for an operator to know how the industry leaders are performing, and a high-level benchmarking exercise can illuminate this. As shown in Figure 2, the best group of plants – as defined by low operating cost and high availability – can be highlighted using a good benchmarking database, even without specific identification.

Click here to enlarge image

Once the high-level comparison is established, benchmarking practice often provides a strategy for zeroing in on the key areas driving this unit’s unacceptable performance. As Figures 3a and 3b illustrate, this unit is not a particular outlier with regard to operating costs, but its maintenance costs are notably higher that its peers. Figure 4 then steps the analysis down another level, illustrating that boiler maintenance cost is a chief contributor. Further, Figure 5 shows clearly that the boiler is also the major contributor to the high forced outage statistics. Thus, concentrating on the boiler would appear to be the way to fix both cost and availability concerns.

Click here to enlarge image

In this situation management is undoubtedly already aware that both boiler maintenance costs and forced outages are high, but they may not be as clear on just how high, nor on how much better peer unit performance can be. The value of benchmarking is also that it can cause management to shift to a new paradigm that a better, less costly way is possible (since others are seen able to achieve a higher level of performance). The series of benchmarking comparisons presented above demonstrates how an analysis can be used to help pinpoint the specifics.

Click here to enlarge image

Once the key problem areas have been isolated sufficiently, the core work of asset optimization can begin. In many cases the problem can be clearly identified but the appropriate technical fix is still uncertain, as to both how well it will work and as to how much value it will create. In the first instance the degree of improvement in key performance indicators (e.g., availability, cost reduction) from a given project can only be estimated ex ante, and is almost always uncertain. Secondly, the dollar worth of any given improvement is highly dependent on market conditions and the duty cycle of the unit, usually even more difficult to pin down with certainty.

Several mitigation alternatives might be proposed by the engineering staff, such as a control system upgrade (to help with heat rate and cost control) and a boiler wall upgrade (to help with tube failures and forced outage rate). Each can be evaluated using economic and cash flow analyses, which consider both the technical effects of the project and the market value of such a performance upgrade.

Benchmark analysis can also be used to good effect in this process. One universal difficulty in resource allocation is the need to reliably predict ex ante the expected value impact of a given project. This is difficult enough for the individual project sponsor, and quite often even more so for management charged with choosing among multiple projects being put forward. Often such value impacts contain several drivers of uncertainty, making the choice of the optimal project a challenge. A benchmark assessment many times helps bound such relevant uncertainties by identifying the possible ranges of key drivers – in cost reduction, heat rate improvement or availability improvement – achieved by a peer group. State-of-the-art, user-friendly stochastic models can then be employed to optimize the decision process. Typically these models identify the top 8-10 drivers of value and the associated financial impact of each of these on the overall metric. The key benefit of utilizing a probabilistic asset optimization model is in the ability to see the differences between the predicted deterministic value and the expected value. Relative project value can be assessed on a plant basis or on an overall fleet basis to optimize the value metric at the desired level.

The essential asset optimization questions concern value creation. What is the expected economic, risk mitigation, or customer value of these projects? Further, what is the expected performance improvement of each project and how does that reposition the asset in comparison to the peer group? Once a good assessment of project impacts has been developed, the benchmarking exercise can be repeated regularly into the future to gauge competitive improvement.

Getting the Priorities Right

At the end of the day, there will always be real trade-offs in the asset optimization process — assessing opportunities, identifying remediation options, evaluating multiple dimensions and managing risk and uncertainties. It is the role of management to use the information available to make the best decisions possible. This is where benchmarking can provide real value to management — namely better asset optimization and decision quality through improved information and knowledge.

To date, using benchmark information and the benchmark process has been underutilized in this regard, partly out of a historically underdeveloped capability, partly out of a history of missteps, and partly out of a failure of imagination. This is beginning to change today, and the timing could not be any better.


Dale Probasco is a Director with Navigant Consulting, Inc. He has more than 25 years of experience in managerial and consulting positions in the energy industry, including management positions with Bechtel, Toledo Edison and Utah Power. Probasco is currently leading the Generation Services practice, with responsibility for NCI’s plant operational performance decision tool, the Generation Knowledge Service (GKS). He holds a BS in Business Management from Utah State University and has completed several postgraduate courses.