From NOX box to diverse problem solver
By Rob James and Peter Spinney, NeuCo
Artificial intelligence- (AI-) based optimization software has been used to improve fossil steam power plant boiler operations for more than 15 years. The technology has come a long way since the early advisory systems that could take more than a year to implement and were difficult to maintain. The industry now is using third-generation closed-loop applications that cover the entire boiler, leverage hybrid technologies, provide enhanced transparency and benchmarking and allow for custom application extensions. This article looks at how the technology is evolving and some of the power producers who are driving the transformation.
Power plant boiler optimization initially focused on reducing nitrogen oxide (NOX) emissions produced during the combustion process in fossil-fired generating units, a complex, highly-variable process that is difficult to predict and control. Adoption was driven by the U.S. Environmental Protection Agency’s 1990 Clean Air Act Amendments, which created financial incentives to reduce NOX by imposing an emissions cap along with a market for trading NOX emission allowances.
Combustion optimization systems optimize the distribution of fuel and air in power plant furnaces by biasing control system settings to those that best meet a set of objectives and constraints. To determine the optimal biases, real-time and historical plant data are used to model relationships between variables that impact combustion quality, such as damper, burner tilt and pulverizer settings, and optimization objectives, such as reducing NOX emissions, improving boiler efficiency and controlling carbon monoxide. The process of determining the optimal biases and adjusting them accordingly is continuous and occurs in closed loop; that is, without the need for operator action.
Changing Industry, Changing Goals
Today, boiler optimization is about more than NOX. Goals are varied and the systems are asked to address a more diverse problem set. This change is due to several factors:
- NOX Regulation Hiatus: In the U.S., the first wave of NOX regulations has passed and the final form of the EPA Transport Rule (designed to replace the Clean Air Interstate Rule) is still uncertain. NOX reduction still matters, but requirements are more region- and plant-specific and actions tend to be driven by the need to avoid litigation and fines.
- Strong Efficiency Push: Producing more electricity with less fuel is not only a way to save on fuel costs, it is also the most cost-effective way to reduce carbon dioxide (CO2) output per megawatthour. CO2 pressures are expected to increase with the Boiler MACT (Maximum Achievable Control Technology) rule released by the EPA for comment on March 16, 2011, which requires generators to demonstrate “best practices for combustion” for compliance on covered hazardous air pollutants such as dioxins and furans.
- New Operating Complexities: A whole host of new operating challenges and opportunities are coming to the forefront. One example is the increased demand on base load fossil plants to ramp up and down frequently and run at lower loads due to the addition of intermittent, renewable generation sources on the grid. Another challenge stems from the addition of NOX control hardware; now that plants are running selective and non-selective catalytic reduction systems year round, they are looking for ways to reduce their operating costs and minimize negative impacts on efficiency and availability. Adopting new instrumentation and controls also provide more opportunities to optimize. Finally, the MACT regulations for mercury are going to require sorbent injection for many units, adding further complexity to the combustion process.
- Maturing Market: The technology has now been in the hands of end users for a long time and they realize it is capable of much more than reducing NOX. Even projects that start with a NOX focus often morph to address changing objectives and emerging operational issues. For instance, Deseret’s Bonanza power plant in Utah started using combustion optimization in 2004 to reduce NOX emissions. But following a series of mechanical changes that made NOX less problematic, the optimizer’s priority shifted to improving efficiency and lowering carbon monoxide. More recently the focus has been on balancing oxygen (O2) across their open windbox boiler in order to improve overall operating balance and consistency. Once users see how the technology works, they realize the best way to achieve a bottom line objective, such as heat rate, is often to target their chronic boiler-specific problems.
The definition of boiler optimization has broadened. What was once synonymous with furnace fuel and air mixing now refers to integrated optimization of the combustion and sootblowing processes and includes the furnace and backpass regions of the boiler. Because the combustion and heat transfer processes are highly interrelated, independently optimizing them without knowledge of what the other is doing leaves benefits on the table and, in some cases, causes conflicts. For instance, boiler cleanliness significantly impacts firing intensity and tilt position, and combustion stoichiometries and temperatures affect ash build up, fouling and slag formation.
Another result of the maturing optimization market is the increase in customer-driven, value-add applications that extend beyond a typical boiler optimization project. APS’ Four Corners station started using combustion optimization in 1999 and later expanded its optimization program to include total boiler optimization (including sootblowing) as well as performance and equipment anomaly alerting at all five units. Now the focus is on broadening the solution to deal with unit-specific operational issues, such as an application to minimize ball tube mill pulverizer re-peaking to eliminate the need to supplement firing with natural gas. Other objectives include helping to maintain optimal coal inventory for fineness control and load response and optimizing ball charge. The project entails using neural models and an optimizer to bias pulverizer kilowatt peak to maintain pulverizer decibels (dB) within prescribed limits. Meanwhile, heuristic rules use the current kilowatt peak bias to determine ball level due to wear and suggest ball recharge.
APS and Xcel Energy are also incorporating a “slagging index” into their optimization profiles to capture heat transfer patterns that indicate the onset of slagging so appropriate sootblowing actions can be taken.
Another opportunity stems from the introduction of new instrumentation options. For instance, DTE Energy’s Belle River Unit 2 reported a 1.7 percent heat rate improvement and 20 percent NOX reduction when they integrated their combustion optimizer with a ZoloBOSS laser-based combustion sensor designed for the ultra-harsh combustion zone. Meanwhile, TMPA Gibbons Creek is integrating optimization with their Foster Wheeler SOFA system that uses advanced instrumentation to measure coal flow in each conduit and optimize air flow to each fuel nozzle. They seek to address a broader range of control variables and improve upon NOX, CO, temperatures and O2 balance.
Hybrid Technology Approaches
The biggest enabler of the changing boiler optimization landscape is the broad range of optimization, modeling and advanced control technologies that are now being deployed.
There used to be two distinct technologies for optimizing combustion: artificial neural networks and model predictive control (MPC). Neural networks are nonlinear, multivariable steady-state models that are used to identify the best combinations of variables under varying conditions. MPC uses dynamic models to predict future changes and anticipate the effects of disturbances and future moves. Both approaches have benefits and limitations.
CPS Energy’s Deely Plant implemented a hybrid neural network and model predictive control optimization system.
Today, these technologies and others are being combined. In one combustion optimization approach, neural models are used for the manipulated variables that can be adjusted over time to balance unit operations, while MPC is used for the major gross-air controls that must respond quickly to plant conditions. One of the first to apply this hybrid system was CPS Energy’s Deely Plant in San Antonio, Texas, where the combination of technologies resulted in improved steam temperature control and attemperation sprays compared to a neural-only approach.
Rule-based optimization is also playing a key role. Rooted in the expert system side of AI, heuristic models capture and codify human expertise. These rules can be systematically applied by an inference engine, which provides a way to rank a set of possible actions given a set of conditions. Expert rules can also be used to address discrete changes that have long challenged traditional optimization technology, such as automatically swapping in the most recently trained models for a particular combination of pulverizers in-service, or switching between neural and direct search-based optimization schemes.
This increased sophistication enables knowledge to be extracted from the operations personnel who have intimate knowledge of their specific units and encoded in software so it is permanently available at units, plants and corporate performance centers on a widely distributed basis. Rules can be seamlessly integrated with neural networks, MPC, first-principles equations and other methods to form more advanced hybrids and applied in the best combination to each problem.
Figure 1 shows an example of the hybrid technology approach to boiler optimization deployed at NRG’s Limestone plant in Texas.
Early optimizers were essentially “black box” systems that took actions in closed loop with little human interaction or understanding. As the industry evolved, the importance of the user-AI interface has risen to the forefront. People need optimizers to do more than complete tasks. They need them to provide insight into the optimization problem.
Optimizers work on difficult problems and often make trade-offs to balance competing goals under tight constraints. Just where those tradeoffs are is often a mystery, even to the most seasoned operator. Today’s graphical user interfaces (GUI’s) and analysis tools provide insight into exactly what the optimizer has been told to do and how the process is responding. The ability to expose this in an intuitive way so that the user can focus only on the key principles involved, without being distracted by complex software and theory, is important to system success. This transparency improves optimizer performance, situational awareness and dialogue within the user organization and with the vendor support team.
Although benchmarking is challenging in the noisy multivariate situations that define boiler processes, optimization requires it in order to apply some real notion of how well things are going. Emerging optimization measurement tools are now capable of spanning causes, disturbances and objectives to form a coherent understanding of complex process change. It is possible for users to quickly select, filter, clean and view data for any number of manipulated, disturbance or objective variables, using any combination of boolean conditions, value ranges or time contexts. The results can be compared side-by-side to investigate patterns of change and test hypotheses about cause and effect with relative ease.
The market is also driving improved heat rate measurements within boiler optimization systems. For optimization it is less important to know the exact heat rate at any time than it is to understand the relative change. If the proxy for relative changes in fuel efficiency is directionally valid, repeatable, properly-scaled and sensitive to one’s actions, it can be used for optimization. One example is using a summation of the different losses that are directly affected by boiler optimization into a single index of heat rate loss. The index can then be used to directly target heat rate by including a goal (residual) in the optimization profile, or to view and track improvements in heat rate.
Key performance indicator- (KPI-) based reporting is also being used to track benefit and change. By showing the optimization activity and generating unit KPIs over time, it is much easier to identify emerging issues, focus on areas to improve, maintain high system utilization, sponsor dialogue within and between organizations and understand the value being delivered. One example is Deseret Bonanza’s use of shift-specific operator reports, which use optimizer insights to provide operations personnel with timely knowledge about how their decisions impact plant performance and how their shift compares to other shifts, weekly and long-term performance.
While boiler optimization systems have come a long way, the journey is by no means complete. Optimization systems, like the industries they are applied to, will continue to evolve. The systems will continue to improve to meet the demands of a maturing and dynamic market.
Authors: Rob James is product manager, responsible for NeuCo’s Boiler Optimization products. He has been with the company since its inception and has 15 years of experience in applying neural network and other artificial intelligence technologies to industrial boiler processes. Peter Spinney, director of Market and Technology Assessment, has been at NeuCo since its inception. His background includes more than 25 years of combined electric power generation, economics consulting and government agency experience. He is also the chief blogger for www.theOptimizationBlog.com
Power Engineerng Issue Archives
View Power Generation Articles on PennEnergy.com