EPRI and vendors working to improve sensor technology

Issue 3 and Volume 99.

EPRI and vendors working to improve sensor technology

By Joe Weiss and Bob Iveson, Electric Power Research Institute

An EPRI study indicates that current sensor technology has reached its technical limits

The regulated electric utility industry is a basic enabling industry for the U.S. economy. This traditionally has been a conservatively-operated regulated industry. However, with utility deregulation of wholesale generation, increasing transmission access and the Clean Air Act, utilities can no longer afford to operate conservatively.

The available sensor technology for many utility industry applications?pressure, level, flow, combustion, chemistry, current and voltage sensing?has been reviewed and found lacking. The development of new instrumentation and control (I&C) technologies is a major initiative that would make generation and power delivery more competitive and reduce the cost of electricity. Advanced sensor and control technology can reduce the cost of electricity with ultimate benefits to the U.S. economy of up to $11 billion per year.

Developing new I&C hardware has a high technical risk and is a costly endeavor that utilities, traditional I&C equipment suppliers and the Environmental Protection Agency (EPRI) cannot afford alone. EPRI is a non-profit R&D institution founded in 1973. EPRI is funded by voluntary payments by member U.S. utilities that represent 70 percent of retail kilowatt-hour sales in the U.S.

Currently, the regulated electric power utilities act as quasi-monopolies with rates of return limited by the state public utility commissions (PUCs). Advanced instrumentation and plant distributed control systems (DCS?s) primary benefits have been perceived as improving plant efficiencies and lowering fuel costs. However, in many states, the fuel costs are passed through to the ratepayer through fuel adjustment clauses. Because there was no incentive for reducing fuel costs, only 30 percent to 40 percent of the fossil plants in this country have implemented DCS upgrades.

Current industry status

The average age of U.S. fossil plants is 25 years while the nuclear plants have an average age of 15 years. Almost all of the plants are still using the original I&C systems. As a result, the plants are experiencing increased maintenance and generation costs. From 1988 to 1991, fuel costs, as a function of total electricity production, decreased by approximately 5 percent, while maintenance costs, as a function of overall cost to generate electricity, increased by approximately 6 percent, resulting in a 1 percent increase in the overall cost of generating electricity.

Depending upon the plant?s vintage, each fossil or nuclear power plant may have up to 5,000 process sensors/gauges, located throughout the plants, that monitor the various plant operating systems/processes. Approximately 1,000 sensors measuring pressure, temperature, level and flow control the plant while the remainder are used for information.

Improved I&C is a cost-effective approach in which to achieve the increased efficiencies and productivity necessary to maintain competitive power plants. Since I&C technology is constantly being improved, it is possible to take advantage of these advancements as part of normal equipment replacement.

Generation needs

Many fossil plants have changed from baseload to cycling operation. Cycling of power plants affects the way plants need to be controlled to optimally generate power and optimize the life of key components. On the one hand, the faster plants can change load, the less fuel is burned. However, the faster the load change the more thermal stresses (fatigue) are placed on key components such as boiler tube headers and turbine rotors.

Fuel costs frequently change, so decisions on plant thermal performance will also change with time. Emissions control has also complicated plant operation. Optimizing heat rate can increase and/or decrease certain emissions. These mutually exclusive constraints make economic operation of fossil plants extremely demanding, requiring advanced control system hardware, advanced algorithms with new automated functions, advanced instrumentation, improved final control elements and true open system architecture.

The basic sensing cells used for process instrumentation including pressure, level, flow and temperature are vintage 1950-1960. The only changes have been the adding of microprocessors and diagnostics.

Plant DCS?s are being implemented to reduce plant operating costs and provide more plant flexibility. The drawback with current sensors, which were masked by the shortcomings of the analog and pneumatic control systems, are now part of the Ocritical path? for control and operation of the plant. This leads to efficiency losses and high maintenance costs. Statistics from nuclear power plants have identified process instrumentation as contributing approximately 12 percent in lost production time. Drift in pressure and differential pressure (dp) sensors is often on the order of 1 percent to 2 percent over a 12-24 month period.

A 1-percent lower main steam flow measurement error can result in a 1 percent increase in plant heat rate (fuel costs) while a 1 percent higher reading can cause a 1-percent reduction in unit net load. Data from more than 30 power plants indicate that main steam flow measurement inaccuracies are much closer to 3 percent to 5 percent.

Failed boiler tubes are the leading cause of fossil plant boiler outages. According to the North American Electric Reliability Council, coal-fired units, 200 MW or larger, have experienced more than 15,000 boiler tube failures during the past six years. The failures amount to an average equivalent availability loss of 3.3 percent annually, which is approximately $825 million per year.

Past attempts to address this problem have been based on estimating the tube header stress by monitoring the fluid-metal temperature difference. Although this is a simple way to estimate the tube header stress, the accuracy is dependent on a tuned analytical model.

A more direct and accurate method for determining stresses on boilers is to install strain sensors at critical locations. Unfortunately, conventional strain gauges have not proven reliable in environments greater than 1,000 F. In addition, special application strain gauges are extremely expensive and have not demonstrated long-term reliability.

Electric utility experiences

Testing at Pacific Gas & Electric?s Moss Landing oil/gas-fired power plant indicate a 1-percent temperature measurement error leads to a 3.7-percent decrease in high pressure turbine efficiency if the sensor reads high and a loss of component life if the sensor reads low. Existing temperature sensors are at best 99 percent accurate at temperatures above 1,000 F.

Maintenance records at Arizona Public Service Company?s Palo Verde nuclear power plant indicates that scheduled surveillance, and testing for one of the Palo Verde units during the 18-month fuel cycle and refueling outage, consumed 26,000 man-hours

Maintenance records at Tennessee Valley Authority?s coal-fired fossil plants indicate that maintaining EPA-required continuous emission monitoring systems consume 1,000 man-hours per system, per year. Typical maintenance labor rates are $50 per man-hour.

Electric utilities have many needs for sensors including:

Y To measuring pulverized coal flow to individual burners for burner balancing, thus improving plant efficiencies and reducing emissions.

Y To measure and control NOx production at the individual burners.

Y A need for noninvasive technologies for measuring strain in a robust and reliable manner.

Y Drift-free sensors that do not require frequent, costly calibrations to maintain required accuracy.

New technologies and the future

Current sensors, such as pressure, dp (level and flow) sensors, temperature (thermocouples and RTDs) sensors, high temperature strain sensors, continuous emission monitors, nuclear plant area radiation monitors and nuclear plant hydrogen/oxygen sensors are expensive to calibrate.

EPRI studies have shown that present sensor technology has reached its technical limits. As an example, pressure transmitters may be able to increase accuracy by a small amount, but cannot significantly reduce frequent calibration requirements or eliminate fundamental failure modes. In order to significantly improve process sensing, new technology is required that does not have the inherent limitations of current designs.

Fiber optic technology sensing has the capability to perform distributed sensing. With filtering or output processing, the entire length of a fiber optic can act as a sensor providing a profile measurement rather than a point measurement.

Additionally, fiber optics are inherently sensitive to pressure and strain which means that a fiber optic can simultaneously act as a pressure, temperature and strain sensor. Chemical constituents, CO, CO2, NOx and SO2, have characteristic wavelengths that can be measured using spectroscopic techniques. Microwave and magnetic resonance imaging technologies are also being researched. Many of these technologies are immune to electromagnetic interference and radio frequency interference.

As a result of utility-identified sensing needs, EPRI issued a competitive request for proposal, in late 1992, to develop, demonstrate and commercialize the next generation of advanced power plant instrumentation. This solicitation netted EPRI 31 proposals from 18 vendors.

The purpose of the EPRI program is to perform conceptual design and prototype development and testing of new instrumentation for power plant operation. Using state-of-the-art technology, the EPRI selected suppliers would research process sensing (pressure, level, flow, temperature), component lifetime expenditure (strain, stress, temperature), combustion (chemistry), control system integration and transmission/distribution needs.

Current utility estimates indicate that improved process instrumentation can improve heat rate by up to 1 percent. This heat rate improvement is worth more than $300 million per year to the fossil utility industry. In addition, a 1-percent availability improvement from advanced instrumentation would be worth more than $3 billion per year.

State-of-the-art controls and instrumentation systems will allow electric utilities to utilize its existing assets to their full potential, thus allowing the assets to remain in productive service for a longer period of time. In addition, these systems would mean higher worker productivity, lower non-fuel related operating and maintenance costs and, in some instances, lower power plant emissions. END


Robert Iveson is program manager-strategic planning in EPRI?s Power Delivery Group. His main fields of interest are power systems planning, operations, and control and communication systems.

Joe Weiss is manager-controls and automation in EPRI?s Generation Group. His main fields of interest are instrumentation and control for generating stations and utility automation.