
You could argue Clift Pompee has adapted with technological change as well as anybody.
In a 13-year stretch while working at Duke Energy, he helped deploy five products to drive efficiencies in the utility’s operations and generating fleet, from advanced analytics to business intelligence to machine learning applications.
Take work management software, for example. Pompee’s team introduced iPads for workers to bring to a plant site, rather than a sheet of paper. These iPads would allow workers access to problem reports, equipment histories, even a digital library of the utility’s maintenance procedures.
Using talk-to-text, workers could make notes quickly, rather than waiting to access a single computer shared by multiple team members in the maintenance shop. They could close out jobs on-site.
“But the challenge…was not in the technical saving of time,” Pompee noted. “The real challenge was in the culture shift.”
He said some workers were hesitant to adopt the tools because of personal routines or concerns about efficiency backfiring.
“We had to convince people why it was better for them and not necessarily just better for the company,” he said.
To that end, new developments and technological change often require a mindset shift.
That’s where Pompee finds himself now, as the VP of Power & Emissions for Compass Datacenters, a hyperscale data center developer. He is now more than five months into the role, at a time when AI advancements are leading to energy-consuming data centers popping up everywhere. He is an evangelist for the company’s slogan, “Data centers done differently.”
But more on that shortly.
Pompee has all the power generation bona fides one might expect of a 23-year career in the utilities industry, working for both Florida Power & Light (FPL) and Duke.
He began his career as a steam turbine support engineer and was involved in repowering projects converting oil burners into natural gas combined cycle (NGCC) units. He later spent six years in the nuclear sector, including working in oversight at the now-retired Crystal River Nuclear Plant under Progress Energy, before it was bought by Duke.
Pompee was also a gas turbine program manager, focusing on GE 7F turbines. He would later become Duke Energy’s managing director of low-carbon technologies, like small modular reactors, hydrogen, long-duration energy storage and carbon capture.
“My career trajectory was actually very well-scripted,” he said.
Even as he worked on the strategy side, he hadn’t considered leaving the utility industry. But in Pompee’s last few years at Duke, the realities of AI became more real.
A breakthrough moment came when a hyperscaler asked the utility about securing 750 megawatts (MW) of hydrogen power for their facility—equivalent to the output of a large gas turbine power plant. At the same time, Duke was already running out of capacity for its own machine learning algorithms.
“It makes perfect sense that these things would require this much power,” said Pompee.
While conducting modeling for Duke Energy’s Integrated Resource Planning (IRP) and focusing heavily on data center-driven load growth, Pompee was connected with Compass Datacenters, which was seeking a leader for its data center initiatives.
Unsurprisingly, Pompee’s initial conversations with Compass centered around the challenges of finding reliable data center power. By this time, projections were suggesting U.S. data center electricity consumption would at least double by 2030.
The opportunity to shift industries and take on this challenge from the data center side had a lot of appeal for Pompee.
While he said he initially processed data center power challenges from a utility planning perspective, Pompee soon recognized there was a more flexible approach. Instead of solely relying on large-scale infrastructure projects, the focus was on more innovative, creative solutions.
“Not every one of these power challenges requires you to go build 100 miles of transmission and a gigawatt of power,” he said.
The ‘Co-Serve’ Model
Utilities and data center companies are undoubtedly learning more about each other as getting a grid connection in certain power-constrained markets is not as cut-and-dry as it used to be.
Understandably, there are some communication and expectation gaps.
Pompee noted that data center developers, who are requesting increasingly large amounts of power over time, may not always fully account for how utilities interpret these requests. Utilities need to plan for that requested capacity in their resource modeling and secure enough reserves to meet the demand, even if data centers aren’t immediately using the full amount.
Pompee pointed out that while data center operators may assume excess capacity can be sold to others if not used, regulated utilities cannot operate in that way. They must ensure the full committed capacity is available.
“The idea of the realistic load ramp and the load shape is something that both sides really need to get together and have a realistic discussion,” he said.
This speaks to a more collaborative approach between utilities and data centers companies, rather than a transactional one. Pompee said Compass has adopted the former approach.
Instead of simply requesting large amounts of power, he said Compass conducts a detailed study to determine the actual power needs for a specific site, engaging in an ongoing dialogue with utilities. This process includes conducting a “test fit” to assess how much load can be supported on a given piece of land, followed by discussions on infrastructure requirements, such as whether new transmission lines are needed.
“We don’t submit a power request and then go away and wait for the utility to give us an answer,” said Pompee. “We submit a power request, we talk to the utility. We have a lot of back and forth, and we solve the problems together.”
Hence: “Data centers done differently.”
DTECH Data Centers and AI is a new event under the DISTRIBUTECH brand that aims to explore the strategies necessary to navigate data center project delays, power constraints and the increasing demand for sustainable, flexible solutions. We are seeking dynamic and engaging speakers, presentations, panel discussions and case studies related to these issues. Apply through our Call for Content submission portal here.
Compass, with 20 active U.S. data center campuses and four more in development, believes in a “co-serve” model, which means being a more active partner in planning, risk and capital costs.
For one, when utilities commit to million- or billion-dollar generation and transmission investments, Compass believes the burden shouldn’t fall solely on residential ratepayers if speculative data center projects fail to materialize. Instead, data center developers should contribute by covering some costs, like for upfront feasibility and environmental impact studies.
The company also believes data center developers can play a critical role in infrastructure development, with their greater flexibility than regulated utilities in deploying capital. They can help secure rights of way and construct transmission lines and substations more efficiently, potentially accelerating timelines and easing bottlenecks in the construction process.
Another option Compass advocates for is to embrace more creative approaches to rate paying, such as utility tariffs requiring large load customers like data centers to pay more.
Peak shaving offers yet another potential solution. Pompee noted that emergency backup generators at data centers, which typically remain idle, could be deployed during peak demand periods to alleviate grid pressure. While stringent air quality permits make this solution difficult to implement universally, he supports exploring flexible approaches to utilizing backup generation.
Power providers, regulators and the data center industry could collaborate on this effort, coming up with permitting structures that consider different fuel sources and their emissions profiles. Even updated permitting structures for hydrotreated vegetable oil (HVO), which Compass data centers use for emergency backup rather than diesel, haven’t been fully explored.
“Let’s figure out a way to make use of those assets that’s not prescriptive,” Pompee said. “I’m not saying, hey, relax air permits and let’s let it rip, but there’s an answer there, and there are assets here.”
A future breakthrough could be for some data centers, particularly those in AI training, to be able to flexibly shift their IT workloads in response to grid demand signals. This could mean adjusting AI training schedules or shifting load to other data center campuses to avoid power strain during peak times, like heatwaves or holidays.
Hyperscalers and technology providers have said temporal and spatial computational flexibility is possible if they are given appropriate signals. However, federal energy officials could not identify examples of grid-aware flexible operation at data centers today, with a few exceptions: The carbon-minimizing geographic optimization Google has employed for several years, recent efforts to respond to energy shortages in the European Union resulting from the Russian-Ukraine war, and flexibility requirements in Ireland.
In a more fundamental way, load flexibility solutions begin with utility-data center collaboration. Right now, there’s no set, universal template for a direct path of communication from system planner to operator.
“I think we’ve got all the bones there,” said Pompee. “It’s just a matter of, because we haven’t done it, there’s no real best practices on how to.”