Symphony at work

Two announcements by Boom last week caught the attention of the aviation ecosystem. Both announcements made simultaneously and represent an interesting pivot in the manner an aerospace company goes about its product development cycle.

Boom announced a fresh funding cycle of $300 Mn not for its aviation business but for a subsidiary Boom SuperPower. SuperPower is a 42MW natural gas turbine delivered in a 40 foot container, optimized for AI data centres, the turbine is built on the Boom Symphony’s supersonic technology and the launch was announced with a 1.21GW order from Crusoe Energy.

The Boom Superpower aeroderivative gas turbine

Boom Supersonic has been developing its Overture airliner since 2014. The Overture is to be the spiritual successor to the legendary Concorde. After multiple years of hunting for the right supersonic engine (a partnership with Rolls Royce fell through as RR wanted to shoehorn an existing subsonic engine) Boom decided in 2021-2 to develop its own Symphony supersonic engine. (Full Overture story here : The Boom Overture : A Return to Supersonic | theaviationevangelist ). The engine is designed and projected to develop 40,000 pounds of thrust and four such engines are slated to propel the Overture to Mach 1.7.

As per their CEO Blake Scholl the power turbine was always on their roadmap, it was just expected to come later after the Overture went supersonic, but Boom’s pivot came last week when the gas turbine slotted in before supersonic happened. The pivot happened because of an urgent need in the energy ecosystem and an opportunity to find a niche here in addition to their primary vision of a return to supersonic. https://x.com/bscholl/status/1998372107215122910?s=61&t=94iDmeURmA5WeYayiLPshA

Data

While the concept of data has always existed, we are very interested in the last 25 years. Data units have grown exponentially from Bytes to Kilobytes to Megabytes to Gigabytes to Terabytes – Petabytes – Exabytes all the way to Zetabytes and Yottabytes. To further illustrate the data growth, 1Zetabyte = 1 billion Terabytes, the unit we are most familiar with.

Mankind has always known that data & information is power. The advent of computers helped ease the laborious process of typing and handwriting massive amounts of data. The next logical step was having the data at one location to ease access.

Early data centres from the 1940s (remember ENIAC) were massive machines that filled up  buildings and needed tremendous amounts of power (hold the thought) , generated huge amounts of heat (data centres need a lot of water) and needed tight physical security (to control who has access, we now do all this online!!). The data centres were stand alone in the pre-internet days. The idea of time sharing on these massive machines would come about in the 1960s.

Data Base Management Systems would only happen in the 1970s and they provided a structured way to organize and retrieve data files. These systems were housed in corporate / military / government facilities and came with rigidity and maintenance issues. Furthermore such data centres continued to be very expensive and were considered rare. Scalability meant ordering fresh hardware, connecting to the system and configuring it. The process could go on for weeks or even months and entailed significant cost. The service bureau model emerged where the equipment was owned and maintained by an external organization and the space and time was sold to the company that needed the space. This is considered a precursor to the cloud model.

John McCarthy first proposed computation as a public utility back in the 1960s. By 1999 Salesforce.com launched one of the earliest Software as a Service (SaaS). SaaS along with Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) together form the three pillars of what we call the cloud!

It is the IaaS providers such as Amazon Web Services (AWS) in 2006  and Microsoft Azure in 2010 who were behind taking the idea of cloud computing mainstream. 2010 is also the year the Zetabyte era started with 2ZB processed and stored. In 2025 it is projected to be 181ZB up from 147ZB in 2024. The availability of easily scalable and cost effective models directly facilitated the manner in which organizations work. Decision making increasingly passed from human hands to a machine.

As of 2010 the US had approx 2100 data centre facilities all together that number has swelled to between 4,100 – 5400 facilities with cloud computing having widespread adoption by 2035 we should see a quadrupling of data needs from 2024 levels to over 500ZB. We are already seeing increased applications of generative AI in our everyday lives and the use of big data working with machine learning for our screens telling us what we need, when we need.

All this computing power means the power needs of these data centres will quadruple from 25GW in 2024 to over 100GW by 2035. U.S. data center power demand could reach 106 GW by 2035: BloombergNEF | Utility Dive.

Energy

Data centers with their exponentially growing power demands need sources of cheap and plentiful energy. The primary source of power in the United States is the electricity grid. The issue that comes up is data center growth and grid build out are running along completely different timelines.

In the past the average data center needed 40MW which is enough to power 30-40,000 homes. The same is now at a planned power need of 100MW or more. The largest of the data centers now have a projected power need of 500MW and more, going as high as 1.2GW.

Before we get to the power needs, a typical data center needs a minimum of between 10 – 40 acres of land and hyperscalers can go as high as 500 acres and more. Companies wanting to set up data centers need to look at land prices and water availability in addition to power needs. The combination of land pricing and water availability (taking into account existing water table loads) necessarily means that data centers have to sprout up in places with minimal population and cheap land prices. This also means that power infrastructure might be minimal to unavailable. In such a situation the data centers need to put down an application for a power line off the grid. Between 2010-2014 data centers accounted for 2% of total electricity generated in the US, by 2024 this had gone up to 4% and the same is projected to be 12-15% of all the electricity generated by 2035.

Power infrastructure is unable to keep up with the current and projected data center growth. By 2024 the grid build out rate had slowed down to 300 miles per year (across the USA) from 1,700 miles per year in 2010. The projection for 2035 is expected to be approx 300-400 miles per year. FEWER NEW MILES The rate of grid buildout has fallen by about 80% while data center growth is accelerating. The timeline gap only widens from here.

Data centers typically take anywhere between 2-4 years to be completed (depending upon size and complexity). Data centers with modular constructions can be completed in as little as 12-18 months! By comparison the grid with its slowing buildout rate takes between 7-12 years to fulfil data center connection requests for grid connections. The National Grid is already facing severe congestion and data center hubs like West Virginia consume 26% of the state’s total power, something that power authorities across the board are wary of. Furthermore the costs associated with laying out fresh/upgrading infrastructure vary anywhere between $100-300 per kW making the typical 40MW connections needed today cost anywhere between $4-12 Mn per connection and the developer is expected to bear the cost.Underground vs. overhead: Power line installation-cost comparison and mitigation . All these factors come together to delay electric connections to upcoming data centers. But hold on! 

The US is one of the biggest producers of oil globally and gas a by-product is vented and flared!! Approx 63% of all the gas generated in the US is flared/wasted. What if gas could be used to power the data centers and data centers go BTM (Behind The meter)? All the growth issues faced by the centers disappear overnight and growth stays on track! The data centers have their answer to cheap energy, gas & aeroderivative gas turbines.

Aeroderivative Turbines

The concept of gas turbines finds its beginnings with Leonardo da Vinci back in 1550, however the designs could not be implemented then due to non availability of materials. Gas turbines finally came about in the early 20th Century with the first commercial industrial gas turbine going online in 1939 at a power station in Switzerland.

Today gas turbine manufacturers such as GE Verona / Seimen’s Energy have a lead time anywhere between 3-5 years, in Asia the lead time can be as much as 8 years. Better than the electricity grid, but still much longer than the time it takes for entire data centers to come up. The gas turbine manufacturers have seen several growth and consolidation waves in their industry as demand peaks and falls off. This cyclic market behavior has made the turbine manufacturers reluctant to add on capacity and they are happy to have a backlog for their industrial gas turbines. 

By the early 1940s we saw the development of jet engines and engineers quickly saw multiple applications for the amazing engineering marvel. As early as 1947 we have the Metropolitan Vickers G1, an aeroderivative gas turbine based on the F2 jet engine that became the first marine gas turbine when it completed sea trials. By the late 1960s aeroderivative gas turbines emerged as a distinct turbine category for industrial and marine applications. Their strengths being their light weight, higher efficiency, compact size and quick start up times. Most importantly the engine cores were proven reliable with millions of flight hours and their modular construction meant maintenance was simpler than traditional gas turbines and down time was mitigated. In 1968 a GE turbojet from 1955 was converted into LM1500 for industrial and marine use. But what are the modifications that need to be made to a jet engine for it to become a gas turbine?

If the engine is a turbojet or low bypass then the front fan and nacelle are removed and an air filtration system is fitted directly on the engine. If the engine is high bypass just as the CF6-80C2 used in the PE6000 the integration is much more comprehensive, in either case the front fan is normally removed. A comprehensive filtration system is a must to avoid foreign object damage (FOD). The engine might be light, tough and powerful, but FOD can result in expensive maintenance.

The most important part of the modification is the Free Power Turbine. In a normal jet engine hot exhaust gasses are expelled out the back of the engine to create thrust. In an aeroderivative gas turbine the exhaust gasses spin the turbine before venting out. The turbine is connected to a generator via a shaft and the generator is where electricity is generated.

The engine’s fuel system is modified to work on natural gas instead of ATF. The modifications to the engine include fuel nozzle replacement in the combustor, gas feed lines are added including gas valves and compressors, the combustors are modified to optimize the mixing of air and gas and the engine fuel mix computer is modified to optimize gas combustion.

Finally the engine is housed in a compact 40 foot container that is easy to transport. Installation needs some ground work, but typically such turbines can be installed in well under two weeks.

Data centers repurpose old jet engines to meet AI’s power demand

aeroderivative Gas turbines

The Aeroderivative Gas Turbine  Ecosystem

The Aeroderivative Gas Turbine ecosystem is dominated by a clutch of players. Some of them include GE Verona, Mitsubishi Power, RR, Siemen’s Energy, Baker Hughes. GE & Baker Hughes dominate with a 63% marketshare of the current ecosystem. The market currently valued at just under $4 Bn and is projected to be valued at $6.79 Bn by 2034, to add there is an untapped market yet to be realized. [Latest] Global Aeroderivative Gas Turbine Market Size/Share Worth USD 6.79 Billion by 2034 at a 6.34% CAGR: Custom Market Insights (Analysis, Outlook, Leaders, Report, Trends, Forecast, Segmentation, Growth Rate, Value, SWOT Analysis)

It is into this market that Boom sees a clear opportunity. A quick look at the partners in this pivot. We first have Crusoe. Crusoe is a young start up that is about 7 years old, they specialise in  providing infrastructure for AI data centers. They have an out of the box approach to the energy needs of data centres. They go where cheap energy is available, a case in point is their site at Abilene, Texas. Abilene is in West Texas, an area active in oil and gas production. Furthermore there are a number of wind farms that we loss making until Crusoe stepped in with their data center project, taking in all electricity produced. Crusoe has placed an order with Boom Superpower for 29 gas turbines that will generate 1.21GW of power valued at about $1.25 Bn. This order is only part of their order book, they have over 4.5GW worth of orders with GE Vernova & Chevron.  Crusoe is building data centers where power is abundant and cheap – FastForward.

The second is Darsana Partners, an investment and advisory firm that has a portfolio valued at over $4.25 Bn. They are known for their fundamental research, and making high conviction investments in companies and industries undergoing significant change. They are the lead investor in Boom Supersonic’s ‘ Superpower’ turbine business which raised $300 Mn in funding.

Supersonic – Superpower – Supersonic

At the end of my previous piece on the Boom Overture we had briefly discussed the Overture funding and the funding gap that existed (link at the beginning of the article). The Superpower move by Boom is an excellent step towards that direction.

Boom promises its turbine cost will be at $1033 / kW this stacks up well against an average cost of between $930 – $1500 / kW by competitors such as the GE LM6000 & Siemens SGT-A45 , the plants generate power in the 40-50MW range. How much does it cost to build a gas power plant? – Gas Turbine World Where the Boom Superpower plant scores over the competition is its operation at high ambient temperatures. In summers when temperatures in many parts of the USA climb over 100°F Superpower promises to maintain power output of 42MW even at 110°F while competitors suffer from power derating.

Boom expects to commence deliveries to Crusoe by 2027 and expects to ramp up manufacture to 1GW by 2028, 2GW by 2029 and 4GW p.a by 2030. These translate to 24 units by 2028, 48 units by 2029 and 95 units by 2030 at 42MW each.

In revenue terms it translates to a topline of $1.03 bn in 2028, $2.06 Bn in 2029 & $4.13 Bn by 2030. The overall size of the aeroderivative gas turbine market is limited by the number of jet engines being converted to aeroderivative gas turbines and Boom expects to carry a backlog through towards 2035 (ideally).

Typical margins in the gas turbine business range between 10-15%. Since Boom Superpower has priced aggressively a conservative margin could be in the 5-8% range and this translates to a bottom line of between $51.5 – $82.4 Mn in 2028 and a bottom line number of between $206.5 Mn – $330.4 Mn in 2030.

While the numbers are sizable as a standalone revenue stream, they do not cover development costs of the Overture and the timeline seems to run parallel to the Overture timeline, the Superpower program does run significant add-on benefits.

The Overture is expected in service by 2030 and the Superpower program provides a revenue stream two years before anticipated commercial flights. Blake Scholl appears confident no more money will be needed to fund the Overture program, for this to truly happen the turbine program needs to accelerate by at least a year. For now their Superpower Superfactory expected to come up near Denver is still a grass field, but Boom have shown they can move quickly once permitting is complete in the past (their Greensboro factory).

Since the Superpower turbines will be using the same Sprint Cores as the Symphony, their use directly translates to real world use of their technology and this accelerates learning, validation and improvement required for their Symphony engines. Furthermore scaling up production at their Superpower facility builds supply chain and vertical integration experience that can be transferred to the Overture Superfactory.  Superpower.

Overall it looks like Boom has fast tracked their Symphony & Overture development timelines.

This is what pivots are all about!

Before you Leave.

Read More Amazing Content at: https://theaviationevangelist.com keep scrolling down.

Follow me:

LinkedIn : https://www.linkedin.com/company/the-aviation-evangelist/

X : @ManiRayaprolu

Reddit : r/theaviationevangelist

Facebook : https://www.facebook.com/profile.php?id=61583497868441#

Instagram : https://www.instagram.com/theaviationevangelist/?hl=fr