Ball Aerospace is growing rapidly. The backlog for the Westminster, Colorado, company’s portfolio, which includes sensors, spacecraft, data services and components, jumped 20 percent between 2021 and 2022 to reach $3 billion. Another $5 billion in Ball contracts booked were not yet added to the backlog, the company reported in February.
Jake Sauer, in his newly created role as Ball’s vice president and chief technologist, is identifying the critical technologies that underpin Ball’s work for NASA, the Defense Department, intelligence agencies, military services and commercial customers around the world.
Sauer, who previously served as vice president and general manager of Ball’s Tactical Solutions business, joined Ball in 2012. Before that, Sauer worked at the Massachusetts Institute of Technology’s Lincoln Laboratory. Sauer earned undergraduate degrees in physics and mathematics from Germany’s University of Cologne as well as a master’s degree in physics and a Ph.D. in quantum computing and control from the Georgia Institute of Technology.
What is in your portfolio?
Ball has some core businesses that are underpinned by core technologies. We see convergence between government and commercial customers in certain areas, also convergence among government customers. A good example of the military convergence would be Joint All-Domain Command and Control. JADC2 needs the ability to join up the sensors across domains and cue them to shooters, potentially across domains. It’s no longer just a single service problem.
We’re undergoing a transformation as the customers change. Part of this is rearchitecting systems. In many cases, the users don’t care which platform the data comes from. Yet in years past, different organizations would have platform-specific software and platform-specific interfaces. As the missions start to converge, we have to rearchitect many of the products so they can converge too.
It makes sense to have the ability to look at these critical future technologies across all of the businesses. That’s where I’m focusing.
Why was the chief technologist position created?
There were examples in the company of where we modernized product architectures and software. We want that to be unified.
Another reason is we want to focus our investments in a few key areas that will have the biggest impact. The new cross-domain capabilities, products and product architectures are big and complicated. We want to make a few choices and do a good job.
Are you more selective because Ball Aerospace is not as large as Boeing, Lockheed Martin or Northrop Grumman?
We have to be more selective because we compete with those groups; we work for them and with them, and occasionally, they work for us. We’ve grown so much over the past few years. The way we’ve been successful is by carefully selecting a few things and then doing the very best job to have a higher chance of success.
Where will you focus your investments?
First and foremost will be modeling and simulation. The value of modeling is the ability to make a decision based on the modeling output. A model that’s easy to run and doesn’t have to be adapted too much can give you a lot of information.
Then there’s the talent, the people that think through the problems. No tool will replace the thoughtfulness that goes into a solid analysis that tees up courses of actions for key decisionmakers.
In a future where Ball has capabilities and systems and products that operate in a new way, we would like to have a really good way to imagine how that works. We want to do that across all domains with a unified modeling and simulation effort.
Where else are you focusing?
Ball Aerospace has contracts and a backlog that will last for many years. There are certain things that we won’t change. But as we introduce new products and new programs, we are going to be introducing new architectures. In a few key areas of the business, we’ve rearchitected systems to have processing capability where we didn’t have processing capability before, we call it edge processing.
If it’s a camera or a communication system, or a transmitter, it may have limited resources. If you wait for a long feedback loop to point the camera, you might not be good at tracking something that happens to be very fast. It turns out, the sensor can do that on its own. The exciting thing about edge processing is that not only can we come up with ways to enable sensors to do more with limited resources, but we can give other sensors that capability too. In a recent example, we’re integrating third-party algorithms to run at the edge. When they need heavy compute, they can draw on the cloud. When something needs very little latency, they process it at the edge.
Any other focus areas?
I would like to optimize some of our processes to increase the pace of discovery, particularly at the system and the system-of-systems level. In the future, I see us having certain product architectures that are almost always running and doing things. We would get to try new ideas by plugging in a new type of hardware or trying a new type of algorithm that allows you to use the sensor or the data in a different way. Then you would get that feedback as you’re developing, as you’re designing, as you’re creating and as you’re dreaming. The trick here is to enable groups to be able to work at a faster pace and still work together, of course, but also to work towards different goals, different customer needs and in different domains.
This interview has been edited for clarity and length.
This article originally appeared in the May 2023 issue of SpaceNews magazine.