
Quantum computing and utilities: Risk and potential rewards
The first quantum computers are a reality, and future machines will be capable of simulating systems at a molecular level. In the meantime, there are security concerns.
By Ben Hargreaves, head of content
Quantum computing and utilities: Risk and potential rewards
The first quantum computers are a reality, and future machines will be capable of simulating systems at a molecular level. In the meantime, there are security concerns.
By Ben Hargreaves, head of content

Public enemy number one on this year’s Utility Week Intelligence/Marsh McLennan risk report was the risk of a serious cybersecurity breach.
That’s in the here and now.
Are utilities also thinking about what they can do today to stop hacks that will only become feasible in a decade or 15 years’ time?
Perhaps not, but the challenge posed by the emerging technology of quantum computing suggests they should be.
One organisation aware of this security concern is the National Energy System Operator (NESO), which recently announced it was working with Cambridge Consultants, part of Capgemini Invent, and the University of Edinburgh to create a risk management tool to assess quantum threats to the energy network.
“Quantum computing offers many benefits to the energy system, but the nature of these emerging technologies could enable attackers to break encryption that is currently highly secure and potentially open new attack vectors,” NESO says.
More on the benefits later. For now, NESO is concerned about the potential for hackers to seize encrypted information today and use quantum computing technology to unencrypt it in the future, known as a "store now, crack later" or "harvest and decrypt" strategy. In the US, the government is also preparing for and mitigating the risks to government and critical infrastructure systems posed by a potential future quantum computer.
Both NESO and the US authorities are worried about the potential of quantum computing to break even the toughest encryption algorithms. It is thought that sensitive documents that are impossible to access today could one day be breached thanks to the power of quantum computers. “We can’t wait 15 years to address the problem,” says Daniel Goldsmith, senior quantum technologist at the Digital Catapult here in the UK.
Critical national infrastructure could be at risk, particularly if it relies on bespoke software to operate, he says. “Software that companies have purchased will probably be OK; you can expect vendors to fix it during upgrade cycles.
“But some utilities may have written their own software in the 1980s that uses RSA [an older encryption system for data transmission]. They may not have given any thought to the notion that RSA could be cracked.” Factor in that some encryption may be sitting in hardware, meaning chips also need to be replaced, and it’s potentially a very big problem. “If you’ve got to correct software written in the 1970s or 1980s, who is going to do that?” asks Goldsmith. “It’s worse if the encryption is in hardware.”
He points out that for a utility to itemise all its software and figure out what needs changing is potentially a mammoth undertaking. “Our advice is that you need to be starting this exercise now. Even producing a database of all the software that is at risk might take a couple of years.”
What we have here, then, is another Y2K-style problem – with the crucial difference that no one knows exactly when it’s going to happen. “This is a bit more difficult because no one can put their hand on their heart and say we’ll have a crypto-relevant quantum computer in 10 years’ time, or 15 years’ time,” points out Goldsmith.
Post-quantum cryptography is already making an appearance, and we can expect utilities to widely adopt these algorithms as they become commercially available.

“If you’ve got to correct software written in the 1970s or 1980s, who is going to do that? It’s worse if the encryption is in hardware.”
Daniel Goldsmith, senior quantum technologist, the Digital Catapult
Why quantum computing?
Back in 1981, pioneering physicist Richard Feynman theorised that to simulate physical systems at the quantum level it would be necessary to build quantum computers. The computational complexity of these types of simulation, where, for example, millions of electrons might be modelled, are beyond the capabilities of even today’s supercomputers.
James Cruise, head of quantum algorithms at Cambridge Consultants, believes we are now a couple of decades from the point where this type of activity becomes a reality. Cryptography is just a small part of the story. “Human history is scattered by points where we gain better control of our world,” says Cruise. “Control over the quantum-mechanical world is now allowing access to a new form of computation built on the physics of quantum mechanics, rather than the physics of digital electronics. What that means is a different paradigm for doing computation.”
“Traditional computing is sequential and logical, and it’s very good at sequential and logical problems,” adds Stephen Chasko, director of energy and utilities at Cognizant. “Quantum computing offers the potential to solve multi-state, multi-dimensional problems almost instantly, or very quickly.”
It’s not a case of quantum computing replacing classical computing. For those sequential, logical problems, digital computing will continue to be the best option. But when it comes to modelling systems at a molecular level, computations that would be impossible on a digital computer – or take a very long time to carry out – could be solved. That opens up the possibility of running simulations of entire chemical systems.
Goldsmith of the Digital Catapult says: “The reason for that is that a chemical system is a quantum system. If you try and simulate it on a classical computer, the number of items in the memory increases exponentially – you run out of space. Feynman knew you would need a different type of computer and all these years later we are realising his dream.”
There are several different quantum computing technologies that are being developed, including trapped ion, superconducting, neutral atom, and photonic computing systems. Cruise explains: “They all have their benefits, and they all have their problems. We don’t know what the clear winner is going to be. It is a bit like classical computing at the start, where we had valves and transistors, and they had different uses at different times.”
Whatever technology prevails, running simulations that were previously unfeasible will be a possibility. “Quantum computing could do the same for chemistry as classical computing did for fluid dynamics. Classical computing allowed us to stop using wind tunnels. Quantum computing offers the same promise [cutting out physical experiments] for chemistry,” Cruise says.
Areas where quantum computing could one day assist utilities include modelling of weather systems, Chasko believes. He also thinks it will be possible to improve the design of turbine blades by helping with complex problems like how pressure and heat deformation affect their aerodynamic properties. Another application is engineering new battery chemistries without having to physically test combinations of materials. “New battery chemistries for storage, for carbon capture, for catalysis, superconducting power lines – these are examples where quantum computers could deliver,” says Cruise.
The Digital Catapult has been working with DNV and Frazer Nash on two potential quantum computing use cases for energy – grid optimisation and quantum machine learning. One possibility, says Goldsmith, is modelling an increasingly complex electricity grid to determine how optimisation affects energy costs and carbon emissions. He also thinks quantum machine learning has the potential to improve predictive maintenance capabilities.
Quantum computing could be used to model the grid and help maintain inertia. “A big problem for the future grid will be its relative instability – that is a big challenge for electrical engineers. When you have a lot of renewable energy online, and the sun goes in, or the wind drops, you lose power – and that is difficult to model. One potential quantum computing use case is to model that in a digital twin.”
All these types of developments are some time away from becoming a reality. But companies such as Microsoft and IBM are already developing nascent quantum computing resources which could be scaled up to make much bigger, quicker systems. “The promise is that in the future quantum computing will do classes of problems that simply can’t be done with classical computers,” Goldsmith says.
“I think we will see a fairly small number of quantum computers that are capable of solving a small number of problems in the next three to five years.”

“Control over the quantum-mechanical world is now allowing access to a new form of computation built on the physics of quantum mechanics, rather than the physics of digital electronics.”
James Cruise, head of quantum algorithms, Cambridge Consultants
How do quantum computers work?
Unlike digital computer bits which code either a zero or one, quantum bits ("qubits") code either zero, one, or both zero and one at the same time (what is known as a "quantum superposition"). This opens up an exponentially increasing amount of computational power. Whereas a conventional supercomputer splits up different tasks and assigns them to different GPUs working in parallel, a future quantum computer would be capable of “massive parallelism”, says Goldsmith, because it is simultaneously in many different states with many possible outputs.
With the addition of bits, classical computing adds power linearly, but qubits double the computational "space" each time one is added. Cruise explains: “If I have two bits, I can only go to four possible outputs, whereas when I am dealing with two qubits, they can be in any proportion of that state. When I am dealing with qubits, there’s a much richer environment.”
Importantly, when the qubit is observed, it always reverts to a traditional binary string, which enables the quantum computer to output usable results. The problem with the current technology is error rates. Qubits that are in material are buffeted by vibration, which creates mistakes. “Ninety-nine-point-nine percent is good fidelity for a two-qubit gate,” says Goldsmith. “Oxford Ionics has demonstrated 99.97% quantum gate fidelity using trapped ion computers, which is a really exciting result.”
Another challenge is scaling up qubits, with current computers only able to support small numbers. IBM plans to introduce a quantum system with 200 logical qubits capable of running 100 million quantum gates by 2029, with a goal of two thousand logical qubits capable of running one billion gates by 2033. One more hurdle is that superconducting quantum computing systems require cooling systems to help reduce error rates, which are large and energy intensive.
One of the first applications of a future quantum computer would be to run Shor’s algorithm, which opens up the possibility of decoding common encryption schemes. But Cambridge Consultants is keen to explore other use cases, with an emphasis on practicality. Cruise says: “When it comes to grid control and grid management, which involve complex differential equations, we know that they are very difficult to simulate.
“There is real hope quantum computing will allow us to get into those types of simulation.”
As qubits are scaled up other applications will be unlocked, says Chasko of Cognizant. Grid optimisation through quantum computing could help reduce power consumption by the technology sector, potentially alleviating the pressure to bring online additional energy sources, such as Microsoft’s deal to restart the Three Mile Island nuclear reactor, which provides enough power to meet the needs of a medium-sized city, or 800,000 homes.
“Quantum computing is already showing promise for energy consumption issues related to deep learning systems. Once fully realised, this will be a gamechanger.”
For more on quantum computing and our latest risk report, see here.
