The quantum computers that exist today typically look like large chandeliers, hanging from the roofs of science labs.

A superposition to change computing

Is quantum computing the next major revolution in computer science or will it remain a dream scenario for the foreseeable future? Sven Størmer Thaulow, EVP Chief Data and Technology Officer, looks into an area that is still surrounded by myths.

The fields of quantum mechanics and quantum computing are difficult to understand, even for people who have studied them at university level. But what are they, and what makes their application in computer science so interesting?

First we need to take a step back. Data processing traditionally operates via a digital computer language; everything – images, sounds, text, etc – is broken down into 1s and 0s. When I write “S” using my keyboard, it represented as “01010011” – the ASCII character code in binary format. This is done by feeding current into eight “transistors” in a processor (or chip), with different voltage levels representing the binary states of “1” or “0”. A thing inside the computer reads this and displays “S” on my screen.

Packing transistors

In data processing, building more powerful computers has largely been a matter of packing as many transistors as possible into a chip and getting the clock frequency (the speed at which the computer computes) as high as possible. Many will be familiar with Moore’s Law describing the increase in processing power. It states that the number of transistors on a chip will double every other year. It’s hotly debated, but for some years now, many have claimed that Moore’s Law will soon be dead and that we have reached the limit for how many transistors can be packed into a chip. We’re currently down to three nanometres between the transistors, with the standard distance on your Iphone chip being five nanometres. Attempts to remedy this are being made by designing processors in 3D and other techniques.

However, increased computing power is not just about the number of transistors on a single chip; today we buy vast amounts of computing power in the cloud and no longer have to rely on having our own computers in-house. This means that we can all easily access vast resources to solve computing problems precisely when needed, no more, no less.

Machine learning behind the demand

Demand for such computing power has grown especially rapidly due to the need to train machine learning algorithms on large datasets. These algorithms try to find an optimum in a system with a large number of dimensions (for example, housing prices) – a big mathematical problem. Just imagine how many variables that influence the price of a house. The bigger and more complex the optimisation task, the greater the need for computing capacity – a need it will be difficult to keep up with using conventional data processing techniques. Even today, tasks already exist that are so complex that running them on even the world’s biggest computer cluster is inconceivable. This is where quantum computing is emerging as a promising technology.

Quantum computing is about using quantum mechanics – the theory of how things interact at small scale – to create a computer that is insanely faster at solving certain problems than a conventional binary computer. A quantum computer does not have bits (no 0s or 1s) but rather qubits, i.e. bits with more states than just 0 or 1. Qubits draw on two properties that distinguish them from regular bits: first, they can be put into a “superposition state” that represents both 1 and 0 simultaneously. Second, multiple qubits can be entangled, meaning that states in pairs of quibits can be changed immediately.

Another important difference be tween quantum computers and con ventional processors is how computing power is scaled. To double the computing power in a conventional processor, you essentially need to double the number of transistors. In a quantum computer, the computing power is doubled with every additional qubit. This means that the computing power in a quantum computer grows exponentially when the processor is scaled.

Combined, this enables quantum computers to perform multiple computing operations simultaneously, churning their way through computations which today’s biggest supercomputers would take thousands of years to complete.

This sounds incredible, but at what stage are we at in the development of quantum computing?

More than 40 years old

Well, the theory of quantum computing is more than 40 years old; in 1981 the American physicist and Nobel laureate Richard Feynman said: “Nature isn’t classical, dammit, and if you want to make a simulation of nature, you’d better make it quantum mechanical, and by golly it’s a wonderful problem, because it doesn’t look so easy”.

In many ways it can look as if we’ve reached the same point in solving that problem as the internet had in the early 1990s. Most work is currently being run in labs, though industry is beginning to grasp its potential. Big Tech companies (such as Google and IBM) have launched separate research programmes. Venture capital firms are investing in quantum startups. The first exclusively quantum companies have gone public. National authorities are investing strategically in the defence sector, among others, after having financed basic research over several decades.

Yet we’re still lagging behind when it comes to application. We’ve not yet reached the point of “quantum advantage”, at which a quantum computer can solve a problem faster than a computer using conventional data processing. Researchers expect the first case of quantum advantage will be achieved some time in the next three to five years.

The aim of quantum computers is to perform computations that no conventional computers can realistically manage. A major task that lies ahead will be to explore their applications. And to do this we need to think differently. New computational strategies must be developed to take full advantage of these totally new devices. The mathematics and the algorithms underlying the tasks to be performed will be fundamentally different.

Easy to miss the mark

Researchers and innovators often miss the mark when getting to grips with new innovations: Thomas Edison thought that the phonograph he invented would be used primarily for language learning; the developers behind text messaging thought it would primarily be used by courier companies to notify their customers of parcel deliveries. So what do we think quantum computers will be used for? Three likely areas stand out:

  • Large-scale optimisation problems where the task is to maximise an output based on an inconceivably vast number of variables simultaneously. Some practical examples of application are in the transport sector, for finding optimal routes, or in finance for optimising profit based on a seemingly endless list of constraints and variables.
  • Classification tasks using machine learning. A typical example of a classification task involves putting labels on images, for example: “dog”, “tree” and “street”. Quantum computing has proven to be more efficient at performing complex classification tasks.
  • Simulation of nature, such as in molecular modelling. Modelling anything other than the most basic chemical reactions is mathematically impossible for conventional computers, but with a quantum computer this may be doable. Development of medicines and batteries are two practical examples of potential areas of application.

A supplement

The key point here is that when or if quantum computers become commercially available, they will serve as a supplement to conventional data processing. State authorities, hyperscalers (Big Tech companies) and large universities are expected to be the early adopters of quantum computers due to the fact that they will probably need to operate at extremely low temperatures in dedicated facilities. So the number of quantum computers will likely be small initially – that is, given today’s technological constraints.

That said, quantum computers will more extensively be offered as a cloud-based service, on par with conventional data resources, and be made available using much simpler user interfaces (high-level programming language) than those we have today, where developers in reality need to understand quantum mechanics in order to program the machines.

So what does this mean for Schibsted? We will monitor developments, but will probably wait a few years before we start experimenting with the technology – and when that day comes, we will do it using cloud-based quantum computing.

Sven Størmer Thaulow

Sven Størmer Thaulow
EVP Chief Data and Technology Officer
Years in Schibsted: 2.5