Recently, there’s been more and more word about quantum computing around the web and I’m sure that if you are a software developer you are wondering: Am I about to become obsolete? Do I need to start thinking about a career shift? What kind of skillset do I need to learn to maintain me updated? Is quantum computing better that classic computing (digital)?
The answer to those questions may not be as straightforward as you may hope. Quantum computing does in fact break the current computing paradigm, but so far it’s in an experimental stage and at first it’s applications may not be as wide as we may think.
Quantum issues in classical models
As you may already know, modern computers are getting smaller and smaller, in fact, microchip components are in the nanometer mark. Lately we have been hearing about companies reaching the lower 10s mark (7nm, 5nm and recently IBM announced it’s 2nm chipset from its research lab). Previously, this measure was related to the transistors inside the chips, but not specifically to its size but to the separation between its source and drain where the electrons form the current based on the interaction with a gate, see image below.
By reducing the distance between source and drain we can pack more transistors into the same chip. But there is a problem with just reducing this distance, and the problem is related with quantum mechanics, a specific effect called quantum tunneling, where even having the gate “closed” and no physical connection between the source and drain, some electrons find a way to cross the threshold, forming a small current that in the end. This may not impact information processing at the moment but power efficiency is another topic.
Currently, companies have been investing in redesigning transistor structure to help mitigate the impact of quantum tunneling and improving power efficiency in these components, resulting in longer battery life and better performance. With this said, the transistor measuring has been more complicated due to the difference in transistor designs, so hearing about 2nm may not mean the same thing from company to company, instead, the standard is moving to name the number of transistors per physical space that can be put together.
Closer to the atom
As good as it sounds, getting smaller and smaller in the components it means that we are getting closer to the size of atoms, for example, the hydrogen atom is about 0.1 nm and the silicon atom used in chips is around 0.2 nm. If we are talking about a 2nm separation between components in the transistor it means that we have about 10 atoms between them, this could mean that we may be facing more and more quantum effects other than tunneling and here is where we may need a shift in how we do things in technology.
The quantum computing option is not a new concept by any means, but scientists have been making significant progress and we can now observe more and more advances that will have us eagerly waiting for results in technology.
Quantum models in classical computing
Another problem with the current state of computers is that the study in the quantum realm is too complex for the current computing model. The classical model works on a binary base formed of zeroes and ones, pretty straightforward, but the quantum model is different, in this model a qubit (quantum bit) is composed of 0s and 1s as well, measured by the amplitude of the wavelength of the particle, but there is another state called “superposition” where the qubit is in a combination of all it’s possible values and it will not have a concrete value until it is observed (a common phenomenon in quantum mechanics). Another state in qubits is called “quantum entanglement” where two particles become entangled to each other and a change in state in one particle will affect the other no matter the distance. To simulate all these effects in a classical model may take an exponential amount of time to complete, so simple quantum simulations may take even years to complete.
Welcome to quantum algorithms
Having a basic unit of information (qubit) we can start working on algorithms that can work as if they were classical algorithms but exponentially faster. Two of the most famous quantum algorithms are Shor’s algorithm for factoring and Grover’s algorithm for linear search.
Quantum algorithms can be classified by the techniques used to reach the resolved task, so far we have:
- Algorithms based on amplitude amplification
- Algorithms based on quantum walks
Quantum programming tools
At this point we already have quite a selection of tools to choose from if we want to start learning programming in quantum simulators or even quantum chips that are open source, from instruction sets to programming languages to get familiar with.
As a developer you may be really interested in n the later options. You can choose between imperative languages and functional languages. You can choose you flavor with a simple search on the web. As a .NET developer, I find Q# my most friendly choice since it can be integrated into Visual Studio and VS Code, but there are multiple options based on other languages if you are interested in a different base (like QCL that resembles the C programming language).
Since the ecosystem of tools and languages is already growing, we’ll explore some of these options in future blogs to help you make some choices.
Conclusion
Getting into quantum computing may be really application specific at the moment and may not have much real life applications, but the promise with all technology advances is that we’ll find a way to apply this to every possible market. So even if this is a long term objective, learning quantum computing may benefit us as developers to open our minds to different paradigms and shaking that comfort zone for a moment to see what’s beyond.