“India’s techade is here!” Prime Minister Narendra Modi emphasised during his Independence Day address in August 2022. He was trying to highlight how rapid technological advancements in sectors such as 5G, chip production, optical fibre networks have shown impressive results in education, healthcare, and changes in India’s arguably greatest asset — the common man. “We are bringing a revolution through Digital India to the grassroots level.” The PM stressed how India’s digital transformation is vital for the country to become a developed nation by 2047. 


Even a few months earlier, during the Indian Science Congress in January, PM Modi hailed quantum computing as the next big focus point for the country. “India is moving fast in the direction of quantum computers, chemistry, communication, sensors, cryptography, and new materials,” he said during his address.


Fast forward to January 2023, when Minister of State for Electronics and IT Rajeev Chandrasekhar said that quantum computing “will be at the core of the growth and expansion in India’s techade”, adding that India is collaborating with partners and players from across the globe to build “an ecosystem of innovation, along with startups, R&D laboratories, and higher education institutions".


As India is promoting its technological prowess during its G20 presidency, it has perhaps become pertinent to understand what quantum computing actually is. And more importantly, what are its real-life use cases? Let’s take a deep dive.


Quantum Computing: How Is It Better Than Supercomputers?


There are computers. Then there are supercomputers, which are marked by thousands of classical CPU and GPU cores, which give supercomputers a clear edge over regular computers in carrying out complex calculations. 


Now, there are certain cases where supercomputers are not up to the task. This is where quantum computers come in. 


If we break it down simply, the task of a computer is to solve problems. Now, the more complex a problem, the more capable a computer needs to be to solve it. As per IBM, Quantum computers are able to solve problems with a lot of variables and a high degree of complexity.


Let us try and understand that with a simple use case. In the field of biology and medicinal research, scientists have to deal with understanding protein sequences. Now, a supercomputer is great at sorting through a big database of protein sequences. However, if we ask it to find patterns within that data to understand how proteins would behave, it may just end up blowing a fuse. 


Proteins, when they fold into complex shapes, have the power to turn into vital machines. A classical computer, or even a supercomputer for that matter, will never have the working memory to calculate the trillions of ways a chain of 100 amino acids may fold.


Quantum computers can handle such calculations easily because they are built based on quantum principles that go beyond classical physics. Let’s try and understand this further.


Quantum Computing: How Does It Work?


While traditional computers handle data in binary — 1s and 0s — and can only switch between the two variables, quantum computing creates multidimensional spaces, which can help us visualise how the patterns linking individual data points come into shape. So, instead of only dealing with either 1 or 0, you are actually solving problems through a quantum algorithm that can help find patterns and solutions in ways never imagined before.



While processors in a classical computer use bits to solve problems, quantum processors use qubits (illustrated above) to run multidimensional quantum algorithms.


Quantum Computing: Is It Bigger Than Supercomputers? 


Now, this might come as a shock to you, but quantum computers are actually smaller and are much more energy efficient when compared to supercomputers. 


For example, IBM’s quantum processors are about the same size as the ones found in a typical PC. As for hardware, IBM explained that a quantum system is no bigger than a car, mostly comprising cooling systems that keep its supercomputing processor at the coldest possible temperature to operate.


Supercomputers, on the other hand, are marked by rows after rows of cabinets full of processors (pictured below) that are not only power intensive but needs extremely cold temperatures to perform as well.


 

 


Quantum Computing: Real-Life Use Cases


While the protein-folding example helped us somewhat understand how quantum computing is more capable, it still makes sense to look at some real-world use cases.


Let’s consider Mercedes-Benz. In order to reduce its impact on the environment, the $83.48-billion networth carmaker has pledged to make its entire vehicle fleet carbon-neutral by 2039. This means all its cars would essentially need to be electric vehicles (EVs). Now, we all know that EVs are driven by batteries. 


While we know how to make batteries, it’s hard to ascertain what actually goes on inside a functioning battery at a molecular level. Quantum computers can help Mercedes-Benz accurately simulate electron interactions within a battery (as well as its effects on surrounding electrons) in order to design more efficient batteries. 



Now, let’s talk about the European Organization for Nuclear Research (CERN) in Switzerland. Housing some of the biggest and arguably the most complex machines ever designed, CERN is also home to the Large Hadron Collider (LHC) (pictured above), the world’s largest and highest-energy particle collider. It was here in 2012 that physicists discovered the “God Particle”, or the Higgs boson. 


Within the LHC, scientists smash particles at just below the speed of light (that’s 299,792,458 metres per second, in case you were wondering). Particles actually collide at a frequency of nearly 1 billion times per second. LHC data is actually sent out to over 170 data centres and labs globally. 


To address the growing needs of computing needs, CERN has tied up with IBM Quantum cloud access to help pinpoint LHC experiments and its various findings and events. 


Now, if we were to consider an example that is perhaps more “real” than particle physics or chemical reactions within a battery, we can consider ExxonMobil use of quantum computers to help simplify the complexities of shipping liquified natural gas (LNG) all over the globe on a daily basis, to help predict the fastest and most efficient routes and routines to deliver fuel as needed. 


While these are just a handful of use cases, there are many more sectors where quantum computing is already being employed. 


Quantum Computing: Is It The Future?


My next point might feel out of place in this seemingly scientific (and serious) article. However, I have found out that over years, fiction (films and books, especially of the science fiction genre) have often captured an up-and-coming tech trend before it catches on in the public. Remember the use of collapsible communicators (read: foldable smartphones) in the 1960s sci-fi TV series, “Star Trek”? Or even, the ahead-of-its-time gene splicing and replicating techniques used to create dinosaurs in “Jurassic Park”?


Quantum computing, similarly, has made its way into Marvel’s superhero capers. And that too, in a way that highlights the much-debated ‘death of Moore’s Law’.


What is Moore’s Law? Well, the over-50-year-old postulation proposed by Intel co-founder Gordon E. Moore states that the number of transistors in a dense IC will continue to double every two years. What this essentially means is that technology will continue to become cheaper and smaller. Thanks to chipmakers’ adoption of Moore’s Law over the years, computers are not only present in every household today, but they literally fit the palm of your hands (smartphones, smartwatches, etc.) 


Moore’s Law inspired innovation over the years, right from the age of micrometre architecture process to design chips, to the current nanometre design that can be seen in every chip nowadays. 


However, at present, several experts and chipmakers emphatically claim that Moore’s Law is dying. This means that new chip-driven tech is claimed to be neither cheaper nor better when compared to older models.


This, in turn, has led to the belief that quantum computing is the next paradigm shift that chipmakers are claiming to see in the coming years. 


So, what does that have to do with MCU movies? Well, if you remember the very first “Iron Man” (2008), you might remember how Tony Stark’s (Robert Downey Jr.) initial super suit design used to be clunkier, almost mechanical. Fast forward to “Avengers: Infinity War” (2018), when Stark started using nanotechnology to design a suit that would not only fit within a box but can also regenerate and redesign itself as and when needed. 


Now that Stark’s dead (as seen in 2019’s Avengers: Endgame), the most recent Marvel movie, “Black Panther: Wakanda Forever” (2022), introduced the spiritual successor to Iron Man, the Ironheart, which is yet another heavy metal super suit rocked by teen genius Riri Williams (Dominique Thorne). In a scene, Williams said that to avoid hacks, she designed 2065-byte encryption on her laptop and that she had to build a “functional quantum computer” to crack it when she was logged out. 


Even if we ignore the whole way-too-much byte count and the very fact that a teenager randomly built a quantum computer in her garage, it’s still interesting to see the mention of quantum computing in reference to a problem such technology was indeed conceived to solve. 


Now, let us put fiction aside and get back to the real world again. As per a study by industry development tracker Precedence Research, the global quantum computing market size is projected to hit around $125 billion by 2030, poised to reach a compound annual growth rate (CAGR) of 36.89 percent from 2022 to 2030. 



While North America is hailed as the largest market for quantum computing, Asia Pacific is touted as the fastest-growing region. 


In February 2020, India proposed to invest around $1.12 billion in quantum computing research. While several IITs across the country are researching quantum computers, including IIT Madras and IIT Jodhpur, the Indian Institute of Science (IISc) launched the Quantum Technology Initiative (IQTI) in 2020 to help set up a foundation for quantum technologies as well as create a framework to encourage collaborative efforts between computer scientists, physicists, material scientists, and engineers.


It’s safe to speculate that quantum computing could very easily be the next rung on the computing innovation ladder. How exactly that will take shape remains to be seen.