Sep 18

The key to bigger quantum computers could be to build them like Legos

0 comments

A startup called Quantum Circuits is networking mini quantum devices together to create computers it will claims will be easier to scale up than rival machines.

 

 

Visit any startup or university lab where quantum computers are being built, and it’s like entering a time warp to the 1960s—the heyday of mainframe computing, when small armies of technicians ministered to machines that could fill entire rooms.

All manner of equipment, from super-accurate lasers to supercooled refrigerators, is needed to harness the exotic forces of quantum mechanics for the task of processing data. Cables connecting various bits of gear form multicolored spaghetti that spills over floors and runs across ceilings. Physicists and engineers swarm around banks of screens, constantly monitoring and tweaking the performance of the computers.

Mainframes ushered in the information revolution, and the hope is that quantum computers will prove game-changers too. Their immense processing power promises to outstrip that of even the most capable conventional supercomputers, potentially delivering advances in everything from drug discovery to materials science and artificial intelligence.

The big challenge facing the nascent industry is to create machines that can be scaled up both reliably and relatively cheaply. Generating and managing the quantum bits, or qubits, that carry information in the computers is hard. Even the tiniest vibrations or changes in temperature—phenomena known as “noise” in quantum jargon—can cause qubits to lose their fragile quantum state. And when that happens, errors creep into calculations.

The most common response has been to create quantum computers with as many qubits as possible on a single chip. If some qubits misfire, others holding copies of the information can be called upon as backups by algorithms developed to detect and minimize errors. The strategy, which has been championed by large companies such as IBM and Google, as well as by high-profile startups like Rigetti Computing, has spawned complex machines evocative of those room-sized mainframes.

The problem is, the error rates are extreme. Today’s largest chips have fewer than a hundred qubits, but thousands or even tens of thousands may be needed to produce the same result as a single error-free qubit. Each qubit needs its own control wiring, so the more that are added, the more complex a system becomes to manage. More gear will also be needed to monitor and manage rapidly expanding qubit counts. That could drive up the complexity and cost of the computers dramatically, limiting their appeal.

Robert Schoelkopf, a professor at Yale, thinks there’s a better way forward. Instead of trying to cram ever more qubits onto a single chip, Quantum Circuits, a startup he cofounded in 2017, is developing what amount to mini quantum machines. These can be networked together via specialized interfaces, a bit like very high-tech Lego bricks. Schoelkopf says this approach helps produce lower error rates, so fewer qubits—and therefore less supporting hardware—will be needed to create powerful quantum machines.

 

 

Skeptics point out that unlike rivals such as IBM, Quantum Circuits has yet to publicly unveil a working computer. But if it can deliver one that lives up to Schoelkopf’s claims, it could help bring quantum computing out of labs and into the commercial world much faster.

The drive to create longer-lasting qubits

The idea of bolting together smaller quantum building blocks to create bigger computers has been around for years, but it’s never quite caught on. “There’s not been a great, fault-tolerant machine that’s been built yet using the modular approach,” explains Jerry Chow, who manages the experimental quantum computing team at IBM Research. Still, adds Chow, if anyone can pull it off it will be Schoelkopf and his colleagues.

After training as an engineer and a physicist, including stints at NASA and Caltech, Schoelkopf joined Yale’s faculty in 1998 and began to work on quantum computing. He and his colleagues pioneered the use of superconducting circuits on a chip to create qubits. By pumping electrical current through specialized microchips held inside fridges that are colder than deep space, they are able to coax particles into the quantum states that are key to the computers’ immense power.

Unlike bits in ordinary computers, which are streams of electrical or optical pulses representing either a or a 0, qubits are subatomic particles such as photons or electrons that can be in a kind of combination of both and 0—a phenomenon known as “superposition.” Qubits can also become entangled with one another, which means that a change in the state of one can instantaneously change the state of others even when there’s no physical connection between them.

 

 

There’s more background on this in our quantum computing explainer. The main thing to know, though, is that this allows qubits to act as if they are performing many calculations simultaneously that an ordinary computer would have to perform sequentially. Which means that adding additional qubits to a quantum machine boosts its processing capacity exponentially.

Schoelkopf has also won plaudits for his work on the problem of noise. The coherence times of qubits—that is, how long they can run calculations before noise disrupts their delicate quantum state—have been improving by a factor of 10 roughly every three years. (Researchers have dubbed this trend “Schoelkopf’s Law” in a nod to classical computing’s “Moore’s Law,” which holds that the number of transistors on a silicon chip doubles roughly every two years.) Brendan Dickinson of Canaan Partners, one of Quantum Circuits’ investors, says Schoelkopf’s impressive track record in superconducting qubits is one of the main reasons it decided to back the business, which has raised $18 million so far.

 

Ironically, some of the students mentored by Schoelkopf and his cofounders from Yale, Michel Devoret and Luigi Frunzio, are now at companies like IBM and Rigetti that compete with their startup. Schoelkopf is clearly proud of the quantum diaspora that’s come out of the Yale lab. He told me that a few years ago he had looked at all the organizations around the world working on superconducting qubits and found that more than half of them were run by people who had spent time there. But he also believes a kind of groupthink has set in.

The advantages of modular machines

Most researchers working on superconducting machines focus on creating as many qubits as possible on a single chip. Quantum Circuits’ approach is very different from that standard. The core of its system is a small aluminum module containing superconducting circuits that are made on silicon or sapphire chips. Each module contains what amounts to five to 10 qubits.

To network these modules together into larger computers, the company uses what sounds like something out of Star Trek—quantum teleportation. It’s a method that’s been developed for shipping data across things like telecom networks. The basic idea involves entangling a microwave photon in one module with a photon in another one and then using the link between them as a bridge for transferring data. (We’ve got a quantum teleportation explainer too.) Quantum Circuits has used this approach to teleport a quantum version of a logic gate between its modules.

Schoelkopf says there are several reasons that networking modules together is better than cramming as many qubits as possible onto a single chip. The smaller scale of each unit makes it easier to control the system and to apply error correction techniques. Moreover, if some qubits go haywire in an individual module, the unit can be removed or isolated without affecting others networked with it; if they’re all on a single chip, the entire thing may have to be scrapped.

 

 

Looking ahead, Quantum Circuits’ modular machines will still need some of the same gear as rival ones, including the supercooling refrigerators and monitoring gear. But as they scale, they shouldn’t require anywhere near the same kind of control wiring and other paraphernalia needed to master individual qubits. So while rival devices could look ever more like those massive early mainframes, the startup’s machines should remain akin to the slimmed-down ones that appeared as conventional computing advanced into the 1970s and beyond.

Listening to Schoelkopf talk through the technology, an image crept into my head: my kids playing with plastic Lego bricks when they were young, bolting them together to build castles and forts.

When I suggested the comparison, Schoelkopf was initially a little wary but then became quite enthusiastic. “In general, every complex device I know,” he said, “is based on having the equivalent of Lego blocks, and you define the interfaces and how they fit together …[Lego bricks] are really cheap. They can be mass-produced. And they always plug together the right way.”

Schoelkopf’s quantum modules have another key advantage. Each contains a three-dimensional cavity that traps a number of microwave photons. These form what are known as “qudits,” and they’re like qubits, except they store more information. While a qubit represents a combination of 1 and 0, a qudit can exist in more than two states—say, 0, 1, and 2 at the same time. Quantum computers with qudits can crunch through even more information simultaneously.

Scientists have been experimenting with qudits for some time, but they are tricky to generate and control. Schoelkopf says Quantum Circuits has found ways to create high-quality ones consistently and to reduce errors significantly. (The company claims it’s achieved coherence times using its cavities that are ten to 100 times longer than for superconducting qubits, which makes it easier to correct errors.) Some qubits are still needed to perform operations on the qudits, and to extract information from them, but his approach requires fewer of these qubits. That, in turn, means less hardware is needed overall.

 

Schoelkopf says organizations that want to try out algorithms on Quantum Circuits’ system will be able to do so “very soon,” and that at some point it will connect machines to the cloud as IBM and Rigetti have done. The startup isn’t just building computers; it’s also working on software that will help users get the most out of the underlying hardware.

Besides, it’s early days. The quantum algorithms being run on cloud services like IBM’s today are still pretty basic, Schoelkopf notes. The field is wide open for quantum computers and associated software that can really make a difference in a broad range of areas, from turbocharging artificial-intelligence applications to modeling molecules for chemists.

Lots of questions remain. Will Quantum Circuits be able to keep producing robust qubits and qudits as it builds much bigger machines? Can it get its quantum teleportation method to work reliably as it connects more modules together? And will its systems, when they are rolled out for sale, be more cost-effective to operate than those of rivals? Significant physics and engineering challenges still lie ahead. But if Schoelkopf and his colleagues can overcome them, they could prove that the key to getting very big in quantum computing is to think small.

New Posts
  • An updated analysis from OpenAI shows how dramatically the need for computational resources has increased to reach each new AI breakthrough. In 2018, OpenAI found that the amount of computational power used to train the largest AI models had doubled every 3.4 months since 2012. The San Francisco-based for-profit AI research lab has now added new data to its analysis. This shows how the post-2012 doubling compares to the historic doubling time since the beginning of the field. From 1959 to 2012, the amount of power required doubled every 2 years, following Moore’s Law. This means the doubling time today is more than seven times the previous rate. This dramatic increase in the resources needed underscores just how costly the field’s achievements have become. Keep in mind, the above graph shows a log scale. On a linear scale (below), you can more clearly see how compute usage has increased by 300,000-fold in the last seven years. The chart also notably does not include some of the most recent breakthroughs, including Google’s large-scale language model BERT, OpenAI’s large-scale language model GPT-2,  or DeepMind’s StarCraft II-playing model AlphaStar. In the past year, more and more researchers have sounded the alarm on the exploding costs of deep learning. In June, an analysis from researchers at the University of Massachusetts, Amherst, showed how these increasing computational costs directly translate into carbon emissions. In their paper, they also noted how the trend exacerbates the privatization of AI research because it undermines the ability for academic labs to compete with much more resource-rich private ones. In response to this growing concern, several industry groups have made recommendations. The Allen Institute for Artificial Intelligence, a nonprofit research firm in Seattle, has proposed that researchers always publish the financial and computational costs of training their models along with their performance results, for example. In its own blog, OpenAI suggested policymakers increase funding to academic researchers to bridge the resource gap between academic and industry labs
  • StarckGate is happy to work together with Asimov that will be aiming to radically advance humanity's ability to design living systems. They strive to enable biotechnologies with global benefit by combining synthetic biology and computer science. With their help we will able to grasp the following domains better Synthetic Biology Nature has evolved billions of useful molecular nanotechnology devices in the form of genes, across the tree of life. We catalog, refine, and remix these genetic components to engineer new biological systems. Computational Modeling Biology is complex, and genetic engineering unlocks an unbounded design space. Computational tools are critical to design and model complex biophysical systems and move synthetic biology beyond traditional brute force screening. Cellular Measurement Genome-scale, multi-omics measurement technologies provide deep views into the cell. These techniques permit pathway analysis at the scale of a whole cell, and inspection down at single-nucleotide resolution. Machine Learning We are developing machine learning algorithms that bridge large-scale datasets with mechanistic models of biology. Artificial intelligence can augment human capabilities to design and understand biological complexity.
  • The use of AI (artificial intelligence) in agriculture is not new and has been around for some time with technology spans a wide range of abilities—from that which discriminates between crop seedlings and weeds to greenhouse automation. Indeed, it is easy to think that this is new technology given the way that our culture has distanced so many facets of food production, keeping it far away from urban spaces and our everyday reality. Yet, as our planet reaps the negative repercussions of technological and industrial growth, we must wonder if there are ways that our collective cultures might be able to embrace AI’s use in food production which might include a social response to climate change. Similarly, we might consider if new technology might also be used to educate future generations as to the importance of responsible food production and consumption. While we know that AI can be a force for positive change where, for instance, failures in food growth can be detected and where crops can be analyzed in terms of disease, pests and soil health, we must wonder why food growth has been so divorced from our culture and social reality. In recent years, there has been great pushback within satellite communities and the many creations of villages focussed upon holistic methods of food production. Indeed, RegenVillages is one of many examples where vertical farming, aquaponics, aeroponics and permaculture are part of this community's everyday functioning. Moreover, across the UK are many ecovillages and communities seeking to bring back food production to the core of social life. Lammas is one such ecovillage which I visited seven years ago in Wales which has, as its core concept, the notion of a “collective of eco-smallholdings working together to create and sustain a culture of land-based self-reliance.” And there are thousands of such villagesacross the planet whereby communities are invested in working to reduce their carbon footprint while taking back control of their food production. Even Planet Impact’s reforestation programs are interesting because the links between healthy forests and food production are well known as are the benefits of forest gardening which is widely considered a quite resilient agroecosystem. COO & Founder of Planetimpact.com, Oscar Dalvit, reports that his company’s programs are designed to educate as much as to innovate: “With knowledge, we can fight climate change. Within the for-profit sector, we can win this battle.” Forest gardening is a concept that is not only part of the permaculture practice but is also an ancient tradition still alive and well in places like Kerala, India and Martin Crawford’s forest garden in southwest England where his Agroforestry Research Trust offers courses and serves as a model for such communities across the UK. But how can AI help to make sustainable and local farming practices over and above industrial agriculture? Indeed, one must wonder if it is possible for local communities to take control of their food production. So, how can AI and other new tech interfaces bring together communities and food production methods that might provide a sustainable hybrid model of traditional methods and innovative technology? We know already that the IoT (internet of things) is fast becoming that virtual space where AI is being implemented to include within the latest farming technology. And where businesses invested in robotics are likewise finding that there is no ethical implementation of food technology, we must be mindful of how strategies are implemented which incorporate the best of new tech with the best of old tech. Where AI is helping smaller farms to become more profitable, all sorts of digital interfaces are transmitting knowledge, education and the expansion of local farming methods. This means, for instance, that garden maintenance is continued by others within the community as some members are absent for reasons of vacation or illness. Together with AI, customer experience is as much a business model as it is a local community standard for communication and empowerment. The reality is that industrial farming need not take over local food production and there are myriad ways that communities can directly respond to climate change and the encroachment of big agriculture. The health benefits of local farming practices are already well known as are the many ways that smartphone technology can create high-yield farms within small urban spaces. It is high time that communities reclaim their space within urban centers and that urban dwellers consider their food purchasing and consumption habits while building future sustainability which allows everyone to participate in local food production. As media has recently focussed upon AI and industrial farming, we need to encourage that such technology is used to implement local solutionsthat are far more sustainable and realistic instead of pushing big agriculture.

Proudly created by Starckgate 

© 2020 by Starckgate

  • White Facebook Icon
  • White Twitter Icon
  • White Instagram Icon