Aug 22

Experiments explore the mysteries of ‘magic’ angle superconductors

0 comments

In spring 2018, the surprising discovery of superconductivity in a new material set the scientific community abuzz. Built by layering one carbon sheet atop another and twisting the top one at a “magic” angle, the material enabled electrons to flow without resistance, a trait that could dramatically boost energy efficient power transmission and usher in a host of new technologies.

 

Now, new experiments conducted at Princeton give hints at how this material — known as magic-angle twisted graphene — gives rise to superconductivity. In this week’s issue of the journal Nature, Princeton researchers provide firm evidence that the superconducting behavior arises from strong interactions between electrons, yielding insights into the rules that electrons follow when superconductivity emerges.

“This is one of the hottest topics in physics,” said Ali Yazdani, who is the Class of 1909 Professor of Physics and the senior author of the study. “This is a material that is incredibly simple, just two sheets of carbon that you stick one on top of the other, and it shows superconductivity.”

Exactly how superconductivity arises is a mystery that laboratories around the world are racing to solve. The field even has a name, “twistronics.”

Part of the excitement is that, compared to existing superconductors, the material is quite easy to study since it only has two layers and only one type of atom — carbon.

“The main thing about this new material is that it is a playground for all these kinds of physics that people have been thinking about for the last 40 years,” said B. Andrei Bernevig, a professor of physics specializing in theories to explain complex materials.

The superconductivity in the new material appears to work by a fundamentally different mechanism from traditional superconductors, which today are used in powerful magnets and other limited applications. This new material has similarities to copper-based, high-temperature superconductors discovered in the 1980s called cuprates. The discovery of cuprates led to the Nobel Prize in Physics in 1987.

The new material consists of two atomically thin sheets of carbon known as graphene. Also the subject of a Nobel Prize in Physics, in 2010, graphene has a flat honeycomb pattern, like a sheet of chicken wire. In March 2018, Pablo Jarillo-Herrero and his team at the Massachusetts Institute of Technology placed a second layer of graphene atop the first, then rotated the top sheet by the “magic” angle of about 1.1 degrees. This angle had been predicted earlier by physicists to cause new electron interactions, but it came as a shock when MIT scientists demonstrated superconductivity.

Seen from above, the overlapping chicken-wire patterns give a flickering effect known as “moiré,” which arises when two geometrically regular patterns overlap, and which was once popular in the fabrics and fashions of 17th- and 18th-century royals.

These moiré patterns give rise to profoundly new properties not seen in ordinary materials. Most ordinary materials fall into a spectrum from insulating to conducting. Insulators trap electrons in energy pockets or levels that keep them stuck in place, while metals contain energy states that permit electrons to flit from atom to atom. In both cases, electrons occupy different energy levels and do not interact or engage in collective behavior.

In twisted graphene, however, the physical structure of the moiré lattice creates energy states that prevent electrons from standing apart, forcing them to interact. “It is creating a condition where the electrons can’t get out of each other’s way, and instead they all have to be in similar energy levels, which is prime condition to create highly entangled states,” Yazdani said.

The question the researchers addressed was whether this entanglement has any connection with its superconductivity. Many simple metals also superconduct, but all the high-temperature superconductors discovered to date, including the cuprates, show highly entangled states caused by mutual repulsion between electrons. The strong interaction between electrons appears to be a key to achieve higher temperature superconductivity.

To address this question, Princeton researchers used a scanning tunneling microscope that is so sensitive that it can image individual atoms on a surface. The team scanned samples of magic-angle twisted graphene in which they controlled the number of electrons by applying a voltage to a nearby electrode. The study provided microscopic information on electron behavior in twisted bilayer graphene, whereas most other studies to date have monitored only macroscopic electrical conduction.

By dialing the number of electrons to very low or very high concentrations, the researchers observed electrons behaving almost independently, as they would in simple metals. However, at the critical concentration of electrons where superconductivity was discovered in this system, the electrons suddenly displayed signs of strong interaction and entanglement.

At the concentration where superconductivity emerged, the team found that the electron energy levels became unexpectedly broad, signals that confirm strong interaction and entanglement. Still, Bernevig emphasized that while these experiments open the door to further study, more work needs to be done to understand in detail the type of entanglement that is occurring.

“There is still so much we don’t know about these systems,” he said. “We are nowhere near even scraping the surface of what can be learned through experiments and theoretical modeling.”

Contributors to the study included Kenji Watanabe and Takashi Taniguchi of the National Institute for Material Science in Japan; graduate student and first author Yonglong Xie, postdoctoral research fellow Berthold Jäck, postdoctoral research associate Xiaomeng Liu, and graduate student Cheng-Li Chiu in Yazdani’s research group; and Biao Lian in Bernevig’s research group.

Spectroscopic signatures of many-body correlations in magic angle twisted bilayer graphene,” by Yonglong Xie, Biao Lian, Berthold Jäck, Xiaomeng Liu, Cheng-Li Chiu, Kenji Watanabe, Takashi Taniguchi, B. Andrei Bernevig and Ali Yazdani, was published Aug. 1 in the journal Nature and released online July 31 (DOI: 10.1038/s41586-019-1422-x). This work was primarily supported by the Gordon and Betty Moore Foundation as part of the EPiQS initiative (grant GBMF4530) and by the U.S. Department of Energy’s Basic Energy Sciences program (grant DE-FG02-07ER46419). Other support for experimental efforts was provided by the National Science Foundation’s MRSEC program through the Princeton Center for Complex Materials (grants NSF-DMR-142054 and NSF-DMR-1608848); ExxonMobil through Princeton’s Andlinger Center for Energy and the Environment; and the Princeton Catalysis Initiative. Additional funds came from the Alexander von Humboldt Foundation; Japan’s Ministry of Education, Culture, Sports, Science and Technology; the A3 Foresight Program (JPMJCR15F3); the Princeton Center for Theoretical Science; the U.S. Department of Energy (grant DE-SC0016239), the Simons Foundation, the David and Lucile Packard Foundation, the Eric and Wendy Schmidt Transformative Technology Fund, and the National Science Foundation (EAGER grant DMR-1643312 and NSF-MRSEC DMR-1420541).

New Posts
  • An updated analysis from OpenAI shows how dramatically the need for computational resources has increased to reach each new AI breakthrough. In 2018, OpenAI found that the amount of computational power used to train the largest AI models had doubled every 3.4 months since 2012. The San Francisco-based for-profit AI research lab has now added new data to its analysis. This shows how the post-2012 doubling compares to the historic doubling time since the beginning of the field. From 1959 to 2012, the amount of power required doubled every 2 years, following Moore’s Law. This means the doubling time today is more than seven times the previous rate. This dramatic increase in the resources needed underscores just how costly the field’s achievements have become. Keep in mind, the above graph shows a log scale. On a linear scale (below), you can more clearly see how compute usage has increased by 300,000-fold in the last seven years. The chart also notably does not include some of the most recent breakthroughs, including Google’s large-scale language model BERT, OpenAI’s large-scale language model GPT-2,  or DeepMind’s StarCraft II-playing model AlphaStar. In the past year, more and more researchers have sounded the alarm on the exploding costs of deep learning. In June, an analysis from researchers at the University of Massachusetts, Amherst, showed how these increasing computational costs directly translate into carbon emissions. In their paper, they also noted how the trend exacerbates the privatization of AI research because it undermines the ability for academic labs to compete with much more resource-rich private ones. In response to this growing concern, several industry groups have made recommendations. The Allen Institute for Artificial Intelligence, a nonprofit research firm in Seattle, has proposed that researchers always publish the financial and computational costs of training their models along with their performance results, for example. In its own blog, OpenAI suggested policymakers increase funding to academic researchers to bridge the resource gap between academic and industry labs
  • StarckGate is happy to work together with Asimov that will be aiming to radically advance humanity's ability to design living systems. They strive to enable biotechnologies with global benefit by combining synthetic biology and computer science. With their help we will able to grasp the following domains better Synthetic Biology Nature has evolved billions of useful molecular nanotechnology devices in the form of genes, across the tree of life. We catalog, refine, and remix these genetic components to engineer new biological systems. Computational Modeling Biology is complex, and genetic engineering unlocks an unbounded design space. Computational tools are critical to design and model complex biophysical systems and move synthetic biology beyond traditional brute force screening. Cellular Measurement Genome-scale, multi-omics measurement technologies provide deep views into the cell. These techniques permit pathway analysis at the scale of a whole cell, and inspection down at single-nucleotide resolution. Machine Learning We are developing machine learning algorithms that bridge large-scale datasets with mechanistic models of biology. Artificial intelligence can augment human capabilities to design and understand biological complexity.
  • The use of AI (artificial intelligence) in agriculture is not new and has been around for some time with technology spans a wide range of abilities—from that which discriminates between crop seedlings and weeds to greenhouse automation. Indeed, it is easy to think that this is new technology given the way that our culture has distanced so many facets of food production, keeping it far away from urban spaces and our everyday reality. Yet, as our planet reaps the negative repercussions of technological and industrial growth, we must wonder if there are ways that our collective cultures might be able to embrace AI’s use in food production which might include a social response to climate change. Similarly, we might consider if new technology might also be used to educate future generations as to the importance of responsible food production and consumption. While we know that AI can be a force for positive change where, for instance, failures in food growth can be detected and where crops can be analyzed in terms of disease, pests and soil health, we must wonder why food growth has been so divorced from our culture and social reality. In recent years, there has been great pushback within satellite communities and the many creations of villages focussed upon holistic methods of food production. Indeed, RegenVillages is one of many examples where vertical farming, aquaponics, aeroponics and permaculture are part of this community's everyday functioning. Moreover, across the UK are many ecovillages and communities seeking to bring back food production to the core of social life. Lammas is one such ecovillage which I visited seven years ago in Wales which has, as its core concept, the notion of a “collective of eco-smallholdings working together to create and sustain a culture of land-based self-reliance.” And there are thousands of such villagesacross the planet whereby communities are invested in working to reduce their carbon footprint while taking back control of their food production. Even Planet Impact’s reforestation programs are interesting because the links between healthy forests and food production are well known as are the benefits of forest gardening which is widely considered a quite resilient agroecosystem. COO & Founder of Planetimpact.com, Oscar Dalvit, reports that his company’s programs are designed to educate as much as to innovate: “With knowledge, we can fight climate change. Within the for-profit sector, we can win this battle.” Forest gardening is a concept that is not only part of the permaculture practice but is also an ancient tradition still alive and well in places like Kerala, India and Martin Crawford’s forest garden in southwest England where his Agroforestry Research Trust offers courses and serves as a model for such communities across the UK. But how can AI help to make sustainable and local farming practices over and above industrial agriculture? Indeed, one must wonder if it is possible for local communities to take control of their food production. So, how can AI and other new tech interfaces bring together communities and food production methods that might provide a sustainable hybrid model of traditional methods and innovative technology? We know already that the IoT (internet of things) is fast becoming that virtual space where AI is being implemented to include within the latest farming technology. And where businesses invested in robotics are likewise finding that there is no ethical implementation of food technology, we must be mindful of how strategies are implemented which incorporate the best of new tech with the best of old tech. Where AI is helping smaller farms to become more profitable, all sorts of digital interfaces are transmitting knowledge, education and the expansion of local farming methods. This means, for instance, that garden maintenance is continued by others within the community as some members are absent for reasons of vacation or illness. Together with AI, customer experience is as much a business model as it is a local community standard for communication and empowerment. The reality is that industrial farming need not take over local food production and there are myriad ways that communities can directly respond to climate change and the encroachment of big agriculture. The health benefits of local farming practices are already well known as are the many ways that smartphone technology can create high-yield farms within small urban spaces. It is high time that communities reclaim their space within urban centers and that urban dwellers consider their food purchasing and consumption habits while building future sustainability which allows everyone to participate in local food production. As media has recently focussed upon AI and industrial farming, we need to encourage that such technology is used to implement local solutionsthat are far more sustainable and realistic instead of pushing big agriculture.

Proudly created by Starckgate 

© 2020 by Starckgate

  • White Facebook Icon
  • White Twitter Icon
  • White Instagram Icon