Jun 13

How to speed up the discovery of new solar cell materials

0 comments

 

A broad class of materials called perovskites is considered one of the most promising avenues for developing new, more efficient solar cells. But the virtually limitless number of possible combinations of these materials’ constituent elements makes the search for promising new perovskites slow and painstaking.

Now, a team of researchers at MIT and several other institutions has accelerated the process of screening new formulations, achieving a roughly ten-fold improvement in the speed of the synthesis and analysis of new compounds. In the process, they have already discovered two sets of promising new perovskite-inspired materials that are worthy of further study.

Their findings are described this week in the journal Joule, in a paper by MIT research scientist Shijing Sun, professor of mechanical engineering Tonio Buonassisi, and 16 others at MIT, in Singapore, and at the National Institute of Standards and Technology in Maryland.

Somewhat surprisingly, although partial automation was employed, most of the improvements in throughput speed resulted from workflow ergonomics, says Buonassisi. That involves more traditional systems efficiencies, often derived by tracking and timing the many steps involved: synthesizing new compounds, depositing them on a substrate to crystallize, and then observing and classifying the resulting crystal formations using multiple techniques.

“There’s a need for accelerated development of new materials,” says Buonassisi, as the world continues to move toward solar energy, including in regions with limited space for solar panels. But the typical system for developing new energy-conversion materials can take 20 years, with significant upfront capital costs, he says. His team’s aim is to cut that development time to under two years.

Essentially, the researchers developed a system that allows a wide variety of materials to be made and tested in parallel. “We’re now able to access a large range of different compositions, using the same materials synthesis platform. It allows us to explore a vast range of parameter space,” he says.

Perovskite compounds consist of three separate constituents, traditionally labeled as A, B, and X site ions, each of which can be any one of a list of candidate elements, forming a very large structural family with diverse physical properties. In the field of perovskite and perovskite-inspired materials for photovoltaic applications, the B-site ion is typically lead, but a major effort in perovskite research is to find viable lead-free versions that can match or exceed the performance of the lead-based varieties.

While more than a thousand potentially useful perovskite formulations have been predicted theoretically, out of millions of theoretically possible combinations, only a small fraction of those has been produced experimentally so far, highlighting the need for an accelerated process, the researchers say.

For the experiments, the team selected a variety of different compositions, each of which they mixed in a solution and then deposited on a substrate, where the material crystallized into a thin film. The film was then examined using a technique called X-ray diffraction, which can reveal details of how the atoms are arranged in the crystal structure. These X-ray diffraction patterns were then initially classified with the help of a convolutional neural network system to speed up that part of the process. That classification step alone, Buonassisi says, initially took three to five hours, but by applying machine learning, this was slashed to 5.5 minutes while maintaining 90 percent accuracy.

Already, in their initial testing of the system, the team explored 75 different formulations in about a tenth of the time it previously would have taken to synthesize and characterize that many. Among those 75, they found two new lead-free perovskite systems that exhibit promising properties that might have potential for high-efficiency solar cells.

In the process, they produced four compounds in thin-film form for the first time; thin films are the desirable form for use in solar cells. They also found examples of “nonlinear bandgap tunability” in some of the materials, an unexpected characteristic that relates to the energy level needed to excite an electron in the material, which they say opens up new pathways for potential solar cells.

The team says that with further automation of parts of the process, it should be possible to continue to increase the processing speed, making it anywhere from 10 to 100 times as fast. Ultimately, Buonassisi says, it’s all about getting solar power to be as inexpensive as possible, continuing the technology’s already remarkable plunge. The aim is to bring economically sustainable prices below 2 cents per kilowatt-hour, he says, and getting there could be the result of a single breakthrough in materials: “All you have to do is make one material” that has just the right combination of properties — including ease of manufacture, low cost of materials, and high efficiency at converting sunlight.

“We’re putting all the experimental pieces in place so we can explore faster,” he says.

The work was supported by Total SA through the MIT Energy Initiative, by the National Science Foundation, and Singapore’s National Research Foundation through the Singapore-MIT Alliance for Research and Technology.

New Posts
  • An updated analysis from OpenAI shows how dramatically the need for computational resources has increased to reach each new AI breakthrough. In 2018, OpenAI found that the amount of computational power used to train the largest AI models had doubled every 3.4 months since 2012. The San Francisco-based for-profit AI research lab has now added new data to its analysis. This shows how the post-2012 doubling compares to the historic doubling time since the beginning of the field. From 1959 to 2012, the amount of power required doubled every 2 years, following Moore’s Law. This means the doubling time today is more than seven times the previous rate. This dramatic increase in the resources needed underscores just how costly the field’s achievements have become. Keep in mind, the above graph shows a log scale. On a linear scale (below), you can more clearly see how compute usage has increased by 300,000-fold in the last seven years. The chart also notably does not include some of the most recent breakthroughs, including Google’s large-scale language model BERT, OpenAI’s large-scale language model GPT-2,  or DeepMind’s StarCraft II-playing model AlphaStar. In the past year, more and more researchers have sounded the alarm on the exploding costs of deep learning. In June, an analysis from researchers at the University of Massachusetts, Amherst, showed how these increasing computational costs directly translate into carbon emissions. In their paper, they also noted how the trend exacerbates the privatization of AI research because it undermines the ability for academic labs to compete with much more resource-rich private ones. In response to this growing concern, several industry groups have made recommendations. The Allen Institute for Artificial Intelligence, a nonprofit research firm in Seattle, has proposed that researchers always publish the financial and computational costs of training their models along with their performance results, for example. In its own blog, OpenAI suggested policymakers increase funding to academic researchers to bridge the resource gap between academic and industry labs
  • StarckGate is happy to work together with Asimov that will be aiming to radically advance humanity's ability to design living systems. They strive to enable biotechnologies with global benefit by combining synthetic biology and computer science. With their help we will able to grasp the following domains better Synthetic Biology Nature has evolved billions of useful molecular nanotechnology devices in the form of genes, across the tree of life. We catalog, refine, and remix these genetic components to engineer new biological systems. Computational Modeling Biology is complex, and genetic engineering unlocks an unbounded design space. Computational tools are critical to design and model complex biophysical systems and move synthetic biology beyond traditional brute force screening. Cellular Measurement Genome-scale, multi-omics measurement technologies provide deep views into the cell. These techniques permit pathway analysis at the scale of a whole cell, and inspection down at single-nucleotide resolution. Machine Learning We are developing machine learning algorithms that bridge large-scale datasets with mechanistic models of biology. Artificial intelligence can augment human capabilities to design and understand biological complexity.
  • The use of AI (artificial intelligence) in agriculture is not new and has been around for some time with technology spans a wide range of abilities—from that which discriminates between crop seedlings and weeds to greenhouse automation. Indeed, it is easy to think that this is new technology given the way that our culture has distanced so many facets of food production, keeping it far away from urban spaces and our everyday reality. Yet, as our planet reaps the negative repercussions of technological and industrial growth, we must wonder if there are ways that our collective cultures might be able to embrace AI’s use in food production which might include a social response to climate change. Similarly, we might consider if new technology might also be used to educate future generations as to the importance of responsible food production and consumption. While we know that AI can be a force for positive change where, for instance, failures in food growth can be detected and where crops can be analyzed in terms of disease, pests and soil health, we must wonder why food growth has been so divorced from our culture and social reality. In recent years, there has been great pushback within satellite communities and the many creations of villages focussed upon holistic methods of food production. Indeed, RegenVillages is one of many examples where vertical farming, aquaponics, aeroponics and permaculture are part of this community's everyday functioning. Moreover, across the UK are many ecovillages and communities seeking to bring back food production to the core of social life. Lammas is one such ecovillage which I visited seven years ago in Wales which has, as its core concept, the notion of a “collective of eco-smallholdings working together to create and sustain a culture of land-based self-reliance.” And there are thousands of such villagesacross the planet whereby communities are invested in working to reduce their carbon footprint while taking back control of their food production. Even Planet Impact’s reforestation programs are interesting because the links between healthy forests and food production are well known as are the benefits of forest gardening which is widely considered a quite resilient agroecosystem. COO & Founder of Planetimpact.com, Oscar Dalvit, reports that his company’s programs are designed to educate as much as to innovate: “With knowledge, we can fight climate change. Within the for-profit sector, we can win this battle.” Forest gardening is a concept that is not only part of the permaculture practice but is also an ancient tradition still alive and well in places like Kerala, India and Martin Crawford’s forest garden in southwest England where his Agroforestry Research Trust offers courses and serves as a model for such communities across the UK. But how can AI help to make sustainable and local farming practices over and above industrial agriculture? Indeed, one must wonder if it is possible for local communities to take control of their food production. So, how can AI and other new tech interfaces bring together communities and food production methods that might provide a sustainable hybrid model of traditional methods and innovative technology? We know already that the IoT (internet of things) is fast becoming that virtual space where AI is being implemented to include within the latest farming technology. And where businesses invested in robotics are likewise finding that there is no ethical implementation of food technology, we must be mindful of how strategies are implemented which incorporate the best of new tech with the best of old tech. Where AI is helping smaller farms to become more profitable, all sorts of digital interfaces are transmitting knowledge, education and the expansion of local farming methods. This means, for instance, that garden maintenance is continued by others within the community as some members are absent for reasons of vacation or illness. Together with AI, customer experience is as much a business model as it is a local community standard for communication and empowerment. The reality is that industrial farming need not take over local food production and there are myriad ways that communities can directly respond to climate change and the encroachment of big agriculture. The health benefits of local farming practices are already well known as are the many ways that smartphone technology can create high-yield farms within small urban spaces. It is high time that communities reclaim their space within urban centers and that urban dwellers consider their food purchasing and consumption habits while building future sustainability which allows everyone to participate in local food production. As media has recently focussed upon AI and industrial farming, we need to encourage that such technology is used to implement local solutionsthat are far more sustainable and realistic instead of pushing big agriculture.

Proudly created by Starckgate 

© 2020 by Starckgate

  • White Facebook Icon
  • White Twitter Icon
  • White Instagram Icon