Normal view

There are new articles available, click to refresh the page.
Before yesterdayMain stream

Why 2026 is a hot year for lithium

22 January 2026 at 06:00

In 2026, I’m going to be closely watching the price of lithium.

If you’re not in the habit of obsessively tracking commodity markets, I certainly don’t blame you. (Though the news lately definitely makes the case that minerals can have major implications for global politics and the economy.)

But lithium is worthy of a close look right now.

The metal is crucial for lithium-ion batteries used in phones and laptops, electric vehicles, and large-scale energy storage arrays on the grid. Prices have been on quite the roller coaster over the last few years, and they’re ticking up again after a low period. What happens next could have big implications for mining and battery technology.

Before we look ahead, let’s take a quick trip down memory lane. In 2020, global EV sales started to really take off, driving up demand for the lithium used in their batteries. Because of that growing demand and a limited supply, prices shot up dramatically, with lithium carbonate going from under $10 per kilogram to a high of roughly $70 per kilogram in just two years.

And the tech world took notice. During those high points, there was a ton of interest in developing alternative batteries that didn’t rely on lithium. I was writing about sodium-based batteries, iron-air batteries, and even experimental ones that were made with plastic.

Researchers and startups were also hunting for alternative ways to get lithium, including battery recycling and processing methods like direct lithium extraction (more on this in a moment).

But soon, prices crashed back down to earth. We saw lower-than-expected demand for EVs in the US, and developers ramped up mining and processing to meet demand. Through late 2024 and 2025, lithium carbonate was back around $10 a kilogram again. Avoiding lithium or finding new ways to get it suddenly looked a lot less crucial.

That brings us to today: lithium prices are ticking up again. So far, it’s nowhere close to the dramatic rise we saw a few years ago, but analysts are watching closely. Strong EV growth in China is playing a major role—EVs still make up about 75% of battery demand today. But growth in stationary storage, batteries for the grid, is also contributing to rising demand for lithium in both China and the US.

Higher prices could create new opportunities. The possibilities include alternative battery chemistries, specifically sodium-ion batteries, says Evelina Stoikou, head of battery technologies and supply chains at BloombergNEF. (I’ll note here that we recently named sodium-ion batteries to our 2026 list of 10 Breakthrough Technologies.)

It’s not just batteries, though. Another industry that could see big changes from a lithium price swing: extraction.

Today, most lithium is mined from rocks, largely in Australia, before being shipped to China for processing. There’s a growing effort to process the mineral in other places, though, as countries try to create their own lithium supply chains. Tesla recently confirmed that it’s started production at its lithium refinery in Texas, which broke ground in 2023. We could see more investment in processing plants outside China if prices continue to climb.

This could also be a key year for direct lithium extraction, as Katie Brigham wrote in a recent story for Heatmap. That technology uses chemical or electrochemical processes to extract lithium from brine (salty water that’s usually sourced from salt lakes or underground reservoirs), quickly and cheaply. Companies including Lilac Solutions, Standard Lithium, and Rio Tinto are all making plans or starting construction on commercial facilities this year in the US and Argentina. 

If there’s anything I’ve learned about following batteries and minerals over the past few years, it’s that predicting the future is impossible. But if you’re looking for tea leaves to read, lithium prices deserve a look. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

Three climate technologies breaking through in 2026

15 January 2026 at 06:00

Happy New Year! I know it’s a bit late to say, but it never quite feels like the year has started until the new edition of our 10 Breakthrough Technologies list comes out. 

For 25 years, MIT Technology Review has put together this package, which highlights the technologies that we think are going to matter in the future. This year’s version has some stars, including gene resurrection (remember all the dire wolf hype last year?) and commercial space stations

And of course, the world of climate and energy is represented with sodium-ion batteries, next-generation nuclear, and hyperscale AI data centers. Let’s take a look at what ended up on the list, and what it says about this moment for climate tech. 

Sodium-ion batteries

I’ve been covering sodium-ion batteries for years, but this moment feels like a breakout one for the technology. 

Today, lithium-ion cells power everything from EVs, phones, and computers to huge stationary storage arrays that help support the grid. But researchers and battery companies have been racing to develop an alternative, driven by the relative scarcity of lithium and the metal’s volatile price in recent years. 

Sodium-ion batteries could be that alternative. Sodium is much more abundant than lithium, and it could unlock cheaper batteries that hold a lower fire risk.  

There are limitations here: Sodium-ion batteries won’t be able to pack as much energy into cells as their lithium counterparts. But it might not matter, especially for grid storage and smaller EVs. 

In recent years, we’ve seen a ton of interest in sodium-based batteries, particularly from major companies in China. Now the new technology is starting to make its way into the world—CATL says it started manufacturing these batteries at scale in 2025. 

Next-generation nuclear

Nuclear reactors are an important part of grids around the world today—massive workhorse reactors generate reliable, consistent electricity. But the countries with the oldest and most built-out fleets have struggled to add to them in recent years, since reactors are massive and cost billions. Recent high-profile projects have gone way over budget and faced serious delays. 

Next-generation reactor designs could help the industry break out of the old blueprint and get more nuclear power online more quickly, and they’re starting to get closer to becoming reality. 

There’s a huge variety of proposals when it comes to what’s next for nuclear. Some companies are building smaller reactors, which they say could make it easier to finance new projects, and get them done on time. 

Other companies are focusing on tweaking key technical bits of reactors, using alternative fuels or coolants that help ferry heat out of the reactor core. These changes could help reactors generate electricity more efficiently and safely. 

Kairos Power was the first US company to receive approval to begin construction on a next-generation reactor to produce electricity. China is emerging as a major center of nuclear development, with the country’s national nuclear company reportedly working on several next-gen reactors. 

Hyperscale data centers

This one isn’t quite what I would call a climate technology, but I spent most of last year reporting on the climate and environmental impacts of AI, and the AI boom is deeply intertwined with climate and energy. 

Data centers aren’t new, but we’re seeing a wave of larger centers being proposed and built to support the rise of AI. Some of these facilities require a gigawatt or more of power—that’s like the output of an entire conventional nuclear power plant, just for one data center. 

(This feels like a good time to mention that our Breakthrough Technologies list doesn’t just highlight tech that we think will have a straightforwardly positive influence on the world. I think back to our 2023 list, which included mass-market military drones.)

There’s no denying that new, supersize data centers are an important force driving electricity demand, sparking major public pushback, and emerging as a key bit of our new global infrastructure. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

Amazon supersizes its Walmart rivalry with new big-box retail concept

13 January 2026 at 13:34
A rendering of the future Amazon superstore outside of Chicago, from an Orland Park, Ill., planning document.

Amazon has spent two decades trying to disrupt Walmart’s dominance. Now, it appears the e-commerce giant is taking those efforts to a whole new scale.

A new proposal for a massive, 229,000-square-foot Amazon facility in suburban Chicago looks and feels a lot like a classic Walmart superstore but with distinctive Amazon elements, including the ability to order items via app or kiosk for fulfillment from the back of the store.

The company describes the plans as part of its culture of experimentation — calling it “a new concept that we think customers will be excited about.” Amazon says the store will offer fresh groceries, household essentials, and general merchandise, making it convenient for customers to shop a broad selection of items in one trip.

“This could just be another experiment, but as experiments go, it reveals a degree of Walmart jealousy that we didn’t expect,” wrote analysts Mike Levin and Josh Lowitz of Consumer Intelligence Research Partners (CIRP), in a report to subscribers this morning.

CIRP notes that while Amazon dominates e-commerce, online shopping accounts for less than 20% of U.S. retail spending, leaving the vast majority of consumer dollars on the table. 

Amazon has tried a variety of physical retail formats over the years, with mixed results, in addition to its acquisition of Whole Foods for $13.7 billion in 2017. Whole Foods CEO Jason Buechel was named a year ago to oversee Amazon’s Worldwide Grocery Stores business, including its Amazon Fresh stores.

The company says it already serves more than 150 million grocery shoppers in the U.S., generating over $100 billion in grocery sales in 2024.

But with data showing that 93% of Amazon customers still shop at Walmart, CIRP suggests this new superstore concept is Amazon’s admission that capturing the remaining addressable market requires building a physical moat that rivals the scale and utility of its biggest competitor.

While the footprint screams “traditional big box,” the plans signal that Amazon is attempting to put its own spin on the superstore format.

Filings with the Village of Orland Park indicate that a large portion of the building’s floor plan is designated for “back of house” operations that support in-store and pickup orders. Part of the idea is to solve a headache that plagues modern grocery stores: the clash between in-store shoppers and gig-economy workers.

During an Orland Park planning commission hearing, an Amazon rep described a tech-enabled experience where the digital and physical worlds merge for general merchandise.

A customer might find a sweater on the rack in blue, but want it in red. Instead of searching through piles of inventory, they could use a dedicated app or in-store kiosk to request the item from the back room, picking it up at the front counter when they are finished shopping.

This is similar to an Amazon experiment at its Whole Foods locations — building a “store within a store” to bridge the gap between niche organic offerings and mass-market items.

Amazon last fall unveiled an automated micro-fulfillment center attached to a Whole Foods in Plymouth Meeting, Pa. The concept allows shoppers to browse organic produce in the aisles while simultaneously ordering non-Whole Foods items — like Tide Pods, Pepsi, or Doritos — via an app. Robots in the back pick the items, and the full order is ready for the customer on site.

The Orland Park superstore appears to be an industrial-sized evolution of that experiment.

“We like to explain it as: ‘It’s the best that Amazon has to offer under Whole Foods, Fresh and their online offerings,’ ” said Katie Jahnke Dale, a lawyer representing Amazon at the hearing.

The site plan includes dedicated queuing areas for delivery drivers and separate pickup lanes for customers, streamlining the flow of goods without disrupting the in-store experience.

The planning commission voted 6-1 to recommend approval of the project. The proposal now heads to the Orland Park Village Board of Trustees for a final vote, which is scheduled for Jan. 19. If approved, village officials estimate the store could open in late 2027.

Spark Explained Like You’re Five

13 January 2026 at 08:00

Bitcoin Magazine

Spark Explained Like You’re Five

Some of you may remember an article I published years ago, Understanding Lightning Network Using an Abacus, which I wrote after it became clear to me that many people didn’t fully understand how Lightning works. At the time, my goal wasn’t to explain Lightning’s cryptography or implementation details, but to demystify the core idea behind payment channels. I used the analogy of the abacus to focus on the concept rather than the mechanics. It worked extremely well and people later adopted the abacus analogy to explain Lightning to noobs.

Lately, I’ve been feeling a strong sense of déjà vu.

When discussing Spark, I notice a similar pattern. Some know to say “statechain”, but for most, that’s where the understanding ends. And as with Lightning back then, the problem isn’t a lack of intelligence or effort, it’s simply that the underlying mental model isn’t clear. So I’ll try the same approach again: explain how Spark works conceptually, without getting into cryptographic terminology.

The Two-Piece Puzzle

At its core, Spark allows users to send and receive bitcoin without broadcasting on-chain transactions. The bitcoin doesn’t move on-chain when ownership changes. Instead, what changes is who can jointly authorize their spend. This joint authorization is shared between the user and a group of operators called a Spark Entity (SE).

To explain how this works, imagine that spending a given set of bitcoin on Spark requires completing a simple two-piece puzzle: 

  • One piece of the puzzle is held by the user. 
  • The other piece is held by the SE.

Only when both matching pieces come together can the bitcoin be spent. A different set of bitcoin will require the completion of a different puzzle. 

Now let’s walk through what happens when ownership changes.

Initially, Alice holds a puzzle piece that matches the piece held by the SE. She can spend her bitcoins by combining the pieces and completing the puzzle. When Alice wants to send her bitcoins to Bob, she allows Bob to create a new puzzle together with the SE. Importantly, the puzzle itself doesn’t change: the old and new puzzle have the same shape, but the pieces that compose it change. The new puzzle is designated for Bob: one side is associated with Bob and the other with the SE. From that point on, only Bob’s piece matches the SE’s piece. Alice may still retain her old puzzle piece, but it’s now useless. Since the SE destroyed its matching piece, Alice’s piece no longer fits any other piece and cannot be used to spend the bitcoin. Ownership has effectively moved to Bob, even though the bitcoin in question never moved on-chain.

Bob can later repeat the same process to send the same set of bitcoin to Carol and so on. Each transfer works by replacing the puzzle pieces, not by moving the funds on-chain.

At this point, a question naturally arises: what if the SE simply doesn’t discard its old puzzle piece? In that case, the SE could collude with the previous owner, Alice, and spend Bob’s bitcoin. We need to trust the SE that, when ownership moved from Alice to Bob, it also destroyed its piece of the puzzle. However, it’s important to understand that an SE is not a single party. It consists of a group of operators, and the SE’s side of the puzzle is never held by one operator alone. Replacing the puzzle requires cooperation among multiple operators. No single party can secretly keep an old puzzle active or recreate it later. It’s enough for one operator to act honestly during a transfer to prevent an old puzzle from ever being reactivated.

The key idea is simple: Spark doesn’t move bitcoin on-chain between users. It replaces who holds the valid authorization to spend them. The on-chain bitcoin doesn’t move. What changes is which two puzzle pieces fit together.

To keep this explanation focused, I intentionally didn’t get into Spark’s unilateral exit mechanism. It’s an important part of Spark’s security model, but it would distract from the core idea I want to convey here. What matters is that Spark is not a system where users are permanently dependent on the SE. While everyday transfers rely on joint authorization, Spark also provides users with a way to spend their funds on-chain without requiring the cooperation of the SE. That escape hatch exists by design, it’s just outside the scope of this explanation. 

This post Spark Explained Like You’re Five first appeared on Bitcoin Magazine and is written by Roy Sheinfeld.

What new legal challenges mean for the future of US offshore wind

8 January 2026 at 06:00

For offshore wind power in the US, the new year is bringing new legal battles.

On December 22, the Trump administration announced it would pause the leases of five wind farms currently under construction off the US East Coast. Developers were ordered to stop work immediately.

The cited reason? National security, specifically concerns that turbines can cause radar interference. But that’s a known issue, and developers have worked with the government to deal with it for years.

Companies have been quick to file lawsuits, and the court battles could begin as soon as this week. Here’s what the latest kerfuffle might mean for the struggling offshore wind industry in the US.

This pause affects $25 billion in investment in five wind farms: Vineyard Wind 1 off Massachusetts, Revolution Wind off Rhode Island, Sunrise Wind and Empire Wind off New York, and Coastal Virginia Offshore Wind off Virginia. Together, those projects had been expected to create 10,000 jobs and power more than 2.5 million homes and businesses.

In a statement announcing the move, the Department of the Interior said that “recently completed classified reports” revealed national security risks, and that the pause would give the government time to work through concerns with developers. The statement specifically says that turbines can create radar interference (more on the technical details here in a moment).

Three of the companies involved have already filed lawsuits, and they’re seeking preliminary injunctions that would allow construction to continue. Orsted and Equinor (the developers for Revolution Wind and Empire Wind, respectively) told the New York Times that their projects went through lengthy federal reviews, which did address concerns about national security.

This is just the latest salvo from the Trump administration against offshore wind. On Trump’s first day in office, he signed an executive order stopping all new lease approvals for offshore wind farms. (That order was struck down by a judge in December.)

The administration previously ordered Revolution Wind to stop work last year, also citing national security concerns. A federal judge lifted the stop-work order weeks later, after the developer showed that the financial stakes were high, and that government agencies had previously found no national security issues with the project.

There are real challenges that wind farms introduce for radar systems, which are used in everything from air traffic control to weather forecasting to national defense operations. A wind turbine’s spinning can create complex signatures on radar, resulting in so-called clutter.

Previous government reports, including one 2024 report from the Department of Energy and a 2025 report from the Government Accountability Office (an independent government watchdog), have pointed out this issue in the past.

“To date, no mitigation technology has been able to fully restore the technical performance of impacted radars,” as the DOE report puts it. However, there are techniques that can help, including software that acts to remove the signatures of wind turbines. (Think of this as similar to how noise-canceling headphones work, but more complicated, as one expert told TechCrunch.)

But the most widespread and helpful tactic, according to the DOE report, is collaboration between developers and the government. By working together to site and design wind farms strategically, the groups can ensure that the projects don’t interfere with government or military operations. The 2025 GAO report found that government officials, researchers, and offshore wind companies were collaborating effectively, and any concerns could be raised and addressed in the permitting process.

This and other challenges threaten an industry that could be a major boon for the grid. On the East Coast where these projects are located, and in New England specifically, winter can bring tight supplies of fossil fuels and spiking prices because of high demand. It just so happens that offshore winds blow strongest in the winter, so new projects, including the five wrapped up in this fight, could be a major help during the grid’s greatest time of need.

One 2025 study found that if 3.5 gigawatts’ worth of offshore wind had been operational during the 2024-2025 winter, it would have lowered energy prices by 11%. (That’s the combined capacity of Revolution Wind and Vineyard Wind, two of the paused projects, plus two future projects in the pipeline.) Ratepayers would have saved $400 million.

Before Donald Trump was elected, the energy consultancy BloombergNEF projected that the US would build 39 gigawatts of offshore wind by 2035. Today, that expectation has dropped to just 6 gigawatts. These legal battles could push it lower still.

What’s hardest to wrap my head around is that some of the projects being challenged are nearly finished. The developers of Revolution Wind have installed all the foundations and 58 of 65 turbines, and they say the project is over 87% complete. Empire Wind is over 60% done and is slated to deliver electricity to the grid next year.

To hit the pause button so close to the finish line is chilling, not just for current projects but for future offshore wind efforts in the US. Even if these legal battles clear up and more developers can technically enter the queue, why would they want to? Billions of dollars are at stake, and if there’s one word to describe the current state of the offshore wind industry in the US, it’s “unpredictable.”

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

Four bright spots in climate news in 2025

Climate news hasn’t been great in 2025. Global greenhouse-gas emissions hit record highs (again). This year is set to be either the second or third warmest on record. Climate-fueled disasters like wildfires in California and flooding in Indonesia and Pakistan devastated communities and caused billions in damage.

In addition to these worrying indicators of our continued contributions to climate change and their obvious effects, the world’s largest economy has made a sharp U-turn on climate policy this year. The US under the Trump administration withdrew from the Paris Agreement, cut funds for climate research, and scrapped billions of dollars in funding for climate tech projects.

We’re in a severe situation with climate change. But for those looking for bright spots, there was some good news in 2025. Here are a few of the positive stories our climate reporters noticed this year.

China’s flattening emissions

Solar panels field on hillside
GETTY IMAGES

One of the most notable and encouraging signs of progress this year occurred in China. The world’s second-biggest economy and biggest climate polluter has managed to keep carbon dioxide emissions flat for the last year and a half, according to an analysis in Carbon Brief.

That’s happened before, but only when the nation’s economy was retracting, including in the midst of the covid-19 pandemic. But emissions are now falling even as China’s economy is on track to grow about 5% this year, and electricity demands continue to rise.

So what’s changed? China has now installed so much solar and wind, and put so many EVs on the road, that its economy can continue to expand without increasing the amount of carbon dioxide it’s pumping into the atmosphere, decoupling the traditional link between emissions and growth.

Specifically, China added an astounding 240 gigawatts of solar power capacity and 61 gigawatts of wind power in the first nine months of the year, the Carbon Brief analysis noted. That’s nearly as much solar power as the US has installed in total, in just the first three quarters of this year.

It’s too early to say China’s emissions have peaked, but the country has said it will officially reach that benchmark before 2030.

To be clear, China still isn’t moving fast enough to keep the world on track for meeting relatively safe temperature targets. (Indeed, very few countries are.) But it’s now both producing most of the world’s clean energy technologies and curbing its emissions growth, providing a model for cleaning up industrial economies without sacrificing economic prosperity—and setting the stage for faster climate progress in the coming years.

Batteries on the grid

looking down a row on battery storage units on an overcast day
AP PHOTO/SAM HODDE

It’s hard to articulate just how quickly batteries for grid storage are coming online. These massive arrays of cells can soak up electricity when sources like solar are available and prices are low, and then discharge power back to the grid when it’s needed most.

Back in 2015, the battery storage industry had installed only a fraction of a gigawatt of battery storage capacity across the US. That year, it set a seemingly bold target of adding 35 gigawatts by 2035. The sector passed that goal a decade early this year and then hit 40 gigawatts a couple of months later. 

Costs are still falling, which could help maintain the momentum for the technology’s deployment. This year, battery prices for EVs and stationary storage fell yet again, reaching a record low, according to data from BloombergNEF. Battery packs specifically used for grid storage saw prices fall even faster than the average; they cost 45% less than last year.

We’re starting to see what happens on grids with lots of battery capacity, too: in California and Texas, batteries are already helping meet demand in the evenings, reducing the need to run natural-gas plants. The result: a cleaner, more stable grid.

AI’s energy funding influx

Aerial view of a large Google Data Centre being built in Cheshunt, Hertfordshire, UK
GETTY IMAGES

The AI boom is complicated for our energy system, as we covered at length this year. Electricity demand is ticking up: the amount of power utilities supplied to US data centers jumped 22% this year and will more than double by 2030.

But at least one positive shift is coming out of AI’s influence on energy: It’s driving renewed interest and investment in next-generation energy technologies.

In the near term, much of the energy needed for data centers, including those that power AI, will likely come from fossil fuels, especially new natural-gas power plants. But tech giants like Google, Microsoft, and Meta all have goals on the books to reduce their greenhouse-gas emissions, so they’re looking for alternatives.

Meta signed a deal with XGS Energy in June to purchase up to 150 megawatts of electricity from a geothermal plant. In October, Google signed an agreement that will help reopen Duane Arnold Energy Center in Iowa, a previously shuttered nuclear power plant.

Geothermal and nuclear could be key pieces of the grid of the future, as they can provide constant power in a way that wind and solar don’t. There’s a long way to go for many of the new versions of the tech, but more money and interest from big, powerful players can’t hurt.

Good news, bad news

Aerial view of solar power and battery storage units in the desert
ADOBE STOCK

Perhaps the strongest evidence of collective climate progress so far: We’ve already avoided the gravest dangers that scientists feared just a decade ago.

The world is on track for about 2.6 °C of warming over preindustrial conditions by 2100, according to Climate Action Tracker, an independent scientific effort to track the policy progress that nations have made toward their goals under the Paris climate agreement.

That’s a lot warmer than we want the planet to ever get. But it’s also a whole degree better than the 3.6 °C path that we were on a decade ago, just before nearly 200 countries signed the Paris deal.

That progress occurred because more and more nations passed emissions mandates, funded subsidies, and invested in research and development—and private industry got busy cranking out vast amounts of solar panels, wind turbines, batteries, and EVs. 

The bad news is that progress has stalled. Climate Action Tracker notes that its warming projections have remained stubbornly fixed for the last four years, as nations have largely failed to take the additional action needed to bend that curve closer to the 2 °C goal set out in the international agreement.

But having shaved off a degree of danger is still demonstrable proof that we can pull together in the face of a global threat and address a very, very hard problem. And it means we’ve done the difficult work of laying down the technical foundation for a society that can largely run without spewing ever more greenhouse gas into the atmosphere.

Hopefully, as cleantech continues to improve and climate change steadily worsens, the world will find the collective will to pick up the pace again soon.

Can AI really help us discover new materials?

18 December 2025 at 06:00

Judging from headlines and social media posts in recent years, one might reasonably assume that AI is going to fix the power grid, cure the world’s diseases, and finish my holiday shopping for me. But maybe there’s just a whole lot of hype floating around out there.

This week, we published a new package called Hype Correction. The collection of stories takes a look at how the world is starting to reckon with the reality of what AI can do, and what’s just fluff.

One of my favorite stories in that package comes from my colleague David Rotman, who took a hard look at AI for materials research. AI could transform the process of discovering new materials—innovation that could be especially useful in the world of climate tech, which needs new batteries, semiconductors, magnets, and more. 

But the field still needs to prove it can make materials that are actually novel and useful. Can AI really supercharge materials research? What could that look like?

For researchers hoping to find new ways to power the world (or cure disease or achieve any number of other big, important goals), a new material could change everything.

The problem is, inventing materials is difficult and slow. Just look at plastic—the first totally synthetic plastic was invented in 1907, but it took until roughly the 1950s for companies to produce the wide range we’re familiar with today. (And of course, though it is incredibly useful, plastic also causes no shortage of complications for society.)

In recent decades, materials science has fallen a bit flat—David has been covering this field for nearly 40 years, and as he puts it, there have been just a few major commercial breakthroughs in that time. (Lithium-ion batteries are one.)

Could AI change everything? The prospect is a tantalizing one, and companies are racing to test it out.

Lila Sciences, based in Cambridge, Massachusetts, is working on using AI models to uncover new materials. The company can not only train an AI model on all the latest scientific literature, but also plug it into an automated lab, so it can learn from experimental data. The goal is to speed up the iterative process of inventing and testing new materials and look at research in ways that humans might miss.

At an MIT Technology Review event earlier this year, I got to listen to David interview Rafael Gómez-Bombarelli, one of Lila’s cofounders. As he described what the company is working on, Gómez-Bombarelli acknowledged that AI materials discovery hasn’t yet seen a big breakthrough moment. Yet.

Gómez-Bombarelli described how models Lila has trained are providing insights that are “as deep [as] or deeper than our domain scientists would have.” In the future, AI could “think” in ways that depart from how human scientists approach a problem, he added: “There will be a need to translate scientific reasoning by AI to the way we think about the world.”

It’s exciting to see this sort of optimism in materials research, but there’s still a long and winding road before we can satisfyingly say that AI has transformed the field. One major difficulty is that it’s one thing to take suggestions from a model about new experimental methods or new potential structures. It’s quite another to actually make a material and show that it’s novel and useful.

You might remember that a couple of years ago, Google’s DeepMind announced it had used AI to predict the structures of “millions of new materials” and had made hundreds of them in the lab.

But as David notes in his story, after that announcement, some materials scientists pointed out that some of the supposedly novel materials were basically slightly different versions of known ones. Others couldn’t even physically exist in normal conditions (the simulations were done at ultra-low temperatures, where atoms don’t move around much).

It’s possible that AI could give materials discovery a much-needed jolt and usher in a new age that brings superconductors and batteries and magnets we’ve never seen before. But for now, I’m calling hype. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

Solar geoengineering startups are getting serious

11 December 2025 at 06:00

Solar geoengineering aims to manipulate the climate by bouncing sunlight back into space. In theory, it could ease global warming. But as interest in the idea grows, so do concerns about potential consequences.

A startup called Stardust Solutions recently raised a $60 million funding round, the largest known to date for a geoengineering startup. My colleague James Temple has a new story out about the company, and how its emergence is making some researchers nervous.

So far, the field has been limited to debates, proposed academic research, and—sure—a few fringe actors to keep an eye on. Now things are getting more serious. What does it mean for geoengineering, and for the climate?

Researchers have considered the possibility of addressing planetary warming this way for decades. We already know that volcanic eruptions, which spew sulfur dioxide into the atmosphere, can reduce temperatures. The thought is that we could mimic that natural process by spraying particles up there ourselves.

The prospect is a controversial one, to put it lightly. Many have concerns about unintended consequences and uneven benefits. Even public research led by top institutions has faced barriers—one famous Harvard research program was officially canceled last year after years of debate.

One of the difficulties of geoengineering is that in theory a single entity, like a startup company, could make decisions that have a widespread effect on the planet. And in the last few years, we’ve seen more interest in geoengineering from the private sector. 

Three years ago, James broke the story that Make Sunsets, a California-based company, was already releasing particles into the atmosphere in an effort to tweak the climate.

The company’s CEO Luke Iseman went to Baja California in Mexico, stuck some sulfur dioxide into a weather balloon, and sent it skyward. The amount of material was tiny, and it’s not clear that it even made it into the right part of the atmosphere to reflect any sunlight.

But fears that this group or others could go rogue and do their own geoengineering led to widespread backlash. Mexico announced plans to restrict geoengineering experiments in the country a few weeks after that news broke.

You can still buy cooling credits from Make Sunsets, and the company was just granted a patent for its system. But the startup is seen as something of a fringe actor.

Enter Stardust Solutions. The company has been working under the radar for a few years, but it has started talking about its work more publicly this year. In October, it announced a significant funding round, led by some top names in climate investing. “Stardust is serious, and now it’s raised serious money from serious people,” as James puts it in his new story.

That’s making some experts nervous. Even those who believe we should be researching geoengineering are concerned about what it means for private companies to do so.

“Adding business interests, profit motives, and rich investors into this situation just creates more cause for concern, complicating the ability of responsible scientists and engineers to carry out the work needed to advance our understanding,” write David Keith and Daniele Visioni, two leading figures in geoengineering research, in a recent opinion piece for MIT Technology Review.

Stardust insists that it won’t move forward with any geoengineering until and unless it’s commissioned to do so by governments and there are rules and bodies in place to govern use of the technology.

But there’s no telling how financial pressure might change that, down the road. And we’re already seeing some of the challenges faced by a private company in this space: the need to keep trade secrets.

Stardust is currently not sharing information about the particles it intends to release into the sky, though it says it plans to do so once it secures a patent, which could happen as soon as next year. The company argues that its proprietary particles will be safe, cheap to manufacture, and easier to track than the already abundant sulfur dioxide. But at this point, there’s no way for external experts to evaluate those claims.

As Keith and Visioni put it: “Research won’t be useful unless it’s trusted, and trust depends on transparency.”

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

Why the grid relies on nuclear reactors in the winter

4 December 2025 at 06:00

As many of us are ramping up with shopping, baking, and planning for the holiday season, nuclear power plants are also getting ready for one of their busiest seasons of the year.

Here in the US, nuclear reactors follow predictable seasonal trends. Summer and winter tend to see the highest electricity demand, so plant operators schedule maintenance and refueling for other parts of the year.

This scheduled regularity might seem mundane, but it’s quite the feat that operational reactors are as reliable and predictable as they are. It leaves some big shoes to fill for next-generation technology hoping to join the fleet in the next few years.

Generally, nuclear reactors operate at constant levels, as close to full capacity as possible. In 2024, for commercial reactors worldwide, the average capacity factor—the ratio of actual energy output to the theoretical maxiumum—was 83%. North America rang in at an average of about 90%.

(I’ll note here that it’s not always fair to just look at this number to compare different kinds of power plants—natural-gas plants can have lower capacity factors, but it’s mostly because they’re more likely to be intentionally turned on and off to help meet uneven demand.)

Those high capacity factors also undersell the fleet’s true reliability—a lot of the downtime is scheduled. Reactors need to refuel every 18 to 24 months, and operators tend to schedule those outages for the spring and fall, when electricity demand isn’t as high as when we’re all running our air conditioners or heaters at full tilt.

Take a look at this chart of nuclear outages from the US Energy Information Administration. There are some days, especially at the height of summer, when outages are low, and nearly all commercial reactors in the US are operating at nearly full capacity. On July 28 of this year, the fleet was operating at 99.6%. Compare that with  the 77.6% of capacity on October 18, as reactors were taken offline for refueling and maintenance. Now we’re heading into another busy season, when reactors are coming back online and shutdowns are entering another low point.

That’s not to say all outages are planned. At the Sequoyah nuclear power plant in Tennessee, a generator failure in July 2024 took one of two reactors offline, an outage that lasted nearly a year. (The utility also did some maintenance during that time to extend the life of the plant.) Then, just days after that reactor started back up, the entire plant had to shut down because of low water levels.

And who can forget the incident earlier this year when jellyfish wreaked havoc on not one but two nuclear power plants in France? In the second instance, the squishy creatures got into the filters of equipment that sucks water out of the English Channel for cooling at the Paluel nuclear plant. They forced the plant to cut output by nearly half, though it was restored within days.

Barring jellyfish disasters and occasional maintenance, the global nuclear fleet operates quite reliably. That wasn’t always the case, though. In the 1970s, reactors operated at an average capacity factor of just 60%. They were shut down nearly as often as they were running.

The fleet of reactors today has benefited from decades of experience. Now we’re seeing a growing pool of companies aiming to bring new technologies to the nuclear industry.

Next-generation reactors that use new materials for fuel or cooling will be able to borrow some lessons from the existing fleet, but they’ll also face novel challenges.

That could mean early demonstration reactors aren’t as reliable as the current commercial fleet at first. “First-of-a-kind nuclear, just like with any other first-of-a-kind technologies, is very challenging,” says Koroush Shirvan, a professor of nuclear science and engineering at MIT.

That means it will probably take time for molten-salt reactors or small modular reactors, or any of the other designs out there to overcome technical hurdles and settle into their own rhythm. It’s taken decades to get to a place where we take it for granted that the nuclear fleet can follow a neat seasonal curve based on electricity demand. 

There will always be hurricanes and electrical failures and jellyfish invasions that cause some unexpected problems and force nuclear plants (or any power plants, for that matter) to shut down. But overall, the fleet today operates at an extremely high level of consistency. One of the major challenges ahead for next-generation technologies will be proving that they can do the same.

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

This year’s UN climate talks avoided fossil fuels, again

27 November 2025 at 06:00

If we didn’t have pictures and videos, I almost wouldn’t believe the imagery that came out of this year’s UN climate talks.

Over the past few weeks in Belem, Brazil, attendees dealt with oppressive heat and flooding, and at one point a literal fire broke out, delaying negotiations. The symbolism was almost too much to bear.

While many, including the president of Brazil, framed this year’s conference as one of action, the talks ended with a watered-down agreement. The final draft doesn’t even include the phrase “fossil fuels.”

As emissions and global temperatures reach record highs again this year, I’m left wondering: Why is it so hard to formally acknowledge what’s causing the problem?

This is the 30th time that leaders have gathered for the Conference of the Parties, or COP, an annual UN conference focused on climate change. COP30 also marks 10 years since the gathering that produced the Paris Agreement, in which world powers committed to limiting global warming to “well below” 2.0 °C above preindustrial levels, with a goal of staying below the 1.5 °C mark. (That’s 3.6 °F and 2.7 °F, respectively, for my fellow Americans.)

Before the conference kicked off this year, host country Brazil’s president, Luiz Inácio Lula da Silva, cast this as the “implementation COP” and called for negotiators to focus on action, and specifically to deliver a road map for a global transition away from fossil fuels.

The science is clear—burning fossil fuels emits greenhouse gases and drives climate change. Reports have shown that meeting the goal of limiting warming to 1.5 °C would require stopping new fossil-fuel exploration and development.

The problem is, “fossil fuels” might as well be a curse word at global climate negotiations. Two years ago, fights over how to address fossil fuels brought talks at COP28 to a standstill. (It’s worth noting that the conference was hosted in Dubai in the UAE, and the leader was literally the head of the country’s national oil company.)

The agreement in Dubai ended up including a line that called on countries to transition away from fossil fuels in energy systems. It was short of what many advocates wanted, which was a more explicit call to phase out fossil fuels entirely. But it was still hailed as a win. As I wrote at the time: “The bar is truly on the floor.”

And yet this year, it seems we’ve dug into the basement.

At one point about 80 countries, a little under half of those present, demanded a concrete plan to move away from fossil fuels.

But oil producers like Saudi Arabia were insistent that fossil fuels not be singled out. Other countries, including some in Africa and Asia, also made a very fair point: Western nations like the US have burned the most fossil fuels and benefited from it economically. This contingent maintains that legacy polluters have a unique responsibility to finance the transition for less wealthy and developing nations rather than simply barring them from taking the same development route. 

The US, by the way, didn’t send a formal delegation to the talks, for the first time in 30 years. But the absence spoke volumes. In a statement to the New York Times that sidestepped the COP talks, White House spokesperson Taylor Rogers said that president Trump had “set a strong example for the rest of the world” by pursuing new fossil-fuel development.

To sum up: Some countries are economically dependent on fossil fuels, some don’t want to stop depending on fossil fuels without incentives from other countries, and the current US administration would rather keep using fossil fuels than switch to other energy sources. 

All those factors combined help explain why, in its final form, COP30’s agreement doesn’t name fossil fuels at all. Instead, there’s a vague line that leaders should take into account the decisions made in Dubai, and an acknowledgement that the “global transition towards low greenhouse-gas emissions and climate-resilient development is irreversible and the trend of the future.”

Hopefully, that’s true. But it’s concerning that even on the world’s biggest stage, naming what we’re supposed to be transitioning away from and putting together any sort of plan to actually do it seems to be impossible.

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

Three things to know about the future of electricity

20 November 2025 at 04:00

One of the dominant storylines I’ve been following through 2025 is electricity—where and how demand is going up, how much it costs, and how this all intersects with that topic everyone is talking about: AI.

Last week, the International Energy Agency released the latest version of the World Energy Outlook, the annual report that takes stock of the current state of global energy and looks toward the future. It contains some interesting insights and a few surprising figures about electricity, grids, and the state of climate change. So let’s dig into some numbers, shall we?

We’re in the age of electricity

Energy demand in general is going up around the world as populations increase and economies grow. But electricity is the star of the show, with demand projected to grow by 40% in the next 10 years.

China has accounted for the bulk of electricity growth for the past 10 years, and that’s going to continue. But emerging economies outside China will be a much bigger piece of the pie going forward. And while advanced economies, including the US and Europe, have seen flat demand in the past decade, the rise of AI and data centers will cause demand to climb there as well.

Air-conditioning is a major source of rising demand. Growing economies will give more people access to air-conditioning; income-driven AC growth will add about 330 gigawatts to global peak demand by 2035. Rising temperatures will tack on another 170 GW in that time. Together, that’s an increase of over 10% from 2024 levels.  

AI is a local story

This year, AI has been the story that none of us can get away from. One number that jumped out at me from this report: In 2025, investment in data centers is expected to top $580 billion. That’s more than the $540 billion spent on the global oil supply. 

It’s no wonder, then, that the energy demands of AI are in the spotlight. One key takeaway is that these demands are vastly different in different parts of the world.

Data centers still make up less than 10% of the projected increase in total electricity demand between now and 2035. It’s not nothing, but it’s far outweighed by sectors like industry and appliances, including air conditioners. Even electric vehicles will add more demand to the grid than data centers.

But AI will be the factor for the grid in some parts of the world. In the US, data centers will account for half the growth in total electricity demand between now and 2030.

And as we’ve covered in this newsletter before, data centers present a unique challenge, because they tend to be clustered together, so the demand tends to be concentrated around specific communities and on specific grids. Half the data center capacity that’s in the pipeline is close to large cities.

Look out for a coal crossover

As we ask more from our grid, the key factor that’s going to determine what all this means for climate change is what’s supplying the electricity we’re using.

As it stands, the world’s grids still primarily run on fossil fuels, so every bit of electricity growth comes with planet-warming greenhouse-gas emissions attached. That’s slowly changing, though.

Together, solar and wind were the leading source of electricity in the first half of this year, overtaking coal for the first time. Coal use could peak and begin to fall by the end of this decade.

Nuclear could play a role in replacing fossil fuels: After two decades of stagnation, the global nuclear fleet could increase by a third in the next 10 years. Solar is set to continue its meteoric rise, too. Of all the electricity demand growth we’re expecting in the next decade, 80% is in places with high-quality solar irradiation—meaning they’re good spots for solar power.

Ultimately, there are a lot of ways in which the world is moving in the right direction on energy. But we’re far from moving fast enough. Global emissions are, once again, going to hit a record high this year. To limit warming and prevent the worst effects of climate change, we need to remake our energy system, including electricity, and we need to do it faster. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

Google is still aiming for its “moonshot” 2030 energy goals

13 November 2025 at 06:00

Last week, we hosted EmTech MIT, MIT Technology Review’s annual flagship conference in Cambridge, Massachusetts. Over the course of three days of main-stage sessions, I learned about innovations in AI, biotech, and robotics. 

But as you might imagine, some of this climate reporter’s favorite moments came in the climate sessions. I was listening especially closely to my colleague James Temple’s discussion with Lucia Tian, head of advanced energy technologies at Google. 

They spoke about the tech giant’s growing energy demand and what sort of technologies the company is looking to to help meet it. In case you weren’t able to join us, let’s dig into that session and consider how the company is thinking about energy in the face of AI’s rapid rise. 

I’ve been closely following Google’s work in energy this year. Like the rest of the tech industry, the company is seeing ballooning electricity demand in its data centers. That could get in the way of a major goal that Google has been talking about for years. 

See, back in 2020, the company announced an ambitious target: by 2030, it aimed to run on carbon-free energy 24-7. Basically, that means Google would purchase enough renewable energy on the grids where it operates to meet its entire electricity demand, and the purchases would match up so the electricity would have to be generated when the company was actually using energy. (For more on the nuances of Big Tech’s renewable-energy pledges, check out James’s piece from last year.)

Google’s is an ambitious goal, and on stage, Tian said that the company is still aiming for it but acknowledged that it’s looking tough with the rise of AI. 

“It was always a moonshot,” she said. “It’s something very, very hard to achieve, and it’s only harder in the face of this growth. But our perspective is, if we don’t move in that direction, we’ll never get there.”

Google’s total electricity demand more than doubled from 2020 to 2024, according to its latest Environmental Report. As for that goal of 24-7 carbon-free energy? The company is basically treading water. While it was at 67% for its data centers in 2020, last year it came in at 66%. 

Not going backwards is something of an accomplishment, given the rapid growth in electricity demand. But it still leaves the company some distance away from its finish line.

To close the gap, Google has been signing what feels like constant deals in the energy space. Two recent announcements that Tian talked about on stage were a project involving carbon capture and storage at a natural-gas plant in Illinois and plans to reopen a shuttered nuclear power plant in Iowa. 

Let’s start with carbon capture. Google signed an agreement to purchase most of the electricity from a new natural-gas plant, which will capture and store about 90% of its carbon dioxide emissions. 

That announcement was controversial, with critics arguing that carbon capture keeps fossil-fuel infrastructure online longer and still releases greenhouse gases and other pollutants into the atmosphere. 

One question that James raised on stage: Why build a new natural-gas plant rather than add equipment to an already existing facility? Tacking on equipment to an operational plant would mean cutting emissions from the status quo, rather than adding entirely new fossil-fuel infrastructure. 

The company did consider many existing plants, Tian said. But, as she put it, “Retrofits aren’t going to make sense everywhere.” Space can be limited at existing plants, for example, and many may not have the right geology to store carbon dioxide underground. 

“We wanted to lead with a project that could prove this technology at scale,” Tian said. This site has an operational Class VI well, the type used for permanent sequestration, she added, and it also doesn’t require a big pipeline buildout. 

Tian also touched on the company’s recent announcement that it’s collaborating with NextEra Energy to reopen Duane Arnold Energy Center, a nuclear power plant in Iowa. The company will purchase electricity from that plant, which is scheduled to reopen in 2029. 

As I covered in a story earlier this year, Duane Arnold was basically the final option in the US for companies looking to reopen shuttered nuclear power plants. “Just a few years back, we were still closing down nuclear plants in this country,” Tian said on stage. 

While each reopening will look a little different, Tian highlighted the groups working to restart the Palisades plant in Michigan, which was the first reopening to be announced, last spring. “They’re the real heroes of the story,” she said.

I’m always interested to get a peek behind the curtain at how Big Tech is thinking about energy. I’m skeptical but certainly interested to see how Google’s, and the rest of the industry’s, goals shape up over the next few years. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

Stop worrying about your AI footprint. Look at the big picture instead.

6 November 2025 at 06:00

Picture it: I’m minding my business at a party, parked by the snack table (of course). A friend of a friend wanders up, and we strike up a conversation. It quickly turns to work, and upon learning that I’m a climate technology reporter, my new acquaintance says something like: “Should I be using AI? I’ve heard it’s awful for the environment.” 

This actually happens pretty often now. Generally, I tell people not to worry—let a chatbot plan your vacation, suggest recipe ideas, or write you a poem if you want. 

That response might surprise some people, but I promise I’m not living under a rock, and I have seen all the concerning projections about how much electricity AI is using. Data centers could consume up to 945 terawatt-hours annually by 2030. (That’s roughly as much as Japan.) 

But I feel strongly about not putting the onus on individuals, partly because AI concerns remind me so much of another question: “What should I do to reduce my carbon footprint?” 

That one gets under my skin because of the context: BP helped popularize the concept of a carbon footprint in a marketing campaign in the early 2000s. That framing effectively shifts the burden of worrying about the environment from fossil-fuel companies to individuals. 

The reality is, no one person can address climate change alone: Our entire society is built around burning fossil fuels. To address climate change, we need political action and public support for researching and scaling up climate technology. We need companies to innovate and take decisive action to reduce greenhouse-gas emissions. Focusing too much on individuals is a distraction from the real solutions on the table. 

I see something similar today with AI. People are asking climate reporters at barbecues whether they should feel guilty about using chatbots too frequently when we need to focus on the bigger picture. 

Big tech companies are playing into this narrative by providing energy-use estimates for their products at the user level. A couple of recent reports put the electricity used to query a chatbot at about 0.3 watt-hours, the same as powering a microwave for about a second. That’s so small as to be virtually insignificant.

But stopping with the energy use of a single query obscures the full truth, which is that this industry is growing quickly, building energy-hungry infrastructure at a nearly incomprehensible scale to satisfy the AI appetites of society as a whole. Meta is currently building a data center in Louisiana with five gigawatts of computational power—about the same demand as the entire state of Maine at the summer peak.  (To learn more, read our Power Hungry series online.)

Increasingly, there’s no getting away from AI, and it’s not as simple as choosing to use or not use the technology. Your favorite search engine likely gives you an AI summary at the top of your search results. Your email provider’s suggested replies? Probably AI. Same for chatting with customer service while you’re shopping online. 

Just as with climate change, we need to look at this as a system rather than a series of individual choices. 

Massive tech companies using AI in their products should be disclosing their total energy and water use and going into detail about how they complete their calculations. Estimating the burden per query is a start, but we also deserve to see how these impacts add up for billions of users, and how that’s changing over time as companies (hopefully) make their products more efficient. Lawmakers should be mandating these disclosures, and we should be asking for them, too. 

That’s not to say there’s absolutely no individual action that you can take. Just as you could meaningfully reduce your individual greenhouse-gas emissions by taking fewer flights and eating less meat, there are some reasonable things that you can do to reduce your AI footprint. Generating videos tends to be especially energy-intensive, as does using reasoning models to engage with long prompts and produce long answers. Asking a chatbot to help plan your day, suggest fun activities to do with your family, or summarize a ridiculously long email has relatively minor impact. 

Ultimately, as long as you aren’t relentlessly churning out AI slop, you shouldn’t be too worried about your individual AI footprint. But we should all be keeping our eye on what this industry will mean for our grid, our society, and our planet. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

Four thoughts from Bill Gates on climate tech

30 October 2025 at 07:00

Bill Gates doesn’t shy away or pretend modesty when it comes to his stature in the climate world today. “Well, who’s the biggest funder of climate innovation companies?” he asked a handful of journalists at a media roundtable event last week. “If there’s someone else, I’ve never met them.”

The former Microsoft CEO has spent the last decade investing in climate technology through Breakthrough Energy, which he founded in 2015. Ahead of the UN climate meetings kicking off next week, Gates published a memo outlining what he thinks activists and negotiators should focus on and how he’s thinking about the state of climate tech right now. Let’s get into it. 

Are we too focused on near-term climate goals?

One of the central points Gates made in his new memo is that he thinks the world is too focused on near-term emissions goals and national emissions reporting.

So in parallel with the national accounting structure for emissions, Gates argues, we should have high-level climate discussions at events like the UN climate conference. Those discussions should take a global view on how to reduce emissions in key sectors like energy and heavy industry.

“The way everybody makes steel, it’s the same. The way everybody makes cement, it’s the same. The way we make fertilizer, it’s all the same,” he says.

As he noted in one recent essay for MIT Technology Review, he sees innovation as the key to cutting the cost of clean versions of energy, cement, vehicles, and so on. And once products get cheaper, they can see wider adoption.

What’s most likely to power our grid in the future?

“In the long run, probably either fission or fusion will be the cheapest way to make electricity,” he says. (It should be noted that, as with most climate technologies, Gates has investments in both fission and fusion companies through Breakthrough Energy Ventures, so he has a vested interest here.)

He acknowledges, though, that reactors likely won’t come online quickly enough to meet rising electricity demand in the US: “I wish I could deliver nuclear fusion, like, three years earlier than I can.”

He also spoke to China’s leadership in both nuclear fission and fusion energy. “The amount of money they’re putting [into] fusion is more than the rest of the world put together times two. I mean, it’s not guaranteed to work. But name your favorite fusion approach here in the US—there’s a Chinese project.”

Can carbon removal be part of the solution?

I had my colleague James Temple’s recent story on what’s next for carbon removal at the top of my mind, so I asked Gates if he saw carbon credits or carbon removal as part of the problematic near-term thinking he wrote about in the memo.

Gates buys offsets to cancel out his own personal emissions, to the tune of about $9 million a year, he said at the roundtable, but doesn’t expect many of those offsets to make a significant dent in climate progress on a broader scale: “That stuff, most of those technologies, are a complete dead end. They don’t get you cheap enough to be meaningful.

“Carbon sequestration at $400, $200, $100, can never be a meaningful part of this game. If you have a technology that starts at $400 and can get to $4, then hallelujah, let’s go. I haven’t seen that one. There are some now that look like they can get to $40 or $50, and that can play somewhat of a role.”

 Will AI be good news for innovation? 

During the discussion, I started a tally in the corner of my notebook, adding a tick every time Gates mentioned AI. Over the course of about an hour, I got to six tally marks, and I definitely missed making a few.

Gates acknowledged that AI is going to add electricity demand, a challenge for a US grid that hasn’t seen net demand go up for decades. But so too will electric cars and heat pumps. 

I was surprised at just how positively he spoke about AI’s potential, though:

“AI will accelerate every innovation pipeline you can name: cancer, Alzheimer’s, catalysts in material science, you name it. And we’re all trying to figure out what that means. That is the biggest change agent in the world today, moving at a pace that is very, very rapid … every breakthrough energy company will be able to move faster because of using those tools, some very dramatically.”

I’ll add that, as I’ve noted here before, I’m skeptical of big claims about AI’s potential to be a silver bullet across industries, including climate tech. (If you missed it, check out this story about AI and the grid from earlier this year.) 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

Winter Garden Clean Up

28 November 2022 at 23:58

As winter approaches, it’s tempting just to sit back and put your feet up and not have to think about the garden until springtime. However, just a bit of extra work at this time of the year can save you a whole lot of hassle come planting time. Garden clean-up, the last big chore for gardeners, is often overlooked, especially […]

The post Winter Garden Clean Up appeared first on Backyard Gardener.

❌
❌