Normal view

There are new articles available, click to refresh the page.
Yesterday — 24 January 2026Main stream
Before yesterdayMain stream

Size (and Units) Really Do Matter

23 January 2026 at 10:00

We miss the slide rule. It isn’t so much that we liked getting an inexact answer using a physical moving object. But to successfully use a slide rule, you need to be able to roughly estimate the order of magnitude of your result. The slide rule’s computation of 2.2 divided by 8 is the same as it is for 22/8 or 220/0.08. You have to interpret the answer based on your sense of where the true answer lies. If you’ve ever had some kid at a fast food place enter the wrong numbers into a register and then hand you a ridiculous amount of change, you know what we mean.

Recent press reports highlighted a paper from Nvidia that claimed a data center consuming a gigawatt of power could require half a million tons of copper. If you aren’t an expert on datacenter power distribution and copper, you could take that number at face value. But as [Adam Button] reports, you should probably be suspicious of this number. It is almost certainly a typo. We wouldn’t be surprised if you click on the link and find it fixed, but it caused a big news splash before anyone noticed.

Thought Process

Best estimates of the total copper on the entire planet are about 6.3 billion metric tons. We’ve actually only found a fraction of that and mined even less. Of the 700 million metric tons of copper we actually have in circulation, there is a demand for about 28 million tons a year (some of which is met with recycling, so even less new copper is produced annually).

Simple math tells us that a single data center could, in a year, consume 1.7% of the global copper output. While that could be true, it seems suspicious on its face.

Digging further in, you’ll find the paper mentions 200kg per megawatt. So a gigawatt should be 200,000kg, which is, actually, only 200 metric tons. That’s a far cry from 500,000 tons. We suspect they were rounding up from the 440,000 pounds in 200 metric tons to “up to a half a million pounds,” and then flipped pounds to tons.

Glass Houses

We get it. We are infamous for making typos. It is inevitable with any sort of writing at scale and on a tight schedule. After all, the Lincoln Memorial has a typo set in stone, and Webster’s dictionary misprinted an editor’s note that “D or d” could stand for density, and coined a new word: dord.

So we aren’t here to shame Nvidia. People in glass houses, and all that. But it is amazing that so much of the press took the numbers without any critical thinking about whether they made sense.

Innumeracy

We’ve noticed many people glaze over numbers and take them at face value. The same goes for charts. We once saw a chart that was basically a straight line except for one point, which was way out of line. No one bothered to ask for a long time. Finally, someone spoke up and asked. Turns out it was a major issue, but no one wanted to be the one to ask “the dumb question.”

You don’t have to look far to find examples of innumeracy: a phrase coined by  [Douglas Hofstadter] and made famous by [John Allen Paulos]. One of our favorites is when a hamburger chain rolled out a “1/3 pound hamburger,” which flopped because customers thought that since three is less than four, they were getting more meat with a “1/4 pound hamburger” at the competitor’s restaurant.

This is all part of the same issue. If you are an electronics or computer person, you probably have a good command of math. You may just not realize how much better your math is than the average person’s.

Gimli Glider

Air Canada 143 after landing” from the FAA

Even so, people who should know better still make mistakes with units and scale. NASA has had at least one famous case of unit issues losing an unmanned probe. In another famous incident, an Air Canada flight ran out of fuel in 1983. Why?

The plane’s fuel sensors were inoperative, so the ground crew manually checked the fuel load with a dipstick. The dipstick read in centimeters. The navigation computer expected fuel to be in kg. Unfortunately, the fuel’s datasheet posted density in pounds/liter. This incorrect conversion happened twice.

Unsurprisingly, the plane was out of fuel and had to glide to an emergency landing on a racetrack that had once been a Royal Canadian Air Force training base. Luckily, Captain Pearson was an experienced glider pilot. With reduced control and few instruments, the Captain brought the 767 down as if it were a huge glider with 61 people onboard. Although the landing gear collapsed and caused some damage, no one on the plane or the ground were seriously hurt.

What’s the Answer?

Sadly, math answers are much easier to get than social answers. Kids routinely complain that they’ll never need math once they leave school. (OK, not kids like we were, but normal kids.) But we all know that is simply not true. Even if your job doesn’t directly involve math, understanding your own finances, making decisions about purchases, or even evaluating political positions often requires that you can see through math nonsense, both intentional and unintentional.

[Antoine de Saint-Exupéry] was a French author, and his 1948 book Citadelle has an interesting passage that may hold part of the answer. If you translate the French directly, it is a bit wordy, but the quote is commonly paraphrased: “If you want to build a ship, don’t herd people together to collect wood and don’t assign them tasks and work, but rather teach them to long for the endless immensity of the sea.”

We learned math because we understood it was the key to building radios, or rockets, or computer games, or whatever it was that you longed to build. We need to teach kids math in a way that makes them anxious to learn the math that will enable their dreams.

How do we do that? We don’t know. Great teachers help. Inspiring technology like moon landings helps. What do you think? Tell us in the comments. Now with 285% more comment goodness. Honest.

We still think slide rules made you better at math. Just like not having GPS made you better at navigation.

Epoch Ventures Predicts Bitcoin Hits $150K in 2026, Declares End of 4-Year Halving Cycle

By: Juan Galt
23 January 2026 at 08:13

Bitcoin Magazine

Epoch Ventures Predicts Bitcoin Hits $150K in 2026, Declares End of 4-Year Halving Cycle

Epoch, a venture firm specializing in Bitcoin infrastructure, issued its second annual ecosystem report on January 21, 2026, forecasting robust growth for the asset despite a subdued 2025 performance.

The 186-page document analyzes Bitcoin’s price dynamics, adoption trends, regulatory outlook, and technological risks, positioning the cryptocurrency as a maturing monetary system. Key highlights include a prediction that Bitcoin will reach at least $150,000 USD by year-end, driven by institutional inflows and decoupling from equities. The report also anticipates the Clarity Act failing to pass, though its substance on asset taxonomy and regulatory authority may advance through SEC guidance. Additional forecasts cover gold rotations boosting Bitcoin by 50 percent, major asset managers allocating 2 percent to model portfolios, and Bitcoin Core maintaining implementation dominance.

Eric Yakes, CFA charterholder and managing partner at Epoch Ventures, brings over a decade of finance expertise to the Bitcoin space, having started his career in corporate finance and restructuring at FTI Consulting before advancing to private equity at Lion Equity Partners, where he focused on buyouts. He left traditional finance in recent years to immerse himself in Bitcoin, authoring the influential book “The 7th Property: Bitcoin and the Monetary Revolution,” which explores Bitcoin’s role as a transformative monetary asset, and has since written extensively on its technologies and ecosystem. Yakes holds a double major in finance and economics from Creighton University, positioning him as a key voice in Bitcoin venture capital through Epoch, a firm dedicated to funding Bitcoin infrastructure.

The Death of the Four-Year Cycle

Bitcoin closed 2025 at $87,500, marking a 6 percent annual decline but an 84 percent four-year gain that ranks in the bottom 3 percent historically. The report states the death of the 4-year cycle in no uncertain terms: “We believe cycle theory is a relic of the past, and the cycles themselves probably never existed. The fact is that Bitcoin is boring and growing gradually now. We make the case for why gradual growth is precisely what will drive a ‘gradually, then suddenly’ moment.” 

The report goes on to discuss cycle theory in depth, presenting a view of the future that’s becoming the new market expectation: less volatility to the downside, slow and steady growth to the upside. 

Price action suggests a new bull market commenced in 2026, with 2025’s drop from $126,000 to $81,000 potentially being a self-fulfilling prophecy due to cycle expectations, as RSI remained below overbought since late 2024, suggesting bitcoin already went through a bear market and we are commencing a new kind of cycle. 

Versus gold, Bitcoin is down 49 percent from its highs, in a bear market since December 2024. Gold’s meteoric rise presents a potential price catalyst for bitcoin; a small rebalancing reallocation from gold of 0.5% would induce greater inflows than the U.S. ETFs; at 5.5%, it would equal bitcoin’s market capitalization. Gold’s rise makes bitcoin more attractive on a relative basis, and the higher gold goes, the more likely a rotation into bitcoin. Timing analysis, as seen in the chart below, which counts days from the local top, suggests Bitcoin might be nearing a bottom versus Gold.

In terms of volatility bitcoin has aligned with mega-caps like Tesla, with 2025 averages for Nasdaq 100 leaders exceeding Bitcoin’s, suggesting a risk-asset decoupling and limiting drawdowns. Long-term stock correlations persist, but maturing credit markets and safe-haven narratives may pivot Bitcoin toward gold-like behavior. 

The report goes in-depth into other potential catalysts for 2026, defending its bullish thesis, such as:

  • Consistent ETF Inflows
  • Nation State Adoption
  • Mega-cap Companies Allocating to Bitcoin
  • Wealth Managers Allocating Clients
  • Inheritance Allocation

FUD, Sentiment and Media Analysis

Analysis of 356,423 datapoints from 653 sources reveals a fractured sentiment landscape, with “Bitcoin is dead” narratives concluded. FUD is stable at 12-18 percent but the topics rotate, crime and legal themes are up 277 percent, while environmental FUD is down 41 percent.

A 125-point perception gap exists between conference attendees (+90 positive) while tech media is generally negative at (-35). UK outlets show 56-64 percent negativity, 2-3 times international averages. 

The Lightning Network coverage dominates podcasts at 33 percent but garners only 0.28 percent mainstream coverage, a 119x disparity. Layer 2 solutions are not zero-sum, with Lightning at 58 percent mentions and Ark up 154 percent.

Media framing has caused mining sentiment to swing 67 points: mainstream outlets cover the sector at 75.6 percent positive, while Bitcoin communities view it at only 8.4 percent positive, underscoring the importance of narrative and audience credibility for mining companies.

Bitcoin Treasury Companies

More companies added Bitcoin to their balance sheets in 2025 than in any previous year, marking a major step in corporate adoption. Established firms that already held Bitcoin—known as Bitcoin treasury companies, or BtcTCs—bought even larger amounts, while new entrants went public specifically to raise money and purchase Bitcoin. According to the report, public company bitcoin holdings increased 82% y/y to ₿1.08 million and the number of public companies holding bitcoin grew from 69 to over 191 throughout 2025.65 Corporations own at least 6.4% of total Bitcoin supply – public companies 5.1% and private companies 1.3%. This created a clear boom-and-bust pattern throughout the year.

Company valuations rose sharply through mid-2025 before pulling back when the broader Bitcoin price corrected. The report explains that these public treasury companies offer investors easier access through traditional brokers, the ability to borrow against holdings, and even dividend payments, though with dilution risks. In contrast, buying and holding Bitcoin directly remains simpler and preserves the asset’s full scarcity.

Looking ahead, Epoch expects Japan’s Metaplanet to post the highest multiple on net asset value (mNAV)—a key valuation metric—among all treasury companies with a market cap above $1 billion. The firm also predicts that an activist investor or rival company will force the liquidation of one underperforming treasury firm to capture the discount between its share price and the actual value of its Bitcoin holdings. 

Over time, these companies will stand out by offering competitive yields on their Bitcoin. In total, treasury companies acquired roughly 486,000 BTC during 2025, equal to 2.3 percent of the entire Bitcoin supply, drawing further corporate interest in Bitcoin. For business owners considering a Bitcoin treasury, the report highlights both the growth potential and the risks of public-market volatility.

The Bitcoin Treasury Companies section of the report explores: 

  • The fundamentals of a Bitcoin treasury allocation including the potential benefits and risks of Bitcoin treasury company investing. 
  • The 2025 timeline of Bitcoin Treasury companies. 
  • Current valuations of BtcTCs. 
  • Our opinion on BtcTCs broadly, and how we view them compared to owning Bitcoin directly. 
  • Commentary on specific BtcTCs. 
  • Predictions on Bitcoin treasury companies in the coming years. 

Regulation Expectations for 2026

Epoch predicts the Clarity Act—a proposed bill to clarify digital asset oversight by dividing authority between the SEC and CFTC—will not pass Congress in 2026. However, the report expects the bill’s main ideas, including clear definitions for asset categories and regulatory jurisdiction, to advance through SEC rulemaking or guidance instead. The firm also forecasts Republican losses in the midterm elections, which could trigger new regulatory pressure on crypto, most likely in the form of consumer protection measures aimed at perceived industry risks. On high-profile legal cases, Epoch does not expect pardons for the founders of Samurai Wallet or Tornado Cash this year, though future legal appeals or related proceedings may ultimately support their defenses. 

The report takes a critical view of recent legislative efforts, arguing that bills like the GENIUS Act (focused on stablecoins) and the Clarity Act prioritize industry lobbying over the concerns of everyday Bitcoin users, especially the ability to hold and control assets directly without third-party interference (self-custody). 

The report points out a discrepancy between what crypto-owning voters want — a majority preferring above all, the right to transact. While the Clarity and Genius Acts focus on less popular special interests, they just fall within the 50% support range. Epoch warns that “This deviation between the will of the voters and the will of the largest industry players is an early warning sign of the potential harm from regulatory capture (intentional or otherwise)”.  

The report is particularly critical of the way the GENIUS Act set up the regulatory structure for stablecoins. The paragraph on the topic is so poignant that it merits being printed in its entirety:

“Meet the new boss, same as the old boss:

Last year, in our Bitcoin Banking Report, we discussed the structure of the 2-tier banking system in the US (see figure below). In this system, the Central Bank pays a yield on the deposits it receives from the Tier II Commercial banks, who then go on to share a portion of that yield with their depositors. Sound familiar?

The compromise structure in the GENIUS Act essentially creates a parallel banking system where stablecoin issuers play the role of Tier I Central Banks and the crypto exchanges play the role of Tier II Commercial Banks. 

To make matters worse, stablecoin issuers are required to keep their reserves with regulated Tier II banks and are unlikely to have access to Fed Master accounts. The upshot of all this is that the GENIUS act converts a peer-to-peer payment mechanism into a heavily intermediated payment network that sits on top of another heavily intermediate payment network.”

The report goes into further depth on topics of regulation and regulatory capture risk, closing the topic with an analysis of how the CLARITY Act might and, in their opinion, should take shape. 

Quantum Computing Risk

Concerns about quantum computing potentially breaking Bitcoin’s cryptography surfaced prominently in late 2025, in part contributing to institutional sell-offs as investors reacted to headlines about rapid advances in the field. The Epoch report attributes much of this reaction to behavioral biases, including loss aversion—where people fear losses more than they value equivalent gains—and herd mentality, in which market participants follow the crowd without independent assessment. The authors describe the perceived threat as significantly overhyped, noting that claims of exponential progress in quantum capabilities, often tied to “Neven’s Law,” lack solid observational evidence to date.

“Neven’s law states that the computational power of quantum computers increases at a double exponential rate of classical computers. If true, the timeline to break Bitcoin’s cryptography could be as short as 5 years. 

However, Moore’s law was an observation. Neven’s law is not an observation because logical qubits are not increasing at such a rate. 

Neven’s law is an expectation of experts. Based on our understanding of expert opinion in the fields we are knowledgeable about, we are highly skeptical of expert projections,” the Epoch report explained.

They add that current quantum computers have not succeeded in factoring numbers larger than 15, and error rates increase exponentially with scale, making reliable large-scale computation far from practical. The report argues that progress in physical qubits has not yet translated into the logical qubits or error-corrected systems needed for factorization of the large numbers underpinning Bitcoin’s security.

Implementing quantum-resistant signatures prematurely — which do exist — would introduce inefficiencies, consuming more block space on the network, while emerging schemes remain untested in real-world conditions. Until meaningful advances in factorization occur, Epoch concludes the quantum threat does not warrant immediate priority or network changes.

Mining Expectations

The report forecasts that no company among the top ten public Bitcoin miners will generate more than 30 percent of its revenue from AI computing services during the 2026 fiscal year. This outcome stems from significant delays in the development and deployment of the necessary infrastructure for large-scale AI workloads, preventing miners from pivoting as quickly as some market narratives suggested.

Media coverage of Bitcoin mining shows a stark divide depending on who is framing the discussion. Mainstream outlets tend to portray the industry positively—75.6 percent of coverage is favorable, often emphasizing energy innovation, job creation, or economic benefits—while conversations within Bitcoin communities remain far more skeptical, with only 8.4 percent positive sentiment. This 67-point swing in net positivity highlights how framing and audience shape perceptions of the same sector, with community credibility remaining a critical factor for mining companies seeking to maintain support among Bitcoin holders.

The report has a lot more to offer including analysis of layer two systems and Bitcoin adoption data on multiple fronts, it can be read on Epoch’s website for free. 

This post Epoch Ventures Predicts Bitcoin Hits $150K in 2026, Declares End of 4-Year Halving Cycle first appeared on Bitcoin Magazine and is written by Juan Galt.

More than half of former UK employees still have access to company spreadsheets, study finds

23 January 2026 at 07:27

More than half of UK employees retain access to company spreadsheets they no longer need, leaving sensitive business data exposed long after people change roles or leave organisations, according to new research from privacy technology company Proton.

The study, based on a survey of 250 small and medium-sized businesses (SMB) in the UK, found that 64% still had access to files that should no longer be available to them. In some cases, this includes documents containing financial information, client data, salary details, or internal planning material.

With around 16.9 million people working for SMBs across the UK, the findings suggest that millions of current and former employees could still have access to sensitive company data without their employers’ knowledge.

The research highlights a growing gap between the critical role spreadsheets play in daily business operations and the poor governance of their access. Spreadsheets are now widely used as informal systems of record, with 64% of respondents using them for project management, 47% for financial reporting, and 45% for managing client or customer data.

Despite this reliance, access controls remain weak. Nearly four in ten respondents (39%) said they had shared spreadsheets using “anyone with the link” permissions, while 20% said they only review who has access to their spreadsheets once a year. Manual offboarding processes remain common: 44% of access removals are handled manually, while just 36% are automated.

Proton says this combination of link-based sharing and manual offboarding helps explain why access often persists long after an employee leaves.

“Spreadsheets are often treasure troves of sensitive data, from financial and strategic planning information to HR and client data,” said Patricia Egger, head of security at Proton. “Yet they’re not handled like other high-risk data. When someone leaves a company, access to shared spreadsheets is often nobody’s problem. Links stay active, permissions aren’t reviewed, and data remains accessible without anyone noticing.”

Confusion over cloud security and data use

The study also found widespread misunderstanding about how secure cloud-based spreadsheets really are. Two-thirds of respondents (67%) believe their Google Sheets files are private and accessible only to intended viewers, while almost a quarter said they were unsure what information Google can or cannot access.

There is similar uncertainty around encryption and provider access, particularly with Microsoft. Almost a quarter of UK respondents said they were unsure whether Microsoft could view spreadsheet content.

Uncertainty also extends to data use. More than a third (34%) of respondents believe spreadsheet data could be used to train AI models, and 84% said they would find that concerning.

Personal and work accounts are being mixed

Nearly half of respondents (45%) admitted to opening work spreadsheets using personal cloud accounts, while 46% said they had accessed personal spreadsheets using work accounts. Security researchers warn that this blurring of personal and professional data increases the risk of accidental data leakage, unauthorised access, and compliance failures, particularly where sensitive financial or customer data is involved.

The UK is among the most spreadsheet-reliant countries

Proton compared its UK findings with results from other countries, including the US and France. While lingering access in the UK (64%) was slightly lower than in the US (67%), it was significantly higher than in France (40%).

The UK also showed the highest levels of uncertainty about provider access and encryption, particularly for Microsoft-hosted spreadsheets. Proton noted that these risks are amplified by European data sovereignty concerns, as data hosted by foreign cloud providers may fall under legal regimes outside a company’s control.

Everyday tools, enterprise-level risk

The findings point to a broader problem: spreadsheets are increasingly used to run core business processes, but without the governance, visibility, or controls normally applied to more formal business systems. Researchers say this creates a growing blind spot for SMBs, particularly as collaboration tools, consumer cloud accounts, and AI services become more deeply embedded in everyday work.

“Most of these risks don’t come from malicious behaviour,” Egger added. “They come from everyday process gaps; manual offboarding, weak defaults, and a lack of visibility into who can still access what.”

The post More than half of former UK employees still have access to company spreadsheets, study finds appeared first on IT Security Guru.

Obsidian Security Extends Reach to SaaS Application Integrations

22 January 2026 at 11:39

Obsidian Security today announced that it has extended the reach of its platform for protecting software-as-a-service (SaaS) applications to include any integrations. Additionally, the company is now making it possible to limit which specific end users of a SaaS application are allowed to grant and authorize new SaaS integrations by enforcing least privilege policies. Finally,..

The post Obsidian Security Extends Reach to SaaS Application Integrations appeared first on Security Boulevard.

We’ve Reached the “Customers Want Security” Stage, and AI Is Listening

22 January 2026 at 11:29

I’ve seen this movie before. That’s why a recent LinkedIn post by Ilya Kabanov stopped me mid-doomscroll. Kabanov described how frontier AI companies are quietly but decisively shifting into cybersecurity. They are not joining as partners or tacking on features. They are stepping up as product makers, targeting the core of the enterprise security budget...

The post We’ve Reached the “Customers Want Security” Stage, and AI Is Listening appeared first on Security Boulevard.

Skimming Satellites: On the Edge of the Atmosphere

By: Tom Nardi
22 January 2026 at 10:00

There’s little about building spacecraft that anyone would call simple. But there’s at least one element of designing a vehicle that will operate outside the Earth’s atmosphere that’s fairly easier to handle: aerodynamics. That’s because, at the altitude that most satellites operate at, drag can essentially be ignored. Which is why most satellites look like refrigerators with solar panels and high-gain antennas attached jutting out at odd angles.

But for all the advantages that the lack of meaningful drag on a vehicle has, there’s at least one big potential downside. If a spacecraft is orbiting high enough over the Earth that the impact of atmospheric drag is negligible, then the only way that vehicle is coming back down in a reasonable amount of time is if it has the means to reduce its own velocity. Otherwise, it could be stuck in orbit for decades. At a high enough orbit, it could essentially stay up forever.

Launched in 1958, Vanguard 1 is expected to remain in orbit until at least 2198

There was a time when that kind of thing wasn’t a problem. It was just enough to get into space in the first place, and little thought was given to what was going to happen in five or ten years down the road. But today, low Earth orbit is getting crowded. As the cost of launching something into space continues to drop, multiple companies are either planning or actively building their own satellite constellations comprised of thousands of individual spacecraft.

Fortunately, there may be a simple solution to this problem. By putting a satellite into what’s known as a very low Earth orbit (VLEO), a spacecraft will experience enough drag that maintaining its velocity requires constantly firing its thrusters.  Naturally this presents its own technical challenges, but the upside is that such an orbit is essentially self-cleaning — should the craft’s propulsion fail, it would fall out of orbit and burn up in months or even weeks. As an added bonus, operating at a lower altitude has other practical advantages, such as allowing for lower latency communication.

VLEO satellites hold considerable promise, but successfully operating in this unique environment requires certain design considerations. The result are vehicles that look less like the flying refrigerators we’re used to, with a hybrid design that features the sort of aerodynamic considerations more commonly found on aircraft.

ESA’s Pioneering Work

This might sound like science fiction, but such craft have already been developed and successfully operated in VLEO. The best example so far is the Gravity Field and Steady-State Ocean Circulation Explorer (GOCE), launched by the European Space Agency (ESA) back in 2009.

To make its observations, GOCE operated at an altitude of 255 kilometers (158 miles), and dropped as low as just 229 km (142 mi) in the final phases of the mission. For reference the International Space Station flies at around 400 km (250 mi), and the innermost “shell” of SpaceX’s Starlink satellites are currently being moved to 480 km (298 mi).

Given the considerable drag experienced by GOCE at these altitudes, the spacecraft bore little resemblance to a traditional satellite. Rather than putting the solar panels on outstretched “wings”, they were mounted to the surface of the dart-like vehicle. To keep its orientation relative to the Earth’s surface stable, the craft featured stubby tail fins that made it look like a futuristic torpedo.

Even with its streamlined design, maintaining such a low orbit required GOCE to continually fire its high-efficiency ion engine for the duration of its mission, which ended up being four and a half years.

In the case of GOCE, the end of the mission was dictated by how much propellant it carried. Once it had burned through the 40 kg (88 lb) of xenon onboard, the vehicle would begin to rapidly decelerate, and ground controllers estimated it would re-enter the atmosphere in a matter of weeks. Ultimately the engine officially shutdown on October 21st, and by November 9th, it’s orbit had already decayed to 155 km (96 mi). Two days later, the craft burned up in the atmosphere.

JAXA Lowers the Bar

While GOCE may be the most significant VLEO mission so far from a scientific and engineering standpoint, the current record for the spacecraft with the lowest operational orbit is actually held by the Japan Aerospace Exploration Agency (JAXA).

In December 2017 JAXA launched the Super Low Altitude Test Satellite (SLATS) into an initial orbit of 630 km (390 mi), which was steadily lowered in phases over the next several weeks until it reached 167.4 km (104 mi). Like GOCE, SLATS used a continuously operating ion engine to maintain velocity, although at the lowest altitudes, it also used chemical reaction control system (RCS) thrusters to counteract the higher drag.

SLATS was a much smaller vehicle than GOCE, coming in at roughly half the mass. It also carried just 12 kg (26 lb) of xenon propellant, which limited its operational life. It also utilized a far more conventional design than GOCE, although its rectangular shape was somewhat streamlined when compared to a traditional satellite. Its solar arrays were also mounted in parallel to the main body of the craft, giving it an airplane-like appearance.

The combination of lower altitude and higher frontal drag meant that SLATS had an even harder time maintaining velocity than GOCE. Once its propulsion system was finally switched off in October 2019, the craft re-entered the atmosphere and burned up within 24 hours. The mission has since been recognized by Guinness World Records for the lowest altitude maintained by an Earth observation satellite.

A New Breed of Satellite

As impressive as GOCE and SLATS were, their success was based more on careful planning than any particular technological breakthrough. After all, ion propulsion for satellites is not new, nor is the field of aerodynamics. The concepts were simply applied in a novel way.

But there exists the potential for a totally new type of vehicle that operates exclusively in VLEO. Such a craft would be a true hybrid, in the sense that its primarily a spacecraft, but uses an air-breathing electric propulsion (ABEP) system akin to an aircraft’s jet engine. Such a vehicle could, at least in theory, maintain an altitude as low as 90 km (56 mi) indefinitely — so long as its solar panels can produce enough power.

Both the Defense Advanced Research Projects Agency (DARPA) in the United States and the ESA are currently funding several studies of ABEP vehicles, such as Redwire’s SabreSat, which have numerous military and civilian applications. Test flights are still years away, but should VLEO satellites powered by ABEP become common platforms for constellation applications, they may help alleviate orbital congestion before it becomes a serious enough problem to impact our utilization of space.

Tech in Plain Sight: Finding a Flat Tire

21 January 2026 at 10:00

There was a time when wise older people warned you to check your tire pressure regularly. We never did, and would eventually wind up with a flat or, worse, a blowout. These days, your car will probably warn you when your tires are low. That’s because of a class of devices known as tire pressure monitoring systems (TPMS).

If you are like us, you see some piece of tech like this, and you immediately guess how it probably works. In this case, the obvious guess is sometimes, but not always, correct. There are two different styles that are common, and only one works in the most obvious way.

Obvious Guess

We’d guess that the tire would have a little pressure sensor attached to it that would then wirelessly transmit data. In fact, some do work this way, and that’s known as dTPMS where the “d” stands for direct.

Of course, such a system needs power, and that’s usually in the form of batteries, although there are some that get power wirelessly using an RFID-like system. Anything wireless has to be able to penetrate the steel and rubber in the tire, of course.

But this isn’t always how dTPMS systems worked. In days of old, they used a finicky system involving a coil and a pressure-sensitive diaphragm — more on that later.

TPMS sensor (by [Lumu] CC BY-SA 3.0
Many modern systems use iTPMS (indirect). These systems typically work on the idea that a properly inflated tire will have a characteristic rolling radius. Fusing data from the wheel speed sensor, the electronic steering control, and some fancy signal processing, they can deduce if a tire’s radius is off-nominal. Not all systems work exactly the same, but the key idea is that they use non-pressure data to infer the tire’s pressure.

This is cheap and requires no batteries in the tire. However, it isn’t without its problems. It is purely a relative measurement. In practice, you have to inflate your tires, tell the system to calibrate, and then drive around for half an hour or more to let it learn how your tires react to different roads, speeds, and driving styles.

Changes in temperature, like the first cold snap of winter, are notorious for causing these sensors to read flat. If the weather changes and you suddenly have four flat tires, that’s probably what happened. The tires really do lose some pressure as temperatures drop, but because all four change together, the indirect system can’t tell which one is at fault, if any.

History

When the diaphragm senses correct pressure, the sensor forms an LC circuit. Low air pressure causes the diaphragm to open the switch, breaking the circuit.

The first passenger vehicle to offer TPMS was the 1986 Porsche 959. Two sensors made from a diaphragm and a coil are mounted between the wheel and the wheel’s hub. The sensors were on opposite sides of the tire. With sufficient pressure on the diaphragm, an electrical contact was made, changing the coil value, and a stationary coil would detect the sensor as it passed. If the pressure drops, the electrical contact opens, and the coil no longer sees the normal two pulses per rotation. The technique was similar to a grid dip meter measuring an LC resonant circuit. The diaphragm switch would change the LC circuit’s frequency, and the sensing coil could detect that.

If one or two pulses were absent despite the ABS system noting wheel rotation, the car would report low tire pressure. There were some cases of centrifugal force opening the diaphragms at high speed, causing false positives, but for the most part, the system worked. This isn’t exactly iTPMS, but it isn’t quite dTPMS either. The diaphragm does measure pressure in a binary way, but it doesn’t send pressure data in the way a normal dTPMS system does.

Of course, as you can see in the video, the 959 was decidedly a luxury car. It would be 1991 before the US-made Corvette acquired TPMS. The Renault Laguna II in 2000 was the first high-volume car to have similar sensors.

Now They’re Everywhere

In many places, laws were put in place to require TPMS in vehicles. It was also critical for cars that used “run flat” tires. The theory is that you might not notice your run flat tires were actually flat, and while they are, as their name implies, made to run flat, they also require you to limit speed and distance when they are flat.

Old cars or other vehicles that don’t have TPMS can still add it. There are systems that can measure tire pressure and report to a smartphone app. These are, of course, a type of dTPMS.

Problems

Of course, there are always problems. An iTPMS system isn’t really reading the tire pressure, so it can easily get out of calibration. Direct systems need battery changing, which usually means removing the tire, and a good bit of work — watch the video below. That means there is a big tradeoff between sending data with enough power to go through the tire and burning through batteries too fast.

Another issue with dTPMS is that you are broadcasting. That means you have to reject interference from other cars that may also transmit. Because of this, most sensors have a unique ID. This raises privacy concerns, too, since you are sending a uniquely identifiable code.

Of course, your car is probably also beaming Bluetooth signals and who knows what else. Not to even mention what the phone in your car is screaming to the ether. So, in practice, TPMS attacks are probably not a big problem for anyone with normal levels of paranoia.

An iTPMS sensor won’t work on a tire that isn’t moving, so monitoring your spare tire is out. Even dTPMS sensors often stop transmitting when they are not moving to save battery, and that also makes it difficult to monitor the spare tire.

The (Half Right) Obvious Answer

Sometimes, when you think of the “obvious” way something works, you are wrong. In this case, you are half right. TPMS reduces tire wear, prevents accidents that might happen during tire failure, and even saves fuel.

Thanks to this technology, you don’t have to remember to check your tire pressure before a trip. You should, however, probably check the tread.

You can roll your own TPMS. Or just listen in with an SDR. If biking is more your style, no problem.

Marion Stokes Fought Disinformation with VCRs

20 January 2026 at 10:00

You’ve likely at least heard of Marion Stokes, the woman who constantly recorded television for over 30 years. She comes up on reddit and other places every so often as a hero archivist who fought against disinformation and disappearing history. But who was Marion Stokes, and why did she undertake this project? And more importantly, what happened to all of those tapes? Let’s take a look.

Marion the Librarian

Marion was born November 25, 1929 in Germantown, Philadelphia, Pennsylvania. Noted for her left-wing beliefs as a young woman, she became quite politically active, and was even courted by the Communist Party USA to potentially become a leader. Marion was also involved in the civil rights movement.

Marion Stokes on the set of her public access television show, Input.
Marion on her public-access program Input. Image via DC Video

For nearly 20 years, Marion worked as a librarian at the Free Library of Philadelphia until she was fired in the 1960s, which was likely a direct result of her political life. She married Melvin Metelits, a teacher and member of the Communist Party, and had a son named Michael with him.

Throughout this time, Marion was spied on by the FBI, to the point that she and her husband attempted to defect to Cuba. They were unsuccessful in securing Cuban visas, and separated in the mid-1960s when Michael was four.

Marion began co-producing a Sunday morning public-access talk show in Philadelphia called Input with her future husband John Stokes, Jr. The focus of the show was on social justice, and the point of the show was to get different types of people together to discuss things peaceably.

Outings Under Six Hours

Marion’s taping began in 1979 with the Iranian Hostage Crisis, which coincided with the dawn of the twenty-four-hour news cycle. Her final tape is from December 14, 2012 — she recorded coverage of the Sandy Hook massacre as she passed away.

In 35 years of taping, Marion amassed 70,000 VHS and Beta-max tapes. She mostly taped various news outlets, fearing that the information would disappear forever. Her time in the television industry taught her that networks typically considered preservation too expensive, and therefore often reused tapes.

But Marion didn’t just tape the news. She also taped various programs such as The Cosby Show, Divorce Court, Nightline, Star Trek, The Oprah Winfrey Show, and The Today Show. Some of her collection includes 24/7 coverage of news networks, all of which was recorded on up to eight VCRs: 3-5 were going all day every day, and up to 8 would be taping if something special was happening. All family outings were planned around the six-hour VHS tape, and Marion would sometimes cut dinner short to go home and change the tapes.

People can’t take knowledge from you.  — Marion Stokes

You might be wondering where she kept all the tapes, or how she could afford to do this, both financially and time-wise. For one thing, her second husband John Stokes, Jr. was already well off. For another, she was an early investor in Apple stock, using capital from her in-laws. To say she bought a lot of Macs is an understatement. According to the excellent documentary Recorder, Marion own multiples of every Apple product ever produced. Marion was a huge fan of technology and viewed it as a way of unlocking people’s potential. By the end of her life, she had nine apartments filled with books, newspapers, furniture, and multiples of any item she ever became obsessed with.

In addition to the creating this vast video archive, Marion took half a dozen daily newspapers and over 100 monthly periodicals, which she collected for 50 years. This is not to mention the 40-50,000 books in her possession. In one interview, Marion’s first husband Melvin Metelits has said that in the mid-1970s, the family would go to a bookstore and drop $800 on new books. That’s nearly $5,000 in today’s money.

Why Tapes? Why Anything?

It’s easy to understand why she started with VHS tapes — it was the late 1970s, and they were still the best option. When TiVo came along, Marion was not impressed, preferring not to expose her recording habits to any possible governments. And she had every right to be afraid, with her past.

Those in power are able to write their own history.  — Marion Stokes

As for the why, there were several reasons. It was a form of activism, which partially defined Marion’s life. The rest I would argue was defined by this archive she amassed.

Marion started taping when the Iranian Hostage Crisis began. Shortly thereafter, the 24/7 news cycle was born, and networks reached into small towns in order to fill space. And that’s what she was concerned with — the effect that filling space would have on the average viewer.

Marion was obsessed with the way that media reflects society back upon itself. With regard to the hostage crisis, her goal was trying to reveal a set of agendas on the part of governments. Her first husband Melvin Metelits said that Marion was extremely fearful that America would replicate Nazi Germany.

The show Nightline was born from nightly coverage of the crisis. It aired at 11:30PM, which meant it had to compete with the late-night talk show hosts. And it did just fine, rising on the wings of the evening soap opera it was creating.

To the Internet Archive

When Marion passed on December 14, 2012, news of the Sandy Hook massacre began to unfold. It was only after she took her last breath that her VCRs were switched off. Marion bequeathed the archive to her son Michael, who spent a year and half dealing with her things. He gave her books to a charity that teaches at-risk youth using secondhand materials, and he says he got rid of all the remaining Apples.

A screen capture of the Marion Stokes video collection on the Internet Archive.
Image via The Internet Archive

But no one would take the tapes. That is, until the Internet Archive heard about them. The tapes were hauled from Philadelphia to San Francisco, packed in banker’s boxes and stacked in four shipping containers.

So that’s 70,000 tapes at let’s assume six hours per tape, which totals 420,000 hours. No wonder the Internet Archive wasn’t finished digitizing the footage as of October 2025. That, and a lack of funding for the massive amount of manpower this must require.

If you want to see what they’ve uploaded so far, it’s definitely worth a look. And as long as you’re taking my advice, go watch the excellent documentary Recorder on YouTube. Check out the trailer embedded below.

Main and thumbnail images via All That’s Interesting

❌
❌