Normal view

There are new articles available, click to refresh the page.
Today — 25 January 2026Main stream

Commodore 64 Helps Revive the BBS Days

25 January 2026 at 04:00

Before the modern Internet existed, there were still plenty of ways of connecting with other computer users “online”, although many of them might seem completely foreign to those of us in the modern era. One of those systems was the Bulletin Board System, or BBS, which would have been a single computer, often in someone’s home, connected to a single phone line. People accessing the BBS would log in if the line wasn’t busy, leave messages, and quickly log out since the system could only support one user at a time. While perhaps a rose-tinted view, this was a more wholesome and less angsty time than the modern algorithm-driven Internet, and it turns out these systems are making a bit of a comeback as a result.

The video by [The Retro Shack] sets up a lot of this history for context, then, towards the end, uses a modern FPGA-based recreation called the Commodore 64 Ultimate to access a BBS called The Old Net, a modern recreation of what these 80s-era BBS systems were like. This involves using a modern networking card that allows the C64 to connect to Wi-Fi access points to get online instead of an old phone modem, and then using a terminal program called CCGMS to connect to the BBS itself. Once there, users can access mail, share files, and even play a few games.

While the video is a very basic illustration of how these BBS systems worked and how to access one, it is notable in that it’s part of a trend of rejecting more modern technology and systems in favor of older ones, where the users had more control. A retro machine like a C64 or Atari is not required either; modern operating systems can access these with the right terminal program, too. A more in-depth guide to the BBS can be found here for those looking to explore, and we’ve also seen other modern BBS systems recently.

Thanks to [Charlie] for the tip!

Yesterday — 24 January 2026Main stream

Crazy Old Machines

24 January 2026 at 10:00

Al and I were talking about the IBM 9020 FAA Air Traffic Control computer system on the podcast. It’s a strange machine, made up of a bunch of IBM System 360 mainframes connected together to a common memory unit, with all sorts of custom peripherals to support keeping track of airplanes in the sky. Absolutely go read the in-depth article on that machine if it sparks your curiosity.

It got me thinking about how strange computers were in the early days, and how boringly similar they’ve all become. Just looking at the word sizes of old machines is a great example. Over the last, say, 40 years, things that do computing have had 4, 8, 16, 32, or even 64-bit words. You noticed the powers-of-two trend going on here, right? Basically starting with the lowly Intel 4004, it’s been round numbers ever since.

Harvard Mark I, by [Topory]
On the other side of the timeline, though, you get strange beasts. The classic PDP-8 had 12-bit words, while its predecessors the PDP-6 and PDP-1 had 36 bits and 18 bits respectively. (Factors of six?) There’s a string of military guidance computers that had 27-bit words, while the Apollo Guidance computer ran 15-bit words. UNIVAC III had 25-bit words, putting the 23-bit Harvard Mark I to shame.

I wasn’t there, but it gives you the feeling that each computer is a unique, almost hand-crafted machine. Some must have made their odd architectural choices to suit particular functions, others because some designer had a clever idea. I’m not a computer historian, but I’m sure that the word lengths must tell a number of interesting stories.

On the whole, though, it gives the impression of a time when each computer was it’s own unique machine, before the convergence of everything to roughly the same architectural ideas. A much more hackery time, for lack of a better word. We still see echoes of this in the people who make their own “retro” computers these days, either virtually, on a breadboard, or emulated in the fabric of an FPGA. It’s not just nostalgia, though, but a return to a time when there was more creative freedom: a time before 64 bits took over.

This article is part of the Hackaday.com newsletter, delivered every seven days for each of the last 200+ weeks. It also includes our favorite articles from the last seven days that you can see on the web version of the newsletter. Want this type of article to hit your inbox every Friday morning? You should sign up!

Ethereum Launches $2M Quantum Defense Team as Threat Timeline Accelerates

24 January 2026 at 09:01

The Ethereum Foundation has officially elevated quantum resistance to a top strategic priority with the formation of a dedicated Post Quantum team backed by $2 million in funding.

The new initiative comes as blockchain networks face mounting pressure to defend against quantum computing threats that industry experts increasingly warn could materialize within years rather than decades.

Ethereum researcher Justin Drake announced the team formation on Friday, revealing that Thomas Coratger will lead the effort alongside Emile, a core contributor to leanVM.

After years of quiet R&D, EF management has officially declared PQ security a top strategic priority,” Drake said, adding that the foundation has been developing its quantum strategy since a 2019 presentation at StarkWare Sessions.

Today marks an inflection in the Ethereum Foundation's long-term quantum strategy.

We've formed a new Post Quantum (PQ) team, led by the brilliant Thomas Coratger (@tcoratger). Joining him is Emile, one of the world-class talents behind leanVM. leanVM is the cryptographic…

— Justin Drake (@drakefjustin) January 23, 2026

Foundation Commits Resources Across Multiple Fronts

The Ethereum Foundation is launching comprehensive defensive measures spanning research, development, and infrastructure testing.

Antonio Sanso will kick off bi-weekly All Core Devs Post Quantum breakout calls next month, focusing on user-facing security, including dedicated precompiles, account abstraction, and transaction signature aggregation with leanVM.

The foundation announced two $1 million prize competitions to strengthen cryptographic foundations.

The newly launched Poseidon Prize targets the hardening of the Poseidon hash function, while the existing Proximity Prize continues to drive hash-based cryptography research.

We are betting big on hash-based cryptography to enjoy the strongest and leanest cryptographic foundations,” Drake stated.

Multi-client post-quantum consensus development networks are already operational, with pioneer teams including Zeam, Ream Labs, PierTwo, Gean client, and Ethlambda working alongside established consensus clients Lighthouse, Grandine, and Prysm.

Weekly post-quantum interop calls, coordinated by Will Corcoran, are managing collaborative technical development across these diverse implementation teams.

The foundation will host a three-day expert workshop in October, bringing together top specialists from around the world, building on last year’s post-quantum workshop in Cambridge.

An additional dedicated post-quantum day is scheduled for March 29 in Cannes, ahead of EthCC, to create multiple forums for advancing research and coordination across the global Ethereum development community.

Industry Voices Split on Timeline Urgency

The quantum threat has divided blockchain leaders on both timeline predictions and strategic priorities.

Independent Ethereum educator sassal.eth called quantum computing “a very real threat for blockchains” that is “coming sooner than most people think,” praising the foundation’s defensive preparations.

Pantera Capital General Partner Franklin Bi predicted that traditional financial institutions will struggle with the transition to post-quantum cryptography.

People are over-estimating how quickly Wall Street will adapt to post-quantum cryptography,” Bi said, adding that blockchain networks possess unique capabilities for system-wide upgrades at a global scale.

The post-quantum race begins.

My prediction:

People are over-estimating how quickly Wall Street will adapt to post-quantum cryptography. Like any systemic software upgrade, it'll be slow & chaotic with single points of failure for years. Traditional systems are only as strong… https://t.co/6mEdFKcXrm

— Franklin Bi (@FranklinBi) January 23, 2026

He argued that successful quantum resistance could transform select blockchains into “post-quantum safe havens for data and assets,” particularly as traditional systems face prolonged periods of vulnerability due to single points of failure.

Bitcoin community assessments remain sharply contested. Vitalik Buterin previously shared Metaculus data showing a median 2040 timeline for quantum computers breaking modern cryptography, with roughly a 20% probability before the end of 2030.

Metaculus's median date for when quantum computers will break modern cryptography is 2040:https://t.co/Li8ni8A9Ox

Seemingly about a 20% chance it will be before end of 2030.

— vitalik.eth (@VitalikButerin) August 27, 2025

Blockstream CEO Adam Back has dismissed near-term concerns, claiming practical threats remain decades away and accusing critics of creating unnecessary market alarm.

Project ZKM contributor Stephen Duan acknowledged transition challenges while calling quantum resistance “inevitable,” noting that his team will soon upgrade multiset hashing to a lattice-based construction.

ZKsync inventor Alex Gluk also said the network’s Airbender prover is already “100% PQ-proof,” highlighting Ethereum’s unmatched ability to adapt to emerging threats while maintaining its position as the global financial settlement layer.

Foundation Plans Comprehensive Roadmap Release

The Ethereum Foundation will publish detailed strategic guidance on pq.ethereum.org covering full transition planning to achieve zero loss of funds and zero downtime over the coming years.

Drake highlighted recent artificial intelligence breakthroughs in formal proof generation, noting that specialized mathematics AI recently completed one of the hardest lemmas in hash-based SNARK foundations in a single eight-hour run costing $200.

The foundation is developing educational materials, including a six-part video series with ZKPodcast and enterprise-focused resources through EF Enterprise Acceleration.

Quantum Threatens $600B of Bitcoin 🎧🤖@nic_carter joins me for an in-person @PodcastDelphi to cover his 6 months of research on Quantum's effect on $BTC

Nic's first and only podcast on Quantum

Listen directly here, or on any of the links below pic.twitter.com/CSnv7xekqn

— Tommy (@Shaughnessy119) January 9, 2026

Ethereum now has representation on the post-quantum advisory board, Coinbase announced this week, bringing together leading cryptography researchers to assess long-term blockchain security risks as quantum computing capabilities advance across government and private-sector development programs.

The post Ethereum Launches $2M Quantum Defense Team as Threat Timeline Accelerates appeared first on Cryptonews.

Google Research suggests AI models like DeepSeek exhibit collective intelligence patterns

24 January 2026 at 00:50

A Google study finds advanced AI models mimic collective human intelligence by using internal debates and diverse reasoning paths, reshaping how future AI systems may be designed.

The post Google Research suggests AI models like DeepSeek exhibit collective intelligence patterns appeared first on Digital Trends.

Before yesterdayMain stream

FedRAMP is getting faster, new automation and pilots promise approvals in months, not years

23 January 2026 at 15:34

Interview transcript

Terry Gerton We’re going to talk about one of everybody’s favorite topics, FedRAMP. It’s been around for years, but agencies are still struggling to get modern tools. So from your perspective, why is the process so hard for software and service companies to get through?

Irina Denisenko  It’s a great question. Why is it so hard to get through FedRAMP? It is so hard to get through FedRAMP because at the end of the day, what is FedRAMP really here to do? It’s here to secure cloud software, to secure government data sitting in cloud software. You have to remember this all came together almost 15 years ago, which if you remember 15 years ago, 20 years ago, was kind of early days of all of us interacting with the internet. And we were still even, in some cases, scared to enter our credit card details onto an online website. Fast forward to today, we pay with our face when we get on our phone. We’ve come a long way. But the reality is cloud security hasn’t always been the, of course, it’s secure. In fact, it has been the opposite. Of course, its unsecure and it’s the internet and that’s where you go to lose all your data and all your information. And so long story short, you have to understand that’s were the government is coming from. We need to lock everything down in order to make sure that whether it’s VA patient data, IRS data on our taxpayers, obviously anything in the DoW, any sort of information data there, all of that stays secure. And so that’s why there are hundreds of controls that are applied to cloud environments in order make sure and double sure and triple sure that that data is secure.

Terry Gerton You lived the challenge first-hand with your own company. What most surprised you about the certification process when you tackled it yourself? What most surprise me?

Irina Denisenko  When we tackled FedRAMP ourselves for the first time was that even if you have the resources and specifically if you $3 million to spend, you know, $3 million burning a hole in your pocket doesn’t happen often, but even if have that and you have staff on the U.S. Soil and you have the willingness to invest all of that for a three-year process to get certified, that is still not enough. What you need on top of that is an agency to say yes to sponsoring you. And when they say yes, to sponsoring you what they are saying yes to you is to take on your cyber risk. And specifically what they’re saying yes to is to spend half a million dollars of taxpayer money of agency budget, typically using contractors, to do an initial security review of your application. And then to basically get married to you and do something called continuous monitoring, which is a monthly meeting that they’re going to have with you forever. They, that agency is going to be your accountability partner and ultimately the risk bearer of you, the software provider, to make sure you are burning down all of the vulnerabilities, all of these CVEs, every finding in your cloud environment on the timeline that you’re supposed to do that. And that ends up costing an agency about $250,000 a year, again, in the form of contractors, tooling, etc. That was the most surprising to me, that again, even as a cloud service provider, who’s already doing business with JP Morgan and Chase, you know, healthcare systems, you name it, even that’s not enough, you need an agency sponsor, because at the end of the day, it’s the agency’s data and they have to protect it. And so they have do that triple assurance of, yes, you said you’re doing the security stuff, but let us confirm that you’re doing the the security stuff. That was the most surprising to me. And why, really, ultimately, we started Knox Systems, because what we do at Knox is we enable the inheritance model. So we are doing all of that with our sponsoring agencies, of which we have 15. Knox runs the largest FedRAMP managed cloud. And what that means is we host the production environment of our customers inside of our FedRAMP environment across AWS, Azure, and GCP. And our customers inherit our sponsors. So they inherit the authorization from the treasury, from the VA, from the Marines, etc., Which means that the Marines, the Treasury, the VA, didn’t have to spend an extra half a million upfront and $250k ongoing with every new application that was authorized. They are able to get huge bang for their buck by just investing that authorization, that sponsorship into the Knox boundary. And then Knox does the work and the hard work to ensure the security and ongoing authorization and compliance of all of the applications that we bring into our environment.

Terry Gerton I’m speaking with Irina Denisenko. She’s the CEO of Knox Systems. So it sounds like you found a way through the maze that was shorter, simpler, less expensive. Is FedRAMP 20X helping to normalize that kind of approach? How do you see it playing out?

Irina Denisenko  Great question. FedRAMP 20X is a phenomenal initiative coming out of OMB-GSA. And really the crux of that is all about machine-readable and continuous authorization. Today, when I talked about continuous monitoring, that’s a monthly meeting that happens. And I kid you not, we, as a cloud service provider, again, we secure Adobe’s environment and many others, we come with a spreadsheet, an actual spreadsheet that has all of the vulnerabilities listed from all the scans we’ve done over the last month, and anything that is still open from anything prior months. And we review that spreadsheet, that actual Excel document, and then after the meet with our agencies and then, after that meeting, we upload that spreadsheet into a system called USDA on the FedCiv side, eMass, DOW side, DISA side. And then they, on their side, download that spreadsheet and they put it into other systems. And I mean, that’s the process. I think no one is confused, or no one would argue that surely there’s a better way. And a better would be a machine readable way, whether that’s over an API, using a standard language like OSCAL. There’s lots of ways to standardize, but it doesn’t have to be basically the equivalent of a clipboard and a pencil. And that’s what FedRAMP 20X is doing. It’s automating that information flow so that not only is it bringing down the amount of just human labor that needs to be done to do all this tracking, but more importantly, this is cloud security. Just because you’re secure one second doesn’t mean you’re secure five seconds from now, right? You need to be actively monitoring this, actively reporting this. And if it’s taking you 30 days to let an agency know that you have a critical vulnerability, that’s crazy. You, you got to tell them in, you know, five minutes after you find out or, you know to put a respectable buffer, a responsible buffer to allow you to mitigate remediate before you notify more parties, maybe it’s a four day buffer but it’s certainly not 30 days. That’s what FedRAMP20X is doing. We’re super excited about it. We are very supportive of it and have been actively involved in phase I and all subsequent phases.

Terry Gerton Right, so phase II is scheduled to start shortly in 2026. What are you expecting to see as a result?

Irina Denisenko  Well, phase I was all about FedRAMP low, phase II is all about FedRAMP moderate. And we expect that, you know, it’s going to really — FedRAMP moderate is realistically where most cloud service offerings sit, FedRAMP moderate and high. And so that’s really the one that the FedRAMP needs to get right. What we expect to see and hope to see is to have agencies actually authorized off of these new frameworks. The key is really going to be what shape does FedRAMP 20x take in terms of machine readable reporting on the security posture of any cloud environment? And then of course, the industry will standardize around that. So we’re excited to see what that looks like. And also how much AI does the agency, the GSA, OMB and ultimately FedRAMP leverage because there is a tremendous amount of productivity, but also security that AI can provide. It can also introduce a lot of risks. And so we’re all collaborating with that agency, as well as we’re excited to see what, you know, where they draw the bright red lines and where they embrace AI.

Terry Gerton So phase II is only gonna incorporate 10 companies, right? So for the rest of the world who’s waiting on these results, what advice do you have for them in the meantime? How can companies prepare better or how can companies who want to get FedRAMP certified now best proceed?

Irina Denisenko  I think the end of the day the inheritance model that Knox provides — and, you know, we’re not the only ones, actually there’s two key players.; it’s ourselves and Palantir. There’s a reason hat large companies like Celonis like OutSystems like BigID like Armis who was just bought by ServiceNow for almost $8 billion. There’s reason that all those guys choose Knox and there’s a reason Anthropic chose Palantir and Grafana chose Palantir, because regardless, FedRAMP 20X, Rev 5, doesn’t matter, there is a massive, massive premium put on getting innovative technology in the hands of our government faster. We have a window right now with the current administration prioritizing innovative technology and commercial off-the-shelf. You know, take the best out of Silicon Valley and use it in the government or out of Europe, out of Israel, you name it, rather than build it yourself, customize it until you’re blue in the face and still get an inferior product. Just use the best and breed, right? But you need it to be secure. And we have this window as a country. We have a window as country for the next few years here to get these technologies in. It takes a while to adopt new technologies. It takes awhile to do a quantum leap, but I’ll give you a perfect example. Celonis, since becoming FedRAMPed on August 19th with Knox — they had been trying to get FedRAMPed for five years — since getting FedRAMPed on august 19th, has implemented three agencies. And what do they do? They do process mining and intelligence. They’re an $800 million company that’s 20 years old that competes, by the way, head on with Palantir’s core product, Foundry and Gotham and so on. They’ve implemented three agencies already to drive efficiency, to drive visibility, to drive process mining, to driving intelligence, to drive AI-powered decision-making. And that’s during the holidays, during a government shutdown, it’s speed that we’ve never seen before. If you want outcomes, you need to get these technologies into the hands of our agencies today. And so that’s why, you know, we’re such big proponents of this model, and also why, our agencies, our federal advisory board, which includes the DHS CISO, the DOW CIO, the VA CIO are also supportive of this because ultimately it’s about serving the mission and doing it now. Rather than waiting for some time in the future.

The post FedRAMP is getting faster, new automation and pilots promise approvals in months, not years first appeared on Federal News Network.

© Getty Images/iStockphoto/Kalawin

Cloud

5 weird ways the Raspberry Pi has revived retro computer hardware

23 January 2026 at 13:00

Raspberry Pi devices are popular among retro enthusiasts looking to emulate old computers and consoles, but this usually only goes as far as software. What you might not have considered is that the Raspberry Pi can also play a role in reviving old hardware.

AI coding work is shifting fast, and your career path may split

23 January 2026 at 05:38

AI coding work is rising fast, but the biggest payoff isn’t evenly shared. A Science analysis suggests seasoned developers get stronger gains than newcomers, which could reshape how you learn, interview, and prove value.

The post AI coding work is shifting fast, and your career path may split appeared first on Digital Trends.

Your AI could copy our worst instincts, but there’s a fix for AI social bias

23 January 2026 at 04:38

AI models including GPT-4.1 and DeepSeek-3.1 can mirror ingroup versus outgroup bias in everyday language, a study finds. Researchers also report an ION training method that reduced the gap.

The post Your AI could copy our worst instincts, but there’s a fix for AI social bias appeared first on Digital Trends.

A 1970s Electronic Game

23 January 2026 at 01:00

What happens when a traditional board game company decides to break into electronic gaming? Well, if it were a UK gaming company in 1978, the result would be a Waddingtons 2001 The Game Machine that you can see in the video from [Re:Enthused] below.

The “deluxe console model” had four complete games: a shooting gallery, blackjack, Code Hunter, and Grand Prix. But when you were done having fun, no worries. The machine was also a basic calculator with a very strange keyboard. We couldn’t find an original retail price on these, but we’ve read it probably sold for £20 to £40, which, in 1978, was more than it sounds like today.

Like a board game, there were paper score sheets. The main console had die-cut panels to decorate the very tiny screen (which looks like a very simple vacuum fluorescent display) and provide labels for the buttons. While it isn’t very impressive today, it was quite the thing in 1978.

This would be a fun machine to clone and quite easy, given the current state of the art in most hacker labs. A 3D-printed case, color laser-printed overlays, and just about any processor you have lying around would make this a weekend project.

It is easy to forget how wowed people were by games like this when they were new. Then again, we don’t remember any of those games having a calculator.

As a side note, Waddingtons was most famous for their special production of Monopoly games at the request of MI9 during World War II. The games contained silk maps, money, and other aids to help prisoners of war escape.

Alder Lake is ending and here’s what it means for your current PC

23 January 2026 at 00:33

Intel has initiated the end-of-life process for its 12th Gen Alder Lake CPUs and 600-series chipsets, setting final order and shipment dates as the platform begins phasing out.

The post Alder Lake is ending and here’s what it means for your current PC appeared first on Digital Trends.

Coinbase Announces New Board Of Experts To Combat Rising Quantum Computing Risks

23 January 2026 at 01:00

The crypto industry is preparing for a potential security challenge with the anticipated arrival of quantum computing. In response to this potential threat, Coinbase (COIN) has announced the formation of an advisory board composed of external experts. 

Coinbase Chief Security Officer’s Warning 

According to a report from Fortune, the newly established board includes academics from Stanford, Harvard, and the University of California, specializing in fields like computer science, cryptography, and fintech. 

Officially titled the Coinbase Independent Advisory Board on Quantum Computing and Blockchain, the group also features experts from the Ethereum Foundation, the decentralized finance (DeFi) platform EigenLayer, and Coinbase itself.

Jeff Lunglhofer, Coinbase’s Chief Information Security Officer, elaborated on the potential impact of quantum computing on current encryption methods. 

He explained that the encryption protecting wallets and private keys of Bitcoin (BTC) holders relies on complex mathematical problems that would take conventional computers thousands of years to solve. 

However, with the computational power that quantum computers promise—potentially a million times greater—these problems could be solved much more swiftly, Lunglhofer asserted.

Although the security implications of quantum computing are genuine, Lunglhofer reassured that they are not expected to become an immediate concern for at least a decade. The purpose of the new advisory board is to examine the upcoming challenges posed by quantum computing in a measured manner. 

This involves fostering initiatives within the blockchain industry that are reportedly already underway to enhance the resilience of Bitcoin and other networks against quantum attacks.

Blockchain Networks Expected To Implement Larger Keys

At present, Bitcoin secures its wallets through private keys, which consist of long strings of random characters. These keys are accessible to their owners but can only be estimated through extensive trial-and-error computations. 

The advent of quantum computing, however, would make it feasible to deduce private keys using trial-and-error methods in a fraction of the time. 

In response to this looming threat, Fortune disclosed that blockchain experts speculate that networks will implement larger keys and add “noise” to obscure their locations, making them more difficult to detect. Implementing these defensive upgrades across blockchain networks is said to take several years. 

In the meantime, the newly formed Coinbase Advisory Board is gearing up to publish research papers and issue position statements aimed at helping the cryptocurrency industry brace for the impacts of quantum computing. 

Their first paper, which will address quantum’s influence on the consensus and transaction layers of blockchain, is expected to be released within the next couple of months.

Coinbase

At the time of writing, Coinbase’s stock, which trades under the ticker symbol COIN on the Nasdaq, is trading at $225.10. This represents a slight drop of 1.2% over the last 24 hours. 

Featured image from OpenArt, chart from TradingView.com 

Google Search can now answer questions using your Gmail and Photos in AI mode

22 January 2026 at 17:29

Google Search is adding new features to Personal Intelligence in AI mode, allowing it to pull context from Gmail and Photos so it can answer questions that depend on your own history.

The post Google Search can now answer questions using your Gmail and Photos in AI mode appeared first on Digital Trends.

You can now connect Claude with Apple Health to get insights from your fitness data

22 January 2026 at 14:53

Claude AI now connects with Apple Health, letting users talk through their fitness and health data to spot trends, understand metrics, and get plain-language insights instead of raw numbers.

The post You can now connect Claude with Apple Health to get insights from your fitness data appeared first on Digital Trends.

Coinbase Forms Quantum Computing Advisory Board as Bitcoin Security Concerns Grow

22 January 2026 at 13:58

Bitcoin Magazine

Coinbase Forms Quantum Computing Advisory Board as Bitcoin Security Concerns Grow

Earlier this week, Coinbase announced the creation of an Independent Advisory Board on Quantum Computing and Blockchain, aiming to safeguard the crypto ecosystem against emerging quantum threats

The board will bring together leading experts in quantum computing, cryptography, and blockchain to assess risks and provide guidance to the broader industry.

Quantum computers, if scaled successfully, could compromise the cryptography that underpins major blockchains like Bitcoin and Ethereum. Coinbase, in their announcement, stressed that preparing for these future challenges is crucial to maintaining the security of digital assets.

The advisory board includes notable figures such as quantum computing pioneer Scott Aaronson, Stanford cryptography expert Dan Boneh, Ethereum researcher Justin Drake, and Coinbase’s own Head of Cryptography, Yehuda Lindell. 

The group says they will publish position papers, recommend best practices for long-term security, and respond to significant advances in quantum computing.

This initiative is part of Coinbase’s larger post-quantum security strategy, which also includes updating Bitcoin address handling, enhancing internal key management, and advancing research on post-quantum signature schemes. The board’s first position paper is expected early next year, laying out a roadmap for quantum resilience in blockchain systems.

Coinbase said the move underscores the importance of proactive planning, ensuring the crypto industry remains prepared, not reactive, as quantum technology evolves.

Is bitcoin at risk from Quantum Computing? 

Over the last several months, concerns over quantum computing’s potential impact on Bitcoin have begun to ripple through traditional finance, prompting some investors to radically rethink their exposure to the cryptocurrency. 

Jefferies strategist Christopher Wood recently removed Bitcoin from his Greed & Fear model portfolio, citing the existential risk that large-scale quantum computers could undermine the cryptographic foundations securing digital assets. 

While the threat is not imminent, Wood and other institutional voices — including BlackRock and UBS CEO Sergio Ermotti — warn that quantum advances could eventually allow attackers to derive private keys from public ones, putting millions of BTC at risk. 

As a result, Wood replaced Bitcoin with gold and gold-mining equities, emphasizing that long-term store-of-value claims for digital assets may be less reliable in the face of accelerating technological change.

The debate over quantum computing in the Bitcoin ecosystem is intensifying. Coinbase research indicates that roughly 20% to 50% of Bitcoin’s supply, particularly coins in older wallet formats, could be vulnerable to so-called long-range quantum attacks. 

Crypto developers and researchers are divided over the urgency of implementing quantum-resistant solutions, with some advocating proactive upgrades and others arguing the risk remains distant. 

Strategy Chairman Michael Saylor believes that quantum computing will actually strengthen Bitcoin rather than threaten it. Network upgrades and coin migrations will boost security, while lost coins remain frozen, Saylor posted.

This post Coinbase Forms Quantum Computing Advisory Board as Bitcoin Security Concerns Grow first appeared on Bitcoin Magazine and is written by Micah Zimmerman.

❌
❌