Reading view

There are new articles available, click to refresh the page.

Coinbase Forms Quantum Computing Advisory Board as Bitcoin Security Concerns Grow

Bitcoin Magazine

Coinbase Forms Quantum Computing Advisory Board as Bitcoin Security Concerns Grow

Earlier this week, Coinbase announced the creation of an Independent Advisory Board on Quantum Computing and Blockchain, aiming to safeguard the crypto ecosystem against emerging quantum threats

The board will bring together leading experts in quantum computing, cryptography, and blockchain to assess risks and provide guidance to the broader industry.

Quantum computers, if scaled successfully, could compromise the cryptography that underpins major blockchains like Bitcoin and Ethereum. Coinbase, in their announcement, stressed that preparing for these future challenges is crucial to maintaining the security of digital assets.

The advisory board includes notable figures such as quantum computing pioneer Scott Aaronson, Stanford cryptography expert Dan Boneh, Ethereum researcher Justin Drake, and Coinbase’s own Head of Cryptography, Yehuda Lindell. 

The group says they will publish position papers, recommend best practices for long-term security, and respond to significant advances in quantum computing.

This initiative is part of Coinbase’s larger post-quantum security strategy, which also includes updating Bitcoin address handling, enhancing internal key management, and advancing research on post-quantum signature schemes. The board’s first position paper is expected early next year, laying out a roadmap for quantum resilience in blockchain systems.

Coinbase said the move underscores the importance of proactive planning, ensuring the crypto industry remains prepared, not reactive, as quantum technology evolves.

Is bitcoin at risk from Quantum Computing? 

Over the last several months, concerns over quantum computing’s potential impact on Bitcoin have begun to ripple through traditional finance, prompting some investors to radically rethink their exposure to the cryptocurrency. 

Jefferies strategist Christopher Wood recently removed Bitcoin from his Greed & Fear model portfolio, citing the existential risk that large-scale quantum computers could undermine the cryptographic foundations securing digital assets. 

While the threat is not imminent, Wood and other institutional voices — including BlackRock and UBS CEO Sergio Ermotti — warn that quantum advances could eventually allow attackers to derive private keys from public ones, putting millions of BTC at risk. 

As a result, Wood replaced Bitcoin with gold and gold-mining equities, emphasizing that long-term store-of-value claims for digital assets may be less reliable in the face of accelerating technological change.

The debate over quantum computing in the Bitcoin ecosystem is intensifying. Coinbase research indicates that roughly 20% to 50% of Bitcoin’s supply, particularly coins in older wallet formats, could be vulnerable to so-called long-range quantum attacks. 

Crypto developers and researchers are divided over the urgency of implementing quantum-resistant solutions, with some advocating proactive upgrades and others arguing the risk remains distant. 

Strategy Chairman Michael Saylor believes that quantum computing will actually strengthen Bitcoin rather than threaten it. Network upgrades and coin migrations will boost security, while lost coins remain frozen, Saylor posted.

This post Coinbase Forms Quantum Computing Advisory Board as Bitcoin Security Concerns Grow first appeared on Bitcoin Magazine and is written by Micah Zimmerman.

Jefferies’ Analyst Dumps Bitcoin Over Quantum Computing Fears, Buys Gold

Bitcoin Magazine

Jefferies’ Analyst Dumps Bitcoin Over Quantum Computing Fears, Buys Gold

Christopher Wood, global head of equity strategy at Jefferies, has eliminated Bitcoin from his flagship Greed & Fear model portfolio, citing concerns that developments in quantum computing could pose an existential threat to the cryptocurrency’s cryptographic foundations.

In the latest edition of the widely followed newsletter, Wood confirmed that Jefferies has removed its entire 10% Bitcoin allocation, replacing it with a split allocation of 5% to physical gold and 5% to gold-mining equities, according to Bloomberg. 

The strategist said the move reflects rising uncertainty over whether Bitcoin can maintain its role as a long-term store of value in the face of accelerating technological change.

“While Greed & Fear does not believe that the quantum issue is about to hit the Bitcoin price dramatically in the near term, the store-of-value concept is clearly on less solid foundation from the standpoint of a long-term pension portfolio,” Wood wrote.

Wood was an early institutional supporter of Bitcoin, first adding it to the model portfolio in December 2020 amid pandemic-era stimulus and fears of fiat currency debasement. He later increased the allocation to 10% in 2021.

Since that initial inclusion, Bitcoin has risen approximately 325%, compared with a 145% gain in gold over the same period.

Quantum computing presents structural risks to Bitcoin 

Despite the strong performance, Wood argues that quantum computing presents a structural risk that cannot be ignored. Bitcoin’s security relies on cryptographic algorithms that are effectively unbreakable using classical computers. 

However, sufficiently powerful quantum machines could theoretically derive private keys from public keys, enabling unauthorized transfers and undermining confidence in the network.

Security researchers estimate that roughly 20% to 50% of Bitcoin’s total supply — between 4 million and 10 million BTC — could be vulnerable under certain conditions. 

Coinbase researchers have identified approximately 6.5 million BTC held in older wallet formats where public keys are already exposed on-chain, making them susceptible to so-called long-range quantum attacks.

The issue has sparked a growing divide within the Bitcoin ecosystem. Some think that developers are underestimating the risk. Others, including Blockstream CEO Adam Back, maintain that the threat remains distant and that quiet preparatory work toward quantum-resistant signatures is preferable to alarming investors.

The debate has also begun to reach mainstream finance. BlackRock has listed quantum computing as a potential long-term risk in its spot Bitcoin ETF disclosures, while Solana co-founder Anatoly Yakovenko recently suggested there is a 50% chance of a meaningful quantum breakthrough within five years.

For Wood, the uncertainty itself strengthens the case for gold.

He described the metal as a historically tested hedge in an increasingly volatile geopolitical and technological landscape, concluding that the long-term questions raised by quantum computing are “only positive for gold.”

Gold climbed to record highs this month, topping $4,600 per ounce, as investors piled into the safe-haven asset amid escalating geopolitical tensions involving Iran and growing expectations that the Federal Reserve will cut interest rates following softer U.S. inflation and labor market data.

This post Jefferies’ Analyst Dumps Bitcoin Over Quantum Computing Fears, Buys Gold first appeared on Bitcoin Magazine and is written by Micah Zimmerman.

Bitcoin Decouples From Global Liquidity: Analyst Says Quantum Threat Behind It

Bitcoin has decoupled from the global M2 supply for the first time. Here’s what could be the reason for it, according to the founder of Capriole Investments.

Bitcoin Has Diverged From The Global M2 Supply Trend

In a new post on X, Capriole Investments founder Charles Edwards has talked about how Bitcoin has decoupled from the global liquidity flows recently. Below is the chart cited by Edwards, which compares the year-over-year (YoY) percentage change in BTC to that in the global M2 supply.

Bitcoin Vs Global M2

As displayed in the graph, Bitcoin’s YoY change flatlined over 2025 while the total money supply of the world’s major economies witnessed growth, indicating BTC diverged from traditional liquidity flows.

In the past, the cryptocurrency’s YoY percentage change has generally showcased a similar trajectory to the global M2 supply. “This is the first time Bitcoin has decoupled from money supply and global liquidity flows,” noted the analyst.

What’s the reason behind this new trend? According to Edwards, it’s the threat posed by quantum computing to the network. Quantum computers are hypothesized to have the capability to break the cryptocurrency’s cryptography, with wallets from the blockchain’s early days being especially vulnerable.

It’s uncertain when quantum machines will find a breakthrough, but the Capriole founder believes BTC passed into a “Quantum Event Horizon” in 2025. “The timeframe to a non-zero probability of a quantum machine breaking Bitcoin’s cryptography is now less than the estimated time it will take to upgrade Bitcoin,” said Edwards.

In theory, a party with a sufficiently advanced quantum computer could break into old dormant wallets and dump the coins on the market. This would not only directly impact BTC’s price but could also undermine broader trust in the cryptocurrency itself.

“Money is repositioning to account for this risk accordingly,” explained the analyst. One X user countered that most investors don’t seem to agree with Edwards’ quantum timeline, suggesting that the market would be unlikely to decouple based on a view not widely shared.

“If you listen to all in bitcoin maxis on X you would think that,” Edwards replied to the user. “If you talk to real capital allocators and Bitcoin OGs in the space 7+ years in private – they are all considering this risk.”

In some other news, Bitcoin spot exchange-traded funds (ETFs) have continued to face weak demand recently, as data from SoSoValue shows.

Bitcoin Spot ETFs

From the above chart, it’s visible that last week saw $681 million exit from the US Bitcoin spot ETFs. The new week has started with inflows so far, but it only remains to be seen whether they will continue in the coming days.

BTC Price

At the time of writing, Bitcoin is floating around $92,100, up nearly 2% in the last 24 hours.

Bitcoin Price Chart

Interoperability and standardization: Cornerstones of coalition readiness

In an era increasingly defined by rapid technological change, the ability of the United States and its allies to communicate and operate as a unified force has never been more vital. Modern conflict now moves at the pace of data, and success depends on how quickly information can be shared, analyzed and acted upon across Defense Department and coalition networks. Today, interoperability is critical to maintaining a strategic advantage across all domains.

The DoD has made progress toward interoperability goals through initiatives such as Combined

Joint All-Domain Command and Control (CJADC2), the Modular Open Systems Approach (MOSA) and the Sensor Open Systems Architecture (SOSA). Each underscores a clear recognition that victory in future conflicts will hinge on the ability to connect every sensor, platform and decision-maker in real time. Yet as adversaries work to jam communications and weaken alliances, continued collaboration between government and industry remains essential.

The strategic imperative

Interoperability allows the Army, Navy, Marine Corps, Air Force and Space Force to function as one integrated team. It ensures that data gathered by an Army sensor can inform a naval strike or that an Air Force feed can guide a Space Force operation, all in seconds. Among NATO and allied partners, this same connectivity ensures that an attack on one member can trigger a fast, coordinated, data-driven response by all. That unity of action forms the backbone of deterrence.

Without true interoperability, even the most advanced technology can end up isolated. The challenge is compounded by aging systems, proprietary platforms and differing national standards. Sustained commitment to open architectures and shared standards is the only way to guarantee compatibility while still encouraging innovation.

The role of open standards

Open standards make real interoperability possible. Common interfaces like Ethernet or IP networking allow systems built by different nations or vendors to talk to one another. When governments and companies collaborate on open frameworks instead of rigid specifications, innovation can thrive without sacrificing integration.

History has demonstrated that rigid design rules can slow progress and limit creativity, and it’s critical we now find the right balance. That means defining what interoperability requires while giving end users the freedom to achieve it in flexible ways. The DoD’s emphasis on modular, open architectures allows industry to innovate within shared boundaries, keeping future systems adaptable, affordable and compatible across domains and partners.

Security at the core

Interoperability depends on trust, and trust relies on security. Seamless data sharing among allies must be matched with strong protection for classified and mission-critical information, whether that data is moving across networks or stored locally.

Information stored on devices, vehicles or sensors, also known as data at rest, must be encrypted to prevent exploitation if it is captured or lost. Strong encryption ensures that even if adversaries access the hardware, the information remains unreadable. The loss of unprotected systems has repeatedly exposed vulnerabilities, reinforcing the need for consistent data at rest safeguards across all platforms.

The rise of quantum computing only heightens this concern. As processing power increases, current encryption methods will become outdated. Shifting to quantum-resistant encryption must be treated as a defense priority to secure joint and coalition data for decades to come.

Lessons from past operations

Past crises have highlighted how incompatible systems can cripple coordination. During Hurricane Katrina, for example, first responders struggled to communicate because their radios could not connect. The same issue has surfaced in combat, where differing waveforms or encryption standards limited coordination among U.S. services and allies.

The defense community has since made major strides, developing interoperable waveforms, software-defined radios and shared communications frameworks. But designing systems to be interoperable from the outset, rather than retrofitting them later, remains crucial. Building interoperability in from day one saves time, lowers cost and enhances readiness.

The rise of machine-to-machine communication

As the tempo of warfare increases, human decision-making alone cannot keep up with the speed of threats. Machine-to-machine communication, powered by artificial intelligence and machine learning, is becoming a decisive edge. AI-driven systems can identify, classify and respond to threats such as hypersonic missiles within milliseconds, long before a human could react.

These capabilities depend on smooth, standardized data flow across domains and nations. For AI systems to function effectively, they must exchange structured, machine-readable data through shared architectures. Distributed intelligence lets each platform make informed local decisions even if communications are jammed, preserving operational effectiveness in contested environments.

Cloud and hybrid architectures

Cloud and hybrid computing models are reshaping how militaries handle information. The Space Development Agency’s growing network of low Earth orbit satellites is enabling high bandwidth, global connectivity. Yet sending vast amounts of raw data from the field to distant cloud servers is not always practical or secure.

Processing data closer to its source, at the tactical edge, strikes the right balance. By combining local processing with cloud-based analytics, warfighters gain the agility, security and resilience required for modern operations. This approach also minimizes latency, ensuring decisions can be made in real time when every second matters.

A call to action

To maintain an edge over near-peer rivals, the United States and its allies must double down on open, secure and interoperable systems. Interoperability should be built into every new platform’s design, not treated as an afterthought. The DoD can further this goal by enforcing standards that require seamless communication across services and allied networks, including baseline requirements for data encryption at rest.

Adopting quantum-safe encryption should also remain a top priority to safeguard coalition systems against emerging threats. Ongoing collaboration between allies is equally critical, not only to harmonize technical standards, but to align operational procedures and shared security practices.

Government and industry must continue working side by side. The speed of technological change demands partnerships that can turn innovation into fielded capability quickly. Open, modular architectures will ensure defense systems evolve with advances in AI, networking and computing, while staying interoperable across generations of hardware and software.

Most importantly, interoperability should be viewed as a lasting strategic advantage, not just a technical goal. The nations that can connect, coordinate and act faster than their adversaries will maintain a strategic advantage. The continued leadership of the DoD and allied defense organizations in advancing secure, interoperable and adaptable systems will keep the United States and its partners ahead of near-peer competitors for decades to come.

 

Ray Munoz is the chief executive officer of Spectra Defense Technologies and a veteran of the United States Navy.

Cory Grosklags is the chief commercial officer of Spectra Defense Technologies.

The post Interoperability and standardization: Cornerstones of coalition readiness first appeared on Federal News Network.

© III Marine Expeditionary Force //Cpl. William Hester

Forget Guerrillas and IEDs - The Next Asymmetric War Will Be Engineered

OPINION — For most of modern history, asymmetric conflict conjured a familiar image: guerrillas in the hills, insurgents planting roadside bombs, or terrorists striking with crude weapons. The weak have traditionally offset the strong with mobility, surprise, and a willingness to take punishment.

That world is vanishing. A new age of synthetic asymmetry is emerging, one defined not by geography or ingenuity but by the convergence of technologies that enable small actors to wreak large-scale disruption. Unlike past asymmetry, which grew organically out of circumstance, this new form is engineered. It is synthetic, built from code, data, algorithms, satellites, and biotech labs. Here, “synthetic” carries a double meaning: it is both man-made and the product of synthesis, where disparate technologies combine to produce effects greater than the sum of their parts.

The implications for global security are profound. Power isn’t just about the size of an army or the depth of a treasury. It’s increasingly about who can combine technologies faster and more effectively.

A Brief History of Asymmetry

The weak finding ways to resist the strong is as old as conflict itself, but each era has defined asymmetry differently – shaped by the tools available and the political conditions of the time.

Nineteenth and 20th century resistance fighters, from Spain’s guerrilleros against Napoleon to Mao’s partisans in China, pioneered strategies that leveraged terrain, mobility, and popular support to frustrate superior armies. These methods set the template for Vietnam, where North Vietnamese and Viet Cong forces offset American firepower by blending into the population and stretching the war into a contest of political will.

The late 20th century brought new asymmetric forms. In Afghanistan, the mujahideen used Stinger missiles to neutralize Soviet air power. In Iraq, improvised explosive devices (IEDs) became the great equalizer, allowing insurgents to impose costs on heavily armored U.S. forces. Al-Qaeda and later ISIS demonstrated how transnational terrorist networks could project power globally with minimal resources, using ideology and spectacular violence to substitute for armies.

By the early 2000s, the cyber domain opened an entirely new front. The 2007 attacks on Estonia, widely attributed to Russian actors, showed that digital disruption could cripple a modern state without conventional force. Just three years later, the Stuxnet worm revealed how code could achieve effects once reserved for kinetic strikes, sabotaging Iranian nuclear centrifuges. These incidents marked the beginning of cyber as a core tool of asymmetric power.

The Arab Spring of 2011 revealed another evolution. Social media allowed activists to outmaneuver state censorship, coordinate mass mobilizations, and project their struggles globally. Authoritarian regimes learned just as quickly, harnessing the same tools for surveillance, propaganda, and repression. Asymmetric power was no longer only about insurgents with rifles; it could be exercised through smartphones and hashtags.

What began as the playbook of the weak has now been eagerly adapted by the strong. Russia weaponized social media to influence elections and deployed “little green men” in Crimea, deniable forces designed to blur the line between war and peace. Its use of mercenary groups like Wagner added a layer of plausible deniability, allowing Moscow to project power in Africa and the Middle East without formal commitments. China has fused state and private industry to pursue “civil-military fusion” in cyberspace, using intellectual property theft and digital influence campaigns to achieve strategic goals without firing a shot. Even the United States, though historically the target of asymmetric tactics, has employed them, using cyber operations like Stuxnet and financial sanctions as tools of coercion.

This adaptation by great powers underscores the shift: asymmetry is no longer just the recourse of the weak. It has become a strategic option for all actors, strong and weak alike. These episodes trace an arc: from guerrilla tactics shaped by terrain to a world where asymmetry is engineered by design.

The Cipher Brief brings expert-level context to national and global security stories. It’s never been more important to understand what’s happening in the world. Upgrade your access to exclusive content by becoming a subscriber.

Convergence as a Weapon

Synthetic asymmetry is not the product of a single breakthrough. It’s a result of technologies intertwining, with emergent results exceeding the sum of the parts.

Artificial intelligence and autonomy turn cheap drones into swarming strike platforms and enable generative AI-fueled propaganda that is instantly localized, highly scalable, and adapts in real time.

Biotechnology, leveraged by the democratization of tools like CRISPR and gene synthesis, opens doors to agricultural sabotage, engineered pathogens, or personalized biotargeting once confined to elite labs.

Cyber and quantum computing erode modern infrastructure–today through leaked state tools in criminal hands, tomorrow through quantum’s threat to encryption.

Commercial space assets put reconnaissance and global communications in reach of militias and small states.

Cryptocurrencies and decentralized finance fund rogue actors and blunt the power of sanctions.

Undersea infrastructure opens a highly asymmetric chokepoint, where low-cost submersibles or sabotage can sever global fiber-optic cables and energy pipelines, inflicting massive economic damage.

This is less about any one killer app than about convergence itself becoming a weapon.

Asymmetric warfare has always been about imbalance, but the shift to synthetic asymmetry is an exponential leap. A single phishing email can cripple a city’s infrastructure. Off-the-shelf drones can threaten billion-dollar ships. AI-powered disinformation efforts can destabilize national elections. This new ratio of effort to impact is more disproportionate than anything we’ve seen before.

Sign up for the Cyber Initiatives Group Sunday newsletter, delivering expert-level insights on the cyber and tech stories of the day – directly to your inbox. Sign up for the CIG newsletter today.

Where Synthetic Asymmetry Is Already Here

Ukraine's defense shows what convergence looks like in practice. Commercial drones retrofitted for combat, AI-assisted targeting, crypto-based crowdfunding, and open-source satellite intelligence have allowed a middle-sized country to hold its own against one of the world’s largest militaries. The drone is to the 21st century what the AK-47 was to the 20th: cheap, accessible, and transformative.

In Gaza, reports suggest AI-driven targeting systems have accelerated lethal decision-making. Proponents say they improved efficiency; critics warn they lowered thresholds for force and reduced accountability. Either way, the software changed the calculus of war. When algorithms operate at machine speed, traditional political checks on violence weaken.

Iran has demonstrated how low-cost drone technology can harass U.S. naval forces and regional shipping. These platforms cost a fraction of the vessels and missile defenses required to counter them. Combined with cyber probes against Gulf energy infrastructure, Iran illustrates how synthetic asymmetry allows a mid-tier state to impose global strategic costs.

China’s campaigns against Taiwan go beyond military intimidation. They include AI-generated disinformation, synthetic social media accounts, and coordinated influence operations designed to erode trust in democratic institutions. This is synthetic asymmetry in the cognitive domain, an attempt to shift political outcomes before shots are ever fired.

In parts of Africa, mercenary groups operate with funding streams routed through cryptocurrency wallets, supported by commercial satellite communications. These mercenaries operate in gray zones, blurring the line between private enterprise and state proxy. Accountability vanishes in a haze of digital anonymity. Ransomware gangs, meanwhile, already display near-peer disruptive power. They freeze hospitals and pipelines, extract ransoms, and launder funds through crypto markets. Add generative AI for phishing and deepfake voices for fraud, and these groups begin to resemble stateless proto-powers in the digital realm.

The Private Sector as a Geopolitical Actor

Synthetic asymmetry also elevates the role of private companies. Commercial satellite firms provided Ukraine with near-real-time battlefield imagery. SpaceX’s Starlink network became essential to Kyiv’s communications, until its corporate leadership balked at enabling certain military uses. Crypto exchanges, meanwhile, have been both conduits for sanctions evasion and partners in enforcement.

These examples reveal a new reality: private entities now hold levers of power once reserved for states. But their interests are not always aligned with national strategies. A tech CEO may prioritize shareholder value or brand reputation over geopolitical objectives. This creates a new layer of vulnerability—governments dependent on private infrastructure must negotiate, persuade, or regulate their own corporate champions to ensure strategic alignment. The private sector is becoming a semi-independent actor in world politics.

Need a daily dose of reality on national and global security issues? Subscriber to The Cipher Brief’s Nightcap newsletter, delivering expert insights on today’s events – right to your inbox. Sign up for free today.

The Cognitive and Economic Fronts

Perhaps the most destabilizing form of synthetic asymmetry lies in the cognitive domain. Deepfakes that impersonate leaders, AI-generated news outlets, and precision microtargeting of narratives can shape perceptions at scale. The cost of attack is negligible; the cost of defense is nothing less than the integrity of public discourse. For democracies, the danger is acute because open debate is their lifeblood.

Synthetic asymmetry also reshapes geopolitics through finance. North Korea has bankrolled its weapons programs through crypto theft. Russian oligarchs have sheltered assets in opaque digital networks. Decentralized finance platforms move billions across borders beyond the reach of traditional oversight. This financial shadow world undermines sanctions, once a cornerstone of Western statecraft, and allows actors to sustain pressure that would once have been crippling.

Why Democracies are Both Vulnerable and Strong

Herein lies the paradox: democracies are more exposed to synthetic asymmetry precisely because of their openness. Their media, economies, and political systems are target-rich. Legal and ethical constraints also slow the adoption of equivalent offensive tools.

Yet democracies hold underappreciated strengths: decentralized command cultures that empower rapid adaptation, innovation ecosystems that thrive on openness and collaboration, and alliances that allow for collective defense. The task is to recognize culture itself as a strategic asset and to organize defense not around any single domain, but across all of them.

Ethical and Legal Frameworks in Flux

The rise of synthetic asymmetry is colliding with international law and norms written for an earlier era. The legal status of cyber operations remains contested: is a crippling ransomware attack on a hospital an act of war, or a crime? The Tallinn Manual, NATO’s best attempt at clarifying how international law applies in cyberspace, remains largely aspirational.

AI-driven weapons systems pose even sharper dilemmas. Who is accountable when an algorithm selects a target in error? Should lethal decision-making be delegated to machines at all? The pace of technological change is outstripping the slow processes of treaty-making, leaving a widening gap between capability and governance, a gap where much of the risk resides.

Beyond Cold War Deterrence

Traditional deterrence, threatening massive retaliation, works poorly in a world of synthetic asymmetry. Many attackers are diffuse, deniable, or stateless. They thrive in gray zones where attribution is murky and escalation is uncertain.

What’s required is not just more technology, but a new doctrine for resilience: one that integrates cyber, cognitive, biological, economic, and space defenses as a single system. That doctrine has not yet been written, but its absence is already being exploited. At ISRS, we see this convergence daily, working with governments and institutions to adapt strategies for engineered asymmetric disruption.

We are at a hinge moment in strategic affairs. Just as the machine gun upended 19th-century doctrine and nuclear weapons reordered 20th-century geopolitics, the convergence of today’s technologies is reshaping the distribution of power. The future won’t be decided by who fields the biggest army. It will be decided by who can synthesize technologies into a disruptive force faster. That is the coming age of synthetic asymmetry. The question is whether democracies will recognize it and prepare before it fully arrives.

The Cipher Brief is committed to publishing a range of perspectives on national security issues submitted by deeply experienced national security professionals.

Opinions expressed are those of the author and do not represent the views or opinions of The Cipher Brief.

Have a perspective to share based on your experience in the national security field? Send it to Editor@thecipherbrief.com for publication consideration.

Read more expert-driven national security insights, perspective and analysis in The Cipher Brief

Cisco Quantum – Simply Network All the Quantum Computers

S. Schuchart

Cisco’s Quantum Labs research team, part of Outshift by Cisco, has announced that they have completed a complete software solution prototype. The latest part is the Cisco Quantum Complier prototype, designed for distributed quantum computing across networked processors. In short, it allows a network of quantum computers, of all types, to participate in solving a single problem. Even better, this new compiler supports distributed quantum error correction. Instead of a quantum computer needing to have a huge number of qbits itself, the load can be spread out among multiple quantum computers. This coordination is handled across a quantum network, powered by Cisco’s Quantum Network entanglement chip, which was announced in May 2025. This network could also be used to secure communications for traditional servers as well.

For some quick background – one of the factors holding quantum computers back is the lack of quantity and quality when it comes to qubits. Most of the amazing things quantum computers can in theory do require thousands or millions of qubits. Today we have systems with around a thousand qubits. But those qubits need to be quality qubits. Qubits are extremely susceptible to outside interference. Qubits need to be available in quantity as well as quality. To fix the quality problem, there has been a considerable amount of work performed on error correction for qubits. But again, most quantum error correction routines require even more qubits to create logical ‘stable’ qubits. Research has been ongoing across the industry – everyone is looking for a way to create large amounts of stable qubits.

What Cisco is proposing is that instead of making a single quantum processor bigger to have more qubits, multiple quantum processors can be strung together with their quantum networking technology and the quality of the transmitted qubits should be ensured with distributed error correction. It’s an intriguing idea – as Cisco more or less points out we didn’t achieve scale with traditional computing by simply making a single CPU bigger and bigger until it could handle all tasks. Instead, multiple CPUs were integrated on a server and then those servers networked together to share the load. That makes good sense, and it’s an interesting approach. Just like with traditional CPUs, quantum processors will not suddenly stop growing – but if this works it will allow scaling of those quantum processors on a smaller scale, possibly ushering in useful, practical quantum computing sooner.

Is this the breakthrough needed to bring about the quantum computing revolution? At this point it’s a prototype – not an extensively tested method. Quantum computing requires so much fundamental physics research and is so complicated that its extremely hard to say if what Cisco is suggesting can usher in that new quantum age. But it is extremely interesting, and it will certainly be worth watching this approach as Cisco ramps up its efforts in quantum technologies.

❌