Normal view

There are new articles available, click to refresh the page.
Yesterday — 24 January 2026Main stream

Ethereum Launches $2M Quantum Defense Team as Threat Timeline Accelerates

24 January 2026 at 09:01

The Ethereum Foundation has officially elevated quantum resistance to a top strategic priority with the formation of a dedicated Post Quantum team backed by $2 million in funding.

The new initiative comes as blockchain networks face mounting pressure to defend against quantum computing threats that industry experts increasingly warn could materialize within years rather than decades.

Ethereum researcher Justin Drake announced the team formation on Friday, revealing that Thomas Coratger will lead the effort alongside Emile, a core contributor to leanVM.

After years of quiet R&D, EF management has officially declared PQ security a top strategic priority,” Drake said, adding that the foundation has been developing its quantum strategy since a 2019 presentation at StarkWare Sessions.

Today marks an inflection in the Ethereum Foundation's long-term quantum strategy.

We've formed a new Post Quantum (PQ) team, led by the brilliant Thomas Coratger (@tcoratger). Joining him is Emile, one of the world-class talents behind leanVM. leanVM is the cryptographic…

— Justin Drake (@drakefjustin) January 23, 2026

Foundation Commits Resources Across Multiple Fronts

The Ethereum Foundation is launching comprehensive defensive measures spanning research, development, and infrastructure testing.

Antonio Sanso will kick off bi-weekly All Core Devs Post Quantum breakout calls next month, focusing on user-facing security, including dedicated precompiles, account abstraction, and transaction signature aggregation with leanVM.

The foundation announced two $1 million prize competitions to strengthen cryptographic foundations.

The newly launched Poseidon Prize targets the hardening of the Poseidon hash function, while the existing Proximity Prize continues to drive hash-based cryptography research.

We are betting big on hash-based cryptography to enjoy the strongest and leanest cryptographic foundations,” Drake stated.

Multi-client post-quantum consensus development networks are already operational, with pioneer teams including Zeam, Ream Labs, PierTwo, Gean client, and Ethlambda working alongside established consensus clients Lighthouse, Grandine, and Prysm.

Weekly post-quantum interop calls, coordinated by Will Corcoran, are managing collaborative technical development across these diverse implementation teams.

The foundation will host a three-day expert workshop in October, bringing together top specialists from around the world, building on last year’s post-quantum workshop in Cambridge.

An additional dedicated post-quantum day is scheduled for March 29 in Cannes, ahead of EthCC, to create multiple forums for advancing research and coordination across the global Ethereum development community.

Industry Voices Split on Timeline Urgency

The quantum threat has divided blockchain leaders on both timeline predictions and strategic priorities.

Independent Ethereum educator sassal.eth called quantum computing “a very real threat for blockchains” that is “coming sooner than most people think,” praising the foundation’s defensive preparations.

Pantera Capital General Partner Franklin Bi predicted that traditional financial institutions will struggle with the transition to post-quantum cryptography.

People are over-estimating how quickly Wall Street will adapt to post-quantum cryptography,” Bi said, adding that blockchain networks possess unique capabilities for system-wide upgrades at a global scale.

The post-quantum race begins.

My prediction:

People are over-estimating how quickly Wall Street will adapt to post-quantum cryptography. Like any systemic software upgrade, it'll be slow & chaotic with single points of failure for years. Traditional systems are only as strong… https://t.co/6mEdFKcXrm

— Franklin Bi (@FranklinBi) January 23, 2026

He argued that successful quantum resistance could transform select blockchains into “post-quantum safe havens for data and assets,” particularly as traditional systems face prolonged periods of vulnerability due to single points of failure.

Bitcoin community assessments remain sharply contested. Vitalik Buterin previously shared Metaculus data showing a median 2040 timeline for quantum computers breaking modern cryptography, with roughly a 20% probability before the end of 2030.

Metaculus's median date for when quantum computers will break modern cryptography is 2040:https://t.co/Li8ni8A9Ox

Seemingly about a 20% chance it will be before end of 2030.

— vitalik.eth (@VitalikButerin) August 27, 2025

Blockstream CEO Adam Back has dismissed near-term concerns, claiming practical threats remain decades away and accusing critics of creating unnecessary market alarm.

Project ZKM contributor Stephen Duan acknowledged transition challenges while calling quantum resistance “inevitable,” noting that his team will soon upgrade multiset hashing to a lattice-based construction.

ZKsync inventor Alex Gluk also said the network’s Airbender prover is already “100% PQ-proof,” highlighting Ethereum’s unmatched ability to adapt to emerging threats while maintaining its position as the global financial settlement layer.

Foundation Plans Comprehensive Roadmap Release

The Ethereum Foundation will publish detailed strategic guidance on pq.ethereum.org covering full transition planning to achieve zero loss of funds and zero downtime over the coming years.

Drake highlighted recent artificial intelligence breakthroughs in formal proof generation, noting that specialized mathematics AI recently completed one of the hardest lemmas in hash-based SNARK foundations in a single eight-hour run costing $200.

The foundation is developing educational materials, including a six-part video series with ZKPodcast and enterprise-focused resources through EF Enterprise Acceleration.

Quantum Threatens $600B of Bitcoin 🎧🤖@nic_carter joins me for an in-person @PodcastDelphi to cover his 6 months of research on Quantum's effect on $BTC

Nic's first and only podcast on Quantum

Listen directly here, or on any of the links below pic.twitter.com/CSnv7xekqn

— Tommy (@Shaughnessy119) January 9, 2026

Ethereum now has representation on the post-quantum advisory board, Coinbase announced this week, bringing together leading cryptography researchers to assess long-term blockchain security risks as quantum computing capabilities advance across government and private-sector development programs.

The post Ethereum Launches $2M Quantum Defense Team as Threat Timeline Accelerates appeared first on Cryptonews.

Before yesterdayMain stream

Coinbase Announces New Board Of Experts To Combat Rising Quantum Computing Risks

23 January 2026 at 01:00

The crypto industry is preparing for a potential security challenge with the anticipated arrival of quantum computing. In response to this potential threat, Coinbase (COIN) has announced the formation of an advisory board composed of external experts. 

Coinbase Chief Security Officer’s Warning 

According to a report from Fortune, the newly established board includes academics from Stanford, Harvard, and the University of California, specializing in fields like computer science, cryptography, and fintech. 

Officially titled the Coinbase Independent Advisory Board on Quantum Computing and Blockchain, the group also features experts from the Ethereum Foundation, the decentralized finance (DeFi) platform EigenLayer, and Coinbase itself.

Jeff Lunglhofer, Coinbase’s Chief Information Security Officer, elaborated on the potential impact of quantum computing on current encryption methods. 

He explained that the encryption protecting wallets and private keys of Bitcoin (BTC) holders relies on complex mathematical problems that would take conventional computers thousands of years to solve. 

However, with the computational power that quantum computers promise—potentially a million times greater—these problems could be solved much more swiftly, Lunglhofer asserted.

Although the security implications of quantum computing are genuine, Lunglhofer reassured that they are not expected to become an immediate concern for at least a decade. The purpose of the new advisory board is to examine the upcoming challenges posed by quantum computing in a measured manner. 

This involves fostering initiatives within the blockchain industry that are reportedly already underway to enhance the resilience of Bitcoin and other networks against quantum attacks.

Blockchain Networks Expected To Implement Larger Keys

At present, Bitcoin secures its wallets through private keys, which consist of long strings of random characters. These keys are accessible to their owners but can only be estimated through extensive trial-and-error computations. 

The advent of quantum computing, however, would make it feasible to deduce private keys using trial-and-error methods in a fraction of the time. 

In response to this looming threat, Fortune disclosed that blockchain experts speculate that networks will implement larger keys and add “noise” to obscure their locations, making them more difficult to detect. Implementing these defensive upgrades across blockchain networks is said to take several years. 

In the meantime, the newly formed Coinbase Advisory Board is gearing up to publish research papers and issue position statements aimed at helping the cryptocurrency industry brace for the impacts of quantum computing. 

Their first paper, which will address quantum’s influence on the consensus and transaction layers of blockchain, is expected to be released within the next couple of months.

Coinbase

At the time of writing, Coinbase’s stock, which trades under the ticker symbol COIN on the Nasdaq, is trading at $225.10. This represents a slight drop of 1.2% over the last 24 hours. 

Featured image from OpenArt, chart from TradingView.com 

Coinbase Forms Quantum Computing Advisory Board as Bitcoin Security Concerns Grow

22 January 2026 at 13:58

Bitcoin Magazine

Coinbase Forms Quantum Computing Advisory Board as Bitcoin Security Concerns Grow

Earlier this week, Coinbase announced the creation of an Independent Advisory Board on Quantum Computing and Blockchain, aiming to safeguard the crypto ecosystem against emerging quantum threats

The board will bring together leading experts in quantum computing, cryptography, and blockchain to assess risks and provide guidance to the broader industry.

Quantum computers, if scaled successfully, could compromise the cryptography that underpins major blockchains like Bitcoin and Ethereum. Coinbase, in their announcement, stressed that preparing for these future challenges is crucial to maintaining the security of digital assets.

The advisory board includes notable figures such as quantum computing pioneer Scott Aaronson, Stanford cryptography expert Dan Boneh, Ethereum researcher Justin Drake, and Coinbase’s own Head of Cryptography, Yehuda Lindell. 

The group says they will publish position papers, recommend best practices for long-term security, and respond to significant advances in quantum computing.

This initiative is part of Coinbase’s larger post-quantum security strategy, which also includes updating Bitcoin address handling, enhancing internal key management, and advancing research on post-quantum signature schemes. The board’s first position paper is expected early next year, laying out a roadmap for quantum resilience in blockchain systems.

Coinbase said the move underscores the importance of proactive planning, ensuring the crypto industry remains prepared, not reactive, as quantum technology evolves.

Is bitcoin at risk from Quantum Computing? 

Over the last several months, concerns over quantum computing’s potential impact on Bitcoin have begun to ripple through traditional finance, prompting some investors to radically rethink their exposure to the cryptocurrency. 

Jefferies strategist Christopher Wood recently removed Bitcoin from his Greed & Fear model portfolio, citing the existential risk that large-scale quantum computers could undermine the cryptographic foundations securing digital assets. 

While the threat is not imminent, Wood and other institutional voices — including BlackRock and UBS CEO Sergio Ermotti — warn that quantum advances could eventually allow attackers to derive private keys from public ones, putting millions of BTC at risk. 

As a result, Wood replaced Bitcoin with gold and gold-mining equities, emphasizing that long-term store-of-value claims for digital assets may be less reliable in the face of accelerating technological change.

The debate over quantum computing in the Bitcoin ecosystem is intensifying. Coinbase research indicates that roughly 20% to 50% of Bitcoin’s supply, particularly coins in older wallet formats, could be vulnerable to so-called long-range quantum attacks. 

Crypto developers and researchers are divided over the urgency of implementing quantum-resistant solutions, with some advocating proactive upgrades and others arguing the risk remains distant. 

Strategy Chairman Michael Saylor believes that quantum computing will actually strengthen Bitcoin rather than threaten it. Network upgrades and coin migrations will boost security, while lost coins remain frozen, Saylor posted.

This post Coinbase Forms Quantum Computing Advisory Board as Bitcoin Security Concerns Grow first appeared on Bitcoin Magazine and is written by Micah Zimmerman.

Coinbase Forms Expert Board to Prepare Bitcoin for Quantum Computing Risks

By: Amin Ayan
22 January 2026 at 06:54

Coinbase has launched an independent advisory board aimed at preparing Bitcoin and the broader blockchain ecosystem for the long-term risks posed by quantum computing, as advances in the field raise questions about the durability of today’s cryptographic standards.

Key Takeaways:

  • Coinbase is taking early steps to address potential quantum threats to blockchain security.
  • An independent expert board will assess risks and publish guidance for the crypto industry.
  • The goal is to prepare years in advance before quantum computing becomes a real threat.

Quantum computers, once developed at scale, could disrupt industries ranging from healthcare and finance to national security, the exchange said in a recent blog post.

For blockchain networks, the implications are particularly serious. Most major chains, including Bitcoin and Ethereum, rely on elliptic-curve cryptography, a system considered secure today but potentially vulnerable to sufficiently powerful quantum machines in the future.

Coinbase Launches Independent Advisory Board to Address Quantum Computing Risks

To address that possibility, Coinbase is forming the Coinbase Independent Advisory Board on Quantum Computing and Blockchain, bringing together leading researchers to assess emerging risks and offer guidance to developers, institutions, and users.

According to Coinbase, the board will operate independently and publish position papers evaluating the state of quantum computing and its implications for blockchain security.

It will also issue practical recommendations on how individuals and organizations can prepare for long-term quantum threats, and provide timely analysis when major breakthroughs in quantum research occur.

The advisory board includes several prominent figures from cryptography, quantum computing, and blockchain research.

Quantum Threatens $600B of Bitcoin 🎧🤖@nic_carter joins me for an in-person @PodcastDelphi to cover his 6 months of research on Quantum's effect on $BTC

Nic's first and only podcast on Quantum

Listen directly here, or on any of the links below pic.twitter.com/CSnv7xekqn

— Tommy (@Shaughnessy119) January 9, 2026

Members include Scott Aaronson, a leading quantum computing researcher and director of the Quantum Information Center at the University of Texas at Austin, Stanford cryptography professor Dan Boneh, Ethereum Foundation researcher Justin Drake, EigenLayer founder Sreeram Kannan, Coinbase head of cryptography Yehuda Lindell, and Dahlia Malkhi, a specialist in secure distributed systems and head of the UCSB Foundations of Fintech Research Lab.

Coinbase says the group’s collective expertise is intended to help the industry move beyond theoretical discussions and toward concrete planning.

While large-scale quantum computers capable of breaking current cryptography do not yet exist, the company argues that preparation must begin years in advance.

Coinbase plans to publish the board’s first position paper early next year, outlining a baseline assessment of quantum-related risks and potential paths toward resilience.

Coinbase Says Tokenization Can Open Global Capital Markets to Billions Left Out

As reported, Coinbase CEO Brian Armstrong has outlined a plan to expand access to global capital markets through blockchain-based tokenization, arguing that billions of adults remain locked out of equity and bond investing.

In a new policy paper, Coinbase says structural barriers have excluded nearly two-thirds of the world’s adult population from wealth creation as returns on capital continue to outpace wages.

The paper highlights sharp geographic and economic divides in market participation. While more than half of adults in the US invest in equities or bonds, participation falls below 10% in countries such as China and India.

Armstrong argues that access is largely determined by where someone is born, not their talent, pointing to extreme home bias that keeps investors concentrated in local markets despite limited exposure to global growth.

The post Coinbase Forms Expert Board to Prepare Bitcoin for Quantum Computing Risks appeared first on Cryptonews.

Jefferies’ Analyst Dumps Bitcoin Over Quantum Computing Fears, Buys Gold

16 January 2026 at 15:19

Bitcoin Magazine

Jefferies’ Analyst Dumps Bitcoin Over Quantum Computing Fears, Buys Gold

Christopher Wood, global head of equity strategy at Jefferies, has eliminated Bitcoin from his flagship Greed & Fear model portfolio, citing concerns that developments in quantum computing could pose an existential threat to the cryptocurrency’s cryptographic foundations.

In the latest edition of the widely followed newsletter, Wood confirmed that Jefferies has removed its entire 10% Bitcoin allocation, replacing it with a split allocation of 5% to physical gold and 5% to gold-mining equities, according to Bloomberg. 

The strategist said the move reflects rising uncertainty over whether Bitcoin can maintain its role as a long-term store of value in the face of accelerating technological change.

“While Greed & Fear does not believe that the quantum issue is about to hit the Bitcoin price dramatically in the near term, the store-of-value concept is clearly on less solid foundation from the standpoint of a long-term pension portfolio,” Wood wrote.

Wood was an early institutional supporter of Bitcoin, first adding it to the model portfolio in December 2020 amid pandemic-era stimulus and fears of fiat currency debasement. He later increased the allocation to 10% in 2021.

Since that initial inclusion, Bitcoin has risen approximately 325%, compared with a 145% gain in gold over the same period.

Quantum computing presents structural risks to Bitcoin 

Despite the strong performance, Wood argues that quantum computing presents a structural risk that cannot be ignored. Bitcoin’s security relies on cryptographic algorithms that are effectively unbreakable using classical computers. 

However, sufficiently powerful quantum machines could theoretically derive private keys from public keys, enabling unauthorized transfers and undermining confidence in the network.

Security researchers estimate that roughly 20% to 50% of Bitcoin’s total supply — between 4 million and 10 million BTC — could be vulnerable under certain conditions. 

Coinbase researchers have identified approximately 6.5 million BTC held in older wallet formats where public keys are already exposed on-chain, making them susceptible to so-called long-range quantum attacks.

The issue has sparked a growing divide within the Bitcoin ecosystem. Some think that developers are underestimating the risk. Others, including Blockstream CEO Adam Back, maintain that the threat remains distant and that quiet preparatory work toward quantum-resistant signatures is preferable to alarming investors.

The debate has also begun to reach mainstream finance. BlackRock has listed quantum computing as a potential long-term risk in its spot Bitcoin ETF disclosures, while Solana co-founder Anatoly Yakovenko recently suggested there is a 50% chance of a meaningful quantum breakthrough within five years.

For Wood, the uncertainty itself strengthens the case for gold.

He described the metal as a historically tested hedge in an increasingly volatile geopolitical and technological landscape, concluding that the long-term questions raised by quantum computing are “only positive for gold.”

Gold climbed to record highs this month, topping $4,600 per ounce, as investors piled into the safe-haven asset amid escalating geopolitical tensions involving Iran and growing expectations that the Federal Reserve will cut interest rates following softer U.S. inflation and labor market data.

This post Jefferies’ Analyst Dumps Bitcoin Over Quantum Computing Fears, Buys Gold first appeared on Bitcoin Magazine and is written by Micah Zimmerman.

Fujitsu, SC Ventures Unveil Roadmap for Quantum Joint Venture

16 January 2026 at 07:13

With Qubitra, they intend to drive practical value by applying quantum approaches to some of the finance sector’s most complex problems.

The post Fujitsu, SC Ventures Unveil Roadmap for Quantum Joint Venture appeared first on TechRepublic.

Fujitsu, SC Ventures Unveil Roadmap for Quantum Joint Venture

16 January 2026 at 07:13

With Qubitra, they intend to drive practical value by applying quantum approaches to some of the finance sector’s most complex problems.

The post Fujitsu, SC Ventures Unveil Roadmap for Quantum Joint Venture appeared first on TechRepublic.

AI may not be the federal buzzword for 2026

Let’s start with the good news: artificial intelligence may NOT be the buzzword for 2026.

What will be the most talked about federal IT and/or acquisition topic for this year remains up for debate. While AI will definitely be part of the conversation, at least some experts believe other topics will emerge over the next 12 months. These range from the Defense Department’s push for “speed to capability” to resilient innovation to workforce transformation.

Federal News Network asked a panel of former federal technology and procurement executives for their opinions what federal IT and acquisition storylines they are following over the next 12 months. If you’re interested in previous years’ predictions, here is what experts said about 20232024 and 2025.

The panelists are:

  • Jonathan Alboum, federal chief technology officer for ServiceNow and former Agriculture Department CIO.
  • Melvin Brown, vice president and chief growth officer at CANI and a former deputy CIO at the Office of Personnel Management.
  • Matthew Cornelius, managing director of federal industry at Workday and former OMB and Senate staff member.
  • Kevin Cummins, a partner with the Franklin Square Group and former Senate staff member.
  • Michael Derrios, the new executive director of the Greg and Camille Baroni Center for Government Contracting at George Mason University and former State Department senior procurement executive.
  • Julie Dunne, a principal with Monument Advocacy and former commissioner of GSA’s Federal Acquisition Service.
  • Mike Hettinger, founding principal of Hettinger Strategy Group and former House staff member.
  • Nancy Sieger, a partner at Guidehouse’s Financial Services Sector and a former IRS CIO.

What are two IT or acquisition programs/initiatives that you are watching closely for signs of progress and why?

Brown: Whether AI acquisition governance becomes standard, templates, clauses, evaluation norms, 2026 is where agencies turn OMB AI memos into repeatable acquisition artifacts, through solicitation language, assurance evidence, testing/monitoring expectations and privacy and security gates. The 2025 memos are the anchor texts. I’m watching for signals such as common clause libraries, governmentwide “minimum vendor evidence” and how agencies operationalize “responsible AI” in source selections.

The Cybersecurity Maturity Model Certification (CMMC) phased rollout and how quickly it becomes a de facto barrier to entry. Because the rollout is phased over multiple years starting in November 2025, 2026 is the first full year where you can observe how often contracting officers insert the clause and how primes enforce flow-downs. The watch signals include protest activity, supply-chain impacts and whether smaller firms get crowded out or supported.

Hettinger: Related to the GSA OneGov initiative, there’s continuing pressure on the middleman, that is to say resellers and systems integrators to deliver more value for less. This theme emerged in early 2025, but it will continue to be front and center throughout 2026. How those facing the pressure respond to the government’s interests will tell us a lot about how IT acquisition is going to change in the coming years. I’ll be watching that closely.

Mike Hettinger is president and founding principal of Hettinger Strategy Group and former staff director of the House Oversight and Government Reform Subcommittee on Government Management.

The other place to watch more broadly is how the government is going to leverage AI. If 2025 was about putting the pieces in place to buy AI tools, 2026 is going to be about how agencies are able to leverage those tools to bring efficiency and effectiveness in a host of new areas.

Cornelius: The first is watching the Hill to see if the Senate can finally get the Strengthening Agency Management and Oversight of Software Assets (SAMOSA) Act passed and to the President’s desk. While a lot of great work has already happened — and will continue to happen — at GSA around OneGov, there is only so much they can do on their own. If Congress forces agencies to do the in-depth analysis and reporting required under SAMOSA, it will empower GSA, as well as OMB and Congress, to have the type of data and insights needed to drive OneGov beyond just cost savings to more enterprise transformation outcomes for their agency customers. This would generate value at an order of magnitude beyond what they have achieved thus far.

The second is the implementation of the recent executive order that created the Genesis Mission initiative. The mission is focused on ensuring that the Energy Department and the national labs can hire the right talent and marshal the right resources to help develop the next generation of biotechnology, quantum information science, advanced manufacturing and other critical capabilities empower America’s global leadership for the next few generations. Seeing how DOE and Office of Science and Technology Policy (OSTP) partner collaboratively with industry to execute this aspirational, but necessary, nationwide effort will be revelatory and insightful.

Cummins: Will Congress reverse its recent failure to reauthorize the Technology Modernization Fund (TMF)? President Donald Trump stood up the TMF during his first term and it saw a significant funding infusion by President Joe Biden. Watching the TMF just die with a whimper will make me pessimistic about reviving the longstanding bipartisan cooperation on modernizing federal IT that existed before the Department of Government Efficiency (DOGE).

I will be closely watching how well the recently-announced Tech Force comes together. Its goal of recruiting top engineers to serve in non-partisan roles focused on technology implementation sounds a lot like the U.S. Digital Service started by President Barack Obama, which then became the U.S. DOGE Service. I would like to see Tech Force building a better government with some of the enthusiasm that DOGE showed for cutting it.

Sieger: I’m watching intensely how agencies manage the IT talent exodus triggered by DOGE-mandated workforce reductions and return-to-office requirements. The unintended consequence we’re already observing is the disproportionate loss of mid-career technologists, the people who bridge legacy systems knowledge with modern cloud and AI capabilities.

Agencies are losing their most marketable IT talent first, while retention of personnel managing critical legacy infrastructure creates technical debt time bombs. At Guidehouse, we’re fielding unprecedented requests for cybersecurity, cloud architecture and data engineering services. The question heading into 2026 is whether agencies can rebuild sustainable IT operating models or whether they become permanently dependent on contractor support, fundamentally altering the government’s long-term technology capacity.

My prediction of the real risk is that mission-critical systems are losing institutional knowledge faster than documentation or modernization can compensate. Agencies need to watch and mitigate for increased system outages, security incidents, and failed modernization projects as this workforce disruption cascades through 2026.

Sticking with the above theme, it does bear watching how the new federal Tech Force hiring initiative succeeds. The federal Tech Force initiative signals a major shift in how the federal government sources and deploys modern technology talent. As agencies bring in highly skilled technologists focused on AI, cloud, cybersecurity and agile delivery, the expectations for speed, engineering rigor and product-centric outcomes will rise. This will reshape how agencies engage industry partners, favoring firms that can operate at comparable technical and cultural velocity.

The initiative also introduces private sector thinking into government programs, influencing requirements, architectures and vendor evaluations. This creates both opportunity and pressure. Organizations aligned to modern delivery models will gain advantage, while legacy approaches may struggle to adapt. Federal Tech Force serves as an early indicator of how workforce decisions are beginning to influence acquisition approaches and modernization priorities across government.

Dunne: Title 41 acquisition reform. The House Armed Services Committee and House Oversight Committee worked together to pass a 2026 defense authorization bill out of the House with civilian or governmentwide (Title 41) acquisition reform proposals. These reform proposals in the House NDAA bill included increasing various acquisition thresholds (micro-purchase and simplified acquisition thresholds and cost accounting standards) and language on advance payments to improve buying of cloud solutions. Unfortunately, these governmentwide provisions were left out of the final NDAA agreement, leaving in some cases different rules the civilian and defense sectors. I’m hopeful that Congress will try again on governmentwide acquisition reform.

Office of Centralized Acquisition Services (OCAS). GSA launched OCAS late this year to consolidate and streamline contracting for common goods and services in accordance with the March 2025 executive order (14240). Always a good exercise to think about how to best consolidate and streamline contracting vehicles. We’ve been here before and I think OCAS has a tough mission as agencies often want to do their own thing.  If given sufficient resources and leadership attention, perhaps it will be different this time.

FedRAMP 20x. Earlier this year, GSA’s FedRAMP program management office launched FedRAMP 20x to reform the process and bring efficiencies through automation and expand the availability of cloud service provider products for agencies. All great intentions, but as we move into the next phase of the effort and into FedRAMP moderate type solutions, I hope the focus remains on the security mission and the original intent to measure once, use many times for the benefit of agencies. Also, FedRAMP authorization expires in December 2027 – which is not that far away in congressional time.

Alboum: In the coming year, I’m paying close attention to how agencies manage AI efficiency and value as they move from pilots to production. As budgets tighten, agencies need a clearer picture of which models are delivering results, which aren’t, and where investments are being duplicated.

I’m also watching enterprise acquisition and software asset management efforts. The Strengthening Agency Management and Oversight of Software Assets (SAMOSA) Act has been floating around Congress for the last few years. I’m curious to see whether it will ultimately become law. Its provisions reflect widely acknowledged best practices for controlling software spending and align with the administration’s PMA objective to “consolidate and standardize systems, while eliminating duplicative ones.” How agencies manage their software portfolios will be a crucial test of whether efficiency goals are turning into lasting structural change, or just short-term fixes.

Derrios: I’ll be watching how GSA’s OneGov initiative shapes up will be important because contract consolidation without an equal focus on demand forecasting, standardization and potential requirements aggregation may not yield the intended results. There needs to be a strong focus on acquisition planning between GSA and their federal agency customers in addition to any movement of contracts.

In 2025, the administration revamped the FAR, which hadn’t been reviewed holistically in 40 years. So in 2026, what IT/acquisition topic(s) would you like to see the administration take on that has long been overlooked and/or underappreciated for the impact change and improvements could have, and why?

Cummins: Despite the recent Trump administration emphasis on commercialization, it is still too hard for innovative companies to break into the federal market. Sometimes agencies will move mountains to urgently acquire a new technology, like we have seen recently with some artificial intelligence and drones initiatives. But a commercial IT company generally has to partner with a reseller and get third-party accreditation (CMMC, FedRAMP, etc.) just to get access to a federal customer. Moving beyond the FAR rewrite, could the government give up some of the intellectual property and other requirements that make it difficult for commercial companies to bid as a prime or sell directly to an agency outside of an other transaction agreement (OTA)? It would also be helpful to see more FedRAMP waivers for low-risk cloud services.

Cornelius: It’s been almost 50 years since foundational law and policy set the parameters we still follow today around IT accessibility. During my time in the Senate, I drafted the provision in the 2023 omnibus appropriations bill that required GSA and federal agencies to perform comprehensive assessments of accessibility compliance across all IT and digital assets throughout the government. Now, with a couple years of analysis and with many thoughtful recommendations from GSA and OMB, it is time for Congress to make critical updates in law to improve the accessibility of any capabilities the government acquires or deploys. 2026 could be a year of rare bipartisan, bicameral collaboration on digital accessibility, which could then underpin the administration’s American by Design initiative and ensure important accessibility outcomes from all vendors serving government customers are delivered and maintained effectively.

Derrios: The federal budgeting process really needs a reboot. Static budgets do not align with multi-year missions where risks are continuous, technology changes at lightning speed, and world events impact aging cost estimates. And without a real “return on investment” mentality incorporated into the budgeting process, under-performing programs with high sunk-costs will continue to be supported. But taxpayers shouldn’t have to sit through a bad movie just because they already paid for the ticket.

Brown: I’m watching how agencies continue to move toward the implementation of zero trust and how the data layer becomes the budget fight. With federal guides emphasizing data security, the 2026 question becomes, do programs converge on fewer, interoperable controls, or do they keep buying overlapping tools? My watch signals include requirements that prioritize data tagging/classification, attribute-based access, encryption/key management and auditability as “must haves” in acquisitions.

Alboum: Over the past few years, the federal government has made significant investments in customer experience and service delivery. The question now is whether those gains can be sustained amid federal staffing reductions.

Jonathan Alboum is a former chief information officer at the Agriculture Department and now federal chief technology officer for ServiceNow.

This challenge is closely tied to the “America by Design” executive order, which calls for redesigned websites where people interact with the government. A beautiful, easy-to-use website is an excellent start. However, the public expects a great end-to-end experience across all channels, which aligns directly with the administration’s PMA objective to build digital services for “real people, not bureaucracy.”

So, I’ll be watching to see if we meet these expectations by investing in AI and other technologies to lock in previous gains and improve the way we serve the public. With the proper focus, I’m confident that we can positively impact the public’s perception and trust in government.

Hettinger: Set aside the know and historic challenges with the TMF, we really do need to figure out how to more effectively buy IT at a pace consistent with the need of agencies. Maybe some of that is addressed in the FAR changes, but those are only going to take us so far (no pun intended). If we think outside the box, maybe we can find a way to make real progress in IT funding and acquisition in a way that gets the right technology tools in the hands of the right people more quickly.

Dunne: I think follow through on the initiatives launched in 2025 will be important to focus on in 2026.  The formal rulemaking process for the RFO will launch in 2026 and will be an important part of that follow through. And now that we have a confirmed Office of Federal Procurement Policy administrator, I think 2026 will be an important year for industry engagement on topics like the RFO.

Sieger: If the administration could tackle one long-overlooked issue with transformative impact, it should be the modernization of security clearances are granted, maintained and reciprocally recognized for contractor personnel supporting federal IT initiatives.

The current clearance system regularly creates 6-to-12 month delays in staffing critical IT programs, particularly in cybersecurity and AI. Agencies lose qualified contractors to private sector opportunities during lengthy adjudication periods. The lack of true clearance reciprocity means contractors moving between agency projects often restart the process, wasting resources and creating knowledge gaps on programs.

This is a strategic vulnerability. Federal IT modernization depends on contractor expertise for specialized skills government cannot hire directly. When clearance processes take longer than typical IT project phases, agencies either compromise on talent quality or delay mission-critical initiatives. The opportunity cost is measured in delayed outcomes and increased cyber risk.

Implementing continuous vetting for contractor populations, establishing true cross-agency clearance reciprocity, and creating “clearance portability” would benefit emerging technology areas such as AI, quantum, advanced cybersecurity, where talent competition is fiercest. From Guidehouse’s perspective, we see clients are repeatedly unable to staff approved projects because cleared personnel aren’t available, not because talent doesn’t exist.

This reform would have cascading benefits: faster modernization, better talent retention, reduced costs and improved security through continuous monitoring rather than point-in-time investigations.

If 2025 has been all about cost savings and efficiencies, what do you think will emerge as the buzzword of 2026?

Brown: “Speed to capability” acquisition models spreading beyond DoD. The drone scaling example is a concrete indicator of a broader push. The watch signals for me are increased use of rapid pathways, shorter contract terms, modular contracting and more frequent recompetes to keep pace with technology change.

Cornelius: Governmentwide human resource transformation.

Julie Dunne, a former House Oversight and Reform Committee staff member for the Republicans, a former commissioner of the Federal Acquisition Service at the General Services Administration, and now a principal at Monument Advocacy.

Dunne: AI again. How the government uses it to facilitate delivery of citizen services and how AI tools will assist with the acquisition process, and AI-enabled cybersecurity attacks. I know that’s not one word, but it’s a huge risk to watch and only a matter of time before our adversaries find success in attacking federal systems with an AI-enabled cyberattack, and federal contractors will be on the hook to mitigate such risks.

Cummins: Fraud prevention. While combating waste, fraud and abuse is a perennial issue, the industrial scale fraud revealed in Minnesota highlights a danger from how Congress passed COVID pandemic-era spending packages without the same level of checks and balances that were put in place for earlier Obama-era stimulus spending. Federal government programs generally still have a lot of room for improvement when it comes to preventing improper payments, such as by using better identity and access management and other security tools. Stopping fraud is also one of the few remaining areas of bipartisan agreement among policymakers.

Hettinger: DOGE may be gone, or maybe it’s not really gone, but I don’t know that cost savings and efficiencies are going to be pushed to the backburner. This administration comes at everything — at least from an IT perspective — as believing it can be done better, faster and cheaper. I expect that to continue not just into 2026 but for the rest of this administration.

Derrios: I think there will have to be a focus on how government needs and requirements are defined and how the remaining workforce can upskill to use technology as a force multiplier. If you don’t focus on what you’re buying and whether it constitutes a legitimate mission support need, any cost savings gained in 2025 will not be sustainable long-term. Balancing speed-to-contract and innovative buying methodologies with real requirements rigor is critical. And how your federal workforce uses the tools in the toolbox to yield maximum outcomes while trying to do more with less is going to take focused leadership. To me, all of this culminates in one word for 2026, and that’s producing “value” for federal missions.

Sieger: Resilient innovation. While 2025 focused intensely on cost savings and efficiencies, particularly through DOGE-mandated cuts, 2026’s emerging buzzword will be “resilient innovation.” Agencies are recognizing the need to continue advancing technological capabilities while maintaining operational continuity under constrained resources and heightened uncertainty.

The efficiency drives of 2025 exposed real vulnerabilities. Agencies lost institutional knowledge, critical systems became more fragile, and the pace of modernization actually slowed in many cases as talent departed and budgets tightened. Leaders now recognize that efficiency without resilience creates brittleness—systems that work well under ideal conditions but fail catastrophically when stressed.

Resilient innovation captures the dual mandate facing federal IT in 2026: Continue modernizing and adopting transformative technologies like AI, but do so in ways that don’t create new single points of failure, vendor dependencies or operational risks. It’s about building systems and capabilities that can absorb shocks — whether from workforce turnover, budget cuts, cyber incidents or geopolitical disruption — while still moving forward.

Alboum: Looking ahead, governance will take the center stage across government. As AI, data and cybersecurity continue to scale, agencies will need stronger oversight, greater transparency and better coordination to manage complexity and maintain public trust. Governance won’t be a side conversation — it will be the foundation for everything that comes next.

Success will no longer be measured by how much AI is deployed, but by whether it is secure, compliant and delivering tangible mission value. The conversation will shift from “Do we have AI?” to “Is our AI safe, accurate and worth the investment?”

The post AI may not be the federal buzzword for 2026 first appeared on Federal News Network.

© Getty Images/Greggory DiSalvo

Cisco Quantum – Simply Network All the Quantum Computers

26 September 2025 at 12:29
S. Schuchart

Cisco’s Quantum Labs research team, part of Outshift by Cisco, has announced that they have completed a complete software solution prototype. The latest part is the Cisco Quantum Complier prototype, designed for distributed quantum computing across networked processors. In short, it allows a network of quantum computers, of all types, to participate in solving a single problem. Even better, this new compiler supports distributed quantum error correction. Instead of a quantum computer needing to have a huge number of qbits itself, the load can be spread out among multiple quantum computers. This coordination is handled across a quantum network, powered by Cisco’s Quantum Network entanglement chip, which was announced in May 2025. This network could also be used to secure communications for traditional servers as well.

For some quick background – one of the factors holding quantum computers back is the lack of quantity and quality when it comes to qubits. Most of the amazing things quantum computers can in theory do require thousands or millions of qubits. Today we have systems with around a thousand qubits. But those qubits need to be quality qubits. Qubits are extremely susceptible to outside interference. Qubits need to be available in quantity as well as quality. To fix the quality problem, there has been a considerable amount of work performed on error correction for qubits. But again, most quantum error correction routines require even more qubits to create logical ‘stable’ qubits. Research has been ongoing across the industry – everyone is looking for a way to create large amounts of stable qubits.

What Cisco is proposing is that instead of making a single quantum processor bigger to have more qubits, multiple quantum processors can be strung together with their quantum networking technology and the quality of the transmitted qubits should be ensured with distributed error correction. It’s an intriguing idea – as Cisco more or less points out we didn’t achieve scale with traditional computing by simply making a single CPU bigger and bigger until it could handle all tasks. Instead, multiple CPUs were integrated on a server and then those servers networked together to share the load. That makes good sense, and it’s an interesting approach. Just like with traditional CPUs, quantum processors will not suddenly stop growing – but if this works it will allow scaling of those quantum processors on a smaller scale, possibly ushering in useful, practical quantum computing sooner.

Is this the breakthrough needed to bring about the quantum computing revolution? At this point it’s a prototype – not an extensively tested method. Quantum computing requires so much fundamental physics research and is so complicated that its extremely hard to say if what Cisco is suggesting can usher in that new quantum age. But it is extremely interesting, and it will certainly be worth watching this approach as Cisco ramps up its efforts in quantum technologies.

❌
❌