Brazilian researchers developed an AI system that analyzes WhatsApp audio messages to identify depression, showing high accuracy and potential for low-cost, real-world mental health screening.
Waze has started a wider rollout of previously announced road alerts, navigation improvements, and personalized routing, enhancing driver awareness and safety after delays since 2024.
Ars readers of a certain age no doubt remember the 1980s He-Man and the Masters of the Universe series (and its spinoff, She-Ra: Princess of Powers) and the many, many offshoots of this hugely popular Mattel franchise, including an extensive line of action figures. Amazon MGM Studios no doubt hopes to cash in on any lingering nostalgia with its forthcoming film, Masters of the Universe. Judging by the extended teaser trailer, we're getting an origin story for He-Man.
It's not the first time someone has turned He-Man into a feature film: Dolph Lundgren starred in 1987's Masters of the Universe, a critical and box office bomb that also featured Frank Langella as arch-villain Skeletor. Its poor reception might have stemmed from the 1987 film deviating significantly from the original cartoon, angering fans. But frankly, it was just a bad, cheesy movie, though it still has its share of cult fans today.
This latest big-screen live-action adaptation has been languishing in development hell for nearly two decades. There were rumors in 2007 that John Woo would direct a He-Man feature for Warner Bros., but the project never got the green light. Sony Pictures gained the rights in 2009, and there were multiple script rewrites and much shuffling of possible directors (with John Chu, McG, and David S. Goyer among the candidates).
Ever unlocked your iPhone and wondered where a specific app went? Seeing apps disappear can raise alarms and make you feel suspicious. The truth is, your iPhone removes apps intentionally. But why does this happen? Let’s explore the common reasons behind it and what you can do to prevent it.
For a long time, I thought I was losing my mind. Sometimes, when I open the Duolingo app, the number of Quest Points is lower than I remember having the day before. I was convinced I must be imagining things, but it turns out that this is happening to other people, too.
Replit’s new feature generates iOS apps from text prompts, integrates monetization, and streamlines App Store publishing - marking a major step in AI-driven software creation.
The Shortcuts app is a powerful, underrated tool, but creating automations is frustrating due to its confusing scripting system. Apple must fix it, now that it has the Gemini firepower at its disposal.
Whenever you buy used computers there is a risk that they come with unpleasant surprises that are not of the insect variant. From Apple hardware that is iCloud-locked with the original owner MIA to PCs that have BIOS passwords, some of these are more severe than others. In the case of BIOS passwords, these tend to be more of an annoyance that’s easily fixed by clearing the CMOS memory, but this isn’t always the case as [Casey Bralla] found with a former student-issued HP ProBook laptop purchased off Facebook Marketplace.
Maybe it’s because HP figured that locking down access to the BIOS is essential on systems that find their way into the hands of bored and enterprising students, but these laptops write the encrypted password and associated settings to a separate Flash memory. Although a master key purportedly exists, HP’s policy here is to replace the system board. Further, while there are some recovery options that do not involve reflashing this Flash memory, they require answers to recovery questions.
This led [Casey] to try brute-force cracking, starting with a Rust-based project on GitHub that promised much but failed to even build. Undeterred, he tasked the Claude AI to write a Python script to do the brute-forcing via the Windows-based HP BIOS utility. The chatbot was also asked to generate multiple lists of unique passwords to try that might be candidates based on some human guesses.
Six months later of near-continuous attempts at nine seconds per try, this method failed to produce a hit, but at least the laptop can still be used, just without BIOS access. This may require [Casey] to work up the courage to do some hardware hacking and erase that pesky UEFI BIOS administrator password, proving at least that apparently it’s fairly good BIOS security.
A top House lawmaker is developing legislation to codify the National Institute of Standards and Technology’s Center for AI Standards and Innovation into law.
The move to codify CAISI comes as lawmakers and the Trump administration debate and discuss the federal government’s role in overseeing AI technology.
Rep. Jay Obernolte (R-Calif.), chairman of the House Science, Space and Technology Committee’s research and technology subcommittee, said he has a “forthcoming” bill dubbed the “Great American AI Act.”
During a Wednesday hearing, Obernolte said the bill will formalize CAISI’s role to “advance AI evaluation and standard setting.”
“The work it does in doing AI model evaluation is essential in creating a regulatory toolbox for our sectoral regulators, so everyone doesn’t have to reinvent the wheel,” Obernolte said.
Last September, the center released an evaluation of the Chinese “DeepSeek” AI model that found it lagged behind U.S. models on cost, security and performance. More recently, CAISI released a request for information on securing AI agent systems.
Despite the Trump administration’s rebranding, however, Obernolte noted the NIST center’s functions have largely stayed consistent. He argued codifying the center would provide stability.
“I think everyone would agree, it’s unhealthy for us to have every successive administration spin up a brand new agency that, essentially, is doing something with a long-term mission that needs continuity,” he said.
Obernolte asked Michael Kratsios, the director of the White House Office of Science and Technology Policy, what he thought about codifying the center into law.
Kratsios said CAISI is an “very important part of the larger AI agenda.” He also said it was important for the administration to reframe the center’s work around innovation and standards, rather than safety.
“It’s absolutely important that the legacy work around standards relating to AI are undertaken by CAISI, and that’s what they’re challenged to do,” Kratsios said. “And that’s the focus that they should have, because the great standards that are put out by CAISI and by NIST are the ones that, ultimately, will empower the proliferation of this technology across many industries.”
Later on in the hearing, Kratsios said the NIST center would play a key role in setting standards for “advanced metrology of model evaluation.”
“That is something that can be used across all industries when they want to deploy these models,” he said. “You want to have trust in them so that when everyday Americans are using, whether it be medical models or anything else, they are comfortable with the fact that it has been tested and evaluated.”
Obernolte and Rep. Sarah McBride (D-Del.), meanwhile, have also introduced the “READ AI Act.” The bill would direct NIST to develop guidelines for how AI models should be evaluated, including standard documentation.
Asked about the bill, Kratsios said it was worthy of consideration, but added that any such efforts should avoid just focusing on frontier AI model evaluation.
“The reality is that the most implementation that’s going to happen across industry is going to happen through fine-tuned models for specific use cases, and it’s going to be trained on specific data that the large frontier models never had access to,” he added.
“In my opinion, the greatest work that NIST could do is to create the science behind how you measure models, such that any time that you have a specific model – for finance, for health, for agriculture – whoever’s attempting to implement it has a framework and a standard around how they can evaluate that model,” Kratsios continued. “At the end of the day, the massive proliferation is going to be through these smaller, fine-tuned models for specific use cases.”
Discussion around the role of the NIST center comes amid a larger debate over the role of the federal government in setting AI standards. In a December executive order, President Donald Trump called for legislative recommendations to create a national framework that would preempt state AI laws.
But during the hearing, Kratsios offered few specifics on what he and Special Adviser for AI and Crypto David Sacks have been considering.
“That’s something that I very much look forward to working with everyone on this committee on,” Kratsios said. “What was clear in the executive order, specifically, was that, any proposed legislation should not preempt otherwise lawful state actions relating to child safety protections, AI compute and data infrastructure, and also state government procurement and use of AI.”
“But, we look forward over the next weeks and months to be working with Congress on a viable solution,” he added.
If you search the Internet for “Clone Wars,” you’ll get a lot of Star Wars-related pages. But the original Clone Wars took place a long time ago in a galaxy much nearer to ours, and it has a lot to do with the computer you are probably using right now to read this. (Well, unless it is a Mac, something ARM-based, or an old retro-rig. I did say probably!)
IBM is a name that, for many years, was synonymous with computers, especially big mainframe computers. However, it didn’t start out that way. IBM originally made mechanical calculators and tabulating machines. That changed in 1952 with the IBM 701, IBM’s first computer that you’d recognize as a computer.
If you weren’t there, it is hard to understand how IBM dominated the computer market in the 1960s and 1970s. Sure, there were others like Univac, Honeywell, and Burroughs. But especially in the United States, IBM was the biggest fish in the pond. At one point, the computer market’s estimated worth was a bit more than $11 billion, and IBM’s five biggest competitors accounted for about $2 billion, with almost all of the rest going to IBM.
So it was somewhat surprising that IBM didn’t roll out the personal computer first, or at least very early. Even companies that made “small” computers for the day, like Digital Equipment Corporation or Data General, weren’t really expecting the truly personal computer. That push came from companies no one had heard of at the time, like MITS, SWTP, IMSAI, and Commodore.
The IBM PC
The story — and this is another story — goes that IBM spun up a team to make the IBM PC, expecting it to sell very little and use up some old keyboards previously earmarked for a failed word processor project. Instead, when the IBM PC showed up in 1981, it was a surprise hit. By 1983, there was the “XT” which was a PC with some extras, including a hard drive. In 1984, the “AT” showed up with a (gasp!) 16-bit 80286.
The personal computer market had been healthy but small. Now the PC was selling huge volumes, perhaps thanks to commercials like the one below, and decimating other companies in the market. Naturally, others wanted a piece of the pie.
Send in the Clones
Anyone could make a PC-like computer, because IBM had used off-the-shelf parts for nearly everything. There were two things that really set the PC/XT/AT family apart. First, there was a bus for plugging in cards with video outputs, serial ports, memory, and other peripherals. You could start a fine business just making add-on cards, and IBM gave you all the details. This wasn’t unlike the S-100 bus created by the Altair, but the volume of PC-class machines far outstripped the S-100 market very quickly.
In reality, there were really two buses. The PC/XT had an 8-bit bus, later named the ISA bus. The AT added an extra connector for the extra bits. You could plug an 8-bit card into part of a 16-bit slot. You probably couldn’t plug a 16-bit card into an 8-bit slot, though, unless it was made to work that way.
The other thing you needed to create a working PC was the BIOS — a ROM chip that handled starting the system with all the I/O devices set up and loading an operating system: MS-DOS, CP/M-86, or, later, OS/2.
Protection
An ad for a Columbia PC clone.
IBM didn’t think the PC would amount to much so they didn’t do anything to hide or protect the bus, in contrast to Apple, which had patents on key parts of its computer. They did, however, have a copyright on the BIOS. In theory, creating a clone IBM PC would require the design of an Intel-CPU motherboard with memory and I/O devices at the right addresses, a compatible bus, and a compatible BIOS chip.
But IBM gave the world enough documentation to write software for the machine and to make plug-in cards. So, figuring out the other side of it wasn’t particularly difficult. Probably the first clone maker was Columbia Data Products in 1982, although they were perceived to have compatibility and quality issues. (They are still around as a software company.)
Eagle Computer was another early player that originally made CP/M computers. Their computers were not exact clones, but they were the first to use a true 16-bit CPU and the first to have hard drives. There were some compatibility issues with Eagle versus a “true” PC. You can hear their unusual story in the video below.
The PC Reference manual had schematics and helpfully commented BIOS source code
One of the first companies to find real success cloning the PC was Compaq Computers, formed by some former Texas Instruments employees who were, at first, going to open Mexican restaurants, but decided computers would be better. Unlike some future clone makers, Compaq was dedicated to building better computers, not cheaper.
Compaq’s first entry into the market was a “luggable” (think of a laptop with a real CRT in a suitcase that only ran when plugged into the wall; see the video below). They reportedly spent $1,000,000 to duplicate the IBM BIOS without peeking inside (which would have caused legal problems). However, it is possible that some clone makers simply copied the IBM BIOS directly or indirectly. This was particularly easy because IBM included the BIOS source code in an appendix of the PC’s technical reference manual.
Between 1982 and 1983, Compaq, Columbia Data Products, Eagle Computers, Leading Edge, and Kaypro all threw their hats into the ring. Part of what made this sustainable over the long term was Phoenix Technologies.
Rise of the Phoenix
Phoenix was a software producer that realized the value of having a non-IBM BIOS. They put together a team to study the BIOS using only public documentation. They produced a specification and handed it to another programmer. That programmer then produced a “clean room” piece of code that did the same things as the BIOS.
An Eagle ad from 1983
This was important because, inevitably, IBM sued Phoenix but lost, as they were able to provide credible documentation that they didn’t copy IBM’s code. They were ready to license their BIOS in 1984, and companies like Hewlett-Packard, Tandy, and AT&T were happy to pay the $290,000 license fee. That fee also included insurance from The Hartford to indemnify against any copyright-infringement lawsuits.
Clones were attractive because they were often far cheaper than a “real” PC. They would also often feature innovations. For example, almost all clones had a “turbo” mode to increase the clock speed a little. Many had ports or other features as standard that a PC had to pay extra for (and consume card slots). Compaq, Columbia, and Kaypro made luggable PCs. In addition, supply didn’t always match demand. Dealers often could sell more PCs than they could get in stock, and the clones offered them a way to close more business.
Issues
Not all clone makers got everything right. It wasn’t odd for a strange machine to have different interrupt handling than an IBM machine or different timers. Another favorite place to err involved AT/PC compatibility.
In a base-model IBM PC, the address bus only went from A0 to A19. So if you hit address (hex) FFFFF+1, it would wrap around to 00000. Memory being at a premium, apparently, some programs depended on that behavior.
With the AT, there were more address lines. Rather than breaking backward compatibility, those machines have an “A20 gate.” By default, the A20 line is disabled; you must enable it to use it. However, there were several variations in how that worked.
Intel, for example, had the InBoard/386 that let you plug a 386 into a PC or AT to upgrade it. However, the InBoard A20 gating differed from that of a real AT. Most people never noticed. Software that used the BIOS still worked because the InBoard’s BIOS knew the correct procedure. Most software didn’t care either way. But there was always that one program that would need a fix.
The point is that there were many subtle features on a real IBM computer, and the clone makers didn’t always get it right. If you read ads from those days, they often tout how compatible they are.
Total War!
IBM started a series of legal battles against… well… everybody. Compaq, Corona Data Systems, Handwell, Phoenix, AMD, and anyone who managed to put anything on the market that competed with “big blue” (one of IBM’s nicknames).
IBM didn’t win anything significant, although most companies settled out of court. Then they just used the Phoenix BIOS, which was provably “clean.” So IBM decided to take a different approach.
In 1987, IBM decided they should have paid more attention to the PC design, so they redid it as the PS/2. IBM spent a lot of money telling people how much better the PS/2 was. They had really thought about it this time. So scrap those awful PCs and buy a PS/2 instead.
Of course, the PS/2 wasn’t compatible with anything. It was made to run OS/2. It used the MCA bus, which was incompatible with the ISA bus, and didn’t have many cards available. All of it, of course, was expensive. This time, clone makers had to pay a license fee to IBM to use the new bus, so no more cheap cards, either.
You probably don’t need a business degree to predict how that turned out. The market yawned and continued buying PC “clones” which were now the only game in town if you wanted a PC/XT/AT-style machine, especially since Compaq beat IBM to market with an 80386 PC by about a year.
Not all software was compatible with all clones. But most software would run on anything and, as clones got more prevalent, software got smarter about what to expect. At about the same time, people were thinking more about buying applications and less about the computer they ran on, a trend that had started even earlier, but was continuing to grow. Ordinary people didn’t care what was in the computer as long as it ran their spreadsheet, or accounting program, or whatever it was they were using.
Dozens of companies made something that resembled a PC, including big names like Olivetti, Zenith, Hewlett-Packard, Texas Instruments, Digital Equipment Corporation, and Tandy. Then there were the companies you might remember for other reasons, like Sanyo or TeleVideo. There were also many that simply came and went with little name recognition. Michael Dell started PC Limited in 1984 in his college dorm room, and by 1985, he was selling an $800 turbo PC. A few years later, the name changed to Dell, and now it is a giant in the industry.
Looking Back
It is interesting to play “what if” with this time in history. If IBM had not opened their architecture, they might have made more money. Or, they might have sold 1,000 PCs and lost interest. Then we’d all be using something different. Microsoft retaining the right to sell MS-DOS to other people was also a key enabler.
IBM stayed in the laptop business (ThinkPad) until they sold to Lenovo in 2005. They would also sell them their server business in 2014.
Things have changed, of course. There hasn’t been an ISA card slot on a motherboard in ages. Boot processes are more complex, and there are many BIOS options. Don’t even get us started on EMS and XMS. But at the core, your PC-compatible computer still wakes up and follows the same steps as an old school PC to get started. Like the Ship of Theseus, is it still an “IBM-compatible PC?” If it matters, we think the answer is yes.
If you want to relive those days, we recently saw some new machines sporting 8088s and 80386s. Or, there’s always emulation.
Marvel Studios continues to dribble out brief teasers promoting Avengers: Doomsday, which is slated for a December 2026 release—first playing in cinemas prior to Avatar: Fire and Ashscreenings before becoming publicly available.
We reported previously on the first, which featured Steve Rogers (Chris Evans), the former Captain America. Over the holidays, a second teaser highlighting Chris Hemsworth's Thor was released. Both are familiar faces in the MCU, but we now have a third teaser that brings in some new players. No, not Robert Downey Jr.'s Doctor Doom as rumored. Instead, we've got Magneto (Ian McKellen), Charles Xavier (Patrick Stewart), and Cyclops (James Marsden) from the X-Men franchise.
The film takes place 14 months after the events of this year’s Thunderbolts*. In addition to Thor, we have the new Captain America (Anthony Mackie), Bucky Barnes (Sebastian Stan), Ant-Man (Paul Rudd), Falcon (Danny Ramirez), and Loki (Tom Hiddleston). Then there’s the Wakandan contingent: Shuri as the new Black Panther (Letitia Wright), M’Baku (Winston Duke), and Namor (Tenoch Huerta Mejia).
Marvel Studios decided to ring in the new year with a fresh trailer for Wonder Man, its eight-episode miniseries premiering later this month on Disney+. Part of the MCU’s Phase Six, the miniseries was created by Destin Daniel Cretton (Shang-Chi and the Legend of Five Rings) and Andrew Guest (Hawkeye), with Guest serving as showrunner.
As previously reported, Yahya Abdul-Mateen II stars as Simon Williams, aka Wonder Man, an actor and stunt person with actual superpowers who decides to audition for the lead role in a superhero TV series—a reboot of an earlier Wonder Man incarnation. Demetrius Grosse plays Simon’s brother, Eric, aka Grim Reaper; Ed Harris plays Simon’s agent, Neal Saroyan; and Arian Moayed plays P. Clearly, an agent with the Department of Damage Control. Lauren Glazier, Josh Gad, Byron Bowers, Bechir Sylvain, and Manny McCord will also appear in as-yet-undisclosed roles
Rounding out the cast is Ben Kingsley, reprising his MCU role as failed actor Trevor Slattery. You may recall Slattery from 2013’s Iron Man 3, hired by the villain of that film to pretend to be the leader of an international terrorist organization called the Ten Rings.Slattery showed up again in 2021’s Shang-Chi and the Legend of the Ten Rings,rehabilitated after a stint in prison; he helped the titular Shang-Chi (Simu Liu) on his journey to the mythical village of Ta Lo.
Nearly 85% of the CFO Act agency chief information officers left over the last 12 months. The turnover across the community is unprecedented.
But, generally speaking, federal technology and cybersecurity policy coming from the Trump administration has been relatively modest in calendar year 2025.
For a change, federal acquisition dominated the news cycle from the overhaul of the Federal Acquisition Regulations to the Senate confirmation of Kevin Rhodes to be the administrator of the Office of Federal Procurement Policy to the General Services Administration’s OneGov enterprise contract initiative and increased scrutiny of consulting contractors and value-added resellers.
With so much going on across the federal sector, Federal News Network asked a panel of former federal executives for their opinions about 2025 and what federal IT and acquisition storylines stood out over the last 12 months.
The panelists are:
Jonathan Alboum, federal chief technology officer for ServiceNow and former Agriculture Department CIO.
Melvin Brown, vice president and chief growth officer at CANI and a former deputy CIO at the Office of Personnel Management.
Matthew Cornelius, managing director at Workday and former OMB and Senate staff member.
Kevin Cummins, a partner with the Franklin Square Group and former Senate staff member.
Michael Derrios, the new executive director of the Greg and Camille Baroni Center for Government Contracting at the George Mason University and former State Department senior procurement executive.
Julie Dunne, a principal with Monument Advocacy and former commissioner of GSA’s Federal Acquisition Service.
Mike Hettinger, founding principal of Hettinger Strategy Group and former House staff member.
Nancy Sieger, a partner at Guidehouse Financial Services Sector and a former IRS CIO.
Here are the 2024, 2023 and 2022 year in reviews as well, in case you were interested in comparing previous responses.
What are two specific accomplishments in 2025 within the federal IT and/or acquisition community? Please offer details about those accomplishments and why you thought they had an impact and what changes they brought.
MC: The administration’s concerted push to work more directly with commercial-off-the-shelf software leaders is one of the most significant changes in the federal acquisition landscape in a long time. Not only have these steps reduced costs, but direct relationships between enterprise software leaders and government customers has led to less confusion about product roadmaps and capability assessments, while providing opportunities for the government and American’s leading tech companies to solve problems in a collaborative way that improves both mission readiness and global competitiveness.
Matthew Cornelius is the managing director at Workday and former OMB and Senate staff member.
The Department of Energy became the first cabinet-level agency in the history of the U.S government to go live on a true human capital management software-as-a-service (SaaS) solution. This is an historic step forward for human resources transformation and showcases the ability of leading commercial SaaS solutions to meet stringent federal security and functional requirements at scale that will transform mission readiness for DoE and its agency peers.
MB: AI moved from “policy talk” to governed buying. OMB issued two major April memos that together pushed agencies from experimentation toward repeatable governance and acquisition patterns — what must be documented, who must be involved, and what vendors must provide. Why it mattered for acquisition is because it’s a forcing function for standard solicitation language, evaluation factors, data rights/lock-in protections, privacy involvement and risk controls in AI buys.
The late-2025 “AI procurement guardrails” conversation got louder, especially for large language model (LLM) providers. By December 2025, reporting highlighted OMB procurement guardrails focused on what agencies should demand when buying AI tools, including large language models, and set near-term timelines for agencies to update acquisition policies. Why it mattered is it signaled that LLM procurement is being treated as a special class of risk/assurance problem — not just another software buy.
KC: The FAR rewrite and FedRAMP 20x initiatives made a lot of progress. While the impact of the FAR overhaul and FedRAMP changes may not be felt immediately, these changes should make it easier for agencies to acquire technologies to better meet their missions. FedRAMP’s purpose is to accelerate cloud adoption, but it has become a barrier for commercial cloud companies that want to work with agencies. Even when agencies do have access to FedRAMP’ed cloud solutions, they tend to lag behind the latest versions sold to commercial customers due to the cost and time it takes to get authorizations to operate (ATOs).
MH: The GSA OneGov initiative stands out as one of the more significant things to have happened in federal IT and procurement this year, with more to come as we go into 2026. What started out as just a handful of companies participating has grown into something more significant with 15 OneGov deals having been announced and while we maybe haven’t yet seen the full extent of what it can do in terms of changing buying and selling habits, I suspect we will see those changes as we go into next year. The FAR overhaul is another significant and related piece of this puzzle, which we will again begin to see more from in the next year.
JD: Revolutionary FAR overhaul (RFO) and OneGov activities
NS: I’m watching closely how agencies move from basic zero trust architecture (ZTA) compliance to operationalizing mature, integrated zero trust capabilities across all five pillars: identity, devices, networks, applications and data. The 2025 accomplishments in zero trust adoption created a foundation. In 2026, it will become clearer which agencies can achieve the cultural transformation and cross-domain integration that true zero trust requires.
The real change this brought was cultural. IT professionals moved from viewing zero trust as a security mandate to recognizing it as an enabler of hybrid work and cloud adoption. This shift helped agencies reduce attack surface across government networks and establish replicable patterns that smaller agencies could follow, expanding access to advanced security capabilities across the federal enterprise.
In 2025, the federal government moved beyond AI policy development to actual governance implementation. OMB’s updated guidance, combined with agency-level chief AI officers and cross-functional AI governance boards, created accountability structures that didn’t exist before. What impressed me most was how Treasury and IRS established AI testing and validation protocols that balanced innovation with responsible use.
This brought tangible changes; agencies now have repeatable processes for AI risk assessment, bias testing and human oversight integration. It shifted the conversation from “should we use AI?” to “how do we use AI responsibly?” enabling mission delivery while maintaining public trust.
JA: 2025 was the year of agencies moving beyond AI pilot programs and onto large-scale deployment. As AI became embedded in day-to-day operations, it quickly became clear that success hinges on strong foundations — like high data quality, governance and scalable infrastructure. Agencies that invested in these core building blocks moved toward more sustainable and responsible AI implementations. The result was greater confidence in AI outcomes, improved interoperability and a clearer path for long-term innovation across government.
This shift in priorities is already delivering tangible results. One agency I worked with this past year consolidated 47 intake channels and five legacy platforms into a single system of record, improving data collection efficiency by 80%. By unifying data and workflows, the agency created a strong foundation for scaling AI across the mission and driving measurable outcomes.
This year also brought renewed momentum to enterprise acquisitions. Initiatives like GSA’s OneGov enabled agencies to move away from fragmented purchasing and toward coordinated, enterprisewide agreements. These agreements reduced friction, improved visibility and delivered better value for taxpayers, reflecting the growing demand for simpler access to modern IT solutions. Together, these changes signaled a cultural shift in federal AI adoption — one that prioritizes speed, collaboration and measurable outcomes over complexity.
MD: I think the most significant accomplishment is DoD’s launch of CMMC 2.0 because of how it will shape acquisition strategy, contracting practice and supply-chain resilience across the federal enterprise. As I said in a recent white paper on the subject, the acquisition impact of CMMC is systemic because it will influence how agencies define capable sources, how solicitations are written, how proposals are evaluated and how performance is monitored. Certification is now a qualification threshold for industry and a practical tool for risk reduction in government agencies. But it will also be a costly investment, especially for small businesses. However, I also think civilian federal agencies will eventually look to adopt portions of CMMC at some point, so it behooves any contractor looking to do business with the federal government to explore getting certified at the right time depending on where they’re at in their life cycle.
What technology, acquisition initiative or program surprised you based on how much progress it made or how the pieces and parts came together and why?
MB: FedRAMP tried to become faster and more outcome-oriented through its 20x pilot. GSA launched FedRAMP 20x in March 2025 and continued publishing implementation updates and pilot details through 2025. Separately, GSA reported record authorization pace in 2025 and linked progress to the shift toward modernization, including the 20x pilot. Why it mattered for acquisition is agencies and vendors saw real pressure to reduce authorization friction and move toward automation-based validation and a “security over paperwork” posture, as described in FedRAMP’s own updates.
DoD cybersecurity requirements for contractors hit a concrete implementation runway through the Cybersecurity Maturity Model Certification (CMMC) program. DoD’s CMMC implementation began Phase 1 in November with a multi-phase rollout plan over three years, as described by the DoD CIO and reflected in associated rulemaking discussion. Why it mattered is it moved CMMC from “coming soon” into real solicitation/award gating, changing competitive dynamics for federal suppliers supporting defense programs.
NS: What genuinely surprised me in 2025 was the bold reimagining of FedRAMP through the “FedRAMP 20x” initiative. After more than a decade of incremental changes, GSA’s new leadership assembled a federal technical team of security experts, platform engineers, and data scientists who fundamentally redesigned the authorization framework to be cloud-native and automation-driven with continuous security validation.
Nancy Sieger is a partner at Guidehouse Financial Services Sector and a former IRS CIO.
In my federal agency CIO role, I thought for years the FedRAMP authorization processes were bureaucratic and slow-moving, yet in 2025 the program demonstrated that radical transformation was possible. From Guidehouse’s perspective, what made this remarkable was the cultural shift toward transparency and genuine stakeholder collaboration. This demonstrated that even deeply entrenched federal compliance programs can evolve rapidly when there’s bold leadership, technical expertise and willingness to rethink established processes rather than just optimize them.
KC: The Department of Government Efficiency (DOGE) was surprising in almost every way and was far more impactful than I expected, even if some of its initial claims about government savings were overstated.
MC: I have been incredibly impressed with the reorganization across GSA’s key federal acquisition programs. Elevating the importance, competence, criticality and talent within GSA to drive true consolidation, efficiency, cost savings and standardization across the governmentwide technology procurement landscape has been a long overdue effort that has already delivered enormous outcomes. I’m not surprised that this has been successful, more so just heartened to see the pivot back to bolstering and strengthening GSA’s ability to be the true innovator and key negotiator in the federal technology acquisition landscape as a worthy and worthwhile sign of confidence in this vital agency.
MH: I was pleased to see progress made related to implementation of the Government Service Delivery Improvement Act, which was signed into law in January 2025. While there’s still a way to go toward full implementation, the federal CIO has been designated as the federal service delivery lead as required by the law, and the requirements of GSDIA were incorporated into the annual Circular A-11, Section 280 update, meaning agencies should account for the requirements of GSDIA in their fiscal 2027 budgets. Once we get the agency high impact service providers (HISP) service delivery leads in place, which should happen early next year, GSDIA, working together with 21st Century IDEA and a host of administration policies, should serve to accelerate the path to better, more efficient customer experience.
JD: The revolutionary FAR overhaul (RFO) was a huge effort to publish all the FAR model deviation text by the end of the fiscal year (Sept 30). The FAR Council and the GSA team deserve a lot of credit for getting that done.
The OneGov strategy was announced in April 2025. By the end of the year, GSA had announced 15 agreements. It’s unclear at this point how much agencies are able to leverage these agreements, but it’s impressive that GSA put together that group of agreements over the course of eight months. I’m sure there are more announcements to come.
JA: This year, it became clear that AI cannot scale securely without zero trust. I was struck by how quickly AI governance converged into a shared, nonnegotiable priority. As more agencies deployed AI, cybersecurity risks became impossible to ignore. Zero trust shifted from policy guidance to an operational must, forcing agencies to rethink both their architecture and procurement strategies as they work toward the 2027 mandate.
MD: I’m a bit biased on this one but I’m going to have to say State’s Evolve program. The request for proposals was issued three years ago in December 2022, and given the sheer complexity of the technical program and contract structure, a two-step advisory down select process associated with the highest number of proposals State has ever managed at one time, along with the ambitious size of the award pools, the fact that the department was able to start making contract awards this summer was a tremendous accomplishment.
What emerged as the biggest technology/acquisition challenge of 2025 that will have an impact into 2026 and beyond?
KC: Secretary of Defense Pete Hegseth has acknowledged that “our acquisition system is only as good as our workforce.” Yet we saw many experienced contracting officers leave the federal government in 2025 through the Deferred Resignation Program (DRP), Voluntary Early Retirement Authority (VERA) and other attrition. We also lost many of the newer, more tech-savvy feds who had been hired into places like the Cybersecurity and Infrastructure Security Agency (CISA) and GSA’s Technology Transformation Service (TTS). That will make it harder to successfully modernize government in 2026 and beyond. While some flashy, high-priority procurements may still speed along, more mundane federal IT upgrades will likely suffer.
MC: One of the key provisions of the FedRAMP Authorization Act was around collapsing and consolidating various security assessment frameworks to achieve greater reciprocity between agencies and create scale for critical technologies that can truly serve foundational missions in any agency. While much of the effort (rightly so) has been on automation and streamlining the authorization process so more innovative solutions can enter the federal market, for the “big bets” the administration is making on foundational infrastructure and platforms across both civilian and defense sectors, seeing how OMB (and GSA, DOD, etc.) better collaborate and consolidate on accreditation priorities and processes to speed reciprocity and time to value for these key investments should be a paramount priority.
JA: Growing complexity is a technology and acquisition trend that shows no signs of slowing. Agencies are trying to navigate AI adoption, massive amounts of data, cybersecurity mandates, procurement reform and workforce changes, all while delivering mission-critical outcomes.
Without strong governance, we risk repeating past mistakes like technology sprawl, duplication, and unmanaged threats — only at a much larger scale and with greater negative consequences. Moving into 2026, success will be defined less by the launch of new initiatives, and more by the ability to govern technology investments to deliver sustained value. The administration’s PMA objective to “eliminate data silos and duplicative data collection” will help.
MB: Late 2025 reporting described Pentagon efforts aimed at rapidly scaling small drone procurement and using competitive approaches to accelerate production—explicitly framed as overcoming traditional procurement friction. Why it mattered is it’s a visible example of the broader push to shorten cycles, broaden vendor bases, and buy more like the commercial market — especially for fast-evolving tech.
Melvin Brown is the vice president and chief growth officer at CANI and a former deputy CIO at the Office of Personnel Management.
MH: The personnel and related budget cuts that happened as a result of DOGE have been and will continue to be the greatest challenge as agencies look to prioritize IT modernization, but without a full staff and in many cases smaller budgets. While I feel we are on the backside of the cuts, the challenges associated with this will carry forward into 2026 as we look to rebuild our IT personnel and budgets.
JD: The acquisition workforce has been working through a lot of change this year from the RFO to reductions in force and retirements. We ask a lot of these folks so as we move into the new year, I hope these folks are given the tools and leadership support to drive forward with important initiatives like the RFO, buying commercial and expanding the industrial base. There will be a lot of uncertainty ahead, especially as agencies issue their supplements under the RFO process and they work through another uncertain appropriations process.
NS: DOGE’s push to consolidate IT infrastructure, eliminate redundant systems and mandate shared services will reach critical implementation phases in 2026. I’m watching whether the one-size-fits-all efficiency model can accommodate mission-specific requirements, particularly in national security, law enforcement and regulatory agencies.
The consolidation becomes more acute as consolidation efforts move beyond transactional systems and into complex operational environments such as cybersecurity operations centers, cloud platforms and data centers. These environments are tightly coupled with mission delivery. Bureaus such as the IRS have legitimate mission-specific technology requirements that commodity shared services may not address.
Potential trade-offs of the shared-services centralization that will need to be well designed may be first, impacts to agency/bureau agility both in timelines and innovation, as one-size-fits-all may not work with unique mission needs. Another trade-off is the concentration of risk to a single point; resiliency will be key! Lastly, I’ll say the distance from the customer and potential additional bureaucracy in governance with cross-agency coordination will need to be carefully managed to not suppress time to market on changes and innovation.
MD: In my opinion, the biggest technology/acquisition challenge has (and will be) the rush to adopt and use AI to support federal missions. While there is significant upside to leveraging AI in the government space, there still seems to be a readiness gap in terms of appropriate governance, well-defined use cases, proper training and workforce preparedness, the availability of clean data and policy ambiguity. These issues need to be addressed as agencies are testing out AI to ensure the adoption of new tools does not exacerbate existing friction or result in throwing money at problems by addressing symptoms versus the root causes.
Apple has just released for download iOS 26.3 beta 1 alongside beta 1 of iPadOS 26.3, macOS 26.3, tvOS 26.3, watchOS 26.3 and visionOS 26.3 for compatible devices.