Reading view

There are new articles available, click to refresh the page.

TMF makes one more award before authorization expires

The 43-day partial government shutdown left us all bereft of information technology news and updates on program progress. Now that agencies are funded at least through Jan. 31, the conference circuit is in its end-of-the-year sprint and lawmakers are working on the final touches of the defense authorization bill, there are some interesting IT news items that have emerged over the last week or so. Here are three of them that deserve highlighting.

TMF Board beats deadline

The Technology Modernization Fund officially expires on Friday, meaning the board can only continue to oversee and fund existing awards.

While the House was attempting to add a provision to the fiscal 2026 defense authorization bill to extend the TMF, the board quietly made at least one new award before the impending deadline.

The National Nuclear Security Administration earned an investment of $28.3 million for three separate, but interrelated modernization projects.

The agency will use the funding “to modernize the foundational technology infrastructure supporting its nuclear security missions through an integrated cloud and artificial intelligence transformation. Current systems have critical vulnerabilities: the FireGuard wildfire monitoring activity requires intensive manual analysis across classification barriers; the Turbo Federal Radiological Monitoring and Assessment Center (FRMAC) radiological assessment tool cannot communicate with other emergency response applications; and limited AI development infrastructure prevents rapid evaluation of models for nuclear security threats. These vulnerabilities hamper data aggregation, slow emergency response, and leave gaps in the agency’s ability to assess evolving technological risks.”

A government source with knowledge of the TMF award said most of the funding — about $23 million — will go to modernizing the AI infrastructure for NNSA’s classified environment.

NNSA has been building out its classified environment in the cloud for much of the last two years and the influx of funding will accelerate that effort.

For example, NNSA recently implemented AskSage’s platform to access secure AI tools through the Army’s tenet.

The source said through the TMF investment, NNSA can more quickly implement its own version of AWS, Microsoft and Google clouds with AI tools built in.

NNSA expects to create a centralized AI infrastructure for its classified users to make it easier for them to access these tools more quickly.

This was NNSA’s second TMF award. It won $3.8 million in July 2024 to modernize its Radiological Response Data Portal.

The source said the other two projects, for which NNSA will receive about $3 million for FRMAC and $2 million for FireGuard, were grouped together as part of the overall modernization initiative because they will rely on AI and machine learning tools.

“Through the TMF, FireGuard aims to use machine learning to automatically track fire boundaries and move data between computing systems. This automation would allow analysts to spend their time verifying maps instead of drawing them from scratch, which is particularly critical as wildfires pose growing risks to nuclear sites,” the TMF states on its website. “Through the TMF, NNSA plans to shift the Turbo FRMAC radiological assessment tool from its legacy desktop software to a cloud‑based platform, providing emergency teams with the speed, accuracy, and collaborative capability they need to protect the public and critical infrastructure during radiological crises.”

The source said while much of the work to modernize the systems and to develop the AI infrastructure was already underway, the TMF money certainly will help NNSA move faster.

And that idea of moving faster brings us back to Congress failing to extend the program’s authorization. While plenty of programs operate without authorization, it seems like this type of investment, at least for the short term, is on hold.

There still is some hope, as the Senate did allocate $5 million for the TMF in fiscal 2026 in its version of the Financial Services and General Government appropriations bill. This is the first time in three years lawmakers didn’t zero out new funding for the TMF.

Commerce, USDA race for AI

The departments of Commerce and Agriculture are in a friendly race to see who can implement the new USAi platform first.

Brian Epley, Commerce’s chief information officer, said at ACT-IAC’s Executive Leadership Conference last week that, hopefully within the next few weeks, employees at all bureaus will be able to begin using the different large language models for their use cases.

Meanwhile, Tony Brannum, the chief information security officer at Agriculture, said deployment of the AI shared service across the agency is imminent as well.

These would be the first agencies to take advantage of the USAi platform, which GSA launched in March and expanded this summer to provide generative AI chatbot tools for other agencies.

“The code base is the same as GSAi. It is just a copy for our agency partners to use so that they can do that deep exploration across a broad portfolio of generative AI technologies and large language models, but most importantly, capture the observability telemetry data so that they can make informed decisions about how they’re going to buy well into this disruptive space in the future,” GSA CIO Dave Shive said at the ELC conference.

GSA says the USAi platform is part of how it’s providing agencies the opportunity to try out AI tools that previously would’ve either been cost prohibitive to roll out across the agency or included too much risk.

Zach Whitman, GSA’s chief AI officer, said USAi has become an interagency project in terms of developing and learning from its use.

“With a tool like USAi, you can really get practical, hands-on data to suggest these are the ways in which we’re seeing this adoption curve today. And the telemetry that we’re gathering in terms of how frequently do people use the thing is an important step for the program development,” Whitman said. “But then also, what are they asking of this tool? What kinds of mission functions are they using this tool for? That true fidelity on how the thing is being used is, I would argue that the real value proposition of a tool like USAi is to be able to see inside how the workforce is truly adopting this new technology to such granularity, and then shape policy and workflows to maximize that utility has been an eye opener for us, as well as our partners.”

As for Commerce, Epley said it will use the chat prompt tools to answer specific questions that arise in each bureau based on their individual use cases.

“We have an AI use case inventory of more than 300. USAi will bring Commerce to life when it comes to AI tools,” he said. “We looked at what we were doing on our own when it came to AI and decided it was best to partner with GSA. The majority of our workforce doesn’t have a working knowledge of AI, so now they will have tools at their fingertips. We expect USAi to increase our productivity for our employees and for the mission.”

Epley added that Commerce will use its lessons from implementing USAi and create a playbook for other agencies to ease their move to the shared service.

This is a multi-year effort for Commerce, as Epley expects they will have to optimize the value of the LLMs and figure out which tools are best for which mission area.

As for USDA, the implementation of USAi is the culmination of a decade-long journey to improve its data tagging.

Brannum said knowing what data the agency has, where it’s located, and who has access to it will make the application of AI tools easier.

“There is a lot of interest in AI across the agency. We are asked why something is blocked, or once they see something out there from GSA, they ask to use it,” he said.

JWCC-Next, JOE and more from DISA

The Defense Information Systems Agency will continue to run its own multiple award contracts for the foreseeable future. DISA’s Procurement Services Executive Doug Packard, who will retire in January after more than 35 years in government, said the agency has no immediate plans to consolidate acquisition programs into GSA.

That means the Joint Warfighting Cloud Computing Contract-Next (JWCC-Next) and several other high profile programs will remain in play for 2026.

At DISA’s annual industry day on Monday, executives said JWCC-Next, Olympus, the Joint Operational Environment (JOE) and several other major IT programs are positioned for significant progress over the next year.

Byron Stephenson, the J-9 vice director for the Host and Compute Center at DISA, said DISA has been working with the DoD CIO’s office to collect requirements for the new contract.

“We are still looking at all the requirements that were collected across the department to ensure we bring the best cloud capable contract for the future,” Stephenson said.

He expects DISA to issue the JWCC-Next solicitation during the third quarter of fiscal 2026 and make an award about a year later.

Additionally, DISA will issue a solicitation in the second quarter of fiscal 2026 for technical support for the JWCC engineering program office.

As for the current iteration of the contract, which DoD awarded in November 2021, Stephenson said the use by the services continues to grow.

“In 2025, had over $3.9 billion in total orders. We are actively seeing increases as the Air Force onboards to the contract now,” he said. “We had large Army orders last year, so it is continuing to see progress.”

Aside from JWCC-Next, DISA J-9 is busy implementing several large scale technology programs. Stephenson said the joint operational environment is operational in multiple theaters with active workloads, including the Indo-Pacific Command. Through JOE, DISA is bringing commercial cloud services to commands located outside the continental United States.

Meanwhile, DISA’s Olympus, which is its infrastructure-as-code initiative, is active in two environments, AWS and Microsoft, and military services and agencies can take advantage of the tools through DISA’s working capital fund.

The post TMF makes one more award before authorization expires first appeared on Federal News Network.

© Getty Images/iStockphoto/monsitj

server room 3d illustration with programming data design element.,concept of big data storage and cloud computing
technology.

Look forward, not back, in response to cyber threats

The U.S. government is seeing increasing cyber and physical threats targeting the critical infrastructure Americans rely on every day, including federal systems. Just last month, U.S. officials issued an emergency directive ordering federal agencies to defend their networks against hackers exploiting flaws in Cisco’s software. In July, an investigation revealed that engineers based in China were given indirect access to the cloud platform used by multiple federal agencies. And last year, a Chinese espionage group targeted at least 200 U.S. organizations, including the Army National Guard, in an operation known as Salt Typhoon.

Cyberthreats to U.S. government systems are a significant national security risk that expose the country to spying, sabotage and more. In light of these threats, it might be tempting for federal agencies and other recently-hacked government organizations to hunker down or seek refuge in a traditional GovCloud. Doing so may give the illusion of a quick fix, but will ultimately undermine President Trump’s priorities and U.S. national security, and is a step backwards technologically.

To understand why a return to GovClouds is such a bad idea, it is worth revisiting the history behind why the government has shifted from GovClouds to commercial clouds in the first place. The emergence of modern cloud technology in the early 2000s fundamentally changed enterprise IT, including how the U.S. government operates. For decades prior, federal agencies relied on government-owned servers to store data and run their networks. Taking advantage of the cloud, however, meant relocating those functions to off-site data centers managed by private companies and, as a result, relinquishing some control over who managed and accessed their information.

In 2011, the U.S. government started the Federal Risk and Authorization Management Program (FedRAMP) to safely accelerate federal agencies’ adoption and secure use of cloud services. To sell cloud products to the government, companies had to prove that any personnel handling federal data had the proper authorizations and background screenings. The approach made sense at the time, because the goal was to provide a standardized approach to security assessments and authorizations as this new technology was adopted.

However, FedRAMP — and the Defense Information Systems Agency, which manages the evaluation and authorization of cloud services for the Defense Department — led many cloud providers to create separate cloud environments dedicated to serving federal agencies, GovClouds, rather than running government workloads in their commercial cloud environments. While this made compliance with the assessments and authorizations easier, it came at a cost: GovClouds badly lagged behind commercial cloud in terms of capacity, performance and security, with less than 5% of government cloud environments possessing the full characteristics of current cloud computing.

GovClouds are isolated environments and are slow to roll out security updates and new features, including AI, therefore limiting the public sector’s access to rapidly advancing technologies. Isolation may seem like a good idea that reduces external threats, but it does little to eliminate insider threats — which account for the majority of data breaches — and it deprives government customers of the constant learning and adaptation of commercial technology.

Over time, these problems pushed federal agencies away from GovClouds in favor of the evident security and performance advantages of commercial cloud: continuous roll out of new features and updates, consolidated security operations that take advantage of economies of scale and scope, and lower cost.

GovClouds are also very expensive to build and operate, and those costs are passed onto American taxpayers. These are among the reasons why President Donald Trump issued an executive order tasking federal agencies to adopt commercial solutions for federal contracts wherever possible.

In light of recent events, it would be a colossal mistake for the government to retreat to GovClouds now — especially as it increasingly looks to adopt AI. President Trump’s AI Action Plan put forward an ambitious strategy to ensure American AI leadership abroad, but also to use AI to transform the federal government. The computing power necessary for that transformation will be immense and only increase year after year — and a GovCloud will not be able to keep up. Following through on the President’s AI Action Plan will simply require more than what GovClouds can offer.

As the U.S. government faces increasingly severe cyber threats, let’s not forget the lessons of the past. GovClouds are the technological equivalent of the Maginot Line: expensive, easy to quantify and seemingly impregnable, but unable to defend against modern attacks. The federal government needs to look forward, not back, to secure its systems and push toward President Trump’s vision for the future.

Andrew Grotto founded and directs the program on geopolitics, technology and governance at Stanford University’s Center for International Security and Cooperation. He serves as the faculty lead for the cyber policy and security specialization in Stanford’s master’s in international policy program, where he teaches courses on cyber policy and economic statecraft. He is also a visiting fellow at the Hoover Institution. He was the senior director for cyber policy on the National Security Council in the Obama and Trump administrations.

The post Look forward, not back, in response to cyber threats first appeared on Federal News Network.

© Getty Images/iStockphoto/KanawatTH

HUD Padlock Icon Cyber Security, Digital Data Network Protection, Future Technology Digital Data Network Connection Background Concept. 3d rendering

OPM’s plan to unify disparate HR systems taking shape

The government’s dispersed systems for managing human resources for the federal workforce are on the verge of a major transformation, according to the Office of Personnel Management.

By January, OPM said it expects to award a federal contract that will eventually result in a cohesive HR system for all agencies to use. But in the short term, creating a governmentwide HR entity will mean working to consolidate the more than 100 systems agencies currently use for workforce management.

It’s not the first time OPM has attempted to merge the disparate HR IT systems across government. But Dianna Saxman, OPM’s associate director of HR Solutions, said the effort currently underway is “different.”

“We’re really leveraging the collective wisdom of the entire federal community,” Saxman said Tuesday during a public meeting of the Chief Human Capital Officers (CHCO) Council. “We’ve already brought experts from many different agencies into a steering committee that is helping us to set the strategy up front.”

Along with employing a steering committee at OPM, Saxman said many federal human capital leaders have started collaborating internally with other agency executives, like chief information officers and chief data officers, to plan the integration of the upcoming HR system.

“What we’re seeing in these agency engagements is a lot of enthusiasm and support for the overall effort,” Saxman said.

Once implemented, the new system will be the source for agencies, HR offices and federal employees to manage personnel records, payroll systems and performance data, while also having the capability to provide federal workforce analytics.

After OPM awards the HR IT contract in January, the agency will spend the next several months implementing the core HR system by the end of April. From there, OPM plans to work one-on-one with agencies to configure the platform to meet their unique needs.

All told, OPM’s end goal is to have the HR IT system fully adopted governmentwide by September 2027.

The value of the anticipated contract award in January is not yet clear. But it comes after OPM released a request for proposals (RFP) in October, detailing a specific plan of action to modernize and centralize the more than 100 current federal HR systems.

In May, OPM had previously released its initial RFP through the General Services Administration schedules program, which emphasized the need for interoperability in governmentwide human capital systems. The May RFP came just weeks after OPM initially announced a sole source award to Workday in early May, but then quickly canceled that award.

The expected timing for awarding the HR IT contract also “aligns beautifully” with the Trump administration’s newly released President’s Management Agenda, Saxman said. She highlighted three key goals in the new agenda that dovetail with the HR IT consolidation effort.

For one, a centralized HR IT platform would underly the administration’s goal of fostering a “merit-based” federal workforce, Saxman explained.

“As we look to foster greater merit, we’re able to do that by having an end-to-end HR IT capability that allows us really to see what’s happening with the federal workforce, with skillsets we have available [and] how people are being promoted and evaluated,” Saxman said. “This gives us that visibility.”

The Trump administration’s goal of making “buying power” more efficient calls for consolidated contract opportunities that are “smarter, faster, cheaper,” according to another component of the new PMA.

“A lot of the contracting processes in government are really decentralized, and there’s a lot of repetitive action there,” Saxman said. “This effort seeks to centralize the purchasing of a private sector core human capital capability at OPM — we would have one entity buying it on behalf of the entire federal government.”

On top of that, Saxman noted that OPM’s effort aligns with the goal of leveraging technology, as the PMA seeks to “consolidate and standardize systems,” while also incorporating “digital-first” government services and eliminating data siloes.

“We’re going to be consolidating over 100 systems into one, reducing the number of system integrations that are required and the complexity of managing all of these systems,” she said.

Saxman outlined what she said will eventually be an array of benefits for agencies, HR offices, federal employees and external stakeholders, once legacy HR systems are decommissioned and the new system is fully implemented.

“There are many manual data requests that come out from OPM, many different stakeholder groups,” she said. “But we’ll have an opportunity where the data will be readily available in dashboards, so we can have a real view of what’s happening with the federal workforce at any point in time.”

OPM also hopes to ease the workload for HR employees by eventually moving all personnel records to single files, even when employees transition between multiple agencies throughout their career. Currently many federal employees have personnel records that span across multiple different HR systems.

“Our goal here is to have one system that they can manage their employee record,” Saxman said.

For signs of success, Saxman said OPM will be measuring and looking for improvements in employee experience, cost savings and better data overall.

“A lot of our HR professionals are working with outdated, disparate technologies that are not serving them well, that are not serving our employees well,” Saxman said. “As a federal community, this is something that we have wanted to do for a long time.”

The post OPM’s plan to unify disparate HR systems taking shape first appeared on Federal News Network.

© Getty Images/VectorInspiration

Human resources search, resume and recruitment, human resources department holding magnifying glass to select resume

GSA’s next-generation contract vehicle is expanding and small businesses need to pay attention

Interview transcript:

 

Terry Gerton OASIS+ enters Phase II with five new service domains and draft scorecards expected December 16, ahead of a January 12 solicitation date. The expansion could reshape competition and compliance for federal contractors, especially small businesses. Stephanie Kostro, president of the Professional Services Council, is here to share her insights. We’re going to focus a little bit more than we usually do today on things that are happening in the small business world. So let’s start with December 4th. GSA announced the launch of the OASIS+ Phase II expansion. First of all, tell us what’s noteworthy there.

Stephanie Kostro This is long anticipated, Terry, and I will say thank you very much for focusing on small business today. It has been an area where a lot of our contracting friends have looked for guidance and information from the executive branch and from the legislative branch, to be honest, about where small business policy is going. And so thanks so much for raising this important issue area. OASIS+ has been in the works for so many years now, and there are hundreds, if not thousands, of vendors very interested in this expansion. And what I’ll tell you very quickly is Oasis+ had been eight domains or eight categories of services. It is now 13. And the five that they’ve added in this new tranche are very interesting. It is things like business administration, financial services, human capital, marketing and public relations, and social services. So this is a dramatic increase in the scope of Oasis+. It expands from eight domains, service domains to 13. And we have a lot of interest here in the contracting community about how they can support the executive branch through these new domains.

Terry Gerton Those new domains seem tailor made for small businesses. What are you hearing about how small business might be able to participate now?

Stephanie Kostro Again, it’s very exciting. It looks like the solicitations will be open here in January, mid-January of 2026. We’ll have to see what the actual words on the paper, if I can be that mundane about it, say about small business participation. But this is exactly the kinds of domains that small businesses excel. The marketing and PR, the human capital, financial services, etc., where they can partner with large companies in either in a joint venture or as a mentor-protege. So we’ll have to see what GSA decides will be the allowable partnering arrangements going forward. I would also note that this is a reflection of an executive order that the president signed out early on, and it was back in March, it was called Executive Order 14240, Eliminating Waste and Saving Taxpayer Dollars by Consolidating Procurement. So really this is the migration of some of the domains from other vehicles over to Oasis+, which really makes Oasis+ a must-have vehicle for contractors.

Terry Gerton What should small-business owners and leaders be looking at between now and January to help them prepare?

Stephanie Kostro They really should check out Oasis+ Phase I and see what came out in the solicitation documents for that. They should monitor the GSA websites very, very closely to see if any blog posts there will give them insight into what will be allowable. You know, a lot of times PSC has voiced concern about final requests for proposals not hewing very closely to the draft that they had released as an RFP. And so sometimes you have to scramble as a small business to figure out who can you partner with? Because the final RFP does not really look like the draft RFP. I’m hoping that GSA decides to move forward with a final RFP that looks very similar to a draft RFP so that our small businesses can plan accordingly. It has been a rough year in 2025 for small businesses. Some of them have seen contract terminations or de-scoping or rescoping. Some of them have been asked to offer up discounts that really cut into the muscle, not just the fat, if there was fat for a small business. But we need the innovation that comes from small businesses. And I think this is a great opportunity for them to provide an offer that is really beneficial to the government and to the small business community.

Terry Gerton I’m speaking with Stephanie Kostro. She’s the president of the Professional Services Council. Stephanie, speaking of small businesses, there was a bill that passed the House last week, the SBA IT Modernization Reporting Act. What are you watching here?

Stephanie Kostro Oh, now we can really dork out, Terry, on all of this stuff. So I as I mentioned in our previous conversation here, we’re talking about HR and financial services, or rather human capital and financial services, etc. The IT Modernization Reporting Act is a really interesting piece of legislation that looks at recommendations offered by the Government Accountability Office back in 2024 about reporting on agencies’ IT systems. And so they really want the Small Business Administration to help address risks tied to the Small Business certification platform that can help reduce the project risk, so that it can actually improve project risk management, establish a risk mitigation plan and resolve cybersecurity vulnerabilities. Now, Terry, as you know, we are seeing a host of cybersecurity requirements come out from the Department of War and their Cybersecurity Maturity Model Certification program, but also elsewhere. And it hits small businesses hard. You know, some of this is basic cyber hygiene, but some of it is really, really burdensome on small businesses that don’t have the resources and can’t spread resources out between, say, a commercial side and a government side. And so as we look at the implementation of this legislation, it’ll be interesting to see how small businesses can reduce the risk and reduce their vulnerabilities across and what SBA can do to support them in that.

Terry Gerton This is obviously the beginning of its legislative process. It still has to pass the Senate. It still has to get signed. But are there things that you would want small businesses to be looking at now with the expectation that this bill will eventually be passed?

Stephanie Kostro That is a great question, Terry, and it actually leads me to something that your listeners probably just learned about recently, which was the House and Senate released their National Defense Authorization Act for fiscal year 2026, where that act, and it’s in its final stages, this is the conference bill, right? And so now it just has to pass House and Senate and get signed by the President. I say “just,” but it takes a few days for that to happen. That bill was released on December 7. And I would note that in it actually establishes more firmly Project Spectrum, which is a Department of War effort to help small businesses with their cybersecurity. I encourage small businesses to look at Project Spectrum if they are a Department of War contractor. But even if you’re not, take a look at what those offerings are. See what you can get the government to support you and to help pay for in terms of cyber hygiene and cybersecurity. I’m encouraging the companies to do that even in advance of any of this SBA IT Mod act. As we move forward, it’s going to be a huge cost for companies and anything the executive branch puts in place to mitigate those costs or help minimize those costs is going to be a good thing.

Terry Gerton Stephanie, you’ve already alluded to a couple of big changes that small businesses are facing as a result of so many of the new policies and programs coming out of the Trump administration this year; 2026 is going to look very different for small businesses than 2025. Give us a feel for the change in the business market, the government contracting market for small business, and what do you think the year ahead brings?

Stephanie Kostro Small businesses have, again, seen a lot of changes here in 2025, not least of which has been calling into question the socioeconomic set-asides that we have in place. There’s the 8(a) program, which is for disadvantaged businesses, but there are also women-owned small business, veteran-owned small business, service-disabled veteran-owned small business, hub zone, etc. So we have ratcheted back, as a nation, back to the statutory requirements. The Biden administration and others had grown the set-aside amounts and thresholds for these kinds of small businesses. Those are back down to the statutory requirements. In addition, we have heard about this audit of the 8(a) program, which was launched months ago, but more recently, contractors have been receiving documentation requests from their customers to help justify 8(a) program awards, etc. So they’ve seen that as well. As we go into 2026, I imagine we will see more of this audit-like activity to make sure the companies that certify themselves as small are in fact small and qualify for these set-asides. I would also say under the revolutionary FAR overhaul, which is this FAR rewrite we’ve been undergoing for a few months, all of the class deviations, part by part of the FAR, are out there. The agency supplements are being changed. We are awaiting formal rulemaking for some of these things. But it does appear that the “rule of two” upon which a lot of small businesses base their business strategies is also changing. I’ll just succinctly summarize it by saying right now, if the revolutionary FAR overhaul goes through the rulemaking process and nothing else is changed, that “rule of two” applies only to the contract level, not to the task-order level, which is a significant change. It also allows the contracting officers to have a little bit of flexibility in terms of what can be deemed for a set aside, and then also not necessarily requiring companies to recertify their status. And so a lot of these changes are going come to be executed in 2026, and it is again going to be a year a little of some upheaval for small businesses.

Terry Gerton So if you’re a small business owner or leader and you’re thinking about your strategy or your business plan for 2026, what are the key things that should be at the top of your consideration list?

Stephanie Kostro The first one is obviously to monitor everything that the government, the executive branch is saying in terms of what the requirements are for a small business. I would also make sure that if you have an opportunity to get on a vehicle yourself as an organic small business or as a joint venture, go ahead and get on, because I’m not sure what on-ramping opportunities are going to look like in the future for some of these larger vehicles. And also make sure that you have all of your documentation in a row, all your ducks in a row for documentation. And that includes not just, “hey, you’ve got an 8(a) contract award and you may be getting required to submit some documentation,” but to certify yourself to make sure that the platforms are in place where you can certify yourself quickly with the SBA and making sure that you have all of that documentation in line. This is also an interesting dynamic for any new entrants to the market who have not experienced what existing small businesses experienced here in 2025. They may look at this and go, the juice is not worth the squeeze. It’s too hard to do work with the federal government. I think it is a business decision that if folks want to come and talk to those of us at the Professional Services Council and we can give them a little bit of a taste of what the dynamics are, we’re happy to talk to them about, is this a good market for you to explore? I think it is. Particularly, for example, we talked about at the top of this discussion, the Oasis+ expansion. The new five domains are for services, are great for small businesses. How do you compete for that? Come talk to us and we can help you out.

The post GSA’s next-generation contract vehicle is expanding and small businesses need to pay attention first appeared on Federal News Network.

© Federal News Network

SMALLBUSINESS_02

Trump’s government management vision centers on elimination, accountability

The Trump administration has laid out its President’s Management Agenda, providing a framework for the administration’s overarching priorities to drive change in the federal government for the next few years.

The new PMA, which the Office of Management and Budget published Monday, includes three key priority areas, each of which contain several underlying goals the administration wants to meet, such as eliminating “woke” government, ending “over-classification” and “buying American.”

Many of the goals contained in the management agenda are already taking shape through a number of President Donald Trump’s executive orders and other changes to government the president has initiated since taking office.

“In his first months in office, President Trump already took bold and decisive actions to begin to reshape the federal government and end its weaponization against American citizens,” OMB Deputy Director for Management Eric Ueland wrote Monday in a memo to agencies.

A senior OMB official, speaking on background, said the PMA takes the president’s promises, as well as the administration’s work already underway, and creates a framework to “institutionalize” those end goals.

“Some of the previous PMAs have been all-encompassing and trying to be everything to everybody, whereas this PMA is very clearly tied to what President Trump promised the American people he would do when he got elected,” the official said in an interview with Federal News Network. “These are going to be priorities every agency focuses on for the full Trump administration.”

The Trump administration’s three PMA priorities are:

  • Shrink the government and eliminate waste
  • Ensure accountability for Americans
  • Deliver results, buy American

The PMA has been a staple of presidential administrations for more than 20 years. Generally, each PMA aims to address systemic challenges in government management by setting goals and holding agency executives accountable. It’s a way for the White House to work with agencies to establish top priorities, then monitor progress toward priority-based goals.

Performance.gov, the website that hosts the administration’s PMA, so far contains only an outline of Trump’s management agenda. Details are missing on which federal leaders will be tasked with delivering on the goals, where there has already been progress, and how the administration will measure results for each priority.

An OMB official blamed the 43-day government shutdown for the limited details on the PMA website. Though confirming that more information would eventually be available, the official did not provide a specific timeline.

“We’ll work with the agencies and identify where they’re already making progress and start putting out — as PMAs in the past have — updates on the success that’s been had, what metrics we’re going to be looking at measuring and what agencies are going to be part of different leads for the individual goals,” the official said.

Shrinking the government

For the Trump administration, the first priority in the PMA focuses on shrinking the government and eliminating waste, particularly in programs that Trump has described as “woke” or “weaponized.”

To meet that end, the PMA’s first priority defines three key goals:

  • Eliminate woke, weaponization and waste
  • Downsize the federal workforce
  • Optimize federal real estate

Already, the Trump administration has taken significant steps toward those goals. Agencies spent much of this year under a hiring freeze, while the administration simultaneously reduced the size of the federal workforce by more than 300,000 employees.

Going forward, the OMB official pointed to Trump’s latest executive order on federal hiring as a way to measure progress toward the PMA’s first priority. The Oct. 15 order called on agencies to form strategic hiring committees composed mainly of political appointees, as well as create staffing plans for the coming year.

“A key part of that will be making sure agencies are putting in place those hiring committees,” the official said. “They’re making very strategic decisions around who they’re hiring and what positions they’re hiring for, so we don’t just inflate the federal government again and overwhelm all the success we’ve had in reductions to date.”

Trump’s first priority area in the PMA is a clear departure from the Biden administration’s agenda, which had centered on strengthening the federal workforce and included efforts to increase federal hiring and workforce development.

On top of reducing the federal workforce, the Trump administration’s first PMA priority additionally focuses on removing programs related to diversity, equity, inclusion and accessibility (DEIA), as well as ending a number of federal programs the administration described as “wasteful.”

That goal already began taking shape earlier this year, as the Trump administration directed agencies to end DEIA programs, and remove federal employees who worked on DEIA-related projects. The administration has also sought to shrink certain agencies, including USAID and the Education Department.

As a final piece of its first PMA priority, the administration said it plans to shrink the government’s real estate holdings by offloading “unnecessary” leases and federal buildings, as well as moving agency facilities to more “cost-effective” locations.

Trump has signed a number of executive orders this year focused on making federal architecture “beautiful,” and changing the way agencies prioritize federal building locations, while also requiring all federal employees to work on-site full time.

A focus on accountability

In addition to shrinking government, the administration will also be focused on driving “accountability” as the second PMA priority. The effort will impact federal employees, agency programs and government contractors, according to the agenda’s outline.

The underlying goals for achieving Trump’s second priority area are:

  • Foster merit-based federal workforce
  • End censorship and over-classification
  • Demand partners who deliver

Many of the goals under the second PMA priority are familiar, as the administration has already attempted to reach those ends. For instance, the administration has created a new “Schedule Policy/Career” classification for federal employment, and altered performance management standards for federal employees.

“One benefit of the way that PMA is structured for this administration is it’s going to be easy to integrate this PMA into performance reviews for individual employees across the government and hold them accountable for delivering on the president’s priorities,” the OMB official said.

The Office of Personnel Management in May also issued a “merit hiring plan,” which in part called on agencies to question job applicants on how they will adhere to the president’s priorities.

“A lot of this is following up on executive orders and policy decisions made by the president early on,” the OMB official said. “We’re going to be having agencies strategically hiring [and] they need to do so following the merit hiring plan.”

The second priority area also includes a focus on implementing Trump’s orders related to collective bargaining and labor-management relations at agencies. On top of that, the administration also detailed goals of promoting transparency in the federal government, such as through “find[ing] and annihilat[ing] government censorship of speech.”

Additionally, the second PMA priority includes goals of changing government contracting by working with “the best businesses,” and tasking political appointees, rather than career employees, with leading grant processing work.

“It’s making sure that those receiving federal dollars were chosen based on merit, because they’re going to deliver the outcomes that are expected,” the OMB official said.

Modernize technology, “deliver results”

The third and final priority area in the Trump administration’s PMA focuses on consolidating federal procurement, as well as adopting more modern technology into government services.

The priority contains two key goals:

  • Efficiently deploy the buying power of the federal government and buy American
  • Leverage technology to deliver faster, more secure services

Attempting to advance technology in government has been a long-standing goal across multiple administrations and throughout many agencies. But the OMB official said for the Trump administration, the goal will be to focus on finding modernization initiatives that can be turned around in shorter timeframes, and “moving out of 10-year, 15-year efforts.”

“We are being more specific in where we’re focused and making sure that we’re tackling projects that we can get done, so that we get the results and the benefits of that,” the official said.

The third priority, once again, mirrors many steps that the administration has already taken, such as attempting to reshape the federal acquisition process.

The priority area also focuses on a familiar throughline from the Trump administration and the Department of Government Efficiency of eliminating “waste.”

Some underlying goals in the PMA’s third priority area, for instance, focus on reducing the number of “confusing” government websites. Another focuses on removing “duplicative” data collections and eliminating data siloes.

“Instead of having dozens or hundreds siloed IT systems,” the OMB official said. “We’re going to be able to work off of consolidated IT systems that can operate in an integrated fashion.”

The post Trump’s government management vision centers on elimination, accountability first appeared on Federal News Network.

© AP Photo/Julia Demaree Nikhinson

President Donald Trump speaks during a Cabinet meeting at the White House, Tuesday, Dec. 2, 2025, in Washington. (AP Photo/Julia Demaree Nikhinson)

At VA, cyber dominance is in, cyber compliance is out

The Department of Veterans Affairs is moving toward a more operational approach to cybersecurity.

This means VA is applying a deeper focus on protecting the attack surfaces and closing off threat vectors that put veterans’ data at risk.

Eddie Pool, the acting principal deputy assistant secretary for the Office of Information and Technology at VA, said the agency is changing its cybersecurity posture to reflect a cyber dominance approach.

Eddie Pool is the acting principal deputy assistant secretary for the Office of Information and Technology at the Department of Veterans Affairs.

“That’s a move away from the traditional and an exclusively compliance based approach to cybersecurity, where we put a lot of our time resources investments in compliance based activities,” Pool said on Ask the CIO. “For example, did someone check the box on a form? Did someone file something in the right place? We’re really moving a lot of our focus over to the risk-based approach to security, pushing things like zero trust architecture, micro segmentation of our networks and really doing things that are more focused on the operational landscape. We are more focused on protecting those attack surfaces and closing off those threat vectors in the cyber space.”

A big part of this move to cyber dominance is applying the concepts that make up a zero trust architecture like micro segmentation and identity and access management.

Pool said as VA modernizes its underlying technology infrastructure, it will “bake in” these zero trust capabilities.

“Over the next several years, you’re going to see that naturally evolve in terms of where we are in the maturity model path. Our approach here is not necessarily to try to map to a model. It’s really to rationalize what are the highest value opportunities that those models bring, and then we prioritize on those activities first,” he said. “We’re not pursuing it in a linear fashion. We are taking parts and pieces and what makes the most sense for the biggest thing for our buck right now, that’s where we’re putting our energy and effort.”

One of those areas that VA is focused on is rationalizing the number of tools and technologies it’s using across the department. Pool said the goal is to get down to a specific set instead of having the “31 flavors” approach.

“We’re going to try to make it where you can have any flavor you want so long as it’s chocolate. We are trying to get that standardized across the department,” he said. “That gives us the opportunity from a sustainment perspective that we can focus the majority of our resources on those enterprise standardized capabilities. From a security perspective, it’s a far less threat landscape to have to worry about having 100 things versus having two or three things.”

The business process reengineering priority

Pool added that redundancy remains a key factor in the security and tool rationalization effort. He said VA will continue to have a diversity of products in its IT investment portfolios.

“Where we are at is we are looking at how do we build that future state architecture, as elegantly and simplistically as possible so that we can manage it more effectively, they can protect it more securely,” he said.

In addition to standardizing on technology and cyber tools and technologies, Pool said VA is bringing the same approach to business processes for enterprisewide services.

He said over the years, VA has built up a laundry list of legacy technology all with different versions and requirements to maintain.

“We’ve done a lot over the years in the Office of Information and Technology to really standardize on our technology platforms. Now it’s time to leverage that, to really bring standard processes to the business,” he said. “What that does is that really does help us continue to put the veteran at the center of everything that we do, and it gives a very predictable, very repeatable process and expectation for veterans across the country, so that you don’t have different experiences based on where you live or where you’re getting your health care and from what part of the organization.”

Part of the standardization effort is that VA will expand its use of automation, particularly in processing of veterans claims.

Pool said the goal is to take more advantage of the agency’s data and use artificial intelligence to accelerate claims processing.

“The richness of the data and the standardization of our data that we’re looking at and how we can eliminate as many steps in these processes as we can, where we have data to make decisions, or we can automate a lot of things that would completely eliminate what would be a paper process that is our focus,” Pool said. “We’re trying to streamline IT to the point that it’s as fast and as efficient, secure and accurate as possible from a VA processing perspective, and in turn, it’s going to bring a decision back to the veteran a lot faster, and a decision that’s ready to go on to the next step in the process.”

Many of these updates already are having an impact on VA’s business processes. The agency said that it set a new record for the number of disability and pension claims processed in a single year, more than 3 million. That beat its record set in 2024 by more than 500,000.

“We’re driving benefit outcomes. We’re driving technology outcomes. From my perspective, everything that we do here, every product, service capability that the department provides the veteran community, it’s all enabled through technology. So technology is the underpinning infrastructure, backbone to make all things happen, or where all things can fail,” Pool said. “First, on the internal side, it’s about making sure that those infrastructure components are modernized. Everything’s hardened. We have a reliable, highly available infrastructure to deliver those services. Then at the application level, at the actual point of delivery, IT is involved in every aspect of every challenge in the department, to again, bring the best technology experts to the table and look at how can we leverage the best technologies to simplify the business processes, whether that’s claims automation, getting veterans their mileage reimbursement earlier or by automating processes to increase the efficacy of the outcomes that we deliver, and just simplify how the veterans consume the services of VA. That’s the only reason why we exist here, is to be that enabling partner to the business to make these things happen.”

The post At VA, cyber dominance is in, cyber compliance is out first appeared on Federal News Network.

© Getty Images/ipopba

Cyber security network and data protection technology on virtual interface screen.

Outdated SEC communications rules are putting compliance and competitiveness at risk

Interview transcript

Terry Gerton The Securities Industry and Financial Markets Association has recently written to the SEC asking to modernize its communication and record keeping rules. Help us understand what the big issue is here.

Robert Cruz Well, I think the fundamental issue that SIFMA is calling out is a mismatch between the technology that firms use today and the rules, which were written a long time ago — and in some cases, you know, the Securities and Exchange Act from 1940. So essentially we’re all struggling trying to find a way to fit the way that we interact today into rules that are very old, written when we were doing things with typewriters and, you know, over written communication. So it’s trying to minimize the gap between those two things, between the technology and what the rule requires firms to do.

Terry Gerton So instead of all of those hard copy letters that we get from investment firms and those sorts of things, we also get emails, text messages. That’s where the disconnect is happening?

Robert Cruz Yes. It’s the fact that individuals can collaborate and communicate with their customers over a variety of mechanisms. And some of these may be casual. They may be not related to business. And that’s the fundamental problem is that SIFMA is looking for the rules to be clarified so it pertains only to the things that matter to the firm, that create value or risk to their business or to the investor.

Terry Gerton And what would those kinds of communications look like?

Robert Cruz I think what they’ll look like is external communication. So, right now the rule doesn’t distinguish between internal — you and I as colleagues talking versus things that pertain to, you know, communications with the public or with a potential investor. So it’s trying to carve out those things that really do relate to the business’s products or services and exclude some of the things that may be more just conversational, as you and I might pass each other in the hallway, we can chat on a chat board someplace. It’s trying to remove those kind of just transitory communications from the record keeping obligations.

Terry Gerton Right. The letter even mentions things like emojis and messages like “I’m running late.”

Robert Cruz Exactly. And you know, it’s a fundamental problem that firms have is the fact that if you say you’re going to be able to use a tool, even if it’s as simple as email, that means that our firm has an obligation to capture it. And when it captures it, it captures everything, everything that is delivered through that communication channel. So that creates some of that problem of like, somebody left their lunch in the refrigerator. We need to clean it up. it’s trying to remove all of that noise from the things that really do matter to the business.

Terry Gerton Not only does that kind of record keeping impose a cost on the organization, the reporting organization, but it also would create quite a burden on the regulators trying to sort out the meaningful communication in that electronic file cabinet, so to speak.

Robert Cruz Absolutely. Well, the firm clearly has the obligation to sift through all of this data to find the things that matter. If you have a regulatory inquiry, you’ve got to find everything that relates to it. Even if it’s, you know, I talked to an investor and there was an emoji in that conversation. I still need to account for that. So the burden is both on the firm as well as on the regulator to try to parse through these very large sets of data that are very, you know, heterogeneous with a lot of different activities that are captured in today’s tools.

Terry Gerton Relative to the question about the tools, you’ve said that SEC rules should be agnostic to technology. Unpack that for me. What exactly does that mean?

Robert Cruz Sure. This kind of goes back a few years where there was a revision to the rule 17A-4 from the SEC, which is the fundamental record keeping obligation. It says you need to have complete and accurate records. What they tried to do at that time was remove references to old technologies and spinning disks and things we used to do long ago. And so the objective was to be more independent of technology. Your obligation is your obligation. If it matters to the business, that’s the principle that should govern, not the particular tool that you use. So technology being agnostic — or rules being agnostic; technology means it doesn’t matter whether it’s delivered via email, via text, via emojis, carrier pigeons or anything else. If it matters to the business, it matters to the business.

Terry Gerton How do today’s variety of technologies complicate a business’ compliance requirements?

Robert Cruz The challenge is very complex, period. It’s always going to be with us because there’s always going to be a new way that your your client wants to engage. There may be a new tool that you’re not familiar with that they want to interact on. Or you may get pull from your employees internally because they’re familiar with tools from their personal lives. So that encroachment of new tools, it doesn’t go away. It’s always been with us. And so it’s things that we have to anticipate. Again, be agnostic because there’s going to be something that comes right along behind it that potentially makes you know an explicit regulation irrelevant from the outset.

Terry Gerton I’m speaking with Robert Cruz. He’s the Vice President for Regulatory and Information Governance at SMARSH. All right, let’s follow along with that because you’ve got a proposal that includes a compliance safe harbor. So along with these compliance questions, what would that change for firms and how does it address the challenges of enforcement?

Robert Cruz Well, it’s an interesting concept because the rules today are meant to be principles-based. They’re not prescriptive. In other words, they don’t tell you, you must do the following. And that’s one of the challenges the industry has is that, what is good enough? What is the SEC specifically looking for? So this is like trying to give people a safe spot to which then you can say, well, SEC, if you really care about, you know, particular areas of these communications, they can tune their programs to do that. So it feels like it’s just giving some latitude so that we can define best practices. We can get a clearer sense of what the regulators are looking for. It’ll guide our governance processes by just having a clearer picture of where enforcement’s going to be focused.

Terry Gerton The regulatory process that would apply here is notoriously slow and complicated. What’s at stake for firms and investors if we don’t get this modernized?

Robert Cruz Well, I think you’re going to continue to see just a lot of individual practices that will vary. Some firms will interpret things differently and we’ll need to wait for enforcement to determine which is the best way. So, case in point, generative AI — if you’re using these technologies inside of the tools that you currently support, are these going to be considered issues for the SEC or not? We we have to wait until we get some interpretation from the regulators to say, yes, we need to have stronger controls around this, or yes, we need to block these tools. You know, you need to make that adjustment based upon the way that the SEC responds to it.

Terry Gerton And what is your sense of how the SEC might respond to this?

Robert Cruz My gut tells me that just given where we are right now, you know, the SEC has a reduction in headcount it’s dealing with. It’s stating its mission very clearly and its focus is on crypto, is on capital formation, is on reducing regulatory burden. I just don’t know if this makes the list. So it clearly is being abdicated strongly from SIFMA, but, whether this makes page one of the SEC priorities list with the 20% reduction in headcount, it really seems like an outside chance that it gets onto their agenda.

Terry Gerton Could it inform some of the other regulation issues that they’re addressing, such as crypto and and capital formation?

Robert Cruz Absolutely. And that’s a great comment — the notion of using an unapproved communication tool, it didn’t go away. We may not see the big fines anymore, but I think the regulators are going to be saying if there’s an issue related to crypto, related to investor harm or what have you, if you’re using a tool that is not approved for use, you don’t have the artifact, you don’t have the historical record. They’re not going to view that you know favorably if you’re not able to defend your business. And so it’ll come up in context of other examinations that they’re carrying out. So maybe not a means to an end as it’s been for the last two years, but it will impact their ability to do their jobs ultimately.

The post Outdated SEC communications rules are putting compliance and competitiveness at risk first appeared on Federal News Network.

© Getty Images/iStockphoto/Maxxa_Satori

Business woman hand using smartphone with digital marketing via multi-channel communication network on mobile application technology.

Risk and Compliance 2025 Exchange: Diligent’s Jason Venner on moving beyond manual cyber compliance

The Pentagon is taking a major step forward in modernizing how it addresses cybersecurity risks.

Defense Department officials have emphasized the need to move beyond “legacy shortcomings” to deliver technology to warfighters more rapidly. In September, DoD announced a new cybersecurity risk management construct to address those challenges.

“The previous Risk Management Framework was overly reliant on static checklists and manual processes that failed to account for operational needs and cyber survivability requirements,” DoD wrote at the time. “These limitations left defense systems vulnerable to sophisticated adversaries and slowed the delivery of secure capabilities to the field.”

Weeding through legacy manual processes

The legacy of manual processes has built up over decades. Jason Venner, a solutions sales director at Diligent, said agencies have traditionally relied on people and paperwork to ensure compliance.

“It’s no one’s fault,” Venner said during Federal News Network’s Risk & Compliance Exchange 2025. “It just sort of evolved that way, and now it’s time to stop and reassess where we’re at. I think the administration is doing a pretty good job in looking at all the different regs that they’re promulgating and revising them.”

Venner said IT leaders are interested in ways to help streamline the governance, risk and compliance process while ensuring security.

“Software should help make my life easier,” he said. “If I’m a CIO or a CISO, it should help my make my life easier, and not just for doing security scans or vulnerability scans, but actually doing IT governance, risk and compliance.”

Katie Arrington, who is performing the duties of the DoD chief information officer, has talked about the need to “blow up” the current RMF. The department moved to the framework in 2018 when it transitioned away from the DoD Information Assurance Certification and Accreditation Process (DIACAP).

“I remember when we were going from DIACAP to RMF, I wanted to pull my hair out,” Arrington said earlier this year. “It’s still paper. Who reads it? What we do is a program protection plan. We write it, we put it inside the program. We say, ‘This is what we’ll be looking to protect the program.’ We put it in a file, and we don’t look at it for three years. We have to get away from paperwork. We have to get away from the way we’ve done business to the way we need to do business, and it’s going to be painful, and there are going to be a lot of things that we do, and mistakes will be made. I really hope that industry doesn’t do what industry tends to do, [which] is want to sue the federal government instead of working with us to fix the problems. I would really love that.”

Arrington launched the Software Fast Track initiative to once again tackle the challenge of quickly adopting secure software.

Evolving risk management through better automation, analytics

DoD’s new risk management construct includes a five-phase lifecycle and then core principles, including automation, continuous monitoring and DevSecOps.

Arrington talked about the future vision for cyber risk management within DoD earlier this year.

“I’m going to ask you, if you’re a software provider, to provide me your software bill of materials in both your sandbox and production, along with a third-party SBOM. You’re going to populate those artifacts into our Enterprise Mission Assurance Support Service,” she said. “I will have AI tools on the back end to review the data instead of waiting for a human and if all of it passes the right requirements, provisional authority to operate.”

Venner said the use of automation and AI rest on a foundation of data analytics. He argued the successful use of AI for risk management will require purpose-built models.

“Can you identify, suggest, benchmark things for me and then identify controls to mitigate these risks, and then let me know what data I need to monitor to ensure those controls are working. That’s where AI can really accelerate the conversation,” Venner said.

Discover more articles and videos now on our Risk & Compliance Exchange 2025 event page.

The post Risk and Compliance 2025 Exchange: Diligent’s Jason Venner on moving beyond manual cyber compliance first appeared on Federal News Network.

© Federal News Network

fnr-icon-full

Gen AI adoption is reshaping roles and raising tough questions about workforce strategy

 

Interview transcript:

 

Terry Gerton I know you have studied how workers of different skill levels choose to use generative AI and the concept of AI exposure. Can you talk to us a little bit about what you’re finding there? Are there certain roles more likely to embrace AI, or certain roles that are more likely to be replaced?

Ramayya Krishnan AI exposure, to understand that, I think we have to think about how occupations are structured. So the Bureau of Labor Statistics has something, a taxonomy called O*NET. And O*NET describes all the occupations in the U.S. economy, there are 873 or so. And each of those occupations is viewed as consisting of tasks and tasks requiring certain sets of skills. AI exposure is a measure of how many of those tasks are potentially doable by AI. And thereby that becomes, then, a measure of ways in which AI could have an impact on people who are in that particular occupation. So, however, AI exposure should not be assumed to mean that that’s tantamount to AI substitution, because I think we should be thinking about how AI is deployed. And so there are capabilities that AI has. For instance, this conversation that we’re having could be automatically transcribed by AI. This this conversation we are having could be automatically translated from English to Spanish by AI, for instance. Those are capabilities, right? So when you take capabilities and actually deploy them in organizational contexts, the question of how it’s deployed will determine whether AI is going to augment the human worker, or is it going to automate and replace a particular task that a human worker does? Remember, this happens at the task level, not at the occupation level. So some tasks within an occupation may get modified or adapted. So if you look at how software developers today use co-pilots to build software, that’s augmentation, where it’s been demonstrated that software developers with lower skills usually get between 20% to 25% productivity improvement. Call center employees, again, a similar type of augmentation is happening. In other cases, you could imagine, for instance, if you were my physician and I was speaking to you, today we have things called ambient AIs that will automatically transcribe the conversation that I’m having with you, the physician. That’s an example of an AI that could potentially substitute for a human transcriber. So I gave you two examples: software developer and customer service where you’re seeing augmentation; the transcription task, I’m giving you an example of substitution. So depending on how AI is deployed, you might have some tasks being augmented, some being substituted. When you take a step back, you have to take AI exposure as a measure of capability and then ask the question, how does that then get deployed? Which then has impact on how workers are going to actually have to think about, what does this then mean for them? And if it’s complementing, how do they become fluent in AI and be able to use AI well? And if there’s a particular task where it’s being used in a substitutive manner, what does that then mean longer term for them, in terms of having to acquire new skills to maybe transition to other occupations where there might be even more demand? So I think it’s we have to unpack what AI exposure then means for workers by thinking about augmentation versus automation.

Terry Gerton There’s a lot of nuance in that. And your writings also make the point that Gen AI adoption narrows when the cost of failure is high. So how do organizations think both about augmentation versus replacement and the risk of failure as they deploy AI?

Ramayya Krishnan If you take the example of using AI in an automated fashion, its error rate has to be so low because you don’t have human oversight. And therefore, if the error rates are not sufficiently appropriate, then you need to pair the human with the AI. In some cases you might say the AI is just not ready. So we’re not going to use the AI at all. We’ll just keep human as is. In other cases, if AI can be used with the human, where there is benefits to productivity but the error rates are such you still need the human to ensure and sign off, either because the error rates are high or from an ethical standpoint or from a governance standpoint, you need the human in the loop to sign off, you’re going to see complementing the human with the AI. And then there are going to be tasks for which the AI quality is so high, that its error rates are so low, that you could actually deploy it. So when we talk about the cost of failure, you want to think about consequential tasks where failure is not an option. And so either the error rates have to be really low, and therefore I can deploy the AI in an automated fashion, or you have to ensure there is a human in the loop. And this is why I think AI measurement and evaluation prior to deployment is so essential because things like error rates, costs, all of these have to be measured and inform the decisions to deploy AI and deploy AI in what fashion? Is it in augmentation fashion or not, or is it going to be used independently?

Terry Gerton I’m speaking with Dr. Ramayya Krishnan. He’s the director of the Center for AI Measurement Science and Engineering at Carnegie Mellon University. So we’re talking there about how AI gets deployed in different organizations. How do you see this applying in the public sector? Are there certain kinds of government work where AI is more suitable for augmentation versus automation and that error rate then becomes a really important consideration?

Ramayya Krishnan I think there are going to be a number of opportunities for AI to be deployed. So you remember we talked about call centers and customer service types of centers. I mean, public sector, one aspect of what they do is they engage with citizens in a variety of ways, where they have to deliver and provide good information. Some of those are time sensitive and very consequential, like 911 emergency calls. Now, there you absolutely want the human in the loop because we want to make sure that those are dealt with in a way that we believe we need humans in the loop, which could be augmented by AI, but you know, you want humans in the loop. On the other hand, you could imagine questions about, you know, what kind of permit or what kind of form, you know, administrative kinds of questions, where there’s triage, if you will, of having better response time to those kinds of questions. The alternative to calling and speaking to somebody might be just like you could go to a website and look it up. Imagine a question-answering system that actually allows for you to ask and get these questions answered. I expect that, and in fact you’re already seeing this in local government and in state government, the deployment of these kinds of administrative kinds of question-answering systems. I’d say that’s one example. Within the organizations, there is the use of AI, not customer-facing or citizen-facing, but within the organizations, the use of these kinds of co-pilots that are being used within the organization to try and improve productivity. I think as AI gets more robust and more reliable, I expect that you will see greater use of AI in both trying to improve efficiency and effectiveness, but to do so in a responsible way, in such a way that you take into account the importance of providing service to citizens of all different abilities. One of the important things with the public sector is … maybe there’s multilingual support that is needed, you might need to help citizens who are disabled. How might we support different kinds of citizens with different ability levels? I think these are things where AI could potentially play an important role.

Terry Gerton AI is certainly already having a disruptive impact on the American workforce, particularly. What recommendations do you have for policymakers and employers to mitigate the disruption and think long-term about upskilling and reskilling so that folks can be successful in this new space?

Ramayya Krishnan I think this is actually one of the most important questions that we need to address. And you know, I served on the National AI Advisory Committee to the President and the White House Office of AI Initiatives, and this was very much a key question that was addressed by colleagues. And I think a recent op-ed that we have written with Patrick Harker at the University of Pennsylvania and Mark Hagerott at the University of South Dakota, really we make the case that this is an inflection point which requires a response pretty much on the scale of what President Lincoln did in 1862 with the Morrill Act in establishing land grant universities. Much like land grant universities were designed to democratize access to agricultural technology, really it enabled Americans from everywhere in the nation to harness this technology for economic prosperity both for themselves and for the nation. I think if you’re going to see AI be deployed and not have the kind of inequality that might arise from people having access to the technology and not having access to the technology, we need something like this. And we call this the Digital Land Grant Initiative that would connect our universities, the community colleges, with various ways of providing citizens, both in rural areas and urban areas, everywhere in the country, access to AI education and skilling appropriate to their context. So if I’m a farmer, how can I do precision agriculture? If I’m a mine worker, or if I’m somebody who wants to work in in banking — from the whole range of occupations and professions, you could imagine AI having a transformative effect on these different occupations. And there may be new occupations that are going to emerge that you and I are not thinking about right now. So, how do we best position our citizens so that they can equip themselves with the right sets of skills that are going to be required and demanded? I think that’s the big public policy question with regard to workforce upskilling and reskilling.

The post Gen AI adoption is reshaping roles and raising tough questions about workforce strategy first appeared on Federal News Network.

© Getty Images/iStockphoto/ipopba

Businessman hold circle of network structure HR - Human resources. Business leadership concept. Management and recruitment. Social network. Different people.

House lawmakers to try again to extend TMF through NDAA

The Technology Modernization Fund is running out of time. In 10 days, the reauthorization will expire for the 8-year-old governmentwide account to help agencies update IT systems.

If Congress doesn’t act before Dec. 12, the TMF will not be able to make any new investments, freezing more than $150 million.

“The Technology Modernization Fund remains one of the federal government’s most effective tools for rapidly strengthening cybersecurity and improving high-impact systems. Reauthorizing the TMF is essential to ensuring stable, flexible funding that helps agencies deliver secure, modern services for the American people,” said a GSA spokesperson in an email to Federal News Network. “We look forward to working with Congress on the reauthorization effort.”

There is support in the House for reauthorizing the TMF. Rep. Nancy Mace (R-S.C.) and former Congressman Gerry Connolly (D-Va.) introduced the Modernizing Government Technology (MGT) Reform Act in April that included an extension of the fund to Dec. 31, 2031.

The bill hasn’t moved out of the House Oversight and Government Reform Committee and there is no Senate companion.

The House did pass a version of this bill in May 2024, but, again, the Senate never moved on the bill.

The Senate, however, did allocate $5 million for the TMF in its version of the fiscal 2026 Financial Services and General Government appropriations bill, released last week. This comes after Congress zeroed out new funding for the program over the last three years. The House version of the FSGG bill didn’t include any new money for the TMF.

Mace tried to include her TMF bill as a provision in the House’s version of the National Defense Authorization bill, but language didn’t make it in the version passed by the lower chamber. The Senate version of the NDAA also didn’t include the TMF extension, but there is still hope to get it in during the upcoming conference committee negotiations.

“Extending and reauthorizing the Technology Modernization Fund, which expires on Dec. 12, is a high priority for the committee and we have requested in a bipartisan manner that it be included in the final Fiscal Year 2026 National Defense Authorization Act,” said an Oversight and Government Reform Committee spokesperson. “This is a shared policy priority with the administration and the Office of Management and Budget. Extending the fund also has broad industry support, specifically the Committee has support letters from the Information Technology Industry Council (ITI), the Center for Procurement Advocacy (CPA), the Professional Services Council (PSC) and the Alliance for Digital Innovation (ADI).”

TMF: 69 investments, $1 billion

ADI wrote lawmakers a letter on Nov. 24 advocating for the TMF extension.

“To date, the TMF has catalyzed transformation across government, from strengthening cybersecurity defenses to improving citizen-facing digital services. By providing flexible capital through a merit-based process overseen by federal technology leaders, the Fund enables agencies to undertake complex modernization initiatives that would otherwise remain trapped in multi-year budget cycles. This structure ensures accountability while giving agencies the agility to respond to rapidly evolving technology landscapes and emerging threats,” the industry association said in its letter to House and Senate leadership. “The MGT Reform Act provides the right framework for the TMF’s next chapter. By extending authorization for seven years, Congress would provide agencies the long-term certainty needed to plan and execute substantial and transformational modernization programs. The legislation’s transparency provisions, including the establishment of a federal legacy IT inventory, will give policymakers greater visibility into modernization progress and priorities. These reforms strengthen oversight while preserving the operational flexibility that makes the TMF effective.”

GSA says in its fiscal 2026 budget justification that the TMF currently manages more than $1.07 billion worth of systems upgrades and modernization projects totaling 69 investments across 34 federal agencies. The TMF board has received and reviewed more than 290 proposals totaling about $4.5 billion in funding demand.

The TMF board made only one new investment in calendar year 2025. It awarded $14.6 million to the Federal Trade Commission in June to develop a cloud-based analytics platform that uses artificial intelligence tools and to train staff to handle data analysis in-house.

GSA says it had more than $231 million in available funding for 2025 and it expected to have more than $158 million for the TMF in 2026.

“The government needs updated technology, and those updates need to be done efficiently. I’m proud to co-sponsor the bipartisan Modernizing Government Technology Reform Act introduced by Cybersecurity Subcommittee Chairwoman Mace,” said Rep. Shontel Brown (D-Ohio), ranking member of the Cybersecurity, IT and Government Innovation subcommittee, in an email to Federal News Network. “The best course of action would be the Oversight Committee and Congress advancing this legislation before the authorization ends.”

Technical debt would increase faster

Former federal technology executives say letting the TMF expire would set back agency modernization efforts.

Larry Bafundo, the former executive director of the TMF program office, said without the TMF, agencies will have a more difficult time finding funding to modern legacy systems.

“We spend a vast majority of our funding on maintaining existing and outdated systems instead of adapting systems to meet changing needs. I think something is broken in the way we fund modernization of IT systems. Congress is incentivized to think in terms of projects instead of services that evolve over time. There is a huge disconnect between how the government works and how IT projects are funded,” said Bafuno, who is now president of Mo Studio, a digital services company. “There isn’t a clear, governmentwide IT modernization strategy, with a clear inventory of systems, to align programs like TMF against. As a result, we approach the problem piece-meal, rather than as part of a deliberate, or coordinated, plan. Similarly, agencies can sometimes lack incentives to modernize effectively. In many cases, they not only lack performance baselines to measure change against, but there are also very few senior executives in govt today who are evaluated based on the value of the services they provide the public. Instead, they are incentivized to preserve the status quo. All of this makes showing ‘return on investment’ difficult, along with the fact that Congress is not united in its understanding of what the return on investment looks like — is it cheaper, more secure, faster, etc.? We don’t have a common definition for success when it comes to programs like TMF.”

Bafundo said the TMF works because it provides agencies with guardrails or characteristics for the types of projects the board would invest in.

“We relied on good ideas or good proposals and someone who could defend their ideas, as opposed to a set of focal areas and show us what you can with seed funding. You can use that experience to unlock further funding,” he said. “That is how it should work instead of a 3-to-5 year plan that many programs have. In some ways the TMF because it relies on lengthy proposals instead of working software is more like a grant program than a seed fund.”

Gundeep Ahluwalia, a former Labor Department chief information officer, helped the agency win TMF funding for six different projects between 2018 and 2024.

Ahluwalia, who is now an executive vice president and chief innovation officer for NuAxis Innovations, said the TMF helped Labor pay down its technical debt.

“Whether it’s improving services to Americans or protecting against foreign adversaries, the cost of not doing anything here is just too large, especially considering the investment is paltry,” he said. “The TMF used an approach very similar to the private sector where you would make your business case, tell the board how much the company would get back from the investment. This business case is a no-brainer. For $500 million or even $250 million, it could give agencies the opportunity to improve services, reduce risks and become cyber strong.”

OMB seeks change to TMF

It’s unclear why support on Capitol Hill has been tepid a best for the TMF.

Ahluwalia said lawmakers still have trouble understanding why something like the TMF is needed and there isn’t an outspoken supporter like Connolly, who passed away in May, was for IT modernization funding.

“If you don’t understand something and there is a significant resistance to spending this becomes yet another government program. But this isn’t just another one, the TMF is a way out of our technical debt conundrums. It’s modeled after the private sector and I don’t think people may not understand that,” he said.

OMB, which didn’t respond to two requests for comments on the TMF expiring, proposed through GSA’s 2026 budget request a new funding model for the program. The White House wants to make it a revolving or working capital fund of sorts that would be authorized to collect up to $100 million a year in otherwise expired funding.

The legislative proposal would let “GSA, with the approval of OMB, to collect funding from other agencies and bring that funding into the TMF,” GSA wrote in its budget justification document. “This would allow agencies to transfer resources to the TMF using funds that are otherwise no longer available to them for obligation. This provision is essential to providing the TMF with the necessary funds to help the federal government address critical technology challenges by modernizing high-priority systems, improving AI adoption and supporting cross-government collaboration and scalable services.”

If the TMF authority expires, GSA would still be able to support existing investments with already approved funding and other program support services.

The post House lawmakers to try again to extend TMF through NDAA first appeared on Federal News Network.

© Federal News Network

technology-modernization-fund-1

How the inefficiencies of TIC 2.0 hinder agencies’ cybersecurity progress

Federal agencies face an ever-evolving threat landscape, with cyberattacks escalating in both frequency and sophistication. To keep pace, advancing digital modernization isn’t just an aspiration; it’s a necessity. Central to this effort is the Trusted Internet Connections (TIC) 3.0 initiative, which offers agencies a transformative approach to secure and modernize their IT infrastructure.

TIC 3.0 empowers agencies with the flexibility to securely access applications, data and the internet, providing them with the tools they need to enhance their cyber posture and meet the evolving security guidance from the Office of Management and Budget and the Cybersecurity and Infrastructure Security Agency. Yet, despite these advantages, many agencies are still operating under the outdated TIC 2.0 model, which creates persistent security gaps, slows user experience, and drives higher operating costs, ultimately hindering progress toward today’s modernization and adaptive security goals.

Why agencies must move beyond TIC 2.0

TIC 2.0, introduced over a decade ago, aimed to consolidate federal agencies’ internet connections through a limited number of TIC access points. These access points were equipped with legacy, inflexible and costly perimeter defenses, including firewalls, web proxies, traffic inspection tools and intrusion detection systems, designed to keep threats out. While effective for their time, these static controls weren’t designed for today’s cloud-first, mobile workforce. Often referred to as a “castle and moat” architecture, this perimeter-based security model was effective when TIC 2.0 first came out, but is now outdated and insufficient against today’s dynamic threat landscape.

Recognizing these limitations, OMB introduced TIC 3.0 in 2019 to better support the cybersecurity needs of a mobile, cloud-connected workforce. TIC 3.0 facilitates agencies’ transition from traditional perimeter-based solutions, such as Managed Trusted Internet Protocol Service (MTIPS) and legacy VPNs, to modern Secure Access Service Edge (SASE) and Security Service Edge (SSE) frameworks. This new model brings security closer to the user and the data, improving performance, scalability and visibility across hybrid environments.

The inefficiencies of TIC 2.0

In addition to the inefficiencies of a “castle and moat” architecture, TIC 2.0 presents significant trade-offs for agencies operating in hybrid and multi-cloud environments:

  • Latency on end users: TIC 2.0 moves data to where the security is located, rather than positioning security closer to where the data resides. This slows performance, hampers visibility, and frustrates end users.
  • Legacy systems challenges: outdated hardware and rigid network paths prevent IT teams from managing access dynamically. While modern technologies deliver richer visibility and stronger data protection, legacy architectures hold agencies back from adopting them at scale.
  • Outages and disruptions: past TIC iterations often struggle to integrate cloud services with modern security tools. This can create bottlenecks and downtime that disrupt operations and delay modernization efforts.

TIC 3.0 was designed specifically to overcome these challenges, offering a more flexible, distributed framework that aligns with modern security and mission requirements.

“TIC tax” on agencies — and users

TIC 2.0 also results in higher operational and performance costs. Since TIC 2.0 relies on traditional perimeter-based solutions — such as legacy VPNs, expensive private circuits and inflexible, vulnerable firewall stacks — agencies often face additional investments to maintain these outdated systems, a burden commonly referred to as the “TIC Tax.”

But the TIC Tax isn’t just financial. It also shows up in hidden costs to the end user. Under TIC 2.0, network traffic must be routed through a small number of approved TIC Access Points, most of which are concentrated around Washington, D.C. As a result, a user on the West Coast or at an embassy overseas may find their traffic backhauled thousands of miles before reaching its destination.

In an era where modern applications are measured in milliseconds, those delays translate into lost productivity, degraded user experience, and architectural inefficiency. What many users don’t realize is that a single web session isn’t just one exchange; it’s often thousands of tiny connections constantly flowing between the user’s device and the application server. Each of those interactions takes time, and when traffic must travel back and forth across the country — or around the world — the cumulative delay becomes a real, felt cost for the end user.

Every detour adds friction, not only for users trying to access applications, but also for security teams struggling to manage complex routing paths that no longer align with how distributed work and cloud-based systems operate. That’s why OMB, CISA and the General Services Administration have worked together under TIC 3.0 to modernize connectivity, eliminating the need for backhauling and enabling secure, direct-to-cloud options that prioritize both performance and protection.

For example, agencies adopting TIC 3.0 can leverage broadband internet services (BIS), a lower-cost, more flexible transport option that connects users directly to agency networks and cloud services through software-defined wide area network (SD-WAN) and SASE solutions.

With BIS, agencies are no longer constrained to rely on costly, fixed point-to-point or MPLS circuits to connect branch offices, data centers, headquarters and cloud environments. Instead, they can securely leverage commercial internet services to simplify connectivity, improve resiliency, and accelerate access to applications. This approach not only reduces operational expenses but also minimizes latency, supports zero trust principles, and enables agencies to build a safe, flexible and repeatable solution that meets TIC security objectives without taxing the user experience.

How TIC 2.0 hinders zero trust progress

Another inefficiency — and perhaps one of the most significant — of TIC 2.0 is its incompatibility with zero trust principles. As federal leaders move into the next phase of zero trust, focused on efficiency, automation and rationalizing cyber investments, TIC 2.0’s limitations are even more apparent.

Under TIC 2.0’s “castle and moat” model, all traffic, whether for email, web services or domain name systems, must be routed through a small number of geographically constrained access points. TIC 3.0, in contrast, adopts a decentralized model that leverages SASE and SSE platforms to enforce policy closer to the user and data source, improving both security and performance.

To visualize the difference, think of entering a baseball stadium. Under TIC 2.0’s “castle and moat” approach, once you show your ticket at the entrance, you can move freely throughout the stadium. TIC 3.0’s decentralized approach still checks your ticket, but ushers and staff ensure you stay in the right section, verifying continuously rather than once.

At its core, TIC 3.0 is about moving trust decisions closer to the resource. Unlike TIC 2.0, where data must travel to centralized security stacks, TIC 3.0 brings enforcement to the edge, closer to where users, devices and workloads actually reside. This aligns directly with zero trust principles of continuous verification, least privilege access and minimized attack surface.

How TIC 3.0 addresses TIC 2.0 inefficiencies

By decentralizing security and embracing SASE-based architectures, TIC 3.0 reduces latency, increases efficiency and enables agencies to apply modern cybersecurity practices more effectively. It gives system owners better visibility and control over network operations while allowing IT teams to manage threats in real time. The result is smoother, faster and more resilient user experiences.

With TIC 3.0, agencies can finally break free from the limitations of earlier TIC iterations. This modern framework not only resolves past inefficiencies, it creates a scalable, cloud-first foundation that evolves with emerging threats and technologies. TIC 3.0 supports zero trust priorities around integration, efficiency and rationalized investment, helping agencies shift from maintaining legacy infrastructure to enabling secure digital transformation.

Federal IT modernization isn’t just about replacing technology; it’s about redefining trust, performance and resilience for a cloud-first world. TIC 3.0 provides the framework, but true transformation comes from operationalizing that framework through platforms that are global, scalable, and adaptive to mission needs.

By extending security to where users and data truly live — at the edge — agencies can modernize without compromise: improving performance while advancing zero trust maturity. In that vision, TIC 3.0 isn’t simply an evolution of policy; it’s the foundation for how the federal enterprise securely connects to the future.

Sean Connelly is executive director for global zero trust strategy and policy at Zscaler and former zero trust initiative director and TIC program manager at CISA.

The post How the inefficiencies of TIC 2.0 hinder agencies’ cybersecurity progress first appeared on Federal News Network.

© Getty Images/iStockphoto/go-un lee

Industry 4.0, Internet of things (IoT) and networking, network connections

Risk & Compliance Exchange 2025: Former DOJ lawyer Sara McLean on ensuring cyber compliance under the False Claims Act

Since January 2025, the Justice Department has been aggressively holding federal contractors accountable for violating cybersecurity violations under the False Claims Act.

Over the last 11 months, the Trump administration has announced six settlements out of the 14 since the initiative began in 2021.

Sara McLean, a former assistant director of the DOJ Commercial Litigation Branch’s Fraud Section and now a partner with Akin, said the Trump administration has made a much more significant push to hold companies, especially those that work for the Defense Department, accountable for meeting the cyber provisions of their contracts.

Sara McLean is a former assistant director of the DOJ Commercial Litigation Branch’s Fraud Section and now is a partner with Akin,

“I think there are going to be a lot more of these announcements. There’s been a huge uptick just since the beginning of the administration. That is just absolutely going to continue,” McLean said during Federal News Network’s Risk & Compliance Exchange 2025.

“The cases take a long time. The investigations are complex. They take time to develop. So I think there are going to be many, many, many more announcements, and there’s a lot of support for them. Cyber enforcement is now embedded in what the Justice Department does every day. It’s described as the bread and butter by leadership.”

A range of high-profile cases

A few of the high-profile cases this year so far include a $875,000 settlement with Georgia Tech Research Corp. in September and a $1.75 million settlement in August with Aero Turbine Inc. (ATI), an aerospace maintenance provider, and Gallant Capital Partners, a private equity firm that owned a controlling stake in ATI during the time period covered by the settlement.

McLean, who wouldn’t comment on any one specific case, said in most instances, False Claims Act allegations focus on reckless disregard for the rules, not simple mistakes.

“We’ve seen in some of the more recent announcements new types of fact patterns. What happens is when announcements are made that DOJ has pursued a matter and has resolved a matter, that often leads to the qui tam relators and their attorneys finding more matters like that and filing them,” said McLean who left federal service in October after almost 27 years. “It’ll be interesting to see if these newer fact patterns yield more cases that are similar.”

Recent cases that involve the security of medical devices or the qualifications of cyber workers performing on government contracts are two newer fact patterns that have emerged over the last year or so.

Launched in 2021, the Justice’s Civil-Cyber Fraud initiative uses the False Claims Act to ensure contractors and grantees meet the government’s cybersecurity requirements.

President Joe Biden signed an executive order in May 2021 that directed all agencies to improve “efforts to identify, deter, protect against, detect and respond to” malicious cyberthreats.

130 DOJ lawyers focused on cyber

Justice conducted a 360 review of cyber matters and related efforts, and one of the areas that emerged was to use the False Claims Act to hold contractors and grantees accountable and drive a change in behavior.

“The motivation was largely to improve cybersecurity and also to protect sensitive information, personal information, national security information, and to ensure a level playing field, so that you didn’t have some folks who were meeting the requirements and others who were not,” McLean said.

“It was to ensure that incidents were being reported to the extent the False Claims Act could be used around that particular issue. Because the thought was that would enable the government to respond to cybersecurity problems and that still is really the impetus now behind the enforcement.”

McLean said the Civil-Cyber Fraud initiative is now embedded as part of the DOJ’s broader False Claims Act practice. It has about 130 lawyers, who work with U.S. attorney’s offices as well as agency inspectors general offices.

Typically, an IG begins an investigation either based on a qui tam or whistleblower filing, or a more traditional review of contracts and grants.

The IG will assign agents and DOJ lawyers will join as part of the investigative team.

McLean said the agents are on the ground, interviewing witnesses and applying all the resources that come from the IGs. DOJ then decides, based on the information the IGs bring back, to either take some sort of action, such as intervening in a qui tam lawsuit and taking it over, or to decline or settle with a company.

“They go back to the agency for a recommendation on how to proceed. So it’s really the agencies and DOJ who are really in lockstep in these matters,” she said. “DOJ is making the decision, but it’s based on the recommendation of the agencies and with the total support of the agencies.”

Many times, Justice decides to intervene in a case or seek a settlement depending on whether the company in question has demonstrated reckless disregard for federal cyber rules and regulations.

McLean said a violation of the False Claims Act requires only reckless disregard, not intentional fraud.

“It’s critically important for anyone doing business with the government, especially those who are signing a contract and agreeing to do something, to make sure that they understand what that is, especially in the cybersecurity area,” she said. “What they’ve signed on to can be quite complicated. It can be legally complicated. It can be technically complicated. But signing on the dotted line without that understanding is just a recipe for getting into trouble.”

When a whistleblower files a qui tam lawsuit, McLean said that ratchets up the entire investigation. A whistleblower can be entitled to up to 30% of the government’s recovery, whether through a decision or a settlement.

Self-disclosures encouraged

If a company doesn’t understand the requirements and doesn’t put any resources into trying to understand and comply with them, that can lead to a charge of reckless disregard.

“When it comes to employee qualifications, it’s the same thing. If a contract says that there needs to be this level of education or there needs to be this level of experience, that is what needs to be provided. Or a company can get into trouble,” McLean said.

“The False Claims Act applies to making false claims and causing false claims. It’s not just the company that’s actually directly doing business with the government that needs to worry about the risk of False Claims Act liability, because a company that’s downstream, like a subcontractor who’s not submitting the claims to the government, could be found liable for causing a false claim, or, say, an assessor could be found liable for causing a false claim, or a private equity company could be found liable for causing a false claim. There are individuals who can be found liable for causing and submitting false claims.”

She added that False Claims Act allegations can apply not only to just the one company that has the direct relationship with the government but also to their partners if they are not making a good faith effort to comply.

But when it’s a mistake, maybe an overpayment or something similar, the company can usually claim responsibility and address the problem quickly.

“DOJ has policies of giving credit in False Claims Act settlements for self-disclosure, cooperation and remediation. That is definitely something that is available and that companies have been definitely taking advantage of in this space,” McLean said. “DOJ understands that there’s more focus on cybersecurity than there used to be, and so there are companies that maybe didn’t attend to this as much as they now wish they had in the past. The companies discover that they’ve got some kind of a problem and want to fix it going forward, but then also figure out, ‘How do I make it right and in the past?’ ”

McLean said this is why vendors need to pay close attention to how they comply with the DoD’s new Cybersecurity Maturity Model Certification.

She said when vendors sign certifications that they are complying with CMMC standards without fully understanding what that means, that could be considered deliberate ignorance.

“Some courts have described it as gross negligence. Negligence would be a mistake. I don’t know if that helps for the for the nonlawyers, but corporations which do not inform themselves about the requirements or not taking the steps that are necessary, even if it’s not through necessarily ill intent, but it’s not what the government bargained for, and it’s not just an accident. It’s a little bit more than that, quite a bit more than that,” she said.

“The one thing that’s important about that development is it does involve more robust certifications, and that is something that can be a factor in a case being a False Claims Act and a case being more or less likely to be one that the government would take over. Because signing a certification when the information is not true starts to look like a lie, which starts to look like the more intentional type of fraud … rather than a mistake. It looks reckless to be signing certifications without doing this review to know that the information that’s in there is right.”

Discover more articles and videos now on our Risk & Compliance Exchange 2025 event page.

The post Risk & Compliance Exchange 2025: Former DOJ lawyer Sara McLean on ensuring cyber compliance under the False Claims Act first appeared on Federal News Network.

© Federal News Network

Risk and Compliance Exchange 2025 (3)

Risk & Compliance Exchange: Cyber AB’s Matt Travis on scaling the CMMC ecosystem

The Cybersecurity Maturity Model Certification program is officially off the ground.

CMMC is the Pentagon’s program to evaluate whether defense contractors are following requirements for protecting controlled unclassified information. The cybersecurity requirements, based on National Institute of Standards and Technology controls, have been in Defense Department contracts since 2016.

It took years for CMMC to become a reality. But the final rule to implement CMMC into contractual requirements took effect Nov. 10.The rule establishing CMMC as a program had already gone into effect last year.

DoD has a phased implementation plan for the program. During Phase 1, over the next year, the department will largely require CMMC self-assessments from contractors. But DoD programs have the discretion to require Level 2 CMMC third-party assessments over the next year as needed.

Tackling third-party CMMC assessments

During Phase 2, starting next November, those third-party assessments will become standard in applicable contacts.

Those third-party assessments are a key facet of the CMMC program and its goal to ensure defense contractors follow cybersecurity requirements.

The Cyber Accreditation Body is responsible for authorizing the CMMC third-party assessment organizations (C3PAOs) that will carry out those independent assessments. And Matthew Travis, CEO of The Cyber AB, said work is well underway to building out the scaffolding that will support the CMMC program.

“If there’s any remaining skepticism of whether or not the department was serious about this conformity regime, you can now just look at the Code of Federal Regulations and see both rules there,” Travis said during Federal News Network’s Risk & Compliance Exchange 2025. “Now, the real challenge is to scale the ecosystem.”

‘Impending bow wave’

So far, just under 500 defense contractors have voluntarily achieved a Level 2 CMMC certification, Travis shared.

But the Pentagon has estimated that the requirement for a Level 2 third-party assessment could apply to as many as 80,000 companies as CMMC is phased in.

“I am concerned about the impending bow wave that I think we’ll see in demand,” Travis said.

Some C3PAOs already have a backlog of assessments that stretch into next year.

“Now is the time to move if you’re ready,” Travis added. “People are going to start racing to the checkout line, and it’s going to be a wait. So move now if you’re ready, and if you’re not ready, get ready, because the sooner you do it, the sooner you’ll be able get a slot.”

Among the voluntary Level 2 assessments that have occurred to date, Travis said “false starts” have been an issue for some organizations.

“We heard frequently from the C3PAOs that they had to call it off mutually once the organization seeking certification realized all the things that they hadn’t fully done,” Travis said. “And the C3PAO said, ‘We might want to pause here. Go back to work and call us when you’re ready.’ ”

Travis said the 110 requirements required under Level 2 go beyond technical controls.

“It does require an organizational commitment,” he said. “There are physical security requirements, there are training requirements that human resources has to be involved in. There are leadership requirements in terms of resourcing.”

Another key lesson gleaned from early assessments is the need for companies to understand their external service providers. Travis said most organizations rely on cloud service providers or managed service providers for many IT and cybersecurity needs.

But whether they’re a CSP or an MSP — and to what extent they are involved in an organization’s handling of controlled unclassified information — are crucial questions in a CMMC assessment.

“Knowing who’s helping you and knowing your organization is fully committed are probably the two biggest takeaways that we’re hearing from industry,” Travis said.

CMMC’s ‘long pole in the tent’

The Cyber AB, through its no-cost contract with the Pentagon, is responsible for authorizing C3PAOs and certifying the people who conduct CMMC assessments.

Travis said there are just under 600 certified CMMC assessors today. Half of them are eligible to lead assessment teams.

But to meet the envisioned scale of the CMMC program — evaluating tens of thousands of defense contractors annually — Travis estimates there’s a need for between 2,000 and 3,000 assessors.

“That’s the most important part of the ecosystem that has to be grown. … That’s a long pole in the tent,” Travis said.

Initially, the challenge to building a pool of assessors was DoD’s drawn out rulemaking process: There was no financial incentive to become an assessor with no CMMC requirements on the horizon.

But Travis said the challenge now is getting CMMC assessors through the process quickly enough as DoD phases in the requirements. The process of becoming an assessor involves training, exams and passing a Tier 3 DoD background investigation, which is equivalent to being investigated for a secret-level security clearance. Those investigations can often take months.

Travis said assessors don’t necessarily need to start with a technical background. He pitched it as a “great way for folks to get engaged in cybersecurity.”

“Whether it’s a full time job or a side hustle, these assessors are going to be in demand,” Travis said. “And so the compensation that goes with it, I think, is compelling. We are encouraging folks, if they haven’t considered entering into the CMMC program, think about becoming an assessor.”

Discover more articles and videos now on our Risk & Compliance Exchange 2025 event page.

The post Risk & Compliance Exchange: Cyber AB’s Matt Travis on scaling the CMMC ecosystem first appeared on Federal News Network.

© Federal News Network

Risk and Compliance Exchange 2025 (2)

DOGE and its long-term counterpart remain, with a full slate of modernization projects underway

The Department of Government Efficiency, the driving force behind the Trump administration’s cuts to the federal workforce and executive branch spending, isn’t wrapping up operations sooner than expected, according to several administration officials.

Reuters published a story on Sunday claiming that DOGE no longer exists, about eight months ahead of the deadline set by President Donald Trump. The story drew strong reactions from Trump administration officials, who rejected claims that DOGE is ending before its final day on July 4, 2026.

A DOGE spokesperson told Federal News Network on Tuesday that DOGE and its longer-term, tech-aligned counterpart, the U.S. DOGE Service, both remain — and that the latter organization is moving forward with a full slate of modernization projects.

The spokesperson, in response to written questions, confirmed DOGE still exists as a temporary organization within the U.S. DOGE Service, and that Amy Gleason remains the acting administrator of USDS.

In addition, the spokesperson said the U.S. DOGE Service — a Trump-era rebranding of the U.S. Digital Service — is working on several cross-agency projects. The spokesperson said USDS is actively involved in these projects, but the agencies in charge of these projects oversee staffing and hiring. The list of projects shared with Federal News Network closely resembles the type of work that USDS was involved in before the Trump administration.

“The U.S. DOGE Service remains deeply engaged across government-modernizing critical systems, improving public services, and delivering fast, practical solutions where the country needs them most,” the spokesperson said.

Office of Personnel Management Director Scott Kupor wrote on X that “DOGE may not have centralized leadership under USDS,” but the “principles of DOGE remain alive and well.”

Those principles, he added, include deregulation; eliminating fraud, waste and abuse; and reshaping the federal workforce.

Kupor wrote that DOGE “catalyzed these changes,” and that OPM and the Office of Management and Budget “will institutionalize them.”

It’s not clear that DOGE leadership ever set exact demands for its representatives scattered across multiple federal agencies. Current and former DOGE representatives publicly stated that DOGE leadership played a hands-off role in their day-to-day work, and that they identified primarily as employees of their agencies. Former DOGE employees said they rarely heard from Elon Musk, DOGE’s former de facto leader, once they completed their onboarding to join the Trump administration.

DOGE wrote on X that “President Trump was given a mandate by the American people to modernize the federal government and reduce waste, fraud and abuse,” and that it terminated 78 contracts worth $335 million last week.

The DOGE spokesperson said the U.S. DOGE Service is working on a project to use AI to process over 600,000 pieces of federal correspondence each month, and is working with the General Services Administration to advance “responsible AI governmentwide.”

Current U.S. DOGE Service projects include:

  • Supporting 18 million students by modernizing the FAFSA system and implementing major student loan and Pell Grant changes.
  • Improving access to benefits with a streamlined, public-option verification tool that helps states accelerate community engagement requirements for Medicaid and SNAP approvals.
  • Transforming the non-immigrant visa process to support Olympic and World Cup travel with a more reliable, adaptable digital platform.
  • Reducing delays for over 600,000 veterans each month through a modernized VA disability compensation application.
  • Building a modern National Provider Directory to speed Medicare provider enrollment and enable nationwide interoperability.
  • Launching new patient-facing apps and data access tools, first announced at the White House and rolling out beginning January 2026.
  • Digitizing the National Firearms Act process, replacing outdated paper systems.
  • Using AI responsibly to process over 600,000 pieces of federal correspondence monthly.
  • Strengthening Medicare’s digital experience with better security, fraud reporting, caregiver access and reduced paper burden.
  •  Improving VA appointment management with integrated scheduling, check-ins, notifications and after-visit support.
  • Advancing responsible AI government-wide through partnership with GSA.
  • Rapid-response deployments for Customs and Border Protection, FEMA, Medicare claims modernization, FDA data consolidation.

Gleason said in September that agencies don’t have enough tech talent to deliver on the administration’s policy goals, and they would need to boost hiring

“We need to hire and empower great talent in government,” Gleason said on Sept. 4. “There’s not enough tech talent here. We need more of it.”

Under the Trump administration, federal employees have faced mass layoffs and incentives to leave government service. The Partnership for Public Service estimates that, as of October, more than 211,000 employees left the federal workforce this year — either voluntarily or involuntarily.

Gleason, who also serves as a strategic advisor for the Centers for Medicare and Medicaid Services, said tech hiring is essential to help CMS “build modern services for the American people.” She said the agency, at the beginning of this year, had about 13 engineers managing thousands of contractors.

“If we could hire great talent for tech in the government, I think in five years, we can really transform a lot of these systems to be much more modern and user-friendly, and easy for citizens to engage with what they need,” Gleason said. “But we have to take advantage of hiring.”

The post DOGE and its long-term counterpart remain, with a full slate of modernization projects underway first appeared on Federal News Network.

© AP Photo/Jose Luis Magana

FILE - Elon Musk flashes his T-shirt that reads "DOGE" to the media as he walks on South Lawn of the White House, in Washington, March 9, 2025. (AP Photo/Jose Luis Magana, File)

A new center aims to modernize federal lending at a scale few realize exists

 

Interview transcript:

 

Doug Criscitello Very excited to get underway at the Center for USA Lending. The idea has been building really in my mind, and on the part of others from this community, the federal lending community, for several decades really. The U.S. government runs more than 125 federal loan and loan guarantee programs, and that’s at agencies like the Federal Housing Administration, the Small Business Administration, the Department of Agriculture has a variety of loan programs, and various others. There’s about a dozen federal agencies that have loan programs. And today, the U.S. government has evolved to a point where it’s really the world’s largest financial institution. Its credit portfolio alone now totals about $5 trillion, a huge number. So given the relative complexity of making and servicing loans — and these instruments have terms that can last for decades — managing the government’s huge credit portfolio has always been a tremendous challenge. You know, particularly when you compare it with simply providing a one-time cash grant to an intended beneficiary, that’s pretty simple. You’re just cashing once. When we loan money, we’re entering into a long-term relationship with the borrower, technically, so the complexity is very significant.

Terry Gerton When you think about that massive portfolio, you’d said 125 different programs, 12 agencies, $5 trillion. Are there any specific programs that rise to the top of your visibility list in terms of desperately needing attention?

Doug Criscitello Let me answer that by talking about some of the good news, because huge strides have been made in recent decades. We’ve come a long way from the days when loan repayments were recorded on three-by-five index cards in pencil, right? So many of the systems that have been developed over the past few decades are huge advances relative to what we had prior to the sort of general use of computational power across the government. But notwithstanding those advancements, the systems that we have today are fragmented, outdated, they don’t communicate with each other. So, this creates a whole lot of administrative complexity. And borrower confusion. It drives up costs at the end of the day and it makes it difficult to manage risk or detect fraud. And it generally frustrates borrowers. I think if you did a man on the street interview, it wouldn’t be hard to find folks that have been frustrated in repaying a loan to the government.

Terry Gerton Well, your press release for the Center for USA Lending mentions modernization, technology, and integrity as core priorities. You just sort of glossed over them. But when I think about the financial industry, banking, and major corporations, they’re really at the front edge of technology, cybersecurity, identity management. How are you seeing the possibilities for bringing that kind of technology into how the government operates its loan portfolio?

Doug Criscitello Exactly right. So there are a lot of financial institutions that embrace modern technologies and are continuing to advance their use of cutting-edge tools. I think artificial intelligence is a terrific application here, right, to tailor the experience of borrowers, depending on their background, both in the application process and when it comes to servicing. Our hope is to really facilitate a dialog, not only across the government, but to bridge the gaps that exist between technology, private financial institutions and what they’re doing, and the U.S. government credit apparatus. Right now, there are huge opportunities to have really seamless systems from the time a borrower applies for a loan till the day they make the final payment. One agency that I’ve worked at and around for much of my career, the Small Business Administration, has made some amazing strides since the COVID pandemic, when it was forced to disperse nearly $1 trillion in paycheck protection program loans and economic injury disaster loans. They’re in the midst of just an incredible improvement in the borrower experience, the disaster loan program being a great example. And we want to encourage that type of improvement to occur at other agencies as well.

Terry Gerton I’m speaking with Doug Criscitello. He’s the new executive director at the Center for USA Lending. Doug, coordinated technology investment is a perennial problem for the federal government. But setting that aside, you just described a situation that calls out for centralized governance, that calls out for data standardization. Beyond tech investment, what are your policy priorities for the center?

Doug Criscitello You’ve touched on some of them, for sure. The notion of trying to at least have a coherent approach across agencies, where we have common data definitions and agreement in principle that having these end-to-end systems are the way forward here. We really need to automate workflows and integrate systems. I mean, that’s priority one, to ensure that can be done. So look, there’s a lot that the center can do. One thing we’re planning to do is to convene the community. Let’s get folks — we plan to have frequent gatherings of both folks in government, folks in industry — to come together to explore how best to move forward and to continually evolve. It’s not a one-time fix, you know. These systems can continually be strengthened. The government has shown no signs of reducing the size of its footprint here in the lending world. So, you know, we want to be a convener. We want to develop thought leadership. We want to pull together data from across the federal lending enterprise into a common shared platform to help all of the participants in this realm better understand how these programs are performing and what we might do differently going forward.

Terry Gerton You’ve laid out a pretty bold and expansive vision there. If you’re successful, five years from now, what looks different about federal lending?

Doug Criscitello The stakes are really high with a $5 trillion portfolio. I think if we’re successful, our work will help enhance taxpayer value, importantly, by reducing wasteful spending on duplicated systems. We hope to enhance program integrity, reduce hedge fraud faster, and streamline access to loans. Particularly when they’re needed most, right? There are times when the federal government — and the pandemic was a great example — times when funds need to be put on the street quickly and effectively and efficiently, and avoiding fraud. So our goal is really to make government lending more efficient. So whether you’re a borrower seeking faster service, a private lender who wants to have a harmonized relationship across all of their various federal loan guarantee programs in which they participate, or even just a taxpayer … importantly, a taxpayer who absolutely deserves efficient government operations. The center’s modernization efforts, I think, are poised to benefit you directly. So we’re really excited to get underway.

The post A new center aims to modernize federal lending at a scale few realize exists first appeared on Federal News Network.

© The Associated Press

FILE - Dallas Koehn plants milo in his field as wind turbines rise in the distance on May 19, 2020, near Cimarron, Kan. The federal government announced Tuesday, Oct. 18, 2022, a program that will provide $1.3 billion in debt relief for about 36,000 farmers who have fallen behind on loan payments or face foreclosure. (AP Photo/Charlie Riedel, File)

IRS tech chief directs staff to take ‘skills assessment’ ahead of IT reorganization

The IRS, ahead of an upcoming reorganization of its tech office, is putting its IT staff to the test.

The agency, in an email sent Monday, directed its IT workforce to complete a “technical skills assessment.”

IRS Chief Information Officer Kaschit Pandya told employees that the assessment is part of a broader effort to gauge the team’s technical proficiency, ahead of an “IRS IT organizational realignment.”

“Over time, hiring practices and role assignments have evolved, and we want to ensure our technical workforce is accurately aligned with the work ahead. The assessment will help establish a baseline understanding of our collective strengths and areas for development,” Pandya told staff in an email sent Monday.

Pandya’s office is leading the technical skills assessment, in coordination with the Treasury Department, the IRS human capital office and the Office of Personnel Management.

“I want to emphasize that this is a baseline assessment, not a performance rating. Your individual-level results will not affect your pay or grade,” he told staff. “I know this comes during a very busy and uncertain time, and I deeply appreciate your partnership.”

Pandya told staff that a “limited group” of IRS IT employees in technical roles — including developers, testers and artificial intelligence/machine learning engineers have been invited to complete the test. He told staff that, as of Monday, about 100 employees were directed to complete the assessment.

On Friday, an IRS IT employee told Federal News Network that several hundred employees have now completed the assessment, and that it took employees about 90 minutes to complete it.

According to the employee, Pandya told staff in an all-hands meeting on Friday that one of the agency’s goals is to rely more on full-time IT employees, and less on outside contractors. He said during that meeting that IRS IT currently has about 6,000 IT employees and about 4,500 contractors.

“It doesn’t make sense, considering all the RIFs, firings and decisions that ignored expertise,” the IRS IT employee said.

The IRS has lost more than 25% of its workforce so far this year, largely through voluntary separation incentives. Pandya told staff in an email this summer that the agency needs to “reset and reassess,” in part because more than 2,000 IT employees have separated from the IRS since January. The IRS had about 8,500 IT employees at the start of fiscal 2025.

The agency also sent mass layoff notices to its employees during the government shutdown, but has rescinded those notices as required by Congress in its spending deal that ended the shutdown.

The Treasury Department sent reduction-in-force notices to 1,377 employees during the recent government shutdown — as part of a broader RIF that targeted about 4,000 federal workers. Court documents show the IRS employees received the vast majority of those RIF notices, and that they disproportionately impacted human resources and IT personnel at the IRS.

The technical assessment is also in line with goals set by Treasury CIO and Department of Government Efficiency representative Sam Corcos, who recently said IRS IT layoffs were “painful,” but necessary for the agency’s upcoming tech reorganization.

In a recent podcast interview, Corcos said much of his time as Treasury CIO has been focused on projects at the IRS, and that the agency’s IT workforce doesn’t have the necessary skills to deliver on its long-term modernization goals.

“We’re in the process of recomposing the engineering org in the IRS, which is we have too many people within the engineering function who are not engineers,” he said. “The goal is, let’s find who our engineers are. Let’s move the people who are not into some other function, and then we’re going to bring in more engineers.”

Corcos estimated that there are about 100 to 200 IRS IT employees currently at the organization that he trusts to carry out his reorganization plans.

“When you go in and you talk to people, a lot of the people, especially an engineer, the engineers on the team, they want to solve this problem. They don’t feel good about the fact that this thing has been ongoing for 35 years and will probably never get done. They actually want to solve these problems.”

IT employees at several agencies have gone through evaluations and assessments during the Trump administration. Tech employees at the General Services Administration were also interviewed and questioned about their skills and expertise by GSA and DOGE leadership. GSA later downsized its Technology Transformation Services office and shuttered its 18F tech shop.

In March, the IRS removed 50 of its IT leaders from their jobs and put them on paid administrative leave. Corcos defended that decision, saying the IRS “has had poor technical leadership for roughly 40 years.”

Corcos said those former IRS IT leaders pushed back on DOGE’s audit of government contracts. The agency, he added, spent an “astounding” amount on cybersecurity contracts, but former leaders resisted cutting and scaling back any of those contracts.

“The initial leadership team just said, ‘Everything is critical, you can’t cut anything. In fact, we need more,’” Corcos said. “And when we swapped them out for people who were more in the weeds, who knew what these things were, we found actually quite a lot that we could cut.”

The post IRS tech chief directs staff to take ‘skills assessment’ ahead of IT reorganization first appeared on Federal News Network.

© AP Photo/Patrick Semansky

Turning the government’s contact centers into engines of intelligence to power federal modernization efforts

Evan Davis views the federal government’s modernization efforts as a strategic opportunity to rebuild trust and achieve mission success through smarter, more human-centered service design.

The recent executive order on digital design makes the timing ideal, said Davis, executive managing director for federal growth at Maximus.

The key? Agencies need to use every citizen interaction as a data point to improve systems, predict needs and personalize service, he recommended during an interview for Federal News Network’s Forward-Thinking Government series.

Lean into the better design executive order

The Improving Our Nation Through Better Design EO is a natural extension of the push to improve experience and the relationship between the government and its constituents, Davis said. “That relationship needs to get built one encounter at a time.”

That matters because the public’s digital service expectations have expanded as commercial interactions have rapidly surpassed those offered in the public sector.

But right now, “there’s a recognition that these government experiences, when looked at carefully with new technology, can meet not only those new expectations but bring federal government encounters to a place where constituents feel appreciated and feel considered in those engagements,” Davis said.

With more than two decades spent helping agencies connect with their constituents, much of it spent partnering within federal contact centers, we asked Davis to share his perspective on the most effective strategies and tactics for advancing digital maturity across government.

Position contact centers as strategic intelligence hubs

For starters, it’s critical to reenvision government contact centers as far more than transactional endpoints. Davis argues that they are rich, underutilized sources of qualitative data that reveal citizen intent, frustration and unmet needs.

With artificial intelligence and analytics, agencies can mine center interactions to inform policy, improve service design and respond in real time, he said.

“I’m constantly amazed by the wealth of untapped data insights hidden within federal agency call centers,” Davis noted and added that center staff members also have a “real-time understanding of the incredible complexity of what it means to engage with the government.”

  • 3 tactics: To take advantage of contact center data, Davis suggests that agencies should:
    • Analyze call transcripts for patterns in citizen needs.
    • Use insights to refine FAQs, digital flows and policy language.
    • Feed findings into broader customer experience and service improvement efforts.

With this approach, agencies — for the first time — “can truly use data to influence policy, to influence an understanding of what’s important to citizens,” he said

Build a digital-first, omnichannel foundation

Davis stressed that digital first doesn’t mean digital only. Agencies must unify systems and channels to guide citizens to the right help, whether that’s a chatbot, a human agent or a proactive SMS update. An omnichannel foundation will enable cost savings, faster service and trust-building through transparency, he said.

“Digital first is not digital separate. … How do I use that first point of contact to get people to the right place based on where they are at the moment?”

The goal, Davis explained, is to reduce the total amount of time and individual actions that citizens must take to address a need.

  • 3 tactics: He suggested that to establish that omnichannel foundation, agencies should:
    • Consolidate legacy contact center systems into a scalable, modular platform.
    • Standardize agent interfaces and data flows.
    • Enable proactive outreach across channels.

“It will also give agents, regardless of the exact content that they’re responsible for, the same user interface, the same pane of glass to look at every day,” Davis said. “It will also allow them to start pulling in that huge amount of data and doing something with it to inform what next steps they should take.”

 Use AI to decode intent and predict needs

Understanding why a citizen contacts an agency is often more complex than a dropdown menu can capture. Davis explained that AI can uncover true intent, match it to policy requirements and guide citizens to resolutions faster. It can also help leaders spot emerging issues before they escalate.

“AI has already proven incredibly adept at understanding true intent of the citizen’s needs” at the micro level and gives agencies more options to quickly respond appropriately, he said. And at macro level, “you can rely on AI to answer things like: What’s changed today? What do I need to know when I wake up this morning as the leader of citizen engagement?”

  • 3 tactics: To speed response times through integrating AI capabilities, Davis recommended that agencies should:
    • Deploy AI-powered intelligent virtual assistant and agent assist tools.
    • Use AI to analyze qualitative data and surface trends.
    • Train models using up-to-date knowledge management systems.

Long term, by integrating AI in these ways and moving to modernized data infrastructures, Davis expects agencies will achieve a state of ongoing transformation and be able to incrementally improve and scale services.

Why tackling service systems matters now

Davis tied these tactics directly to the urgency of the moment: aging systems, rising citizen expectations and the availability of transformative technologies. Agencies must act now, not just to modernize, but to deliver on their missions more effectively, he said.

The beauty of integrating contact center data sources and analyzing that data in real time, Davis pointed out, is that agencies can begin making correlations between circumstances on an interaction that tend to lead to increased costs but also tend to lead to erosion of trust.

“We can begin looking at incredible positive change — to both provide cleaner, simpler, more cost-effective solutions but also to rebuild trust.”

Discover more ways to use technology to reimagine how your agency meets its mission in our Forward-Thinking Government series.

The post Turning the government’s contact centers into engines of intelligence to power federal modernization efforts first appeared on Federal News Network.

© Federal News Network

Screenshot 2025-11-20 124241

When the hotspots go dark, who connects the unconnected?

Interview transcript:

Sam Helmick The E-Rate Hotspot Lending Program is built on about three decades of the FCC’s E-Rate program, which has enabled libraries and schools to have discounts for broadband connectivity as we continue to develop 21st-century readers, learners and skills. And so traditionally that E-Rate funding could only be used for connections within libraries and school buildings. But then in 2024, the then-FCC chairwoman really launched this beautiful program called Learn Without Limits. And that expanded eligibility for the Wi-Fi hotspot devices that libraries could retain to be circulated much like books, particularly to households without reliable or affordable broadband. And the American Libraries Association deeply supported this. And it was executed in more than 800 libraries across the nation; schools and public have utilized this service. It’s about $34 million dollars’ worth of hotspot funding in the year of 2025 to make meaningful connectivity change for Americans.

Eric White Okay, got it. So the FCC voted to virtually end the program back on September 30th. What happened there? What was their reasoning for giving that and does that truly mean the end of the program, or are there other avenues that the program could take to stay alive?

Sam Helmick You’re absolutely right. On September 30th of this year, the FCC voted 2-1 to rescind the hotspot lending program and the school bus Wi-Fi initiative. The majority argued that the E-rate statute didn’t authorize funding for services used beyond library and school property. But the American Library Association, along with many of our partner organizations, disagree with that interpretation and have really urged the FCC to reconsider and maintain the program. This decision reverses rules adopted in 2024 that have just begun to take effect and we’re already sort of seeing the 2025 E-Rate cycle being denied. And we understand that a reader denied is literacy denied, and connectivity divide is almost like participation in civic and educational life denied.

Eric White Yeah, particularly in those rural areas where you may not have a steady connection. You can still obviously access the internet in the library, but you know, when you’re in a teaching scenario and you don’t want to take up the computer for too long because then you start to feel guilty, right? So what other options do folks have who are out in those rural areas that relied on this program?

Sam Helmick If the federal government isn’t prepared to create a robust infrastructure for broadband for our national security, entrepreneurial and economic development, and pursuit of educational wellness and happiness, then I think that we have to think about those students that are on bus rides for up to like three hours a day, back and forth, trying to accomplish their homework. Or folks who are applying for jobs on Sundays because it’s the only day they have off, but the library isn’t supported or resourced enough to be open to them for their public access computers. Also, folks who are trying to attend telehealth appointments, access government services, or even connect with loved ones. Often I think folks forget that libraries are spaces where during both triumph and trials in a community, this is where folks need to go to access internet to tell the broader world and their loved ones that they’re safe and they’re fine. And so we’re really thinking about the broad spectrum of American life and how the lack of connectivity infrastructurally has been devastating. And this was an effort to mitigate that devastation. Now to lose this really leaves a lot of Americans in the lurch.

Eric White We’re speaking with Sam Helmick, president of the American Libraries Association. Let’s talk about federal support for public libraries in general. I’ve spoken to your organization in the past. There were some concerns about dwindling support and obviously cuts have come across the board for a lot of federal programs and I’m sure that libraries are not immune to that. Do I have that correct? And you know, where do things currently stand?

Sam Helmick Oh, you’re absolutely right. In 2024, the Institute of Museum and Library Services awarded $266.7 million dollars through grant-making research and policy development that particularly supported not only our state libraries across the nation, but then our small and rural that rely on those matching state dollar funds to make sure that our tax dollars are working twice and three times over. So with the executive order seeking to dismantle that institute, as well as the lack of robust or comprehensive release of the congressionally mandated funds that fund that institute that support libraries around the country and therefore communities around the country, libraries are experiencing resource scarcity at the federal and then the state and then at the local level. Because despite the fact that those federal dollars have been paid by the taxpayers, they’re not getting returned back. And then if you have contracts through those state consortiums or state libraries, those contracts didn’t end just because the congressionally mandated dollars were not provided to the states. And so this is creating an undue burden on state taxes and taxpayers, and then that trickles down to hurting rural communities that are the least-resourced, but probably the most in need, when it comes to their community anchor institutions, which are a public or a school or an academic library.

Eric White Yeah, I was going to say I’m in no way living in a rural area, but going to any of the libraries in my vicinity, they’re as crowded as ever. So it seems as if the need for resources is almost at an all-time high at a time when they may not have all the support they need.

Sam Helmick Increasingly you and I understand that having digital connection is going to allow us to not only thrive civically but economically, educationally, and then just socially. And so to bar that access to any American, particularly in a country that is so well-resourced and rich, feels counterintuitive to ensuring that we continue to be a nation that thrives 250 years into our story.

Eric White All right, so the situation is what it is. What steps are organizations like yourselves taking, and are there other options on the table, you know, nonprofits, things of that nature? Or is it really just going to come down to more states and more local governments are going to have to step in if they want to save these libraries?

Sam Helmick I think it’s holding anybody, regardless of where they sit on the aisle, accountable to understanding that more Americans visit libraries than they do baseball games, which is our national pastime. And that 70% of us are not interested in abridging or censoring information for any reason — not for economic reasons, not for ideological reasons. That’s a large spectrum of American life, through third-party surveys, that show us how much we value access to information. So how do we support those values? Well, first we recognize that we’re about to be 250 years old as a nation, and that this unique form of government had an essential mechanism called libraries, which is why a lot of our founders invested in them, because they wanted a robust constituency and society that was educated so that it could progress and have informed decisions when it came to civic life. And if we’re going to continue to value that, that means we need to use our libraries. We need to dust off our library cards and make sure that they’re active. Increasingly and regularly, as folks who want to get into the advocacy piece, it’s visiting ALA.org/advocacy to learn how you can write an email, invite your Congress member to come visit their local libraries in their areas of representation, join a city council, join a library board of trustees, join a school board so that your voice and fingerprints are part of the conversation. It’s writing to your legislators and reminding them that you wanted to robustly support your libraries, and so you’re asking them to write policy and create funding that will make that manifest. And then lastly, you can also visit ILoveLibraries.org, so that if you’re wanting to support the American Library Association and library practitioners that are doing this work, you can donate your store, you can donate funds to support moving this national value 250 years into the future.

Eric White You bring up the 250 years portion and that provides me a nice segue. Your organization is a 150 years old, almost. From a historical standpoint, have the nation’s libraries ever really gone through anything like this before? I’m just curious if you have any historical perspective on if we’ve been here before, you know, through tumultuous times  throughout American history.

Sam Helmick Great opportunity to tell a story. I love telling stories, Eric. In 1938, Des Moines Public Library director Forrest Spaulding wrote the Library of Bill of Rights. And I think he did it for a few reasons. We had just gone through a Great Depression and recognized how instrumental our libraries were to supporting their communities during economic strife, but also lifting them up to build entrepreneurial and economic development. But then it was also going through between the world wars and recognizing that we were a melting pot. And sometimes the ideas and values of a very vibrant culture, they blend and harmonize, but sometimes they also brush and create friction. And so creating a set of values where it talks about the right to use reading rooms, the right to find books that both counter and support your own ideology, the right to assemble, the right to speak and to read were essential. And in 1939, the American Library Association adopted that to become an international of free people reading freely. And so when I think about our history, I think libraries have been very good at growing at the pace of their societies, turning inwardly to think about how they can do the work better, and then relying on their communities to do the work best. And so while I would argue that we probably are seeing a difficult time, probably something that even counters McCarthyism in the United States, we have always turned in and relied on our communities and our values to push through. And so using your library, visiting ALA.org/advocacy, using your voice to speak to those that you’ve elected into power — this has always been the recipe. And if we all stay in character, I think we can continue to thrive.

The post When the hotspots go dark, who connects the unconnected? first appeared on Federal News Network.

© The Associated Press

St. Stephen Middle School student Lakaysha Governor works on her Chromebook on Monday, March 20, 2017, on a school bus recently outfitted with WiFi by tech giant Google, as College of Charleston professor RoxAnn Stalvey looks on in St. Stephen, S.C. Lakysha is one of nearly 2,000 students in South Carolina's rural Berkeley County benefiting from a grant from Google, which on Monday unveiled one of its WiFi-equipped school buses in the area. (AP Photo/Meg Kinnard)

OPM’s HR modernization strategy sets next sight on USA Hire

While much attention across the federal community has been focused on the Office of Personnel Management’s strategy to consolidate 119 different human capital systems across government, the agency, at the same time and with little fanfare, kicked off another major human resources modernization effort.

OPM is planning to revamp the USA Hire platform, which provides candidate-assessment tools for agency hiring managers, with the goal of making evaluations more efficient and leading to higher-quality applicants.

OPM, working with the General Services Administration, issued a request for information on Oct. 7 and has been meeting with vendors over the last few weeks to determine what commercial technologies and systems are available. The RFI closed on Oct. 21.

“This RFI is part of OPM’s ongoing effort to ensure agencies have access to cutting-edge, high-quality assessment tools that help identify and hire the best talent across the federal government—advancing a truly merit-based hiring system in line with the president’s Merit Hiring Plan and Executive Order 14170, Reforming the Federal Hiring Process and Restoring Merit to Government Service,” said an OPM spokesperson in an email to Federal News Network. “OPM also anticipates making additional improvements to USAJOBS and USA Staffing to enhance the applicant experience and better integrate assessments into job announcements.”

OPM says in fiscal 2024, USA Hire customer agencies used the program to assess approximately 1 million applicants for over 20,000 job opportunity announcements.  It provides off-the-shelf standard assessment tests covering more than 140 federal job series, access to test center locations worldwide and a broad array of assessment and IT expertise.

“USA Hire currently offers off-the-shelf assessment batteries covering over 800 individual job series/grade combinations, off-the-shelf assessment batteries covering skills and competencies shared across jobs (e.g., project management, writing, data skills, supervisory skills), and custom assessment batteries targeting the needs of individual agencies, access to test center locations worldwide, and a broad array of assessment and IT expertise,” OPM stated in the RFI.

In the RFI, OPM asked industry for details on the capabilities of their assessment systems, including:

  • Delivering assessments in a secure, unproctored asynchronous environment
    Delivering online video-based interviews
  • Using artificial intelligence/machine learning in assessment development and scoring
  • Minimizing and/or mitigating applicant use of AI (e.g, AI chatbots) to improve assessment performance
  • Integrating and delivering assessments across multiple assessment platform

“OPM seeks an assessment delivery system that can automatically score closed-end and open-ended responses, including writing samples. The online assessment platform shall be able to handle any mathematical formula for scoring purposes,” the RFI stated. “Based on the needs of USA Hire’s customers, OPM requires an assessment platform that supports static, multi-form, computer-adaptive (CAT), and linear-on-the-fly (LOFT) assessments delivered in un-proctored, in-person, and remote proctored settings.”

An industry executive familiar with USA Hire said OPM, through the RFI, seems to want to fix some long-standing challenges with the platform.

“RFI suggests OPM will allow third parties to integrate into USA Staffing, which has been a big problem for agencies who weren’t using USA Hire. But I’ll believe it when I see it,” said the executive, who requested anonymity in order to talk about a program they are involved with. “Agencies are not mandated to use USA Hire, but if they don’t use it, they can’t use USA Staffing because of a lack of integration.”

USA Staffing, like USA Hire, is run by OPM’s HR Solutions Office on a fee-for-service basis. The agency says it provides tools to help agencies recruit, evaluate, assess, certify, select and onboard more efficiently.

RFI is a good starting point

The executive said this lack of integration has, for some agencies, been a problem if they are using other assessment platforms.

For example, the Transportation Security Administration issued a RFI back in 2024 for an assessment capability only to decide to use USA Hire after doing some market research.

“USA Hire is adequate for most things the government does. It’s fine for certain types of programs, but if you get out of their swim lanes, they have trouble, especially with customization or configurations. I think getting HR Solutions to do any configurations or customization is a yeomen’s effort,” the executive said. “My concern about USA Hire is it’s a monopoly and when that happens any organization gets fat and lazy. Maybe the Department of Government Efficiency folks kicked them in the butt a little and that’s maybe why we are seeing the RFI.”

The executive said the RFI is a positive step forward.

“It could be good for some companies if it comes to fruition and OPM brings in a legitimate way for other providers with some unique competencies or services to expand the offering from USA Hire,” the executive said. “It’s too early to tell if there will be a RFP, but if they do come out what are they buying? Are they trying to bring on new assessment providers? I think a lot of us would like to know what OPM is looking for or what holes they are seeking to fill in these new solutions.”

Other industry sources say OPM has laid out a tentative schedule for a new USA Hire support services solicitation. Sources say OPM is planning to release a draft request for proposals in January with a final solicitation out in October.

This means an award will not happen before 2027.

“Due to the complexity of requirements and the amount of market research that needs to be conducted, the USA Hire PMO expects the competition timeline to be more than a year long,” OPM said in a justification and approval increasing the ceiling of the current USA Hire contract. “The government estimates that transition could take up to two years depending on the awardee’s solution.”

OPM adds $182M to current contract

OPM released the J&A at the same time it issued the RFI. In a justification and approval, OPM increased the ceiling of its current USA Hire support contract with PDRI, adding $182.7 million for a total contract value of $395 million.

OPM says the need to increase the ceiling is because of the Transportation Security Administration’s (TSA) adoption of USA Hire and its need to fill thousands of vacant positions after the COVID-19 pandemic.

“Because of the EO, the need for USA Hire assessments has far exceeded the initial estimated amount, which has grown at a pace far faster than anticipated when the contract requirements and needs were first drafted and awarded,” OPM stated in the J&A. “OPM planned for the steady growth of USA Hire throughout all options of the contract; however, TSA alone has consumed 95% of the requirement in option year 2 and option year 3. The government issued a modification to realign ceiling value to support the additional assessments; however, the delivery of the assessments has increased significantly.”

An email to PDRI seeking comment on the increased ceiling and the RFI was not returned.

The OPM spokesperson said the agency expects the use of USA Hire to continue to grow over the next few years as agencies implement skills-based assessments as required under the Merit Hiring Plan and Chance to Compete Act.

OPM said in its J&A that it expects USA Hire to provide assessment services to 300,000 applicants for TSA, 10,000 entry level investigators for U.S. Immigration and Customs Enforcement, along with smaller customer agencies spanning cybersecurity positions, tax fraud investigations, entry level credit union examiners and HR specialists.

The post OPM’s HR modernization strategy sets next sight on USA Hire first appeared on Federal News Network.

© Getty Images/iStockphoto/ArtemisDiana

From small business roots to mid-tier powerhouse, this firm is using employee ownership and AI to stay ahead in federal contracting

Interview transcript:

 

Travis Mack Over the years, you know, growing a small business is kind of an iterative process. You learn a lot of things along the way. And we had done very well in the small business vertical, but when we got to that point where we were trying to make that inflection, that turn to trying to be a large business, there were a couple things that we were considering. Were we going to remain a small business or were we just going to blow right through it? And we decided to kind of blow right through the small business threshold. And with that, we had to do a few things differently. We certainly had to upgrade our talent, which was really important, right? We had to also look at trying to drive additional revenue streams, trying to create additional value for the federal government. And so we decided on, not only were we going to grow organically, we were going to grow inorganically as well, which kind of led to our strategy of mergers and acquisitions and incorporating that into our organic growth.

Terry Gerton Well as you say, growing past that small business to large business zone can be really, really challenging. But you’ve kept Saalex as an employee-owned company. How did that decision factor in to your growth strategy?

Travis Mack It factored in because as we were making the transition, we had to figure out how we were going to attract the best and the brightest. And it was actually one of our core strategic decisions on us trying to go and become a large business. It has been the kind of the pillar of us trying to grow. So us becoming and transitioning into an employee-owned organization was really something that I thought of and I said to myself, “if you were going to be asked to work 80, 90 hours a week, what would you want, Travis?” I said I’d probably want equity. And hence, you know, the employee-owned building blocks that we utilize today in order to attract the best and the brightest for Saalex.

Terry Gerton Is that a strategy that you think is sustainable as you continue to grow the company?

Travis Mack Absolutely. We’ve seen it demonstrated before. We think it’s an excellent strategy for us to continue to scale and for those who are willing to put in that work, put in that extra effort. We think it’s something that … because it’s not only the top of the spectrum that’s gaining, it’s the entire organization, because everyone at Saalex has equity and we want that community.

Terry Gerton So you mentioned a little bit about your growth through acquisition strategy. You’re clearly not trying to blend in, you really want to set yourself apart. How do you set yourself apart from the other big primes in the defense and federal space?

Travis Mack We think it’s part of certainly, you know, being an ESOP, having that equity component. We also think it’s from us being unique. We’ve really embraced automation, we’ve really embraced AI, we’ve really embraced security in order to give ourselves a differentiating feel to the organization. And so we think, at our size, being agile than maybe some of the larger primes, being more efficient than maybe some of the larger primes, and really just trying to understand what the core problem is and then solving for that, we think that is a differentiating vertical for us, and we’ve leaned into that. So, you know, we’re an AI-first organization building in automation and AI through every single business system, every single component, and then that efficiency, that effectiveness really translates very seamlessly to the federal customer.

Terry Gerton So that strategy through mergers and acquisitions can really shake up company culture as you’re bringing in different organizations. How have you managed to build an organic Saalex culture and hold on to that through that growth cycle?

Travis Mack It’s a process. And you know, it takes time. It really does, especially now with all the new changes, with how you implement artificial intelligence efficiently, bringing in different organizations within one culture. We’ve launched an initiative called One Saalex, really just trying to focus everyone on — it’s one infrastructure, which is backed by an AI-first mindset, and bringing everybody in and just trying to demonstrate the efficiencies of the platform and how we are supporting our end customers. So we take it day by day. We try to talk about what the benefits are; and it’s a lot of training, Terry. It is truly a lot of training and a lot of — I kid, every single day, half of my battle is changing hearts and minds. And I’ve got to show up every day changing hearts and minds and showing the innovation and showing how, at the end of the day, it’s actually better.

Terry Gerton And you’re bringing in folks with some really amazing technical talent, clearance capability, high-tech roles. How are you finding the job market, and then how do you find the integration once you get them on board?

Travis Mack The job market right now is something that we focus a lot on, right? I mean, the lifeblood of what we do is with individuals, with people. And true enough, we’re trying to scale that with AI and things of that nature. But really it’s about us being out there in in the community. It’s about us being active. It’s about us defining and identifying roles that, you know, we can fit individuals into very, very seamlessly. I think we’ve been certainly very forward-leaning with the mechanisms by which we hire. Traditional ways of hiring isn’t necessarily something that is at the top of the mind these days. So we try to be flexible, we try to be nimble, we try to be innovative, we try to do all those things that we think will entice individuals to come and work with Saalex.

Terry Gerton And one of those things, as you already mentioned, is being an AI-first company. So how do you deploy that kind of fast-moving technology, both in Saalex and then for your customers to keep them on the cutting edge?

Travis Mack Well, we’re not going out building large language models for the federal government. That’s not what we’re doing. We’re going to let them handle that. You know, ours had to be from a services perspective, right? And so we had to figure out, how do we engage and utilize AI from a services perspective? First we thought about, hey, okay, what does that look like? Our journey with AI actually started about two years ago and we really started to focus on AI functionality within all of our business systems. We took that and then we put in the digital connectors with RPA, with robotic process automation, you’ve got to have that digital connection, and then at the end of the day, trying to deploy that from a federal perspective and integrating that with the customers and the uses and creating digital workforce agents and the whole nine yards. And so we’ve tried to be innovative. We think that utilizing AI gives us an agile advantage, you know, than some of the larger competitors that we have. We’re able to move a little bit quicker as a mid-market federal contractor, and so we’re excited about, what are those new use cases, what are those new concepts that we’re delivering? We’re thinking about the work differently, Terry, every single day, and that requires a total mind shift.

Terry Gerton Well speaking about thinking about the work differently, we’ve talked about your growth strategy, we’ve talked about your workforce culture and training, we’ve talked about your tech approach. But the world of federal contracting and defense contracting is changing very, very rapidly. So as you look forward, say five years, what do you see for Saalex, and how are you positioning them to take advantage of the opportunities you see?

Travis Mack I’m going to try to pull out my magic ball here, put my Nostradamus hat on. Difficult question because of how fast things are changing. And what we’re trying to do is just be iterative. What we don’t want to be, Terry, is late. That is the thing. And we know we’re going to have some false starts. We know we’re going to not get it right as we implement automation and AI and efficiency throughout the organization. Government agencies right now want speed, they want agility, they want efficiency, they want security, the whole nine yards, as they are trying to change how they do the work as well. Five years out, we really think that it is about the iterative process, it is about changing how we do the work, it’s about identifying where we can drive efficiencies, and it’s about how we can, in my thoughts, do more with less, honestly. Because that’s where we’re headed to. So we’re excited about building an infrastructure, building a capability that the federal government and government agencies can utilize with some of our technical services, right? We’re supplying software development, we’re doing test range management, a whole bunch of technical stuff with the Department of War. So we’re excited about, how do we deliver those services differently? And what does that look like? Because I think that’s what everyone is struggling with. What does that look like? We’re trying to help get some visibility and we know it’s iterative. We know it’s going to innovate, we know it’s going to continue to expand, but we just didn’t want to be late.

The post From small business roots to mid-tier powerhouse, this firm is using employee ownership and AI to stay ahead in federal contracting first appeared on Federal News Network.

© Getty Images/iStockphoto/Olivier Le Moal

❌