Normal view

There are new articles available, click to refresh the page.
Yesterday — 5 December 2025Main stream

At VA, cyber dominance is in, cyber compliance is out

5 December 2025 at 15:25

The Department of Veterans Affairs is moving toward a more operational approach to cybersecurity.

This means VA is applying a deeper focus on protecting the attack surfaces and closing off threat vectors that put veterans’ data at risk.

Eddie Pool, the acting principal assistant secretary for information and technology and acting principal deputy chief information officer at VA, said the agency is changing its cybersecurity posture to reflect a cyber dominance approach.

Eddie Pool is the acting principal assistant secretary for information and technology and acting principal deputy chief information officer at the Department of Veterans Affairs.

“That’s a move away from the traditional and an exclusively compliance based approach to cybersecurity, where we put a lot of our time resources investments in compliance based activities,” Pool said on Ask the CIO. “For example, did someone check the box on a form? Did someone file something in the right place? We’re really moving a lot of our focus over to the risk-based approach to security, pushing things like zero trust architecture, micro segmentation of our networks and really doing things that are more focused on the operational landscape. We are more focused on protecting those attack surfaces and closing off those threat vectors in the cyber space.”

A big part of this move to cyber dominance is applying the concepts that make up a zero trust architecture like micro segmentation and identity and access management.

Pool said as VA modernizes its underlying technology infrastructure, it will “bake in” these zero trust capabilities.

“Over the next several years, you’re going to see that naturally evolve in terms of where we are in the maturity model path. Our approach here is not necessarily to try to map to a model. It’s really to rationalize what are the highest value opportunities that those models bring, and then we prioritize on those activities first,” he said. “We’re not pursuing it in a linear fashion. We are taking parts and pieces and what makes the most sense for the biggest thing for our buck right now, that’s where we’re putting our energy and effort.”

One of those areas that VA is focused on is rationalizing the number of tools and technologies it’s using across the department. Pool said the goal is to get down to a specific set instead of having the “31 flavors” approach.

“We’re going to try to make it where you can have any flavor you want so long as it’s chocolate. We are trying to get that standardized across the department,” he said. “That gives us the opportunity from a sustainment perspective that we can focus the majority of our resources on those enterprise standardized capabilities. From a security perspective, it’s a far less threat landscape to have to worry about having 100 things versus having two or three things.”

The business process reengineering priority

Pool added that redundancy remains a key factor in the security and tool rationalization effort. He said VA will continue to have a diversity of products in its IT investment portfolios.

“Where we are at is we are looking at how do we build that future state architecture, as elegantly and simplistically as possible so that we can manage it more effectively, they can protect it more securely,” he said.

In addition to standardizing on technology and cyber tools and technologies, Pool said VA is bringing the same approach to business processes for enterprisewide services.

He said over the years, VA has built up a laundry list of legacy technology all with different versions and requirements to maintain.

“We’ve done a lot over the years in the Office of Information and Technology to really standardize on our technology platforms. Now it’s time to leverage that, to really bring standard processes to the business,” he said. “What that does is that really does help us continue to put the veteran at the center of everything that we do, and it gives a very predictable, very repeatable process and expectation for veterans across the country, so that you don’t have different experiences based on where you live or where you’re getting your health care and from what part of the organization.”

Part of the standardization effort is that VA will expand its use of automation, particularly in processing of veterans claims.

Pool said the goal is to take more advantage of the agency’s data and use artificial intelligence to accelerate claims processing.

“The richness of the data and the standardization of our data that we’re looking at and how we can eliminate as many steps in these processes as we can, where we have data to make decisions, or we can automate a lot of things that would completely eliminate what would be a paper process that is our focus,” Pool said. “We’re trying to streamline IT to the point that it’s as fast and as efficient, secure and accurate as possible from a VA processing perspective, and in turn, it’s going to bring a decision back to the veteran a lot faster, and a decision that’s ready to go on to the next step in the process.”

Many of these updates already are having an impact on VA’s business processes. The agency said that it set a new record for the number of disability and pension claims processed in a single year, more than 3 million. That beat its record set in 2024 by more than 500,000.

“We’re driving benefit outcomes. We’re driving technology outcomes. From my perspective, everything that we do here, every product, service capability that the department provides the veteran community, it’s all enabled through technology. So technology is the underpinning infrastructure, backbone to make all things happen, or where all things can fail,” Pool said. “First, on the internal side, it’s about making sure that those infrastructure components are modernized. Everything’s hardened. We have a reliable, highly available infrastructure to deliver those services. Then at the application level, at the actual point of delivery, IT is involved in every aspect of every challenge in the department, to again, bring the best technology experts to the table and look at how can we leverage the best technologies to simplify the business processes, whether that’s claims automation, getting veterans their mileage reimbursement earlier or by automating processes to increase the efficacy of the outcomes that we deliver, and just simplify how the veterans consume the services of VA. That’s the only reason why we exist here, is to be that enabling partner to the business to make these things happen.”

The post At VA, cyber dominance is in, cyber compliance is out first appeared on Federal News Network.

© Getty Images/ipopba

Cyber security network and data protection technology on virtual interface screen.
Before yesterdayMain stream

Outdated SEC communications rules are putting compliance and competitiveness at risk

4 December 2025 at 12:52

Interview transcript

Terry Gerton The Securities Industry and Financial Markets Association has recently written to the SEC asking to modernize its communication and record keeping rules. Help us understand what the big issue is here.

Robert Cruz Well, I think the fundamental issue that SIFMA is calling out is a mismatch between the technology that firms use today and the rules, which were written a long time ago — and in some cases, you know, the Securities and Exchange Act from 1940. So essentially we’re all struggling trying to find a way to fit the way that we interact today into rules that are very old, written when we were doing things with typewriters and, you know, over written communication. So it’s trying to minimize the gap between those two things, between the technology and what the rule requires firms to do.

Terry Gerton So instead of all of those hard copy letters that we get from investment firms and those sorts of things, we also get emails, text messages. That’s where the disconnect is happening?

Robert Cruz Yes. It’s the fact that individuals can collaborate and communicate with their customers over a variety of mechanisms. And some of these may be casual. They may be not related to business. And that’s the fundamental problem is that SIFMA is looking for the rules to be clarified so it pertains only to the things that matter to the firm, that create value or risk to their business or to the investor.

Terry Gerton And what would those kinds of communications look like?

Robert Cruz I think what they’ll look like is external communication. So, right now the rule doesn’t distinguish between internal — you and I as colleagues talking versus things that pertain to, you know, communications with the public or with a potential investor. So it’s trying to carve out those things that really do relate to the business’s products or services and exclude some of the things that may be more just conversational, as you and I might pass each other in the hallway, we can chat on a chat board someplace. It’s trying to remove those kind of just transitory communications from the record keeping obligations.

Terry Gerton Right. The letter even mentions things like emojis and messages like “I’m running late.”

Robert Cruz Exactly. And you know, it’s a fundamental problem that firms have is the fact that if you say you’re going to be able to use a tool, even if it’s as simple as email, that means that our firm has an obligation to capture it. And when it captures it, it captures everything, everything that is delivered through that communication channel. So that creates some of that problem of like, somebody left their lunch in the refrigerator. We need to clean it up. it’s trying to remove all of that noise from the things that really do matter to the business.

Terry Gerton Not only does that kind of record keeping impose a cost on the organization, the reporting organization, but it also would create quite a burden on the regulators trying to sort out the meaningful communication in that electronic file cabinet, so to speak.

Robert Cruz Absolutely. Well, the firm clearly has the obligation to sift through all of this data to find the things that matter. If you have a regulatory inquiry, you’ve got to find everything that relates to it. Even if it’s, you know, I talked to an investor and there was an emoji in that conversation. I still need to account for that. So the burden is both on the firm as well as on the regulator to try to parse through these very large sets of data that are very, you know, heterogeneous with a lot of different activities that are captured in today’s tools.

Terry Gerton Relative to the question about the tools, you’ve said that SEC rules should be agnostic to technology. Unpack that for me. What exactly does that mean?

Robert Cruz Sure. This kind of goes back a few years where there was a revision to the rule 17A-4 from the SEC, which is the fundamental record keeping obligation. It says you need to have complete and accurate records. What they tried to do at that time was remove references to old technologies and spinning disks and things we used to do long ago. And so the objective was to be more independent of technology. Your obligation is your obligation. If it matters to the business, that’s the principle that should govern, not the particular tool that you use. So technology being agnostic — or rules being agnostic; technology means it doesn’t matter whether it’s delivered via email, via text, via emojis, carrier pigeons or anything else. If it matters to the business, it matters to the business.

Terry Gerton How do today’s variety of technologies complicate a business’ compliance requirements?

Robert Cruz The challenge is very complex, period. It’s always going to be with us because there’s always going to be a new way that your your client wants to engage. There may be a new tool that you’re not familiar with that they want to interact on. Or you may get pull from your employees internally because they’re familiar with tools from their personal lives. So that encroachment of new tools, it doesn’t go away. It’s always been with us. And so it’s things that we have to anticipate. Again, be agnostic because there’s going to be something that comes right along behind it that potentially makes you know an explicit regulation irrelevant from the outset.

Terry Gerton I’m speaking with Robert Cruz. He’s the Vice President for Regulatory and Information Governance at SMARSH. All right, let’s follow along with that because you’ve got a proposal that includes a compliance safe harbor. So along with these compliance questions, what would that change for firms and how does it address the challenges of enforcement?

Robert Cruz Well, it’s an interesting concept because the rules today are meant to be principles-based. They’re not prescriptive. In other words, they don’t tell you, you must do the following. And that’s one of the challenges the industry has is that, what is good enough? What is the SEC specifically looking for? So this is like trying to give people a safe spot to which then you can say, well, SEC, if you really care about, you know, particular areas of these communications, they can tune their programs to do that. So it feels like it’s just giving some latitude so that we can define best practices. We can get a clearer sense of what the regulators are looking for. It’ll guide our governance processes by just having a clearer picture of where enforcement’s going to be focused.

Terry Gerton The regulatory process that would apply here is notoriously slow and complicated. What’s at stake for firms and investors if we don’t get this modernized?

Robert Cruz Well, I think you’re going to continue to see just a lot of individual practices that will vary. Some firms will interpret things differently and we’ll need to wait for enforcement to determine which is the best way. So, case in point, generative AI — if you’re using these technologies inside of the tools that you currently support, are these going to be considered issues for the SEC or not? We we have to wait until we get some interpretation from the regulators to say, yes, we need to have stronger controls around this, or yes, we need to block these tools. You know, you need to make that adjustment based upon the way that the SEC responds to it.

Terry Gerton And what is your sense of how the SEC might respond to this?

Robert Cruz My gut tells me that just given where we are right now, you know, the SEC has a reduction in headcount it’s dealing with. It’s stating its mission very clearly and its focus is on crypto, is on capital formation, is on reducing regulatory burden. I just don’t know if this makes the list. So it clearly is being abdicated strongly from SIFMA, but, whether this makes page one of the SEC priorities list with the 20% reduction in headcount, it really seems like an outside chance that it gets onto their agenda.

Terry Gerton Could it inform some of the other regulation issues that they’re addressing, such as crypto and and capital formation?

Robert Cruz Absolutely. And that’s a great comment — the notion of using an unapproved communication tool, it didn’t go away. We may not see the big fines anymore, but I think the regulators are going to be saying if there’s an issue related to crypto, related to investor harm or what have you, if you’re using a tool that is not approved for use, you don’t have the artifact, you don’t have the historical record. They’re not going to view that you know favorably if you’re not able to defend your business. And so it’ll come up in context of other examinations that they’re carrying out. So maybe not a means to an end as it’s been for the last two years, but it will impact their ability to do their jobs ultimately.

The post Outdated SEC communications rules are putting compliance and competitiveness at risk first appeared on Federal News Network.

© Getty Images/iStockphoto/Maxxa_Satori

Business woman hand using smartphone with digital marketing via multi-channel communication network on mobile application technology.

Risk and Compliance 2025 Exchange: Diligent’s Jason Venner on moving beyond manual cyber compliance

The Pentagon is taking a major step forward in modernizing how it addresses cybersecurity risks.

Defense Department officials have emphasized the need to move beyond “legacy shortcomings” to deliver technology to warfighters more rapidly. In September, DoD announced a new cybersecurity risk management construct to address those challenges.

“The previous Risk Management Framework was overly reliant on static checklists and manual processes that failed to account for operational needs and cyber survivability requirements,” DoD wrote at the time. “These limitations left defense systems vulnerable to sophisticated adversaries and slowed the delivery of secure capabilities to the field.”

Weeding through legacy manual processes

The legacy of manual processes has built up over decades. Jason Venner, a solutions sales director at Diligent, said agencies have traditionally relied on people and paperwork to ensure compliance.

“It’s no one’s fault,” Venner said during Federal News Network’s Risk & Compliance Exchange 2025. “It just sort of evolved that way, and now it’s time to stop and reassess where we’re at. I think the administration is doing a pretty good job in looking at all the different regs that they’re promulgating and revising them.”

Venner said IT leaders are interested in ways to help streamline the governance, risk and compliance process while ensuring security.

“Software should help make my life easier,” he said. “If I’m a CIO or a CISO, it should help my make my life easier, and not just for doing security scans or vulnerability scans, but actually doing IT governance, risk and compliance.”

Katie Arrington, who is performing the duties of the DoD chief information officer, has talked about the need to “blow up” the current RMF. The department moved to the framework in 2018 when it transitioned away from the DoD Information Assurance Certification and Accreditation Process (DIACAP).

“I remember when we were going from DIACAP to RMF, I wanted to pull my hair out,” Arrington said earlier this year. “It’s still paper. Who reads it? What we do is a program protection plan. We write it, we put it inside the program. We say, ‘This is what we’ll be looking to protect the program.’ We put it in a file, and we don’t look at it for three years. We have to get away from paperwork. We have to get away from the way we’ve done business to the way we need to do business, and it’s going to be painful, and there are going to be a lot of things that we do, and mistakes will be made. I really hope that industry doesn’t do what industry tends to do, [which] is want to sue the federal government instead of working with us to fix the problems. I would really love that.”

Arrington launched the Software Fast Track initiative to once again tackle the challenge of quickly adopting secure software.

Evolving risk management through better automation, analytics

DoD’s new risk management construct includes a five-phase lifecycle and then core principles, including automation, continuous monitoring and DevSecOps.

Arrington talked about the future vision for cyber risk management within DoD earlier this year.

“I’m going to ask you, if you’re a software provider, to provide me your software bill of materials in both your sandbox and production, along with a third-party SBOM. You’re going to populate those artifacts into our Enterprise Mission Assurance Support Service,” she said. “I will have AI tools on the back end to review the data instead of waiting for a human and if all of it passes the right requirements, provisional authority to operate.”

Venner said the use of automation and AI rest on a foundation of data analytics. He argued the successful use of AI for risk management will require purpose-built models.

“Can you identify, suggest, benchmark things for me and then identify controls to mitigate these risks, and then let me know what data I need to monitor to ensure those controls are working. That’s where AI can really accelerate the conversation,” Venner said.

Discover more articles and videos now on our Risk & Compliance Exchange 2025 event page.

The post Risk and Compliance 2025 Exchange: Diligent’s Jason Venner on moving beyond manual cyber compliance first appeared on Federal News Network.

© Federal News Network

fnr-icon-full

Gen AI adoption is reshaping roles and raising tough questions about workforce strategy

3 December 2025 at 16:45

 

Interview transcript:

 

Terry Gerton I know you have studied how workers of different skill levels choose to use generative AI and the concept of AI exposure. Can you talk to us a little bit about what you’re finding there? Are there certain roles more likely to embrace AI, or certain roles that are more likely to be replaced?

Ramayya Krishnan AI exposure, to understand that, I think we have to think about how occupations are structured. So the Bureau of Labor Statistics has something, a taxonomy called O*NET. And O*NET describes all the occupations in the U.S. economy, there are 873 or so. And each of those occupations is viewed as consisting of tasks and tasks requiring certain sets of skills. AI exposure is a measure of how many of those tasks are potentially doable by AI. And thereby that becomes, then, a measure of ways in which AI could have an impact on people who are in that particular occupation. So, however, AI exposure should not be assumed to mean that that’s tantamount to AI substitution, because I think we should be thinking about how AI is deployed. And so there are capabilities that AI has. For instance, this conversation that we’re having could be automatically transcribed by AI. This this conversation we are having could be automatically translated from English to Spanish by AI, for instance. Those are capabilities, right? So when you take capabilities and actually deploy them in organizational contexts, the question of how it’s deployed will determine whether AI is going to augment the human worker, or is it going to automate and replace a particular task that a human worker does? Remember, this happens at the task level, not at the occupation level. So some tasks within an occupation may get modified or adapted. So if you look at how software developers today use co-pilots to build software, that’s augmentation, where it’s been demonstrated that software developers with lower skills usually get between 20% to 25% productivity improvement. Call center employees, again, a similar type of augmentation is happening. In other cases, you could imagine, for instance, if you were my physician and I was speaking to you, today we have things called ambient AIs that will automatically transcribe the conversation that I’m having with you, the physician. That’s an example of an AI that could potentially substitute for a human transcriber. So I gave you two examples: software developer and customer service where you’re seeing augmentation; the transcription task, I’m giving you an example of substitution. So depending on how AI is deployed, you might have some tasks being augmented, some being substituted. When you take a step back, you have to take AI exposure as a measure of capability and then ask the question, how does that then get deployed? Which then has impact on how workers are going to actually have to think about, what does this then mean for them? And if it’s complementing, how do they become fluent in AI and be able to use AI well? And if there’s a particular task where it’s being used in a substitutive manner, what does that then mean longer term for them, in terms of having to acquire new skills to maybe transition to other occupations where there might be even more demand? So I think it’s we have to unpack what AI exposure then means for workers by thinking about augmentation versus automation.

Terry Gerton There’s a lot of nuance in that. And your writings also make the point that Gen AI adoption narrows when the cost of failure is high. So how do organizations think both about augmentation versus replacement and the risk of failure as they deploy AI?

Ramayya Krishnan If you take the example of using AI in an automated fashion, its error rate has to be so low because you don’t have human oversight. And therefore, if the error rates are not sufficiently appropriate, then you need to pair the human with the AI. In some cases you might say the AI is just not ready. So we’re not going to use the AI at all. We’ll just keep human as is. In other cases, if AI can be used with the human, where there is benefits to productivity but the error rates are such you still need the human to ensure and sign off, either because the error rates are high or from an ethical standpoint or from a governance standpoint, you need the human in the loop to sign off, you’re going to see complementing the human with the AI. And then there are going to be tasks for which the AI quality is so high, that its error rates are so low, that you could actually deploy it. So when we talk about the cost of failure, you want to think about consequential tasks where failure is not an option. And so either the error rates have to be really low, and therefore I can deploy the AI in an automated fashion, or you have to ensure there is a human in the loop. And this is why I think AI measurement and evaluation prior to deployment is so essential because things like error rates, costs, all of these have to be measured and inform the decisions to deploy AI and deploy AI in what fashion? Is it in augmentation fashion or not, or is it going to be used independently?

Terry Gerton I’m speaking with Dr. Ramayya Krishnan. He’s the director of the Center for AI Measurement Science and Engineering at Carnegie Mellon University. So we’re talking there about how AI gets deployed in different organizations. How do you see this applying in the public sector? Are there certain kinds of government work where AI is more suitable for augmentation versus automation and that error rate then becomes a really important consideration?

Ramayya Krishnan I think there are going to be a number of opportunities for AI to be deployed. So you remember we talked about call centers and customer service types of centers. I mean, public sector, one aspect of what they do is they engage with citizens in a variety of ways, where they have to deliver and provide good information. Some of those are time sensitive and very consequential, like 911 emergency calls. Now, there you absolutely want the human in the loop because we want to make sure that those are dealt with in a way that we believe we need humans in the loop, which could be augmented by AI, but you know, you want humans in the loop. On the other hand, you could imagine questions about, you know, what kind of permit or what kind of form, you know, administrative kinds of questions, where there’s triage, if you will, of having better response time to those kinds of questions. The alternative to calling and speaking to somebody might be just like you could go to a website and look it up. Imagine a question-answering system that actually allows for you to ask and get these questions answered. I expect that, and in fact you’re already seeing this in local government and in state government, the deployment of these kinds of administrative kinds of question-answering systems. I’d say that’s one example. Within the organizations, there is the use of AI, not customer-facing or citizen-facing, but within the organizations, the use of these kinds of co-pilots that are being used within the organization to try and improve productivity. I think as AI gets more robust and more reliable, I expect that you will see greater use of AI in both trying to improve efficiency and effectiveness, but to do so in a responsible way, in such a way that you take into account the importance of providing service to citizens of all different abilities. One of the important things with the public sector is … maybe there’s multilingual support that is needed, you might need to help citizens who are disabled. How might we support different kinds of citizens with different ability levels? I think these are things where AI could potentially play an important role.

Terry Gerton AI is certainly already having a disruptive impact on the American workforce, particularly. What recommendations do you have for policymakers and employers to mitigate the disruption and think long-term about upskilling and reskilling so that folks can be successful in this new space?

Ramayya Krishnan I think this is actually one of the most important questions that we need to address. And you know, I served on the National AI Advisory Committee to the President and the White House Office of AI Initiatives, and this was very much a key question that was addressed by colleagues. And I think a recent op-ed that we have written with Patrick Harker at the University of Pennsylvania and Mark Hagerott at the University of South Dakota, really we make the case that this is an inflection point which requires a response pretty much on the scale of what President Lincoln did in 1862 with the Morrill Act in establishing land grant universities. Much like land grant universities were designed to democratize access to agricultural technology, really it enabled Americans from everywhere in the nation to harness this technology for economic prosperity both for themselves and for the nation. I think if you’re going to see AI be deployed and not have the kind of inequality that might arise from people having access to the technology and not having access to the technology, we need something like this. And we call this the Digital Land Grant Initiative that would connect our universities, the community colleges, with various ways of providing citizens, both in rural areas and urban areas, everywhere in the country, access to AI education and skilling appropriate to their context. So if I’m a farmer, how can I do precision agriculture? If I’m a mine worker, or if I’m somebody who wants to work in in banking — from the whole range of occupations and professions, you could imagine AI having a transformative effect on these different occupations. And there may be new occupations that are going to emerge that you and I are not thinking about right now. So, how do we best position our citizens so that they can equip themselves with the right sets of skills that are going to be required and demanded? I think that’s the big public policy question with regard to workforce upskilling and reskilling.

The post Gen AI adoption is reshaping roles and raising tough questions about workforce strategy first appeared on Federal News Network.

© Getty Images/iStockphoto/ipopba

Businessman hold circle of network structure HR - Human resources. Business leadership concept. Management and recruitment. Social network. Different people.

House lawmakers to try again to extend TMF through NDAA

2 December 2025 at 17:30

The Technology Modernization Fund is running out of time. In 10 days, the reauthorization will expire for the 8-year-old governmentwide account to help agencies update IT systems.

If Congress doesn’t act before Dec. 12, the TMF will not be able to make any new investments, freezing more than $150 million.

“The Technology Modernization Fund remains one of the federal government’s most effective tools for rapidly strengthening cybersecurity and improving high-impact systems. Reauthorizing the TMF is essential to ensuring stable, flexible funding that helps agencies deliver secure, modern services for the American people,” said a GSA spokesperson in an email to Federal News Network. “We look forward to working with Congress on the reauthorization effort.”

There is support in the House for reauthorizing the TMF. Rep. Nancy Mace (R-S.C.) and former Congressman Gerry Connolly (D-Va.) introduced the Modernizing Government Technology (MGT) Reform Act in April that included an extension of the fund to Dec. 31, 2031.

The bill hasn’t moved out of the House Oversight and Government Reform Committee and there is no Senate companion.

The House did pass a version of this bill in May 2024, but, again, the Senate never moved on the bill.

The Senate, however, did allocate $5 million for the TMF in its version of the fiscal 2026 Financial Services and General Government appropriations bill, released last week. This comes after Congress zeroed out new funding for the program over the last three years. The House version of the FSGG bill didn’t include any new money for the TMF.

Mace tried to include her TMF bill as a provision in the House’s version of the National Defense Authorization bill, but language didn’t make it in the version passed by the lower chamber. The Senate version of the NDAA also didn’t include the TMF extension, but there is still hope to get it in during the upcoming conference committee negotiations.

“Extending and reauthorizing the Technology Modernization Fund, which expires on Dec. 12, is a high priority for the committee and we have requested in a bipartisan manner that it be included in the final Fiscal Year 2026 National Defense Authorization Act,” said an Oversight and Government Reform Committee spokesperson. “This is a shared policy priority with the administration and the Office of Management and Budget. Extending the fund also has broad industry support, specifically the Committee has support letters from the Information Technology Industry Council (ITI), the Center for Procurement Advocacy (CPA), the Professional Services Council (PSC) and the Alliance for Digital Innovation (ADI).”

TMF: 69 investments, $1 billion

ADI wrote lawmakers a letter on Nov. 24 advocating for the TMF extension.

“To date, the TMF has catalyzed transformation across government, from strengthening cybersecurity defenses to improving citizen-facing digital services. By providing flexible capital through a merit-based process overseen by federal technology leaders, the Fund enables agencies to undertake complex modernization initiatives that would otherwise remain trapped in multi-year budget cycles. This structure ensures accountability while giving agencies the agility to respond to rapidly evolving technology landscapes and emerging threats,” the industry association said in its letter to House and Senate leadership. “The MGT Reform Act provides the right framework for the TMF’s next chapter. By extending authorization for seven years, Congress would provide agencies the long-term certainty needed to plan and execute substantial and transformational modernization programs. The legislation’s transparency provisions, including the establishment of a federal legacy IT inventory, will give policymakers greater visibility into modernization progress and priorities. These reforms strengthen oversight while preserving the operational flexibility that makes the TMF effective.”

GSA says in its fiscal 2026 budget justification that the TMF currently manages more than $1.07 billion worth of systems upgrades and modernization projects totaling 69 investments across 34 federal agencies. The TMF board has received and reviewed more than 290 proposals totaling about $4.5 billion in funding demand.

The TMF board made only one new investment in calendar year 2025. It awarded $14.6 million to the Federal Trade Commission in June to develop a cloud-based analytics platform that uses artificial intelligence tools and to train staff to handle data analysis in-house.

GSA says it had more than $231 million in available funding for 2025 and it expected to have more than $158 million for the TMF in 2026.

“The government needs updated technology, and those updates need to be done efficiently. I’m proud to co-sponsor the bipartisan Modernizing Government Technology Reform Act introduced by Cybersecurity Subcommittee Chairwoman Mace,” said Rep. Shontel Brown (D-Ohio), ranking member of the Cybersecurity, IT and Government Innovation subcommittee, in an email to Federal News Network. “The best course of action would be the Oversight Committee and Congress advancing this legislation before the authorization ends.”

Technical debt would increase faster

Former federal technology executives say letting the TMF expire would set back agency modernization efforts.

Larry Bafundo, the former executive director of the TMF program office, said without the TMF, agencies will have a more difficult time finding funding to modern legacy systems.

“We spend a vast majority of our funding on maintaining existing and outdated systems instead of adapting systems to meet changing needs. I think something is broken in the way we fund modernization of IT systems. Congress is incentivized to think in terms of projects instead of services that evolve over time. There is a huge disconnect between how the government works and how IT projects are funded,” said Bafuno, who is now president of Mo Studio, a digital services company. “There isn’t a clear, governmentwide IT modernization strategy, with a clear inventory of systems, to align programs like TMF against. As a result, we approach the problem piece-meal, rather than as part of a deliberate, or coordinated, plan. Similarly, agencies can sometimes lack incentives to modernize effectively. In many cases, they not only lack performance baselines to measure change against, but there are also very few senior executives in govt today who are evaluated based on the value of the services they provide the public. Instead, they are incentivized to preserve the status quo. All of this makes showing ‘return on investment’ difficult, along with the fact that Congress is not united in its understanding of what the return on investment looks like — is it cheaper, more secure, faster, etc.? We don’t have a common definition for success when it comes to programs like TMF.”

Bafundo said the TMF works because it provides agencies with guardrails or characteristics for the types of projects the board would invest in.

“We relied on good ideas or good proposals and someone who could defend their ideas, as opposed to a set of focal areas and show us what you can with seed funding. You can use that experience to unlock further funding,” he said. “That is how it should work instead of a 3-to-5 year plan that many programs have. In some ways the TMF because it relies on lengthy proposals instead of working software is more like a grant program than a seed fund.”

Gundeep Ahluwalia, a former Labor Department chief information officer, helped the agency win TMF funding for six different projects between 2018 and 2024.

Ahluwalia, who is now an executive vice president and chief innovation officer for NuAxis Innovations, said the TMF helped Labor pay down its technical debt.

“Whether it’s improving services to Americans or protecting against foreign adversaries, the cost of not doing anything here is just too large, especially considering the investment is paltry,” he said. “The TMF used an approach very similar to the private sector where you would make your business case, tell the board how much the company would get back from the investment. This business case is a no-brainer. For $500 million or even $250 million, it could give agencies the opportunity to improve services, reduce risks and become cyber strong.”

OMB seeks change to TMF

It’s unclear why support on Capitol Hill has been tepid a best for the TMF.

Ahluwalia said lawmakers still have trouble understanding why something like the TMF is needed and there isn’t an outspoken supporter like Connolly, who passed away in May, was for IT modernization funding.

“If you don’t understand something and there is a significant resistance to spending this becomes yet another government program. But this isn’t just another one, the TMF is a way out of our technical debt conundrums. It’s modeled after the private sector and I don’t think people may not understand that,” he said.

OMB, which didn’t respond to two requests for comments on the TMF expiring, proposed through GSA’s 2026 budget request a new funding model for the program. The White House wants to make it a revolving or working capital fund of sorts that would be authorized to collect up to $100 million a year in otherwise expired funding.

The legislative proposal would let “GSA, with the approval of OMB, to collect funding from other agencies and bring that funding into the TMF,” GSA wrote in its budget justification document. “This would allow agencies to transfer resources to the TMF using funds that are otherwise no longer available to them for obligation. This provision is essential to providing the TMF with the necessary funds to help the federal government address critical technology challenges by modernizing high-priority systems, improving AI adoption and supporting cross-government collaboration and scalable services.”

If the TMF authority expires, GSA would still be able to support existing investments with already approved funding and other program support services.

The post House lawmakers to try again to extend TMF through NDAA first appeared on Federal News Network.

© Federal News Network

technology-modernization-fund-1

How the inefficiencies of TIC 2.0 hinder agencies’ cybersecurity progress

1 December 2025 at 14:57

Federal agencies face an ever-evolving threat landscape, with cyberattacks escalating in both frequency and sophistication. To keep pace, advancing digital modernization isn’t just an aspiration; it’s a necessity. Central to this effort is the Trusted Internet Connections (TIC) 3.0 initiative, which offers agencies a transformative approach to secure and modernize their IT infrastructure.

TIC 3.0 empowers agencies with the flexibility to securely access applications, data and the internet, providing them with the tools they need to enhance their cyber posture and meet the evolving security guidance from the Office of Management and Budget and the Cybersecurity and Infrastructure Security Agency. Yet, despite these advantages, many agencies are still operating under the outdated TIC 2.0 model, which creates persistent security gaps, slows user experience, and drives higher operating costs, ultimately hindering progress toward today’s modernization and adaptive security goals.

Why agencies must move beyond TIC 2.0

TIC 2.0, introduced over a decade ago, aimed to consolidate federal agencies’ internet connections through a limited number of TIC access points. These access points were equipped with legacy, inflexible and costly perimeter defenses, including firewalls, web proxies, traffic inspection tools and intrusion detection systems, designed to keep threats out. While effective for their time, these static controls weren’t designed for today’s cloud-first, mobile workforce. Often referred to as a “castle and moat” architecture, this perimeter-based security model was effective when TIC 2.0 first came out, but is now outdated and insufficient against today’s dynamic threat landscape.

Recognizing these limitations, OMB introduced TIC 3.0 in 2019 to better support the cybersecurity needs of a mobile, cloud-connected workforce. TIC 3.0 facilitates agencies’ transition from traditional perimeter-based solutions, such as Managed Trusted Internet Protocol Service (MTIPS) and legacy VPNs, to modern Secure Access Service Edge (SASE) and Security Service Edge (SSE) frameworks. This new model brings security closer to the user and the data, improving performance, scalability and visibility across hybrid environments.

The inefficiencies of TIC 2.0

In addition to the inefficiencies of a “castle and moat” architecture, TIC 2.0 presents significant trade-offs for agencies operating in hybrid and multi-cloud environments:

  • Latency on end users: TIC 2.0 moves data to where the security is located, rather than positioning security closer to where the data resides. This slows performance, hampers visibility, and frustrates end users.
  • Legacy systems challenges: outdated hardware and rigid network paths prevent IT teams from managing access dynamically. While modern technologies deliver richer visibility and stronger data protection, legacy architectures hold agencies back from adopting them at scale.
  • Outages and disruptions: past TIC iterations often struggle to integrate cloud services with modern security tools. This can create bottlenecks and downtime that disrupt operations and delay modernization efforts.

TIC 3.0 was designed specifically to overcome these challenges, offering a more flexible, distributed framework that aligns with modern security and mission requirements.

“TIC tax” on agencies — and users

TIC 2.0 also results in higher operational and performance costs. Since TIC 2.0 relies on traditional perimeter-based solutions — such as legacy VPNs, expensive private circuits and inflexible, vulnerable firewall stacks — agencies often face additional investments to maintain these outdated systems, a burden commonly referred to as the “TIC Tax.”

But the TIC Tax isn’t just financial. It also shows up in hidden costs to the end user. Under TIC 2.0, network traffic must be routed through a small number of approved TIC Access Points, most of which are concentrated around Washington, D.C. As a result, a user on the West Coast or at an embassy overseas may find their traffic backhauled thousands of miles before reaching its destination.

In an era where modern applications are measured in milliseconds, those delays translate into lost productivity, degraded user experience, and architectural inefficiency. What many users don’t realize is that a single web session isn’t just one exchange; it’s often thousands of tiny connections constantly flowing between the user’s device and the application server. Each of those interactions takes time, and when traffic must travel back and forth across the country — or around the world — the cumulative delay becomes a real, felt cost for the end user.

Every detour adds friction, not only for users trying to access applications, but also for security teams struggling to manage complex routing paths that no longer align with how distributed work and cloud-based systems operate. That’s why OMB, CISA and the General Services Administration have worked together under TIC 3.0 to modernize connectivity, eliminating the need for backhauling and enabling secure, direct-to-cloud options that prioritize both performance and protection.

For example, agencies adopting TIC 3.0 can leverage broadband internet services (BIS), a lower-cost, more flexible transport option that connects users directly to agency networks and cloud services through software-defined wide area network (SD-WAN) and SASE solutions.

With BIS, agencies are no longer constrained to rely on costly, fixed point-to-point or MPLS circuits to connect branch offices, data centers, headquarters and cloud environments. Instead, they can securely leverage commercial internet services to simplify connectivity, improve resiliency, and accelerate access to applications. This approach not only reduces operational expenses but also minimizes latency, supports zero trust principles, and enables agencies to build a safe, flexible and repeatable solution that meets TIC security objectives without taxing the user experience.

How TIC 2.0 hinders zero trust progress

Another inefficiency — and perhaps one of the most significant — of TIC 2.0 is its incompatibility with zero trust principles. As federal leaders move into the next phase of zero trust, focused on efficiency, automation and rationalizing cyber investments, TIC 2.0’s limitations are even more apparent.

Under TIC 2.0’s “castle and moat” model, all traffic, whether for email, web services or domain name systems, must be routed through a small number of geographically constrained access points. TIC 3.0, in contrast, adopts a decentralized model that leverages SASE and SSE platforms to enforce policy closer to the user and data source, improving both security and performance.

To visualize the difference, think of entering a baseball stadium. Under TIC 2.0’s “castle and moat” approach, once you show your ticket at the entrance, you can move freely throughout the stadium. TIC 3.0’s decentralized approach still checks your ticket, but ushers and staff ensure you stay in the right section, verifying continuously rather than once.

At its core, TIC 3.0 is about moving trust decisions closer to the resource. Unlike TIC 2.0, where data must travel to centralized security stacks, TIC 3.0 brings enforcement to the edge, closer to where users, devices and workloads actually reside. This aligns directly with zero trust principles of continuous verification, least privilege access and minimized attack surface.

How TIC 3.0 addresses TIC 2.0 inefficiencies

By decentralizing security and embracing SASE-based architectures, TIC 3.0 reduces latency, increases efficiency and enables agencies to apply modern cybersecurity practices more effectively. It gives system owners better visibility and control over network operations while allowing IT teams to manage threats in real time. The result is smoother, faster and more resilient user experiences.

With TIC 3.0, agencies can finally break free from the limitations of earlier TIC iterations. This modern framework not only resolves past inefficiencies, it creates a scalable, cloud-first foundation that evolves with emerging threats and technologies. TIC 3.0 supports zero trust priorities around integration, efficiency and rationalized investment, helping agencies shift from maintaining legacy infrastructure to enabling secure digital transformation.

Federal IT modernization isn’t just about replacing technology; it’s about redefining trust, performance and resilience for a cloud-first world. TIC 3.0 provides the framework, but true transformation comes from operationalizing that framework through platforms that are global, scalable, and adaptive to mission needs.

By extending security to where users and data truly live — at the edge — agencies can modernize without compromise: improving performance while advancing zero trust maturity. In that vision, TIC 3.0 isn’t simply an evolution of policy; it’s the foundation for how the federal enterprise securely connects to the future.

Sean Connelly is executive director for global zero trust strategy and policy at Zscaler and former zero trust initiative director and TIC program manager at CISA.

The post How the inefficiencies of TIC 2.0 hinder agencies’ cybersecurity progress first appeared on Federal News Network.

© Getty Images/iStockphoto/go-un lee

Industry 4.0, Internet of things (IoT) and networking, network connections

Risk & Compliance Exchange 2025: Former DOJ lawyer Sara McLean on ensuring cyber compliance under the False Claims Act

1 December 2025 at 12:41

Since January 2025, the Justice Department has been aggressively holding federal contractors accountable for violating cybersecurity violations under the False Claims Act.

Over the last 11 months, the Trump administration has announced six settlements out of the 14 since the initiative began in 2021.

Sara McLean, a former assistant director of the DOJ Commercial Litigation Branch’s Fraud Section and now a partner with Akin, said the Trump administration has made a much more significant push to hold companies, especially those that work for the Defense Department, accountable for meeting the cyber provisions of their contracts.

Sara McLean is a former assistant director of the DOJ Commercial Litigation Branch’s Fraud Section and now is a partner with Akin,

“I think there are going to be a lot more of these announcements. There’s been a huge uptick just since the beginning of the administration. That is just absolutely going to continue,” McLean said during Federal News Network’s Risk & Compliance Exchange 2025.

“The cases take a long time. The investigations are complex. They take time to develop. So I think there are going to be many, many, many more announcements, and there’s a lot of support for them. Cyber enforcement is now embedded in what the Justice Department does every day. It’s described as the bread and butter by leadership.”

A range of high-profile cases

A few of the high-profile cases this year so far include a $875,000 settlement with Georgia Tech Research Corp. in September and a $1.75 million settlement in August with Aero Turbine Inc. (ATI), an aerospace maintenance provider, and Gallant Capital Partners, a private equity firm that owned a controlling stake in ATI during the time period covered by the settlement.

McLean, who wouldn’t comment on any one specific case, said in most instances, False Claims Act allegations focus on reckless disregard for the rules, not simple mistakes.

“We’ve seen in some of the more recent announcements new types of fact patterns. What happens is when announcements are made that DOJ has pursued a matter and has resolved a matter, that often leads to the qui tam relators and their attorneys finding more matters like that and filing them,” said McLean who left federal service in October after almost 27 years. “It’ll be interesting to see if these newer fact patterns yield more cases that are similar.”

Recent cases that involve the security of medical devices or the qualifications of cyber workers performing on government contracts are two newer fact patterns that have emerged over the last year or so.

Launched in 2021, the Justice’s Civil-Cyber Fraud initiative uses the False Claims Act to ensure contractors and grantees meet the government’s cybersecurity requirements.

President Joe Biden signed an executive order in May 2021 that directed all agencies to improve “efforts to identify, deter, protect against, detect and respond to” malicious cyberthreats.

130 DOJ lawyers focused on cyber

Justice conducted a 360 review of cyber matters and related efforts, and one of the areas that emerged was to use the False Claims Act to hold contractors and grantees accountable and drive a change in behavior.

“The motivation was largely to improve cybersecurity and also to protect sensitive information, personal information, national security information, and to ensure a level playing field, so that you didn’t have some folks who were meeting the requirements and others who were not,” McLean said.

“It was to ensure that incidents were being reported to the extent the False Claims Act could be used around that particular issue. Because the thought was that would enable the government to respond to cybersecurity problems and that still is really the impetus now behind the enforcement.”

McLean said the Civil-Cyber Fraud initiative is now embedded as part of the DOJ’s broader False Claims Act practice. It has about 130 lawyers, who work with U.S. attorney’s offices as well as agency inspectors general offices.

Typically, an IG begins an investigation either based on a qui tam or whistleblower filing, or a more traditional review of contracts and grants.

The IG will assign agents and DOJ lawyers will join as part of the investigative team.

McLean said the agents are on the ground, interviewing witnesses and applying all the resources that come from the IGs. DOJ then decides, based on the information the IGs bring back, to either take some sort of action, such as intervening in a qui tam lawsuit and taking it over, or to decline or settle with a company.

“They go back to the agency for a recommendation on how to proceed. So it’s really the agencies and DOJ who are really in lockstep in these matters,” she said. “DOJ is making the decision, but it’s based on the recommendation of the agencies and with the total support of the agencies.”

Many times, Justice decides to intervene in a case or seek a settlement depending on whether the company in question has demonstrated reckless disregard for federal cyber rules and regulations.

McLean said a violation of the False Claims Act requires only reckless disregard, not intentional fraud.

“It’s critically important for anyone doing business with the government, especially those who are signing a contract and agreeing to do something, to make sure that they understand what that is, especially in the cybersecurity area,” she said. “What they’ve signed on to can be quite complicated. It can be legally complicated. It can be technically complicated. But signing on the dotted line without that understanding is just a recipe for getting into trouble.”

When a whistleblower files a qui tam lawsuit, McLean said that ratchets up the entire investigation. A whistleblower can be entitled to up to 30% of the government’s recovery, whether through a decision or a settlement.

Self-disclosures encouraged

If a company doesn’t understand the requirements and doesn’t put any resources into trying to understand and comply with them, that can lead to a charge of reckless disregard.

“When it comes to employee qualifications, it’s the same thing. If a contract says that there needs to be this level of education or there needs to be this level of experience, that is what needs to be provided. Or a company can get into trouble,” McLean said.

“The False Claims Act applies to making false claims and causing false claims. It’s not just the company that’s actually directly doing business with the government that needs to worry about the risk of False Claims Act liability, because a company that’s downstream, like a subcontractor who’s not submitting the claims to the government, could be found liable for causing a false claim, or, say, an assessor could be found liable for causing a false claim, or a private equity company could be found liable for causing a false claim. There are individuals who can be found liable for causing and submitting false claims.”

She added that False Claims Act allegations can apply not only to just the one company that has the direct relationship with the government but also to their partners if they are not making a good faith effort to comply.

But when it’s a mistake, maybe an overpayment or something similar, the company can usually claim responsibility and address the problem quickly.

“DOJ has policies of giving credit in False Claims Act settlements for self-disclosure, cooperation and remediation. That is definitely something that is available and that companies have been definitely taking advantage of in this space,” McLean said. “DOJ understands that there’s more focus on cybersecurity than there used to be, and so there are companies that maybe didn’t attend to this as much as they now wish they had in the past. The companies discover that they’ve got some kind of a problem and want to fix it going forward, but then also figure out, ‘How do I make it right and in the past?’ ”

McLean said this is why vendors need to pay close attention to how they comply with the DoD’s new Cybersecurity Maturity Model Certification.

She said when vendors sign certifications that they are complying with CMMC standards without fully understanding what that means, that could be considered deliberate ignorance.

“Some courts have described it as gross negligence. Negligence would be a mistake. I don’t know if that helps for the for the nonlawyers, but corporations which do not inform themselves about the requirements or not taking the steps that are necessary, even if it’s not through necessarily ill intent, but it’s not what the government bargained for, and it’s not just an accident. It’s a little bit more than that, quite a bit more than that,” she said.

“The one thing that’s important about that development is it does involve more robust certifications, and that is something that can be a factor in a case being a False Claims Act and a case being more or less likely to be one that the government would take over. Because signing a certification when the information is not true starts to look like a lie, which starts to look like the more intentional type of fraud … rather than a mistake. It looks reckless to be signing certifications without doing this review to know that the information that’s in there is right.”

Discover more articles and videos now on our Risk & Compliance Exchange 2025 event page.

The post Risk & Compliance Exchange 2025: Former DOJ lawyer Sara McLean on ensuring cyber compliance under the False Claims Act first appeared on Federal News Network.

© Federal News Network

Risk and Compliance Exchange 2025 (3)

Risk & Compliance Exchange: Cyber AB’s Matt Travis on scaling the CMMC ecosystem

The Cybersecurity Maturity Model Certification program is officially off the ground.

CMMC is the Pentagon’s program to evaluate whether defense contractors are following requirements for protecting controlled unclassified information. The cybersecurity requirements, based on National Institute of Standards and Technology controls, have been in Defense Department contracts since 2016.

It took years for CMMC to become a reality. But the final rule to implement CMMC into contractual requirements took effect Nov. 10.The rule establishing CMMC as a program had already gone into effect last year.

DoD has a phased implementation plan for the program. During Phase 1, over the next year, the department will largely require CMMC self-assessments from contractors. But DoD programs have the discretion to require Level 2 CMMC third-party assessments over the next year as needed.

Tackling third-party CMMC assessments

During Phase 2, starting next November, those third-party assessments will become standard in applicable contacts.

Those third-party assessments are a key facet of the CMMC program and its goal to ensure defense contractors follow cybersecurity requirements.

The Cyber Accreditation Body is responsible for authorizing the CMMC third-party assessment organizations (C3PAOs) that will carry out those independent assessments. And Matthew Travis, CEO of The Cyber AB, said work is well underway to building out the scaffolding that will support the CMMC program.

“If there’s any remaining skepticism of whether or not the department was serious about this conformity regime, you can now just look at the Code of Federal Regulations and see both rules there,” Travis said during Federal News Network’s Risk & Compliance Exchange 2025. “Now, the real challenge is to scale the ecosystem.”

‘Impending bow wave’

So far, just under 500 defense contractors have voluntarily achieved a Level 2 CMMC certification, Travis shared.

But the Pentagon has estimated that the requirement for a Level 2 third-party assessment could apply to as many as 80,000 companies as CMMC is phased in.

“I am concerned about the impending bow wave that I think we’ll see in demand,” Travis said.

Some C3PAOs already have a backlog of assessments that stretch into next year.

“Now is the time to move if you’re ready,” Travis added. “People are going to start racing to the checkout line, and it’s going to be a wait. So move now if you’re ready, and if you’re not ready, get ready, because the sooner you do it, the sooner you’ll be able get a slot.”

Among the voluntary Level 2 assessments that have occurred to date, Travis said “false starts” have been an issue for some organizations.

“We heard frequently from the C3PAOs that they had to call it off mutually once the organization seeking certification realized all the things that they hadn’t fully done,” Travis said. “And the C3PAO said, ‘We might want to pause here. Go back to work and call us when you’re ready.’ ”

Travis said the 110 requirements required under Level 2 go beyond technical controls.

“It does require an organizational commitment,” he said. “There are physical security requirements, there are training requirements that human resources has to be involved in. There are leadership requirements in terms of resourcing.”

Another key lesson gleaned from early assessments is the need for companies to understand their external service providers. Travis said most organizations rely on cloud service providers or managed service providers for many IT and cybersecurity needs.

But whether they’re a CSP or an MSP — and to what extent they are involved in an organization’s handling of controlled unclassified information — are crucial questions in a CMMC assessment.

“Knowing who’s helping you and knowing your organization is fully committed are probably the two biggest takeaways that we’re hearing from industry,” Travis said.

CMMC’s ‘long pole in the tent’

The Cyber AB, through its no-cost contract with the Pentagon, is responsible for authorizing C3PAOs and certifying the people who conduct CMMC assessments.

Travis said there are just under 600 certified CMMC assessors today. Half of them are eligible to lead assessment teams.

But to meet the envisioned scale of the CMMC program — evaluating tens of thousands of defense contractors annually — Travis estimates there’s a need for between 2,000 and 3,000 assessors.

“That’s the most important part of the ecosystem that has to be grown. … That’s a long pole in the tent,” Travis said.

Initially, the challenge to building a pool of assessors was DoD’s drawn out rulemaking process: There was no financial incentive to become an assessor with no CMMC requirements on the horizon.

But Travis said the challenge now is getting CMMC assessors through the process quickly enough as DoD phases in the requirements. The process of becoming an assessor involves training, exams and passing a Tier 3 DoD background investigation, which is equivalent to being investigated for a secret-level security clearance. Those investigations can often take months.

Travis said assessors don’t necessarily need to start with a technical background. He pitched it as a “great way for folks to get engaged in cybersecurity.”

“Whether it’s a full time job or a side hustle, these assessors are going to be in demand,” Travis said. “And so the compensation that goes with it, I think, is compelling. We are encouraging folks, if they haven’t considered entering into the CMMC program, think about becoming an assessor.”

Discover more articles and videos now on our Risk & Compliance Exchange 2025 event page.

The post Risk & Compliance Exchange: Cyber AB’s Matt Travis on scaling the CMMC ecosystem first appeared on Federal News Network.

© Federal News Network

Risk and Compliance Exchange 2025 (2)

DOGE and its long-term counterpart remain, with a full slate of modernization projects underway

25 November 2025 at 18:30

The Department of Government Efficiency, the driving force behind the Trump administration’s cuts to the federal workforce and executive branch spending, isn’t wrapping up operations sooner than expected, according to several administration officials.

Reuters published a story on Sunday claiming that DOGE no longer exists, about eight months ahead of the deadline set by President Donald Trump. The story drew strong reactions from Trump administration officials, who rejected claims that DOGE is ending before its final day on July 4, 2026.

A DOGE spokesperson told Federal News Network on Tuesday that DOGE and its longer-term, tech-aligned counterpart, the U.S. DOGE Service, both remain — and that the latter organization is moving forward with a full slate of modernization projects.

The spokesperson, in response to written questions, confirmed DOGE still exists as a temporary organization within the U.S. DOGE Service, and that Amy Gleason remains the acting administrator of USDS.

In addition, the spokesperson said the U.S. DOGE Service — a Trump-era rebranding of the U.S. Digital Service — is working on several cross-agency projects. The spokesperson said USDS is actively involved in these projects, but the agencies in charge of these projects oversee staffing and hiring. The list of projects shared with Federal News Network closely resembles the type of work that USDS was involved in before the Trump administration.

“The U.S. DOGE Service remains deeply engaged across government-modernizing critical systems, improving public services, and delivering fast, practical solutions where the country needs them most,” the spokesperson said.

Office of Personnel Management Director Scott Kupor wrote on X that “DOGE may not have centralized leadership under USDS,” but the “principles of DOGE remain alive and well.”

Those principles, he added, include deregulation; eliminating fraud, waste and abuse; and reshaping the federal workforce.

Kupor wrote that DOGE “catalyzed these changes,” and that OPM and the Office of Management and Budget “will institutionalize them.”

It’s not clear that DOGE leadership ever set exact demands for its representatives scattered across multiple federal agencies. Current and former DOGE representatives publicly stated that DOGE leadership played a hands-off role in their day-to-day work, and that they identified primarily as employees of their agencies. Former DOGE employees said they rarely heard from Elon Musk, DOGE’s former de facto leader, once they completed their onboarding to join the Trump administration.

DOGE wrote on X that “President Trump was given a mandate by the American people to modernize the federal government and reduce waste, fraud and abuse,” and that it terminated 78 contracts worth $335 million last week.

The DOGE spokesperson said the U.S. DOGE Service is working on a project to use AI to process over 600,000 pieces of federal correspondence each month, and is working with the General Services Administration to advance “responsible AI governmentwide.”

Current U.S. DOGE Service projects include:

  • Supporting 18 million students by modernizing the FAFSA system and implementing major student loan and Pell Grant changes.
  • Improving access to benefits with a streamlined, public-option verification tool that helps states accelerate community engagement requirements for Medicaid and SNAP approvals.
  • Transforming the non-immigrant visa process to support Olympic and World Cup travel with a more reliable, adaptable digital platform.
  • Reducing delays for over 600,000 veterans each month through a modernized VA disability compensation application.
  • Building a modern National Provider Directory to speed Medicare provider enrollment and enable nationwide interoperability.
  • Launching new patient-facing apps and data access tools, first announced at the White House and rolling out beginning January 2026.
  • Digitizing the National Firearms Act process, replacing outdated paper systems.
  • Using AI responsibly to process over 600,000 pieces of federal correspondence monthly.
  • Strengthening Medicare’s digital experience with better security, fraud reporting, caregiver access and reduced paper burden.
  •  Improving VA appointment management with integrated scheduling, check-ins, notifications and after-visit support.
  • Advancing responsible AI government-wide through partnership with GSA.
  • Rapid-response deployments for Customs and Border Protection, FEMA, Medicare claims modernization, FDA data consolidation.

Gleason said in September that agencies don’t have enough tech talent to deliver on the administration’s policy goals, and they would need to boost hiring

“We need to hire and empower great talent in government,” Gleason said on Sept. 4. “There’s not enough tech talent here. We need more of it.”

Under the Trump administration, federal employees have faced mass layoffs and incentives to leave government service. The Partnership for Public Service estimates that, as of October, more than 211,000 employees left the federal workforce this year — either voluntarily or involuntarily.

Gleason, who also serves as a strategic advisor for the Centers for Medicare and Medicaid Services, said tech hiring is essential to help CMS “build modern services for the American people.” She said the agency, at the beginning of this year, had about 13 engineers managing thousands of contractors.

“If we could hire great talent for tech in the government, I think in five years, we can really transform a lot of these systems to be much more modern and user-friendly, and easy for citizens to engage with what they need,” Gleason said. “But we have to take advantage of hiring.”

The post DOGE and its long-term counterpart remain, with a full slate of modernization projects underway first appeared on Federal News Network.

© AP Photo/Jose Luis Magana

FILE - Elon Musk flashes his T-shirt that reads "DOGE" to the media as he walks on South Lawn of the White House, in Washington, March 9, 2025. (AP Photo/Jose Luis Magana, File)

A new center aims to modernize federal lending at a scale few realize exists

25 November 2025 at 16:50

 

Interview transcript:

 

Doug Criscitello Very excited to get underway at the Center for USA Lending. The idea has been building really in my mind, and on the part of others from this community, the federal lending community, for several decades really. The U.S. government runs more than 125 federal loan and loan guarantee programs, and that’s at agencies like the Federal Housing Administration, the Small Business Administration, the Department of Agriculture has a variety of loan programs, and various others. There’s about a dozen federal agencies that have loan programs. And today, the U.S. government has evolved to a point where it’s really the world’s largest financial institution. Its credit portfolio alone now totals about $5 trillion, a huge number. So given the relative complexity of making and servicing loans — and these instruments have terms that can last for decades — managing the government’s huge credit portfolio has always been a tremendous challenge. You know, particularly when you compare it with simply providing a one-time cash grant to an intended beneficiary, that’s pretty simple. You’re just cashing once. When we loan money, we’re entering into a long-term relationship with the borrower, technically, so the complexity is very significant.

Terry Gerton When you think about that massive portfolio, you’d said 125 different programs, 12 agencies, $5 trillion. Are there any specific programs that rise to the top of your visibility list in terms of desperately needing attention?

Doug Criscitello Let me answer that by talking about some of the good news, because huge strides have been made in recent decades. We’ve come a long way from the days when loan repayments were recorded on three-by-five index cards in pencil, right? So many of the systems that have been developed over the past few decades are huge advances relative to what we had prior to the sort of general use of computational power across the government. But notwithstanding those advancements, the systems that we have today are fragmented, outdated, they don’t communicate with each other. So, this creates a whole lot of administrative complexity. And borrower confusion. It drives up costs at the end of the day and it makes it difficult to manage risk or detect fraud. And it generally frustrates borrowers. I think if you did a man on the street interview, it wouldn’t be hard to find folks that have been frustrated in repaying a loan to the government.

Terry Gerton Well, your press release for the Center for USA Lending mentions modernization, technology, and integrity as core priorities. You just sort of glossed over them. But when I think about the financial industry, banking, and major corporations, they’re really at the front edge of technology, cybersecurity, identity management. How are you seeing the possibilities for bringing that kind of technology into how the government operates its loan portfolio?

Doug Criscitello Exactly right. So there are a lot of financial institutions that embrace modern technologies and are continuing to advance their use of cutting-edge tools. I think artificial intelligence is a terrific application here, right, to tailor the experience of borrowers, depending on their background, both in the application process and when it comes to servicing. Our hope is to really facilitate a dialog, not only across the government, but to bridge the gaps that exist between technology, private financial institutions and what they’re doing, and the U.S. government credit apparatus. Right now, there are huge opportunities to have really seamless systems from the time a borrower applies for a loan till the day they make the final payment. One agency that I’ve worked at and around for much of my career, the Small Business Administration, has made some amazing strides since the COVID pandemic, when it was forced to disperse nearly $1 trillion in paycheck protection program loans and economic injury disaster loans. They’re in the midst of just an incredible improvement in the borrower experience, the disaster loan program being a great example. And we want to encourage that type of improvement to occur at other agencies as well.

Terry Gerton I’m speaking with Doug Criscitello. He’s the new executive director at the Center for USA Lending. Doug, coordinated technology investment is a perennial problem for the federal government. But setting that aside, you just described a situation that calls out for centralized governance, that calls out for data standardization. Beyond tech investment, what are your policy priorities for the center?

Doug Criscitello You’ve touched on some of them, for sure. The notion of trying to at least have a coherent approach across agencies, where we have common data definitions and agreement in principle that having these end-to-end systems are the way forward here. We really need to automate workflows and integrate systems. I mean, that’s priority one, to ensure that can be done. So look, there’s a lot that the center can do. One thing we’re planning to do is to convene the community. Let’s get folks — we plan to have frequent gatherings of both folks in government, folks in industry — to come together to explore how best to move forward and to continually evolve. It’s not a one-time fix, you know. These systems can continually be strengthened. The government has shown no signs of reducing the size of its footprint here in the lending world. So, you know, we want to be a convener. We want to develop thought leadership. We want to pull together data from across the federal lending enterprise into a common shared platform to help all of the participants in this realm better understand how these programs are performing and what we might do differently going forward.

Terry Gerton You’ve laid out a pretty bold and expansive vision there. If you’re successful, five years from now, what looks different about federal lending?

Doug Criscitello The stakes are really high with a $5 trillion portfolio. I think if we’re successful, our work will help enhance taxpayer value, importantly, by reducing wasteful spending on duplicated systems. We hope to enhance program integrity, reduce hedge fraud faster, and streamline access to loans. Particularly when they’re needed most, right? There are times when the federal government — and the pandemic was a great example — times when funds need to be put on the street quickly and effectively and efficiently, and avoiding fraud. So our goal is really to make government lending more efficient. So whether you’re a borrower seeking faster service, a private lender who wants to have a harmonized relationship across all of their various federal loan guarantee programs in which they participate, or even just a taxpayer … importantly, a taxpayer who absolutely deserves efficient government operations. The center’s modernization efforts, I think, are poised to benefit you directly. So we’re really excited to get underway.

The post A new center aims to modernize federal lending at a scale few realize exists first appeared on Federal News Network.

© The Associated Press

FILE - Dallas Koehn plants milo in his field as wind turbines rise in the distance on May 19, 2020, near Cimarron, Kan. The federal government announced Tuesday, Oct. 18, 2022, a program that will provide $1.3 billion in debt relief for about 36,000 farmers who have fallen behind on loan payments or face foreclosure. (AP Photo/Charlie Riedel, File)

IRS tech chief directs staff to take ‘skills assessment’ ahead of IT reorganization

21 November 2025 at 17:25

The IRS, ahead of an upcoming reorganization of its tech office, is putting its IT staff to the test.

The agency, in an email sent Monday, directed its IT workforce to complete a “technical skills assessment.”

IRS Chief Information Officer Kaschit Pandya told employees that the assessment is part of a broader effort to gauge the team’s technical proficiency, ahead of an “IRS IT organizational realignment.”

“Over time, hiring practices and role assignments have evolved, and we want to ensure our technical workforce is accurately aligned with the work ahead. The assessment will help establish a baseline understanding of our collective strengths and areas for development,” Pandya told staff in an email sent Monday.

Pandya’s office is leading the technical skills assessment, in coordination with the Treasury Department, the IRS human capital office and the Office of Personnel Management.

“I want to emphasize that this is a baseline assessment, not a performance rating. Your individual-level results will not affect your pay or grade,” he told staff. “I know this comes during a very busy and uncertain time, and I deeply appreciate your partnership.”

Pandya told staff that a “limited group” of IRS IT employees in technical roles — including developers, testers and artificial intelligence/machine learning engineers have been invited to complete the test. He told staff that, as of Monday, about 100 employees were directed to complete the assessment.

On Friday, an IRS IT employee told Federal News Network that several hundred employees have now completed the assessment, and that it took employees about 90 minutes to complete it.

According to the employee, Pandya told staff in an all-hands meeting on Friday that one of the agency’s goals is to rely more on full-time IT employees, and less on outside contractors. He said during that meeting that IRS IT currently has about 6,000 IT employees and about 4,500 contractors.

“It doesn’t make sense, considering all the RIFs, firings and decisions that ignored expertise,” the IRS IT employee said.

The IRS has lost more than 25% of its workforce so far this year, largely through voluntary separation incentives. Pandya told staff in an email this summer that the agency needs to “reset and reassess,” in part because more than 2,000 IT employees have separated from the IRS since January. The IRS had about 8,500 IT employees at the start of fiscal 2025.

The agency also sent mass layoff notices to its employees during the government shutdown, but has rescinded those notices as required by Congress in its spending deal that ended the shutdown.

The Treasury Department sent reduction-in-force notices to 1,377 employees during the recent government shutdown — as part of a broader RIF that targeted about 4,000 federal workers. Court documents show the IRS employees received the vast majority of those RIF notices, and that they disproportionately impacted human resources and IT personnel at the IRS.

The technical assessment is also in line with goals set by Treasury CIO and Department of Government Efficiency representative Sam Corcos, who recently said IRS IT layoffs were “painful,” but necessary for the agency’s upcoming tech reorganization.

In a recent podcast interview, Corcos said much of his time as Treasury CIO has been focused on projects at the IRS, and that the agency’s IT workforce doesn’t have the necessary skills to deliver on its long-term modernization goals.

“We’re in the process of recomposing the engineering org in the IRS, which is we have too many people within the engineering function who are not engineers,” he said. “The goal is, let’s find who our engineers are. Let’s move the people who are not into some other function, and then we’re going to bring in more engineers.”

Corcos estimated that there are about 100 to 200 IRS IT employees currently at the organization that he trusts to carry out his reorganization plans.

“When you go in and you talk to people, a lot of the people, especially an engineer, the engineers on the team, they want to solve this problem. They don’t feel good about the fact that this thing has been ongoing for 35 years and will probably never get done. They actually want to solve these problems.”

IT employees at several agencies have gone through evaluations and assessments during the Trump administration. Tech employees at the General Services Administration were also interviewed and questioned about their skills and expertise by GSA and DOGE leadership. GSA later downsized its Technology Transformation Services office and shuttered its 18F tech shop.

In March, the IRS removed 50 of its IT leaders from their jobs and put them on paid administrative leave. Corcos defended that decision, saying the IRS “has had poor technical leadership for roughly 40 years.”

Corcos said those former IRS IT leaders pushed back on DOGE’s audit of government contracts. The agency, he added, spent an “astounding” amount on cybersecurity contracts, but former leaders resisted cutting and scaling back any of those contracts.

“The initial leadership team just said, ‘Everything is critical, you can’t cut anything. In fact, we need more,’” Corcos said. “And when we swapped them out for people who were more in the weeds, who knew what these things were, we found actually quite a lot that we could cut.”

The post IRS tech chief directs staff to take ‘skills assessment’ ahead of IT reorganization first appeared on Federal News Network.

© AP Photo/Patrick Semansky

Turning the government’s contact centers into engines of intelligence to power federal modernization efforts

Evan Davis views the federal government’s modernization efforts as a strategic opportunity to rebuild trust and achieve mission success through smarter, more human-centered service design.

The recent executive order on digital design makes the timing ideal, said Davis, executive managing director for federal growth at Maximus.

The key? Agencies need to use every citizen interaction as a data point to improve systems, predict needs and personalize service, he recommended during an interview for Federal News Network’s Forward-Thinking Government series.

Lean into the better design executive order

The Improving Our Nation Through Better Design EO is a natural extension of the push to improve experience and the relationship between the government and its constituents, Davis said. “That relationship needs to get built one encounter at a time.”

That matters because the public’s digital service expectations have expanded as commercial interactions have rapidly surpassed those offered in the public sector.

But right now, “there’s a recognition that these government experiences, when looked at carefully with new technology, can meet not only those new expectations but bring federal government encounters to a place where constituents feel appreciated and feel considered in those engagements,” Davis said.

With more than two decades spent helping agencies connect with their constituents, much of it spent partnering within federal contact centers, we asked Davis to share his perspective on the most effective strategies and tactics for advancing digital maturity across government.

Position contact centers as strategic intelligence hubs

For starters, it’s critical to reenvision government contact centers as far more than transactional endpoints. Davis argues that they are rich, underutilized sources of qualitative data that reveal citizen intent, frustration and unmet needs.

With artificial intelligence and analytics, agencies can mine center interactions to inform policy, improve service design and respond in real time, he said.

“I’m constantly amazed by the wealth of untapped data insights hidden within federal agency call centers,” Davis noted and added that center staff members also have a “real-time understanding of the incredible complexity of what it means to engage with the government.”

  • 3 tactics: To take advantage of contact center data, Davis suggests that agencies should:
    • Analyze call transcripts for patterns in citizen needs.
    • Use insights to refine FAQs, digital flows and policy language.
    • Feed findings into broader customer experience and service improvement efforts.

With this approach, agencies — for the first time — “can truly use data to influence policy, to influence an understanding of what’s important to citizens,” he said

Build a digital-first, omnichannel foundation

Davis stressed that digital first doesn’t mean digital only. Agencies must unify systems and channels to guide citizens to the right help, whether that’s a chatbot, a human agent or a proactive SMS update. An omnichannel foundation will enable cost savings, faster service and trust-building through transparency, he said.

“Digital first is not digital separate. … How do I use that first point of contact to get people to the right place based on where they are at the moment?”

The goal, Davis explained, is to reduce the total amount of time and individual actions that citizens must take to address a need.

  • 3 tactics: He suggested that to establish that omnichannel foundation, agencies should:
    • Consolidate legacy contact center systems into a scalable, modular platform.
    • Standardize agent interfaces and data flows.
    • Enable proactive outreach across channels.

“It will also give agents, regardless of the exact content that they’re responsible for, the same user interface, the same pane of glass to look at every day,” Davis said. “It will also allow them to start pulling in that huge amount of data and doing something with it to inform what next steps they should take.”

 Use AI to decode intent and predict needs

Understanding why a citizen contacts an agency is often more complex than a dropdown menu can capture. Davis explained that AI can uncover true intent, match it to policy requirements and guide citizens to resolutions faster. It can also help leaders spot emerging issues before they escalate.

“AI has already proven incredibly adept at understanding true intent of the citizen’s needs” at the micro level and gives agencies more options to quickly respond appropriately, he said. And at macro level, “you can rely on AI to answer things like: What’s changed today? What do I need to know when I wake up this morning as the leader of citizen engagement?”

  • 3 tactics: To speed response times through integrating AI capabilities, Davis recommended that agencies should:
    • Deploy AI-powered intelligent virtual assistant and agent assist tools.
    • Use AI to analyze qualitative data and surface trends.
    • Train models using up-to-date knowledge management systems.

Long term, by integrating AI in these ways and moving to modernized data infrastructures, Davis expects agencies will achieve a state of ongoing transformation and be able to incrementally improve and scale services.

Why tackling service systems matters now

Davis tied these tactics directly to the urgency of the moment: aging systems, rising citizen expectations and the availability of transformative technologies. Agencies must act now, not just to modernize, but to deliver on their missions more effectively, he said.

The beauty of integrating contact center data sources and analyzing that data in real time, Davis pointed out, is that agencies can begin making correlations between circumstances on an interaction that tend to lead to increased costs but also tend to lead to erosion of trust.

“We can begin looking at incredible positive change — to both provide cleaner, simpler, more cost-effective solutions but also to rebuild trust.”

Discover more ways to use technology to reimagine how your agency meets its mission in our Forward-Thinking Government series.

The post Turning the government’s contact centers into engines of intelligence to power federal modernization efforts first appeared on Federal News Network.

© Federal News Network

Screenshot 2025-11-20 124241

When the hotspots go dark, who connects the unconnected?

19 November 2025 at 19:14

Interview transcript:

Sam Helmick The E-Rate Hotspot Lending Program is built on about three decades of the FCC’s E-Rate program, which has enabled libraries and schools to have discounts for broadband connectivity as we continue to develop 21st-century readers, learners and skills. And so traditionally that E-Rate funding could only be used for connections within libraries and school buildings. But then in 2024, the then-FCC chairwoman really launched this beautiful program called Learn Without Limits. And that expanded eligibility for the Wi-Fi hotspot devices that libraries could retain to be circulated much like books, particularly to households without reliable or affordable broadband. And the American Libraries Association deeply supported this. And it was executed in more than 800 libraries across the nation; schools and public have utilized this service. It’s about $34 million dollars’ worth of hotspot funding in the year of 2025 to make meaningful connectivity change for Americans.

Eric White Okay, got it. So the FCC voted to virtually end the program back on September 30th. What happened there? What was their reasoning for giving that and does that truly mean the end of the program, or are there other avenues that the program could take to stay alive?

Sam Helmick You’re absolutely right. On September 30th of this year, the FCC voted 2-1 to rescind the hotspot lending program and the school bus Wi-Fi initiative. The majority argued that the E-rate statute didn’t authorize funding for services used beyond library and school property. But the American Library Association, along with many of our partner organizations, disagree with that interpretation and have really urged the FCC to reconsider and maintain the program. This decision reverses rules adopted in 2024 that have just begun to take effect and we’re already sort of seeing the 2025 E-Rate cycle being denied. And we understand that a reader denied is literacy denied, and connectivity divide is almost like participation in civic and educational life denied.

Eric White Yeah, particularly in those rural areas where you may not have a steady connection. You can still obviously access the internet in the library, but you know, when you’re in a teaching scenario and you don’t want to take up the computer for too long because then you start to feel guilty, right? So what other options do folks have who are out in those rural areas that relied on this program?

Sam Helmick If the federal government isn’t prepared to create a robust infrastructure for broadband for our national security, entrepreneurial and economic development, and pursuit of educational wellness and happiness, then I think that we have to think about those students that are on bus rides for up to like three hours a day, back and forth, trying to accomplish their homework. Or folks who are applying for jobs on Sundays because it’s the only day they have off, but the library isn’t supported or resourced enough to be open to them for their public access computers. Also, folks who are trying to attend telehealth appointments, access government services, or even connect with loved ones. Often I think folks forget that libraries are spaces where during both triumph and trials in a community, this is where folks need to go to access internet to tell the broader world and their loved ones that they’re safe and they’re fine. And so we’re really thinking about the broad spectrum of American life and how the lack of connectivity infrastructurally has been devastating. And this was an effort to mitigate that devastation. Now to lose this really leaves a lot of Americans in the lurch.

Eric White We’re speaking with Sam Helmick, president of the American Libraries Association. Let’s talk about federal support for public libraries in general. I’ve spoken to your organization in the past. There were some concerns about dwindling support and obviously cuts have come across the board for a lot of federal programs and I’m sure that libraries are not immune to that. Do I have that correct? And you know, where do things currently stand?

Sam Helmick Oh, you’re absolutely right. In 2024, the Institute of Museum and Library Services awarded $266.7 million dollars through grant-making research and policy development that particularly supported not only our state libraries across the nation, but then our small and rural that rely on those matching state dollar funds to make sure that our tax dollars are working twice and three times over. So with the executive order seeking to dismantle that institute, as well as the lack of robust or comprehensive release of the congressionally mandated funds that fund that institute that support libraries around the country and therefore communities around the country, libraries are experiencing resource scarcity at the federal and then the state and then at the local level. Because despite the fact that those federal dollars have been paid by the taxpayers, they’re not getting returned back. And then if you have contracts through those state consortiums or state libraries, those contracts didn’t end just because the congressionally mandated dollars were not provided to the states. And so this is creating an undue burden on state taxes and taxpayers, and then that trickles down to hurting rural communities that are the least-resourced, but probably the most in need, when it comes to their community anchor institutions, which are a public or a school or an academic library.

Eric White Yeah, I was going to say I’m in no way living in a rural area, but going to any of the libraries in my vicinity, they’re as crowded as ever. So it seems as if the need for resources is almost at an all-time high at a time when they may not have all the support they need.

Sam Helmick Increasingly you and I understand that having digital connection is going to allow us to not only thrive civically but economically, educationally, and then just socially. And so to bar that access to any American, particularly in a country that is so well-resourced and rich, feels counterintuitive to ensuring that we continue to be a nation that thrives 250 years into our story.

Eric White All right, so the situation is what it is. What steps are organizations like yourselves taking, and are there other options on the table, you know, nonprofits, things of that nature? Or is it really just going to come down to more states and more local governments are going to have to step in if they want to save these libraries?

Sam Helmick I think it’s holding anybody, regardless of where they sit on the aisle, accountable to understanding that more Americans visit libraries than they do baseball games, which is our national pastime. And that 70% of us are not interested in abridging or censoring information for any reason — not for economic reasons, not for ideological reasons. That’s a large spectrum of American life, through third-party surveys, that show us how much we value access to information. So how do we support those values? Well, first we recognize that we’re about to be 250 years old as a nation, and that this unique form of government had an essential mechanism called libraries, which is why a lot of our founders invested in them, because they wanted a robust constituency and society that was educated so that it could progress and have informed decisions when it came to civic life. And if we’re going to continue to value that, that means we need to use our libraries. We need to dust off our library cards and make sure that they’re active. Increasingly and regularly, as folks who want to get into the advocacy piece, it’s visiting ALA.org/advocacy to learn how you can write an email, invite your Congress member to come visit their local libraries in their areas of representation, join a city council, join a library board of trustees, join a school board so that your voice and fingerprints are part of the conversation. It’s writing to your legislators and reminding them that you wanted to robustly support your libraries, and so you’re asking them to write policy and create funding that will make that manifest. And then lastly, you can also visit ILoveLibraries.org, so that if you’re wanting to support the American Library Association and library practitioners that are doing this work, you can donate your store, you can donate funds to support moving this national value 250 years into the future.

Eric White You bring up the 250 years portion and that provides me a nice segue. Your organization is a 150 years old, almost. From a historical standpoint, have the nation’s libraries ever really gone through anything like this before? I’m just curious if you have any historical perspective on if we’ve been here before, you know, through tumultuous times  throughout American history.

Sam Helmick Great opportunity to tell a story. I love telling stories, Eric. In 1938, Des Moines Public Library director Forrest Spaulding wrote the Library of Bill of Rights. And I think he did it for a few reasons. We had just gone through a Great Depression and recognized how instrumental our libraries were to supporting their communities during economic strife, but also lifting them up to build entrepreneurial and economic development. But then it was also going through between the world wars and recognizing that we were a melting pot. And sometimes the ideas and values of a very vibrant culture, they blend and harmonize, but sometimes they also brush and create friction. And so creating a set of values where it talks about the right to use reading rooms, the right to find books that both counter and support your own ideology, the right to assemble, the right to speak and to read were essential. And in 1939, the American Library Association adopted that to become an international of free people reading freely. And so when I think about our history, I think libraries have been very good at growing at the pace of their societies, turning inwardly to think about how they can do the work better, and then relying on their communities to do the work best. And so while I would argue that we probably are seeing a difficult time, probably something that even counters McCarthyism in the United States, we have always turned in and relied on our communities and our values to push through. And so using your library, visiting ALA.org/advocacy, using your voice to speak to those that you’ve elected into power — this has always been the recipe. And if we all stay in character, I think we can continue to thrive.

The post When the hotspots go dark, who connects the unconnected? first appeared on Federal News Network.

© The Associated Press

St. Stephen Middle School student Lakaysha Governor works on her Chromebook on Monday, March 20, 2017, on a school bus recently outfitted with WiFi by tech giant Google, as College of Charleston professor RoxAnn Stalvey looks on in St. Stephen, S.C. Lakysha is one of nearly 2,000 students in South Carolina's rural Berkeley County benefiting from a grant from Google, which on Monday unveiled one of its WiFi-equipped school buses in the area. (AP Photo/Meg Kinnard)

OPM’s HR modernization strategy sets next sight on USA Hire

19 November 2025 at 17:12

While much attention across the federal community has been focused on the Office of Personnel Management’s strategy to consolidate 119 different human capital systems across government, the agency, at the same time and with little fanfare, kicked off another major human resources modernization effort.

OPM is planning to revamp the USA Hire platform, which provides candidate-assessment tools for agency hiring managers, with the goal of making evaluations more efficient and leading to higher-quality applicants.

OPM, working with the General Services Administration, issued a request for information on Oct. 7 and has been meeting with vendors over the last few weeks to determine what commercial technologies and systems are available. The RFI closed on Oct. 21.

“This RFI is part of OPM’s ongoing effort to ensure agencies have access to cutting-edge, high-quality assessment tools that help identify and hire the best talent across the federal government—advancing a truly merit-based hiring system in line with the president’s Merit Hiring Plan and Executive Order 14170, Reforming the Federal Hiring Process and Restoring Merit to Government Service,” said an OPM spokesperson in an email to Federal News Network. “OPM also anticipates making additional improvements to USAJOBS and USA Staffing to enhance the applicant experience and better integrate assessments into job announcements.”

OPM says in fiscal 2024, USA Hire customer agencies used the program to assess approximately 1 million applicants for over 20,000 job opportunity announcements.  It provides off-the-shelf standard assessment tests covering more than 140 federal job series, access to test center locations worldwide and a broad array of assessment and IT expertise.

“USA Hire currently offers off-the-shelf assessment batteries covering over 800 individual job series/grade combinations, off-the-shelf assessment batteries covering skills and competencies shared across jobs (e.g., project management, writing, data skills, supervisory skills), and custom assessment batteries targeting the needs of individual agencies, access to test center locations worldwide, and a broad array of assessment and IT expertise,” OPM stated in the RFI.

In the RFI, OPM asked industry for details on the capabilities of their assessment systems, including:

  • Delivering assessments in a secure, unproctored asynchronous environment
    Delivering online video-based interviews
  • Using artificial intelligence/machine learning in assessment development and scoring
  • Minimizing and/or mitigating applicant use of AI (e.g, AI chatbots) to improve assessment performance
  • Integrating and delivering assessments across multiple assessment platform

“OPM seeks an assessment delivery system that can automatically score closed-end and open-ended responses, including writing samples. The online assessment platform shall be able to handle any mathematical formula for scoring purposes,” the RFI stated. “Based on the needs of USA Hire’s customers, OPM requires an assessment platform that supports static, multi-form, computer-adaptive (CAT), and linear-on-the-fly (LOFT) assessments delivered in un-proctored, in-person, and remote proctored settings.”

An industry executive familiar with USA Hire said OPM, through the RFI, seems to want to fix some long-standing challenges with the platform.

“RFI suggests OPM will allow third parties to integrate into USA Staffing, which has been a big problem for agencies who weren’t using USA Hire. But I’ll believe it when I see it,” said the executive, who requested anonymity in order to talk about a program they are involved with. “Agencies are not mandated to use USA Hire, but if they don’t use it, they can’t use USA Staffing because of a lack of integration.”

USA Staffing, like USA Hire, is run by OPM’s HR Solutions Office on a fee-for-service basis. The agency says it provides tools to help agencies recruit, evaluate, assess, certify, select and onboard more efficiently.

RFI is a good starting point

The executive said this lack of integration has, for some agencies, been a problem if they are using other assessment platforms.

For example, the Transportation Security Administration issued a RFI back in 2024 for an assessment capability only to decide to use USA Hire after doing some market research.

“USA Hire is adequate for most things the government does. It’s fine for certain types of programs, but if you get out of their swim lanes, they have trouble, especially with customization or configurations. I think getting HR Solutions to do any configurations or customization is a yeomen’s effort,” the executive said. “My concern about USA Hire is it’s a monopoly and when that happens any organization gets fat and lazy. Maybe the Department of Government Efficiency folks kicked them in the butt a little and that’s maybe why we are seeing the RFI.”

The executive said the RFI is a positive step forward.

“It could be good for some companies if it comes to fruition and OPM brings in a legitimate way for other providers with some unique competencies or services to expand the offering from USA Hire,” the executive said. “It’s too early to tell if there will be a RFP, but if they do come out what are they buying? Are they trying to bring on new assessment providers? I think a lot of us would like to know what OPM is looking for or what holes they are seeking to fill in these new solutions.”

Other industry sources say OPM has laid out a tentative schedule for a new USA Hire support services solicitation. Sources say OPM is planning to release a draft request for proposals in January with a final solicitation out in October.

This means an award will not happen before 2027.

“Due to the complexity of requirements and the amount of market research that needs to be conducted, the USA Hire PMO expects the competition timeline to be more than a year long,” OPM said in a justification and approval increasing the ceiling of the current USA Hire contract. “The government estimates that transition could take up to two years depending on the awardee’s solution.”

OPM adds $182M to current contract

OPM released the J&A at the same time it issued the RFI. In a justification and approval, OPM increased the ceiling of its current USA Hire support contract with PDRI, adding $182.7 million for a total contract value of $395 million.

OPM says the need to increase the ceiling is because of the Transportation Security Administration’s (TSA) adoption of USA Hire and its need to fill thousands of vacant positions after the COVID-19 pandemic.

“Because of the EO, the need for USA Hire assessments has far exceeded the initial estimated amount, which has grown at a pace far faster than anticipated when the contract requirements and needs were first drafted and awarded,” OPM stated in the J&A. “OPM planned for the steady growth of USA Hire throughout all options of the contract; however, TSA alone has consumed 95% of the requirement in option year 2 and option year 3. The government issued a modification to realign ceiling value to support the additional assessments; however, the delivery of the assessments has increased significantly.”

An email to PDRI seeking comment on the increased ceiling and the RFI was not returned.

The OPM spokesperson said the agency expects the use of USA Hire to continue to grow over the next few years as agencies implement skills-based assessments as required under the Merit Hiring Plan and Chance to Compete Act.

OPM said in its J&A that it expects USA Hire to provide assessment services to 300,000 applicants for TSA, 10,000 entry level investigators for U.S. Immigration and Customs Enforcement, along with smaller customer agencies spanning cybersecurity positions, tax fraud investigations, entry level credit union examiners and HR specialists.

The post OPM’s HR modernization strategy sets next sight on USA Hire first appeared on Federal News Network.

© Getty Images/iStockphoto/ArtemisDiana

From small business roots to mid-tier powerhouse, this firm is using employee ownership and AI to stay ahead in federal contracting

19 November 2025 at 14:52

Interview transcript:

 

Travis Mack Over the years, you know, growing a small business is kind of an iterative process. You learn a lot of things along the way. And we had done very well in the small business vertical, but when we got to that point where we were trying to make that inflection, that turn to trying to be a large business, there were a couple things that we were considering. Were we going to remain a small business or were we just going to blow right through it? And we decided to kind of blow right through the small business threshold. And with that, we had to do a few things differently. We certainly had to upgrade our talent, which was really important, right? We had to also look at trying to drive additional revenue streams, trying to create additional value for the federal government. And so we decided on, not only were we going to grow organically, we were going to grow inorganically as well, which kind of led to our strategy of mergers and acquisitions and incorporating that into our organic growth.

Terry Gerton Well as you say, growing past that small business to large business zone can be really, really challenging. But you’ve kept Saalex as an employee-owned company. How did that decision factor in to your growth strategy?

Travis Mack It factored in because as we were making the transition, we had to figure out how we were going to attract the best and the brightest. And it was actually one of our core strategic decisions on us trying to go and become a large business. It has been the kind of the pillar of us trying to grow. So us becoming and transitioning into an employee-owned organization was really something that I thought of and I said to myself, “if you were going to be asked to work 80, 90 hours a week, what would you want, Travis?” I said I’d probably want equity. And hence, you know, the employee-owned building blocks that we utilize today in order to attract the best and the brightest for Saalex.

Terry Gerton Is that a strategy that you think is sustainable as you continue to grow the company?

Travis Mack Absolutely. We’ve seen it demonstrated before. We think it’s an excellent strategy for us to continue to scale and for those who are willing to put in that work, put in that extra effort. We think it’s something that … because it’s not only the top of the spectrum that’s gaining, it’s the entire organization, because everyone at Saalex has equity and we want that community.

Terry Gerton So you mentioned a little bit about your growth through acquisition strategy. You’re clearly not trying to blend in, you really want to set yourself apart. How do you set yourself apart from the other big primes in the defense and federal space?

Travis Mack We think it’s part of certainly, you know, being an ESOP, having that equity component. We also think it’s from us being unique. We’ve really embraced automation, we’ve really embraced AI, we’ve really embraced security in order to give ourselves a differentiating feel to the organization. And so we think, at our size, being agile than maybe some of the larger primes, being more efficient than maybe some of the larger primes, and really just trying to understand what the core problem is and then solving for that, we think that is a differentiating vertical for us, and we’ve leaned into that. So, you know, we’re an AI-first organization building in automation and AI through every single business system, every single component, and then that efficiency, that effectiveness really translates very seamlessly to the federal customer.

Terry Gerton So that strategy through mergers and acquisitions can really shake up company culture as you’re bringing in different organizations. How have you managed to build an organic Saalex culture and hold on to that through that growth cycle?

Travis Mack It’s a process. And you know, it takes time. It really does, especially now with all the new changes, with how you implement artificial intelligence efficiently, bringing in different organizations within one culture. We’ve launched an initiative called One Saalex, really just trying to focus everyone on — it’s one infrastructure, which is backed by an AI-first mindset, and bringing everybody in and just trying to demonstrate the efficiencies of the platform and how we are supporting our end customers. So we take it day by day. We try to talk about what the benefits are; and it’s a lot of training, Terry. It is truly a lot of training and a lot of — I kid, every single day, half of my battle is changing hearts and minds. And I’ve got to show up every day changing hearts and minds and showing the innovation and showing how, at the end of the day, it’s actually better.

Terry Gerton And you’re bringing in folks with some really amazing technical talent, clearance capability, high-tech roles. How are you finding the job market, and then how do you find the integration once you get them on board?

Travis Mack The job market right now is something that we focus a lot on, right? I mean, the lifeblood of what we do is with individuals, with people. And true enough, we’re trying to scale that with AI and things of that nature. But really it’s about us being out there in in the community. It’s about us being active. It’s about us defining and identifying roles that, you know, we can fit individuals into very, very seamlessly. I think we’ve been certainly very forward-leaning with the mechanisms by which we hire. Traditional ways of hiring isn’t necessarily something that is at the top of the mind these days. So we try to be flexible, we try to be nimble, we try to be innovative, we try to do all those things that we think will entice individuals to come and work with Saalex.

Terry Gerton And one of those things, as you already mentioned, is being an AI-first company. So how do you deploy that kind of fast-moving technology, both in Saalex and then for your customers to keep them on the cutting edge?

Travis Mack Well, we’re not going out building large language models for the federal government. That’s not what we’re doing. We’re going to let them handle that. You know, ours had to be from a services perspective, right? And so we had to figure out, how do we engage and utilize AI from a services perspective? First we thought about, hey, okay, what does that look like? Our journey with AI actually started about two years ago and we really started to focus on AI functionality within all of our business systems. We took that and then we put in the digital connectors with RPA, with robotic process automation, you’ve got to have that digital connection, and then at the end of the day, trying to deploy that from a federal perspective and integrating that with the customers and the uses and creating digital workforce agents and the whole nine yards. And so we’ve tried to be innovative. We think that utilizing AI gives us an agile advantage, you know, than some of the larger competitors that we have. We’re able to move a little bit quicker as a mid-market federal contractor, and so we’re excited about, what are those new use cases, what are those new concepts that we’re delivering? We’re thinking about the work differently, Terry, every single day, and that requires a total mind shift.

Terry Gerton Well speaking about thinking about the work differently, we’ve talked about your growth strategy, we’ve talked about your workforce culture and training, we’ve talked about your tech approach. But the world of federal contracting and defense contracting is changing very, very rapidly. So as you look forward, say five years, what do you see for Saalex, and how are you positioning them to take advantage of the opportunities you see?

Travis Mack I’m going to try to pull out my magic ball here, put my Nostradamus hat on. Difficult question because of how fast things are changing. And what we’re trying to do is just be iterative. What we don’t want to be, Terry, is late. That is the thing. And we know we’re going to have some false starts. We know we’re going to not get it right as we implement automation and AI and efficiency throughout the organization. Government agencies right now want speed, they want agility, they want efficiency, they want security, the whole nine yards, as they are trying to change how they do the work as well. Five years out, we really think that it is about the iterative process, it is about changing how we do the work, it’s about identifying where we can drive efficiencies, and it’s about how we can, in my thoughts, do more with less, honestly. Because that’s where we’re headed to. So we’re excited about building an infrastructure, building a capability that the federal government and government agencies can utilize with some of our technical services, right? We’re supplying software development, we’re doing test range management, a whole bunch of technical stuff with the Department of War. So we’re excited about, how do we deliver those services differently? And what does that look like? Because I think that’s what everyone is struggling with. What does that look like? We’re trying to help get some visibility and we know it’s iterative. We know it’s going to innovate, we know it’s going to continue to expand, but we just didn’t want to be late.

The post From small business roots to mid-tier powerhouse, this firm is using employee ownership and AI to stay ahead in federal contracting first appeared on Federal News Network.

© Getty Images/iStockphoto/Olivier Le Moal

DLA’s Tech Accelerator Team showing how to spur innovation

18 November 2025 at 15:38

The Defense Logistics Agency may have solved two problems every agency tends to struggle with — attracting new and innovative companies and changing the culture of its workforce to work with those firms.

DLA’s Tech Accelerator Team has shown it can do just that. Over the last several years it has been using what are considered traditional private sector methods to attract up-and-coming firms and take an agile approach to solving problems using interviews, data and market research.

David Koch, the director of research and development at DLA, said the agency launched the Tech Accelerator Team about six years ago with the idea of finding commercial technologies from non-traditional companies to solve their most pressing problems.

David Koch is the director of research and development at the Defense Logistics Agency.

“We don’t go into a problem with a solution in mind. We go into it solution agnostic,” Koch said in an interview with Federal News Network. “What is the problem that you want to solve? Then, let’s pull in a bunch of commercial folks that have tackled similar type of problems before. We usually do that through a request for information (RFI) that goes out to companies. We bring them in and we see what kind of solutions they throw up. We don’t go into it with a preconceived idea of how to solve this problem.”

Part of the challenge with this approach led by the Tech Accelerator Team was changing the way DLA leaders approached problems. Koch said they have done a lot of training around innovation to help DLA leaders and employees bring good ideas to fruition.

“It was more about, let’s interview senior leaders and let’s find a problem that we need to go solve. Now it’s really grown into a life of its own to where the program managers reach out and say, ‘Hey, I need a commercial solution for the problem that I have,’” he said. “I think a lot of times now it’s more internally focused, where we reach out to commercial solutions based on a problem that we know exists. We’ve become more aware of what’s going on across the organization. We know where those problem areas are, where there’s commercial opportunities to solve them.”

Koch pointed to an example of this approach in action with RGBSI Aerospace and Defense, a company providing engineering and technical support, around using digital twins differently. Koch said DLA had used digital twins for parts and for processes, but through this approach, the agency is using digital twins to improve its digital threads.

“You can pull in things like acquisition data, logistics data and manufacturing data, along with that thread so that you can pull in more industry partners and more people are available to make that part,” he said. “Now, what we do is we use a computer program to go in and follow where the data flows, and it maps the process for you. Sometimes you’re surprised when you find out how your process really works.”

The Tech Accelerator Team calls themselves “DLA’s innovation broker,” which works with other DoD and federal offices as part of a broad-based innovation ecosystem.

DLA spent $135 million in research and development in fiscal 2025 across three main portfolios:

  • Logistics
  • Manufacturing technology
  • Its small business innovation program

Koch said about $53 million went to manufacturing technology and about $17 million was for DLA business processes or logistics research and development. Additionally, DLA received about $44 million from Congress, most of which went into R&D for rare earth elements and other strategic materials.

Testing an automated inventory platform

Koch said heading into 2026, DLA will focus on four specific areas.

“The first one is strategic material recovery. We hosted [in September] our kickoff event for that being our newest manufacturing technology project. But that doesn’t mean that we’re just now starting strategic materials research. We’ve been doing it out of our SBIR for now for our last few years. It’s very timely, it supports the stockpile and we’ve had some really good success stories,” he said. “[The second one is] additive manufacturing and it’s really about mainstreaming. We call it the joint additive manufacturing acceptability. But mainstreaming additive manufacturing is part of the normal supply chain process that the military can use when they order parts from DLA.”

The two other areas are artificial intelligence transformation and automated inventory management. Koch said DLA is testing the Marine Corps Platform Integration Center (MCPIC) and also adding new technology to the platform to help improve how they manage products across 25 distribution centers.

“We have a lot of stuff that’s outside, think big strikers and tanks and stuff like that that are just out there in the open. So you need something like a drone that’s going to go around and capture that inventory. Then you have a lot of small things, think firearms and stuff like that that we have to do inventory. So that’s the backbone that we’re building it upon,” he said. “The idea is you walk down the aisle and your inventory populates on your laptop or your iPad. We think we can get there.”

He added that DLA is piloting the integrated technology platform at its distribution center in Anniston, Alabama.

“We spend tens of millions of dollars a year doing inventory, and it’s very people intensive. Our automated inventory project is all about automating that process,” Koch said. “The goal is that we can do 100% audit, totally automated, and save a lot of that funding, and then have that information feed into our warehouse management system. We’re definitely excited about the possibility.”

The post DLA’s Tech Accelerator Team showing how to spur innovation first appeared on Federal News Network.

© Getty Images/iStockphoto/ipopba

Businessman hand holding cloud computing online connecting to big data analytics. Block chain network technology and intelligence data storage develop smart decision in global business solution.

Army personnel leaders are pushing hard to modernize how the service manages its people

14 November 2025 at 18:20

Interview transcript

Terry Gerton
The theme of AUSA is Agile, Adaptive, and Lethal. So how does that translate into what you all are working on in terms of the Army personnel strategy?

Brian Eifler Well, we always have to remember — and the chief just mentioned it, and the secretary did yesterday — everything we do is for the warfighter. Everything has to be from the soldier’s perspective. We could come up with some great plans, some great processes, some great things, but if they don’t help the warfighter and they just help those that are trying to help the warfighters, that’s not good enough. We need to have systems that are agile and adaptive and could be more up-to-date in the times, in the 21st century. We still do a lot of things, processes-wise, in the Pentagon, outside the Pentagon that have a lot bureaucratic processes; some systems that are just old. I know Smoke talked about it earlier today, [we have] some of the processes that we use with paper and that we don’t really need, but we just keep doing it. Or we’ll take something, for example, a paper, and we’ll digitize it, put it in a computer and say, hey, we’re modern. That’s not what we need to be doing. We need to do something that takes away all the extra time, extra resources from a soldier and gives it back to them. Everything from coming into the Army — you come in, you input all your data, and you shouldn’t have to input it again. Every time we go to a meeting or go to doctor or something, we are filling out forms again. We shouldn’t be doing that. There’s no need. It’s in the system. One of the big systems, [Integrated Personnel Pay System – Army (IPPS-A)] — everything feeds into that. It feeds into everything. So we’re trying to use the technology that’s available, that we have, to make it better instead of continuing to do things the old way. Transformation takes a lot of time and effort for a huge organization… It takes a little bit more time, and so I get a little impatient with it, as the chief and the secretary [said]. So, as you heard in the last 24 hours, we’re trying to push that and make it go faster. And it’s really expensive. And it shouldn’t be. And we’ve got to cut that stuff out. We need to work with companies and contractors that are willing to do the right thing here and not just try to build the army for a capability. And then we need be able to fix it. Whether it’s rewriting code or fixing a piece of equipment, it shouldn’t be the right [of] the contractor to withhold that. We should be able to do that. And that’s the agility part. We’re never gonna be satisfied with our administrative system, IPPS-A. We’re not gonna finish it and say, okay, we’re done; nothing more to do. We’re always changing policies. We’re always changing something. So we’ve got to be able to adjust and tweak it to [modernize] and stay with the program, instead of staying [with] something that was — hey, it was good in 2017, and we’re still using it in the same way and not adjusting because we can’t. So that’s a little bit of what it means to me. And a lot of the paperwork, a lot the processes for boards, how we do boards. you mentioned AI — putting AI into that to not replace people’s decisions, but to quell down the masses of files that they’re looking at, that AI can easily [be used] to deduce, here’s the population you really need to be looking at. So that’s just a little bit. I don’t want to take up all the time because I know Sgt. Maj. Stevens has got a couple of comments on that.

Smoke Stevens So, ma’am, you asked specifically [about] the strategy, and we just heard the chief talk about soldier touch points, and what does that mean when we talk about new equipment or kit? Specifically in our space, the people space, [it’s about] ensuring that our teammates, whether it’s from, the Human Resources Command or from the IPPS-A team, they’re getting out to organizations, they are seeing what’s working best when it comes to our personnel and pay system. And then the same for us — we’ve gone to Poland, we’re going to Korea. We want to see what’s actually working, what’s not working, come back here to Crystal City, sit down with the IPPS-A team, with the contractors, and say, hey, make these tweaks, because we’ve seen troopers on the ground provide something that can make something better for the Army.

Terry Gerton So you all are dealing with soldiers in their whole life cycle, from recruitment to retirement. What is happening along that process? I mean, the Army’s doing pretty well with recruiting and retention right now. What’s Driving your success and how are you going to sustain it?

Brian Eifler Well, there are multiple reasons, and you almost have to ask each of the trainees that are coming in why, because everybody has their own personal story, like a fingerprint, and I can’t put it down as one thing. But, [there are] a lot of good reasons to serve today, and we’re making it very appealing with the options that they can serve. Really, the [Future Soldier Prep Course (FSPC)] has been a boon of success. Those folks that maybe have been a little out of shape or didn’t test well in the academe — getting a second chance; these people have heart and grit. I equivocate it to walk-ons in football. So we get our Division 1-A scholarship players, they come in, but FSPC gives us the walk-on. These are the Rudies out there that just need a shot. And they’re probably better than Rudies because they just do more than one play, but they keep going and they have a heart and a grit that we are tapping into. And they are doing just as well as the people that didn’t go through that. So things like that are giving more opportunities to do things, and then once they get in, other opportunities to move from one specialty to another — both on the officer side and the enlisted — to do the reclass. That’s something that we’re really opening up earlier on. So you get in the Army like, hey, I didn’t really like this MOS, but I like that one. And there’s room, we can facilitate that. So those are some of the things along the way that we are doing.

Smoke Stevens Specifically with recruiting, one thing that we did identify that we got right was changing our recruiting workforce, focusing on talent acquisition, up-skilling the skill set that they had, the training they went through, sending them out to industry and making them better. That has paid dividends [tenfold]. And I will say part of the retention standpoint, I think it really goes back to leadership too. We have phenomenal leaders in our organizations that really understand the importance of, you got to get to know your service members, you got to make sure they’re going through challenging, tough, realistic training to show that, hey, this is what I came in the Army to do and I’m a part of something bigger than myself.

Terry Gerton So the professionalization of the recruiting force has been more than just a pilot. Now it’s gonna be institutionalized and really paying benefits.

Smoke Stevens Yes ma’am, absolutely. Both on the enlisted and the warrant officer side, we’re seeing the benefits of what talent acquisition is doing for us.

Terry Gerton That’s great to hear. Sir, you also mentioned at the beginning, you talked about all the [components] — active, guard and reserve. When you think about the total force, how would you assess the readiness of the total force these days?

Brian Eifler Well, I think we’re ready. And I think we never can say that we’re ready enough. I think, we always are on a journey of readiness. We always got to get better. And it’s not just becoming a trained expert in your craft, but as a collective element. So whether it’s a squad, a platoon, company, battalion, brigade, division, corps, theater, army, all of us have to continue. We can’t train enough. And that’s a challenge. It’s an expensive way to look at things, but it’s not as expensive as not doing it and going to war and paying for it in lives and the blood and treasure of the United States. So I think we always have to continue training. I’m very proud of the units I’ve been in and how we train and how pursue doing hard, crucible-like, difficult, challenging training because that’s what combat’s going to bring. We have to continue to do that, and we have to keep pushing ourselves. It’s very dangerous, it’s very hard, but I don’t think we can ever be satisfied — to quote Ralph Puckett — where we are. We always have to keep getting better. And what we’re talking about really this week is the technology. We can’t just rest on the laurels. [The] M1 tank is a great tank, but it’s not good enough, not in the future. We have to make the changes that they’re talking about with some automation, some AI, and some of those things. So I think we’re really good. I still think we’re the best in the world. But I don’t think we can sit there and brag about it. I think we need to continue to measure up to that every day.

Terry Gerton Well the Army Transformation Initiative is changing equipment, it’s changing force structure. And equipment is changing so fast — the Army’s TRADOC, Training and Doctrine Command, Futures Command, now you have the Transformation Command — how is all of that playing into what a service member can expect to be able to do in the Army from the time they come in?

Smoke Stevens I think [it’s important to understand] the direction that the Army is going and knowing that, A — and both our secretary and our chief and our SMA always says this — the soldier and everything that we’re supposed to provide for the soldier is the focal point. As soon as you hear that, you should know right up front, I’m coming into an organization that cares about me and the things that I need to have in order to fight and win. So I think, understand the predictability of what we’re going to do for the servicemember is number one. The combination of headquarters realigning and saying, hey, we’re putting these specific resources to ensure that we’re finding efficiencies in our 30-plus years of serving — we have never seen anything like that where, again, the Department of the Army has said, this is what we need in order to give back to the soldiers.

Brian Eifler Yeah, I think it’s choose your own adventure, right? There are so many MOSs and specialties that you could do, what you can qualify for, and there’s always a journey that you can adjust from, if you don’t like something. But I think a lot of what we bring of retaining good talent is our leadership. People want to be valued, people want to led by leaders that they want to emulate. The more people we have like that, the more people are going to want to stay in a unit or in a location or in the Army as a team because they feel like they’re valued, they’re making a difference, they’re being fulfilled, and their leaders are servant leaders. They’re caring for them and taking care of them, making sure they have the resources and the training to make them ready for war or combat or whatever happens.

Terry Gerton Well, you mentioned AI earlier and deploying AI into HR processes, but are you also bringing AI to bear in terms of continuous learning? It can’t just be, go into the basic course, go into advanced course, right? Things are changing so quickly. How are you bringing those kinds of modern tools into lifecycle management?

Brian Eifler Yeah, I think it’s, it’s a challenge, right? Cause it’s not something we’re used to. But as the chief just mentioned, there are two types of companies in the future, those that are powered by AI and those that are out of business. So we have to start incorporating it. We’re doing it a little bit slowly and measured because a lot of people are concerned about using too much AI, but I think we’re using it in chunks and steps forward. Like I implied with some of our promotion boards, what you could do, instead of spending several weeks on thousands of files and going through every file, You can probably use with much success artificial intelligence machine learning to get down to a manageable file level so that you can really invest your time as a board member to not get tired, to not get jaded or lazy in your file assessments, but really focus on what you really need to do. And I think things like that are a great example of what we can do now. We should be doing that. And so we’re helping also by bringing in folks, direct commissioning folks that have data scientist backgrounds — coders and stuff like that — because we need to have some in the Army … And we have that in IPPS-A for example, we have some of those that just came in and they love doing this. They love fixing and solving Army problems that affect the entire Army. So we need, not only just, like you said, to teach it, but we need to breed it into how we do things and expectations and develop people to come in with that skill set that can help us get to the next level.

Smoke Stevens If I can add, on the enlisted side with [professional Military Education (PME)], we started this journey a few years ago, with Digital Job Book, ensuring that — how much time does an actual human being in instructing need in order to deliver a method of training? So again, we found efficiencies and will continue to lean on, you known, academia and the things that are happening out in the college world to bring into our professional military education.

Brian Eifler Yeah, at the scope and scale, we need external help. I mean, we can’t do it all ourselves.

Terry Gerton Well, you’ve talked about a lot of things that you have on your plate. A lot of initiatives, a lot changes. What’s at the top of your Army G1 wish list?

Brian Eifler Do you want to go first?

Smoke Stevens At the top, the speed that we can make policy changes. There are a lot of things that — and I’ll use an example, Gen. Eifler will go to a very senior sync, a whiteboard session, and it’s like, okay, this makes sense, let’s make this change. There are layers of things that we have to go through to make those changes. That’s tough when we know we want to deliver something to the field immediately. Obviously we don’t want to get ahead of anything that has to go through some type of review. But again, at the top of the list is how do we make policy changes to the speed that we need it to happen.

Brian Eifler Yeah, and how do we limit to, like, who really needs to see it? A lot of people like to have their hands on the steering wheel, but we only need one driver. And so when, like he implied, we’ll get something and everybody wants to do it. But then everybody will start to touch it, and we don’t need everybody to touch. We just need to coordinate it. And I think that needs to be overhauled inside the Pentagon. And that’s what we’re trying to do. I think that’s good. I think one of my challenges, like the Secretary spoke yesterday, is making sure we’re holding accountable our industry partners with things — delivering, for example, IPPS-A and getting it to the next level, faster and less expensive, quite honestly. That is what keeps me up at night. This is for soldiers. We have the capability and technology, but it’s taken a little bit too long. We’re going to incorporate pay into IPPS-A. We got to get moving on that and we’ve got to change all the other systems that feed into IPPS-A as well as those that… it feeds into at the same pace, and make sure it’s all connecting. So that’s one of my big wish list [items]. And then also, if I have two wishes, the civilian workforce overhaul. I mean that’s something that we’ve been looking for, whether it’s pay scales, development. The hiring process is archaic. It’s not how we should be doing things. It’s got to be updated. [That’s] really above us, but those are the two wishes that I would probably wish for.

The post Army personnel leaders are pushing hard to modernize how the service manages its people first appeared on Federal News Network.

© Getty Images/iStockphoto/gorodenkoff

International Team of Military Personnel Have Meeting in Top Secret Facility, Female Leader Holds Laptop Computer Talks with Male Specialist. People in Uniform on Strategic Army Meeting

AI is solving problems it’s also creating

14 November 2025 at 18:01

Artificial intelligence has quickly become a centerpiece of cybersecurity strategy across government and industry. Agencies are under pressure to modernize, and AI promises to accelerate response times, automate enforcement, and increase efficiency at scale.

But there is a critical risk that’s not getting enough attention. Automation without visibility doesn’t eliminate complexity. It multiplies it. And for federal agencies operating under stringent mandates and oversight, that creates a dangerous blind spot.

When AI turns enforcement into chaos

Consider an organization that turned to AI to manage firewall rules. The idea was simple: Allow the AI to continuously generate and enforce rules, so that the network remained secure in real time. On paper, it worked. The AI delivered consistent enforcement and even a solid return on investment.

But when auditors stepped in, they discovered a problem. Instead of consolidating rules, the AI had simply layered them on repeatedly. What had been a 2,000-line ruleset grew into more than 20,000 lines. Buried within were contradictions, redundancies and overlaps.

For operators, the network functioned. But for compliance officers, it was a nightmare. Demonstrating segmentation of sensitive environments, something federal mandates and Payment Card Industry Data Security Standards both require, meant combing through 20,000 rules line by line. AI had streamlined enforcement, but it had rendered oversight almost impossible.

This is the irony of AI in cybersecurity: It can solve problems while simultaneously creating new ones.

Masking complexity, not removing it

Federal IT leaders know that compliance is not optional. Agencies must not only enforce controls, but also prove to Congress, regulators and oversight bodies that controls are effective. AI-generated logic, while fast, often can’t be explained in human terms.

That creates risk. Analysts may be right that AI is enabling “preemptive” security, but it’s also masking the misconfigurations, insecure protocols and segmentation gaps that adversaries exploit. Worse, AI may multiply those issues at a scale human operators can’t easily trace.

In short, if you can’t see what AI is changing, you can’t secure it.

Federal mandates demand proof, not promises

Unlike private enterprises, federal agencies face multiple layers of oversight. From Federal Information Security Modernization Act audits to National Institute of Standards and Technology framework requirements, agencies must continuously demonstrate compliance. Regulators won’t accept “trust the AI” as justification. They want evidence.

That’s where AI-driven enforcement creates the most risk: It undermines explainability. An agency may appear compliant operationally but struggle to generate transparent reports to satisfy audits or demonstrate adherence to NIST 800-53, Cybersecurity Maturity Model Certificaiton or zero trust principles.

In an environment where operational uptime is mission-critical, whether for Defense communications, transportation systems or civilian services, losing visibility into how security controls function is not just a compliance risk. It’s a national security risk.

Independent oversight is essential

The solution is not to reject AI. AI can and should play a vital role in federal cybersecurity modernization. But it must be paired with independent auditing tools that provide oversight, interpretation and clarity.

Independent auditing serves the same purpose in cybersecurity as it does in finance: verifying the work. AI may generate and enforce rules, but independent systems must verify, streamline and explain them. That dual approach ensures agencies can maintain both speed and transparency.

I’ve seen agencies and contractors struggle with this first-hand. AI-driven automation delivers efficiency, but when auditors arrive, they need answers that only independent visibility tools can provide. Questions like:

  • Is the cardholder or mission-critical data environment fully segmented?
  • Are insecure protocols still running on public-facing infrastructure?
  • Can we produce an auditable trail proving compliance with NIST or PCI requirements?

Without these answers, federal agencies risk compliance failures and, worse, operational disruption.

The federal balancing act

Federal leaders also face a unique challenge: balancing security with mission-critical operations. In defense, for example, communication downtime in the field is catastrophic. In civilian agencies, outages in public-facing systems can disrupt services for millions of citizens.

This creates tension between network operations centers (focused on uptime) and security operations centers (focused on compliance). AI promises to keep systems running, but without visibility, it risks tipping the balance too far toward operations at the expense of oversight.

The federal mission demands both: uninterrupted operations and provable security. AI can help achieve that balance, but only if independent oversight ensures explainability.

Questions federal security leaders must ask

Before integrating AI further into their cybersecurity posture, federal leaders should ask:

  1. What visibility do we have into AI-generated changes? If you can’t explain the logic, you can’t defend it.
  2. How will we validate compliance against federal frameworks? Oversight bodies won’t accept black-box answers.
  3. What happens when AI introduces errors? Automation multiplies mistakes as quickly as it enforces controls.
  4. Do we have independent tools for oversight? Without them, auditors, regulators and mission leaders will be left in the dark.

Don’t trade clarity for convenience

AI is transforming federal cybersecurity. But speed without clarity is a liability. Agencies cannot afford to trade explainability for convenience.

The warning is clear: AI is quietly building operational debt while masking misconfigurations. Without independent oversight, that debt will come due in the form of compliance failures, operational disruption or even breaches.

Federal leaders should embrace AI’s benefits, but not at the cost of visibility. Because in cybersecurity, especially in government, if you can’t see what AI is changing, you can’t secure it.

Ian Robinson is the chief product officer for Titania.

The post AI is solving problems it’s also creating first appeared on Federal News Network.

© Getty Images/WANAN YOSSINGKUM

How the administration is bringing much needed change to software license management

14 November 2025 at 17:06

Over the last 11 months, the General Services Administration has signed 11 enterprisewide software agreements under its OneGov strategy.

The agreements bring both standard terms and conditions as well as significant discounts for a limited period of time to agencies.

Ryan Triplette, the executive director of the Coalition for Fair Software Licensing, said the Trump administration seems to be taking cues from what has been working, or not working, in the private sector around managing software licenses.

Ryan Triplette is the executive director of the Coalition for Fair Software Licensing.

“They seem to be saying, ‘let’s see if we can import that in to the federal agencies,’ and ‘let’s see if we can address that to mitigate some of the issues that have been occurring in some of the systemic problems that have been occurring here,’” said Triplette on Ask the CIO. “Now it’s significant, and it’s a challenge, but it’s something that we think is important that you understand any precedent that is set in one place, in this instance, in the public agencies, will have a ripple of impact over into the commercial sector.”

The coalition, which cloud service providers created in 2022 to advocate for less-restrictive rules for buying software, outlined nine principles that it would like to see applied to all software licenses, including terms should be clear and intelligible, customers should be free to run their on-premise software on the cloud of their choice and licenses should cover reasonably expected software uses.

Triplette said while there still is a lot to understand about these new OneGov agreements, GSA seems to recognize there is an opportunity to address some long standing challenges with how the government buys and manages its software.

“You had the Department of Government Efficiency (DOGE) efforts and you had the federal chief information officer calling for an assessment of the top five software vendors from all the federal agencies. And you also have the executive order that established OneGov and having them seeking to establish these enterprisewide licensees, I think they recognize that there’s an opportunity here to effect change and to borrow practices from what they have seen has worked in the commercial sector,” she said. “Now there’s so many moving parts of issues that need to be addressed within the federal government’s IT and systems, generally. But just tackling issues that we have seen within software and just tackling the recommendations that have been made by the Government Accountability Office over the past several years is important.”

Building on the success of the MEGABYTE Act

GAO has highlighted concerns about vendors applying restrictive licensing practices. In November 2024, GAO found vendor processes that limit, impede or prevent agencies’ efforts to use software in cloud computing. Meanwhile of the six agencies auditors analyzed, none had “fully established guidance that specifically addressed the two key industry activities for effectively managing the risk of impacts of restrictive practices.”

Triplette said the data call by the federal CIO in April and the OneGov efforts are solid initial steps to change how agencies buy and manage software.

The Office of Management and Budget and GSA have tried several times over the past two decades to improve the management of software. Congress also joined the effort passing the Making Electronic Government (MEGABYTE) Act in 2016.

Triplette said despite these efforts the lack of data has been a constant problem.

“The federal government has found that even when there’s a modicum of understanding of what their software asset management uses, they seem to find a cost performance improvement within the departments. So that’s been one issue. You have the differing needs of the various agencies and departments. This has led them in previous efforts to either opt out of enterprisewide licenses or to modify them with their own terms. So even when there’s been these efforts, you find, like, a year or two or three years later, it’s all a wash,” she said. “Quite frankly, you have a lack of a central mandate and appropriations line. That’s probably the most fundamental thing and why it also differs so fundamentally from other governments that have some of these more centralized services. For instance, the UK government has a central mandate, it works quite well.”

Triplette said what has changed is what she called a “sheer force of will” by OMB and GSA.

“They are recognizing the significant amount of waste that’s been occurring and that there has been lock-in with some software vendors and other issues that need to be tackled,” she said. “I think you’ve seen where the administration has really leaned into that. Now, what is going to be interesting is because it has been so centralized, like the OneGov effort, it’s still also an opt-in process. So that’s why I keep on saying, it’ll to be determined how effective it will be.”

SAMOSA gaining momentum

In addition to the administration’s efforts, Triplette said she’s hopeful Congress finally passes the Strengthening Agency Management and Oversight of Software Assets (SAMOSA) Act. The Senate ran out of time to act on SAMOSA last session, after the House passed it in December.

The latest version of SAMOSA mirrors the Senate bill the committee passed in May 2023. It also is similar to the House version introduced in March by Reps. Nancy Mace (R-S.C.), the late Gerry Connolly (D-Va.), and several other lawmakers.

The coalition is a strong supporter of SAMOSA.

Triplette said one of the most important provisions in the bill would require agencies to have a dedicated executive overseeing software license asset management.

“There is an importance and a need to have greater expertise within the federal workforce, around software licensing, and especially arguably, vendor-specific software licensing terms,” she said. “I think this is one area that the administration could take a cue from the commercial sector. When they’re engaged in commercial licensing, they tend to work with consultants that are experts in the vendor licensing rules, they understand the policy and they understand the ins and outs. They often have somebody in house that … may not be solely specific to one vendor, but they may do only two or three and so you really have that depth of expertise, that you can understand some great cost savings.”

Triplette added that while finding these types of experts isn’t easy, the return on the investment of either hiring or training someone is well worth it.

She said some estimate that the government could save $50 million a year by improving how it manages its software licenses.  This is on top of what the MEGABYTE Act already produced. In 2020, the Senate Homeland Security and Governmental Affairs Committee found that 13 agencies saved or avoided spending more than $450 million between fiscal 2017 and 2019 because of the MEGABYTE Act.

“The MEGABYTE Act was an excellent first step, but this, like everything, [is] part of an iterative process. I think it’s something that needs to have the requirement that it has to be done and mandated,” Triplette said. “This is something that has become new as you’ve had the full federal movement to the cloud, and the discussion of licensing terms between on-premise and the cloud, and the intersection between all of this transformation. That is something that wasn’t around during the MEGABYTE Act. I think that’s where it’s a little bit of a different situation.”

The post How the administration is bringing much needed change to software license management first appeared on Federal News Network.

© Federal News Network

fnr-icon-full

Unlocking efficiency: The case for expanding shared services

13 November 2025 at 18:10

As the federal government contends with tightening budgets, lean staffing and soaring citizen expectations, it faces a unique opportunity — and obligation — to modernize service delivery by investing in shared services. The notion of centralized provisioning might conjure up all sorts of challenges and issues; however, it is a proven avenue to lower costs, eliminate duplication and elevate service performance. While shared services isn’t the answer for all functions, we possess valuable learned lessons and survey feedback on shared services to create a powerful pathway to increase government effectiveness while lowering costs.

The federal government has demonstrated tangible benefits of shared services related to administrative systems. For example, between fiscal 2002 and 2015, consolidating payroll and HR systems generated more than $1 billion in cost savings and an additional $1 billion when 22 payroll systems were consolidated into four, according to the Government Accountability Office. A 2024 report published by the Federation of American Scientists noted that consolidating the payroll centers yielded cumulative savings exceeding $3.2 billion. Other measurable results include the General Services Administration’s fleet management program that consolidated more than 5,000 agency-owned vehicles on leasing, maintenance and administrative costs. Shared IT services have also expanded steadily, with the adoption of Login.gov, USAJOBS, and data center consolidation that saved over $2.8 billion, according to GAO and the Office of Management and Budget in 2020.

Why it matters — and why now

The federal government continues to advance priorities that improve citizen services while driving down costs as we enter the artificial intelligence era where AI, data and information are transforming industries, big data and human capabilities. Agencies are facing a pressing need to modernize IT systems, increase efficiencies by incorporating automation and AI, and increase cybersecurity. Without integrated business services, agencies will struggle to maintain infrastructure, secure their systems and modernize for the AI era. Consolidated IT investments, such as Login.gov and ID.me, have proven to provide stronger, more resilient platforms that enhance cybersecurity and protect mission-critical systems, provide standardized data and analytics and improve transparency.

Still, shared services must be implemented with care. Agencies need flexibility to select providers that best fit their mission-specific requirements. Focusing first on areas where agencies are already seeking solutions — such as accounting, fleet management and office space aligned by security requirements — offers a pragmatic path forward. Service providers must be held to strict performance standards, with service-level agreements ensuring that quality improves alongside efficiency. Equally important, strong leadership and coordination are necessary to sustain momentum.

Agencies like the Office of Personnel Management, GSA and Treasury, which have successfully acted as managing partners in the past, can provide the oversight and accountability required for long-term success. Rather than measuring Quality Service Management Offices (QSMOs) solely by their early momentum, their success should be understood in light of the current environment: smaller budgets, fewer staff and an increased focus on mission delivery. In this context, the adoption of integrated business services positions agencies for long-term gains.

Shared services provide the architecture for a more modern, efficient and mission-focused government. From payroll to fleet management to IT modernization, the federal government has demonstrated the value of this approach through billions of dollars in savings and significant performance improvements. With bipartisan policy support, proven blueprints and advances in shared platforms, the federal enterprise is well-positioned to expand shared services — carefully, collaboratively and with agency choice at its core. If pursued deliberately, shared services can become a cornerstone of fiscal responsibility and high-quality service delivery for the American people.

Erika Dinnie is the vice president of federal strategy and planning for MetTel. Before joining the company, Dinnie served as the General Services Administration’s associate chief information officer for digital infrastructure technologies for nearly 10 years, overseeing GSA’s IT infrastructure, systems, software and applications.

The post Unlocking efficiency: The case for expanding shared services first appeared on Federal News Network.

© Getty Images/NicoElNino

❌
❌