Normal view

There are new articles available, click to refresh the page.
Yesterday — 9 December 2025Main stream

AI agents: The next layer of federal digital infrastructure

9 December 2025 at 14:13

For years, the conversation about artificial intelligence in government focused on model development — how to train algorithms, deploy pilots and integrate machine learning into existing workflows. That foundation remains critical. But today, federal leaders are asking a different question: What does an AI-native government look like?

The answer may lie in AI agents — autonomous, adaptive systems capable of perceiving, reasoning, planning and acting across data environments. Unlike traditional AI models that provide insights or automate discrete tasks, AI agents can take initiative, interact with other systems, and continuously adapt to mission needs. These systems depend on seamless access to 100% of mission-relevant data, not just data in a single environment. Without that foundation — data that’s unified, governed and accessible across hybrid infrastructures — AI agents remain constrained tools rather than autonomous actors. In short, they represent a move from static tools to dynamic, mission-aligned infrastructure.

For federal agencies, this shift opens up important opportunities. AI agents can help agencies improve citizen services, accelerate national security decision-making, and scale mission delivery in ways that were once unthinkable. But realizing that potential requires more than adopting new technology. It requires building the digital foundations (data architectures, governance frameworks and accountability measures) that can support AI agents as core elements of federal digital infrastructure.

A new phase for AI: Why agents are different

Federal agencies have decades of experience digitizing processes: electronic health records at the Department of Veterans Affairs, online tax filing for the IRS, and digital services portals for immigration at Customs and Immigration Services and the Department of Homeland Security. AI has expanded those capabilities by enabling advanced analytics and automation. But most government AI systems today remain tethered to narrowly defined functions. They can classify, predict or recommend, but they do not act independently or coordinate across environments.

AI agents are different. Think of them as mission teammates rather than tools. For example, in federal cybersecurity, instead of just flagging anomalies, an AI agent could prioritize threats, initiate containment steps and escalate issues to human analysts — all while learning from each encounter. In citizen-facing services, an AI agent could guide individuals through complex benefit applications, tailoring support based on real-time context rather than static forms.

This evolution mirrors the shift from mainframes to networks, or from static websites to dynamic cloud platforms. AI agents are not simply another application to bolt onto existing workflows. They are emerging as a new layer of digital infrastructure that will underpin how federal agencies design, deliver and scale mission services.

Building the foundations: Beyond silos

To function effectively, AI agents need access to diverse, distributed data. They must be able to perceive information across silos, reason with context and act with relevance. That makes data architecture the critical enabler.

Most federal data remains fragmented across on-premises systems, multi-cloud environments and interagency ecosystems. AI agents cannot thrive in those silos. They require hybrid data architectures that integrate separate sources, ensure interoperability and provide governed access at scale.

By investing in architectures that unify structured and unstructured data, agencies can empower AI agents to operate seamlessly across environments. For instance, in disaster response, an AI agent might simultaneously draw on Federal Emergency Management Agency data, National Oceanic and Atmospheric Administration weather models, Defense Department logistics systems, and public health records from the Department of Health and Human Services — coordinating actions across federal entities and with state partners. Without hybrid architectures, that level of coordination is impossible.

The second layer: Governance, trust, transparency

Equally as important is governance. Federal leaders cannot separate innovation from responsibility. AI agents must operate within clear rules of transparency, accountability and security. Without trust, their adoption will stall.

Governance begins with ensuring that the data fueling AI agents is accurate, secure and responsibly managed. It extends to monitoring agent behaviors, documenting decision processes, and ensuring alignment with legal and ethical standards. Federal agencies must ask: How do we verify what an AI agent did? How do we ensure its reasoning is explainable? How do we maintain human oversight in critical decisions?

By embedding governance frameworks from day one, agencies can avoid the pitfalls of opaque automation. Just as cybersecurity became a foundational consideration in every IT system, governance must become a foundational consideration for every AI agent deployed in the federal mission space.

For the federal government, trust is also non-negotiable. Citizens are owed AI agents that act fairly, protect their data, and align with democratic values. Transparency through being able to see how decisions are made and how outcomes are validated will be essential to earning that trust.

Agencies can lead by adopting principles of responsible AI: documenting model provenance, publishing accountability standards, and ensuring diverse oversight. Trust is not a constraint; it is a mission enabler. Without it, the promise of AI agents will remain unrealized.

Preparing today for tomorrow

The question for federal leaders is not whether AI agents will shape the future of government service; it is how quickly agencies will prepare for that future. The steps are clear:

  • Invest in data infrastructure: Build hybrid, interoperable architectures that give AI agents access to 100% of mission-relevant federal data, wherever it resides.
  • Embed governance from the start: Establish frameworks for transparency, accountability and oversight before AI agents scale.
  • Cultivate trust: Communicate openly with citizens, publish standards and ensure that AI systems reflect public values.
  • Experiment with mission scenarios: Pilot AI agents in targeted federal use cases (cyber defense and benefits delivery, for instance) while developing playbooks for broader adoption.

We are at a turning point. Just as networks and cloud computing became indispensable layers of federal IT, AI agents are poised to become the next foundational layer of digital infrastructure. They will not replace federal employees, but they will augment them — expanding capacity, accelerating insight, and enabling agencies to meet rising expectations for speed, precision and personalization.

The future of the federal government will not be built on static systems. It will be built on adaptive, agentic infrastructure that can perceive, reason, plan and act alongside humans. Agencies that prepare today — by investing in hybrid architectures, embedding governance and cultivating trust — will be best positioned to lead tomorrow.

In the coming years, AI agents will not just support federal missions. They will help define them. The question is whether agencies will see them as one more tool, or as what they truly are: the next layer of digital infrastructure for public service.

Dario Perez is vice president of federal civilian and SLED at Cloudera.

The post AI agents: The next layer of federal digital infrastructure first appeared on Federal News Network.

© Getty Images/iStockphoto/ipopba

AI, Machine learning, Hands of robot and human touching on big data network connection background, Science and artificial intelligence technology, innovation and futuristic.
Before yesterdayMain stream

Getting ahead of CMMC, FedRAMP and AI Compliance before it gets ahead of you

8 December 2025 at 16:12

If 2025 felt like a whirlwind for regulatory compliance, you’re not imagining it. Between the finalization of Cybersecurity Model Maturity Certification 2.0 rules, the launch of FedRAMP’s 20x initiative promising faster authorizations, and new AI governance requirements from the Office of Management and Budget and the National Institute of Standards and Technology, organizations working with federal agencies faced enormous regulatory change.

As we head into 2026, the tempo isn’t slowing. The Defense Department is phasing CMMC into contracts to protect the defense industrial base. FedRAMP continues evolving as more agencies migrate critical systems to the cloud. And AI regulations are moving from principles to prescriptive requirements as governments grapple with the risks and opportunities of deploying AI at scale.

After leading hundreds of companies through compliance journeys and assessments — and going through them ourselves — we’ve learned that while each framework has nuances, three universal lessons apply.

Three lessons that apply to each framework

1) These frameworks are not like the ones you already know.

The biggest mistake? Treating CMMC like SOC 2 or assuming FedRAMP is “ISO 27001 for government.”

For example, CMMC Level 2 requires implementing all 110 NIST 800-171 requirements and 320 assessment objectives. Your system security plan alone could reach 200 pages. Budget more time, resources and specialized expertise than you think you need.

2) Scoping is a critical first step that organizations often get wrong.

Determining what’s in scope is one of the hardest and most important steps. I’ve seen companies believe 80% of infrastructure was in scope for CMMC, only to learn it was closer to 30%. Be ruthless about where controlled unclassified information actually lives. Every system you include can add months of work and tens of thousands in costs.

For FedRAMP, define your authorization boundary early. For AI governance, inventory every AI system, including embedded features in SaaS tools. Invest in scoping before implementing controls.

3) Automation is mission-critical, not optional.

Manual processes don’t scale when juggling multiple frameworks, and they leave you vulnerable to errors and inefficiencies. That’s why FedRAMP 20x and other frameworks today are evolving to put automation at the center of the process. Organizations that want continuous improvement must treat automation as core infrastructure, especially for monitoring controls, collecting evidence and surfacing real-time compliance data.

The real cost of playing catch-Up

Companies treating compliance as a last-minute sprint face hundreds of thousands of dollars in average costs for CMMC Level 2 alone. They scramble, rush documentation and often fail their first assessment — and non-compliance can come at a hefty price.

Organizations that delay addressing compliance gaps are vulnerable to security risks. IBM’s 2025 Cost of a Data Breach Report showed that noncompliance with regulations increases the average cost of a breach by nearly $174,000.

Regulatory actions are rising too. The Department of Health and Human Services’ Office for Civil Rights issued 19 settlements and over $8 million in fines for HIPAA violations this year to date, already the highest on record for a single year.

Organizations that start early spend less and use compliance as a competitive advantage. When you’re behind, compliance is a burden; when you’re ahead, it’s a differentiator.

What you need to know right now

For CMMC 2.0

If you’re a prime contractor, subcontractor handling CUI, or external service provider in the DoD supply chain, start now.

Identify what type of information you handle, what certification level you need, and define your scope. Build your system security plan early and categorize assets as CUI, security-protected, contract-risk managed or out of scope.

When selecting a C3PAO assessor, look for transparent pricing, strong references and clear data-handling processes. You can achieve conditional certification with a plan of action and milestones, but you have only 180 days to remediate and must score at least 80% in SPRS.

For FedRAMP 20x

Keep in mind that FedRAMP isn’t a one-time audit. The true 20x objective is not just to speed up authorizations, but to achieve smarter and stronger security — and this requires preparation.

These steps are non-negotiable:

  • Build continuous monitoring infrastructure and processes from day one.
  • Ensure your authorization boundary is correct and your architecture documentation is precise. Ambiguity causes delays that stretch timelines beyond a year.
  • Automate evidence collection and continuous monitoring for monthly deliverables required to maintain authorization.

For AI governance

Federal AI regulations are quickly moving from principles to requirements. Establish AI governance councils now. Inventory AI systems comprehensively, document training data provenance, implement bias testing protocols and create transparency mechanisms.

As OMB and NIST frameworks take hold, AI governance will become a standard procurement requirement through 2026.

Five steps to start today

1) Start with an honest gap assessment.

Most companies are further behind than they think, particularly on incident response and supply chain risk management. Know your baseline before building your roadmap.

2) Treat documentation like code.

Your system security plan, policies and authorization package shouldn’t be static Word documents. Your documentation needs to be a living architecture that is version-controlled, regularly updated and, ideally, machine readable.

3) Build compliance into procurement.

Create vendor risk assessment processes that evaluate CMMC readiness, FedRAMP authorization status and AI governance practices before signing contracts. For CMMC, ensure vendors provide Customer Responsibility Matrices documenting which NIST 800-171 controls they are responsible for.

4) Invest in your people.

Build exceptional compliance programs by upskilling existing staff. Send operations teams to CMMC training. Have developers learn secure coding for FedRAMP environments. Create AI literacy programs. Make compliance competency a core skill.

5) Prepare for continuous monitoring.

CMMC includes provisions for ongoing assessments and affirmations of compliance. FedRAMP requires continuous monitoring. AI governance demands continuous bias testing. Invest in automation systems and tools like trust centers that are able to demonstrate your up-to-date security and compliance posture any day of the year.

The opportunity in the complexity

Despite the challenges, companies getting compliance right are winning work they couldn’t before. Defense contractors and small businesses can use CMMC certification to compete for prime contracts. Cloud service providers who achieve FedRAMP authorization can significantly accelerate their federal sales cycles, cutting months from procurement timelines. AI startups land pilots by demonstrating responsible AI practices.

The companies that thrive treat compliance as something they control, not something that happens to them. They build security-first cultures, invest in the right tools and training, and transform compliance from cost center to competitive advantage.

The best time to start was yesterday. The second-best time is today, because 2026 promises even more compliance complexity, and it’s coming faster than you think.

Shrav Mehta is the founder and CEO of Secureframe.

The post Getting ahead of CMMC, FedRAMP and AI Compliance before it gets ahead of you first appeared on Federal News Network.

© Getty Images/Techa Tungateja

Three steps to build a data foundation for federal AI innovation

5 December 2025 at 17:42

America’s AI Action Plan outlines a comprehensive strategy for the country’s leadership in AI. The plan seeks, in part, to accelerate AI adoption in the federal government. However, there is a gap in that vision: agencies have been slow to adopt AI tools to better serve the public. The biggest barrier to adopting and scaling trustworthy AI isn’t policy or compute power — it’s the foundation beneath the surface. How agencies store, access and govern their records will determine whether AI succeeds or stalls. Those records aren’t just for retention purposes; they are the fuel AI models need to power operational efficiencies through streamlined workflows and uncover mission insights that enable timely, accurate decisions. Without robust digitalization and data governance, federal records cannot serve as the reliable fuel AI models need to drive innovation.

Before AI adoption can take hold, agencies must do something far less glamorous but absolutely essential: modernize their records. Many still need to automate records management, beginning with opening archival boxes, assessing what is inside, and deciding what is worth keeping. This essential process transforms inaccessible, unstructured records into structured, connected datasets that AI models can actually use. Without it, agencies are not just delaying AI adoption, they’re building on a poor foundation that will collapse under the weight of daily mission demands.

If you do not know the contents of the box, how confident can you be that the records aren’t crucial to automating a process with AI? In AI terms, if you enlist the help of a model like OpenAI, the results will only be as good as the digitized data behind it. The greater the knowledge base, the faster AI can be adopted and scaled to positively impact public service. Here is where agencies can start preparing their records — their knowledge base — to lay a defensible foundation for AI adoption.

Step 1: Inventory and prioritize what you already have

Many agencies are sitting on decades’ worth of records, housed in a mix of storage boxes, shared drives, aging databases, and under-governed digital repositories. These records often lack consistent metadata, classification tags or digital traceability, making them difficult to find, harder to govern, and nearly impossible to automate.

This fragmentation is not new. According to NARA’s 2023 FEREM report, only 61% of agencies were rated as low-risk in their management of electronic records — indicating that many still face gaps in easily accessible records, digitalization and data governance. This leaves thousands of unstructured repositories vulnerable to security risks and unable to be fed into an AI model. A comprehensive inventory allows agencies to see what they have, determine what is mission-critical, and prioritize records cleanup. Not everything needs to be digitalized. But everything needs to be accounted for. This early triage is what ensures digitalization, automation and analytics are focused on the right things, maximizing return while minimizing risk.

Without this step, agencies risk building powerful AI models on unreliable data, a setup that undermines outcomes and invites compliance pitfalls.

Step 2: Make digitalization the bedrock of modernization

One of the biggest misconceptions around modernization is that digitalization is a tactical compliance task with limited strategic value. In reality, digitalization is what turns idle content into usable data. It’s the on-ramp to AI driven automation across the agency, including one-click records management and data-driven policymaking.

By focusing on high-impact records — those that intersect with mission-critical workflows, the Freedom of Information Act, cybersecurity enforcement or policy enforcement — agencies can start to build a foundation that’s not just compliant, but future-ready. These records form the connective tissue between systems, workforce, data and decisions.

The Government Accountability Office estimates that up to 80% of federal IT budgets are still spent maintaining legacy systems. Resources that, if reallocated, could help fund strategic digitalization and unlock real efficiency gains. The opportunity cost of delay is increasing exponentially everyday.

Step 3: Align records governance with AI strategy

Modern AI adoption isn’t just about models and computation; it’s about trust, traceability, and compliance. That’s why strong information governance is essential.

Agencies moving fastest on AI are pairing records management modernization with evolving governance frameworks, synchronizing classification structures, retention schedules and access controls with broader digital strategies. The Office of Management and Budget’s 2025 AI Risk Management guidance is clear: explainability, reliability and auditability must be built in from the start.

When AI deployment evolves in step with a diligent records management program centered on data governance, agencies are better positioned to accelerate innovation, build public trust, and avoid costly rework. For example, labeling records with standardized metadata from the outset enables rapid, digital retrieval during audits or investigations, a need that’s only increasing as AI use expands. This alignment is critical as agencies adopt FedRAMP Moderate-certified platforms to run sensitive workloads and meet compliance requirements. These platforms raise the baseline for performance and security, but they only matter if the data moving through them is usable, well-governed and reliable.

Infrastructure integrity: The hidden foundation of AI

Strengthening the digital backbone is only half of the modernization equation. Agencies must also ensure the physical infrastructure supporting their systems can withstand growing operational, environmental, and cybersecurity demands.

Colocation data centers play a critical role in this continuity — offering secure, federally compliant environments that safeguard sensitive data and maintain uptime for mission-critical systems. These facilities provide the stability, scalability and redundancy needed to sustain AI-driven workloads, bridging the gap between digital transformation and operational resilience.

By pairing strong information governance with resilient colocation infrastructure, agencies can create a true foundation for AI, one that ensures innovation isn’t just possible, but sustainable in even the most complex mission environments.

Melissa Carson is general manager for Iron Mountain Government Solutions.

The post Three steps to build a data foundation for federal AI innovation first appeared on Federal News Network.

© Getty Images/iStockphoto/FlashMovie

Digital information travels through fiber optic cables through the network and data servers behind glass panels in the server room of the data center. High speed digital lines 3d illustration

Cybersecurity in focus: DOJ aggressively investigating contractors’ cybersecurity practices

4 December 2025 at 15:29

The Justice Department recently resolved several investigations into federal contractors’ cybersecurity requirements as part of the federal government’s Civil Cyber-Fraud Initiative. The initiative, first announced in 2021, ushered in the DOJ’s efforts to pursue cybersecurity-related fraud by government contractors and grant recipients pursuant to the False Claims Act. Since then, the DOJ has publicly announced approximately 15 settlements against federal contractors, with the DOJ undoubtedly conducting even more investigations outside of the public’s view.

As an initial matter, these latest settlements signal that the new administration has every intention of continuing to prioritize government contractors’ cybersecurity practices and combating new and emerging cyber threats to the security of sensitive government information and critical systems. These settlements also coincide with the lead up to the Nov. 10 effective date of the Defense Department’s final rule amending the Defense Federal Acquisition Regulation Supplement, which incorporates the standards of the Cybersecurity Maturity Model Certification.

Key DOJ cyber-fraud decisions

The first of these four recent DOJ settlements was announced in July 2025, and resulted in Hill Associates agreeing to pay the United States a minimum of $14.75 million. In this case, Hill Associates provided certain IT services to the General Services Administration. According to the DOJ’s allegations, Hill Associates had not passed the technical evaluations required by GSA for a contractor to offer certain highly adaptive cybersecurity services to government customers. Nevertheless, the contractor submitted claims charging the government for such cybersecurity services, which the DOJ alleged violated the FCA.

The second settlement, United States ex. rel. Lenore v. Illumina Inc., was announced later in July 2025, and resulted in Illumina agreeing to pay $9.8 million — albeit with Illumina denying the DOJ’s allegations. According to the DOJ, Illumina violated the FCA by selling federal agencies, including the departments of Health and Human Services, Homeland Security and Agriculture, certain genomic sequencing systems that contained cybersecurity vulnerabilities. Specifically, the DOJ alleged that with respect to the cybersecurity of its product, Illumina: (1) falsely represented that its software and systems adhered to cybersecurity standards, including standards of the International Organization for Standardization and National Institute of Standards and Technology; (2) knowingly failed to incorporate product cybersecurity in its software design, development, installation and on-market monitoring; (3) failed to properly support and resource personnel, systems and processes tasked with product security; and (4) failed to adequately correct design features that introduced cybersecurity vulnerabilities.

That same day, the DOJ announced its third settlement, which was with Aero Turbine Inc., and Gallant Capital Partners, LLC (collectively, “Aero”), and resulted in a $1.75 million settlement. This settlement resolved the DOJ’s allegations that Aero violated the FCA by knowingly failing to comply with the cybersecurity requirements of its contract with the Department of the Air Force. Pursuant to the contract, Aero was required to implement the security requirements outlined by NIST Special Publication 800-171, “Protecting Controlled Unclassified Information in Nonfederal Information Systems and Organizations,” but failed to fully do so. This included failing to control the flow of and limit unauthorized access to sensitive defense information when it provided an unauthorized Egypt-based software company and its personnel with files containing sensitive Defense information.

The fourth and latest DOJ settlement was announced in Sept. 2025, and resolved the DOJ’s FCA lawsuit against the Georgia Tech Research Corporation. As part of the settlement, GRTC agreed to pay $875,000 to resolve allegations resulting from a whistleblower complaint that it failed to meet the cybersecurity requirements in its DoD contracts. Specifically, the DOJ alleged that until December 2021, the contractor failed to install, update or run anti-virus or anti-malware tools on desktops, laptops, servers and networks while conducting sensitive cyber-defense research for the DoD. The DOJ further alleged that the contractor did not have a system security plan setting out cybersecurity controls, as required by the government contract. Lastly, the DOJ alleged that the contractor submitted a false summary level cybersecurity assessment score of 98 to the DoD, with the score being premised on a “fictitious” environment, and did not apply to any system being used to process, store or transmit sensitive Defense information.

Takeaways for federal contractors

These recent enforcement actions provide valuable guidance for federal contractors.

  • DOJ has explicitly stated that cyber fraud can exist regardless of whether a federal contractor experienced a cyber breach.
  • DOJ is focused on several practices to support allegations of cyber fraud, including a federal contractor’s cybersecurity practices during product development and deployment, as well as contractors’ statements regarding assessment scores and underlying representations.
  • DOJ takes whistleblower complaints seriously, with several of these actions stemming from complaints by federal contractors’ former employees.
  • To mitigate these risks, federal contractors should ensure that they understand and operationalize their contractual obligations, particularly with respect to the new DFARS obligations.
  • Federal contractors would be well advised to:
    • (1) review and understand their cybersecurity contractional obligations;
    • (2) develop processes to work with the appropriate internal teams (information security, information technology, etc.) to ensure that contractual obligations have been appropriately implemented; and
    • (3) develop processes to monitor compliance with the contractual obligations on an ongoing basis.

Joshua Mullen, Luke Cass, Christopher Lockwood and Tyler Bridegan are partners at Womble Bond Dickinson (US) LLP.

The post Cybersecurity in focus: DOJ aggressively investigating contractors’ cybersecurity practices first appeared on Federal News Network.

© Getty Images/iStockphoto/maxkabakov

Data security and privacy concept. Visualization of personal or business information safety.

US cyber progress isn’t stalled — it’s evolving

2 December 2025 at 14:09

The Cyberspace Solarium Commission’s (CSC 2.0) annual implementation report has sparked fresh concern from representatives and cybersecurity leaders that U.S. cyber progress is slowing. Bureaucratic delays, budget constraints and uneven policy follow-through, particularly around the Cybersecurity and Infrastructure Security Agency’s authorities and funding, are all apparent.

But does this paint the full picture of the technical implementation and enforcement of U.S. cybersecurity? Hardly.

Beneath the policy layer, the technical and strategic modernization of U.S. cybersecurity is actually accelerating faster than ever. While there’s a lot of doom and gloom at the civilian policy level, it’s important we acknowledge the progress individual agencies have made and provide constructive steps to continue to capitalize on that progress.

A defining moment came with the finalization of the CMMC 2.0 rule, which is now effective and entered its first implementation phase on Nov. 10. More than 80,000 defense industrial base vendors will be required by contract to comply with rigorous cybersecurity controls aligned to NIST 800-171, with a hard assessment deadline in 2026.

CMMC 2.0 ensures that cybersecurity is no longer a checkbox exercise or a “nice-to-have” policy objective. It’s now a legal and contractual requirement. By the time full assessments begin in 2026, the Defense Department will have reshaped the entire DIB into a verified ecosystem of secure software and systems providers.

That’s a significant milestone that will set the stage for the operationalization of accountability for government technology.

Equally as transformative is the DoD’s quiet revolution in risk management. In September 2025, the DoD introduced its Cybersecurity Risk Management Construct (CRMC). This is a long-awaited, direct successor to the outdated, paperwork-heavy Risk Management Framework.

The new construct adopts the continuous authority to operate (cATO) model, enabling near-real-time monitoring and risk response. It’s a move away from static compliance documentation toward dynamic, data-driven assurance, reflecting the pace of modern software delivery.

The DoD’s transformation is being powered by the Software Fast Track (SWFT) initiative, launched in mid-2025 to modernize acquisition. SWFT brings DevSecOps automation directly into the authorization process, ensuring secure software can reach warfighters faster and without compromising security. It’s a fundamental shift from compliance to continuous validation.

Lastly, the CSC 2.0’s report doesn’t touch on the work being done by the National Institute of Standards and Technology to operationalize the AI Risk Management Framework for 2026. This will bring much-needed clarity to secure and responsible AI adoption across government and industry.

It’s easy to equate stalled legislation or delayed budgets with a lack of progress. But in cybersecurity, the most impactful advancements rarely happen in congressional hearings. They happen in codebases, acquisition reforms and audit protocols. The policy narrative may be sluggish, but in those areas, we are actually seeing healthy progress as the technical foundation of U.S. cyber defense is advancing rapidly.

Through CMMC enforcement, cATO adoption, automated software assurance and AI governance, federal cybersecurity is entering an implementation era where secure software supply chains and continuous monitoring are not aspirations, but expectations.

With all that said, does this mean the CSC 2.0’s findings should be ignored? Absolutely not.

The reality is that we don’t have a cybersecurity problem; we have an insecure software problem. By not driving forward policy at the civilian-level to change the economics in such a way that incentivizes ensuring the delivery of secure software, we may be ceding the very progress I just outlined.

However, to say “U.S. cyber progress stalls” is to overlook this reality. The truth is that 2025 marks the year where U.S. cybersecurity finally shifted from policy to practice.

Antoine Harden is the vice president of federal sales at Sonatype.

The post US cyber progress isn’t stalled — it’s evolving first appeared on Federal News Network.

© Getty Images/Phimprapha Kitaiamphaisan

Cybersecurity concept

How the inefficiencies of TIC 2.0 hinder agencies’ cybersecurity progress

1 December 2025 at 14:57

Federal agencies face an ever-evolving threat landscape, with cyberattacks escalating in both frequency and sophistication. To keep pace, advancing digital modernization isn’t just an aspiration; it’s a necessity. Central to this effort is the Trusted Internet Connections (TIC) 3.0 initiative, which offers agencies a transformative approach to secure and modernize their IT infrastructure.

TIC 3.0 empowers agencies with the flexibility to securely access applications, data and the internet, providing them with the tools they need to enhance their cyber posture and meet the evolving security guidance from the Office of Management and Budget and the Cybersecurity and Infrastructure Security Agency. Yet, despite these advantages, many agencies are still operating under the outdated TIC 2.0 model, which creates persistent security gaps, slows user experience, and drives higher operating costs, ultimately hindering progress toward today’s modernization and adaptive security goals.

Why agencies must move beyond TIC 2.0

TIC 2.0, introduced over a decade ago, aimed to consolidate federal agencies’ internet connections through a limited number of TIC access points. These access points were equipped with legacy, inflexible and costly perimeter defenses, including firewalls, web proxies, traffic inspection tools and intrusion detection systems, designed to keep threats out. While effective for their time, these static controls weren’t designed for today’s cloud-first, mobile workforce. Often referred to as a “castle and moat” architecture, this perimeter-based security model was effective when TIC 2.0 first came out, but is now outdated and insufficient against today’s dynamic threat landscape.

Recognizing these limitations, OMB introduced TIC 3.0 in 2019 to better support the cybersecurity needs of a mobile, cloud-connected workforce. TIC 3.0 facilitates agencies’ transition from traditional perimeter-based solutions, such as Managed Trusted Internet Protocol Service (MTIPS) and legacy VPNs, to modern Secure Access Service Edge (SASE) and Security Service Edge (SSE) frameworks. This new model brings security closer to the user and the data, improving performance, scalability and visibility across hybrid environments.

The inefficiencies of TIC 2.0

In addition to the inefficiencies of a “castle and moat” architecture, TIC 2.0 presents significant trade-offs for agencies operating in hybrid and multi-cloud environments:

  • Latency on end users: TIC 2.0 moves data to where the security is located, rather than positioning security closer to where the data resides. This slows performance, hampers visibility, and frustrates end users.
  • Legacy systems challenges: outdated hardware and rigid network paths prevent IT teams from managing access dynamically. While modern technologies deliver richer visibility and stronger data protection, legacy architectures hold agencies back from adopting them at scale.
  • Outages and disruptions: past TIC iterations often struggle to integrate cloud services with modern security tools. This can create bottlenecks and downtime that disrupt operations and delay modernization efforts.

TIC 3.0 was designed specifically to overcome these challenges, offering a more flexible, distributed framework that aligns with modern security and mission requirements.

“TIC tax” on agencies — and users

TIC 2.0 also results in higher operational and performance costs. Since TIC 2.0 relies on traditional perimeter-based solutions — such as legacy VPNs, expensive private circuits and inflexible, vulnerable firewall stacks — agencies often face additional investments to maintain these outdated systems, a burden commonly referred to as the “TIC Tax.”

But the TIC Tax isn’t just financial. It also shows up in hidden costs to the end user. Under TIC 2.0, network traffic must be routed through a small number of approved TIC Access Points, most of which are concentrated around Washington, D.C. As a result, a user on the West Coast or at an embassy overseas may find their traffic backhauled thousands of miles before reaching its destination.

In an era where modern applications are measured in milliseconds, those delays translate into lost productivity, degraded user experience, and architectural inefficiency. What many users don’t realize is that a single web session isn’t just one exchange; it’s often thousands of tiny connections constantly flowing between the user’s device and the application server. Each of those interactions takes time, and when traffic must travel back and forth across the country — or around the world — the cumulative delay becomes a real, felt cost for the end user.

Every detour adds friction, not only for users trying to access applications, but also for security teams struggling to manage complex routing paths that no longer align with how distributed work and cloud-based systems operate. That’s why OMB, CISA and the General Services Administration have worked together under TIC 3.0 to modernize connectivity, eliminating the need for backhauling and enabling secure, direct-to-cloud options that prioritize both performance and protection.

For example, agencies adopting TIC 3.0 can leverage broadband internet services (BIS), a lower-cost, more flexible transport option that connects users directly to agency networks and cloud services through software-defined wide area network (SD-WAN) and SASE solutions.

With BIS, agencies are no longer constrained to rely on costly, fixed point-to-point or MPLS circuits to connect branch offices, data centers, headquarters and cloud environments. Instead, they can securely leverage commercial internet services to simplify connectivity, improve resiliency, and accelerate access to applications. This approach not only reduces operational expenses but also minimizes latency, supports zero trust principles, and enables agencies to build a safe, flexible and repeatable solution that meets TIC security objectives without taxing the user experience.

How TIC 2.0 hinders zero trust progress

Another inefficiency — and perhaps one of the most significant — of TIC 2.0 is its incompatibility with zero trust principles. As federal leaders move into the next phase of zero trust, focused on efficiency, automation and rationalizing cyber investments, TIC 2.0’s limitations are even more apparent.

Under TIC 2.0’s “castle and moat” model, all traffic, whether for email, web services or domain name systems, must be routed through a small number of geographically constrained access points. TIC 3.0, in contrast, adopts a decentralized model that leverages SASE and SSE platforms to enforce policy closer to the user and data source, improving both security and performance.

To visualize the difference, think of entering a baseball stadium. Under TIC 2.0’s “castle and moat” approach, once you show your ticket at the entrance, you can move freely throughout the stadium. TIC 3.0’s decentralized approach still checks your ticket, but ushers and staff ensure you stay in the right section, verifying continuously rather than once.

At its core, TIC 3.0 is about moving trust decisions closer to the resource. Unlike TIC 2.0, where data must travel to centralized security stacks, TIC 3.0 brings enforcement to the edge, closer to where users, devices and workloads actually reside. This aligns directly with zero trust principles of continuous verification, least privilege access and minimized attack surface.

How TIC 3.0 addresses TIC 2.0 inefficiencies

By decentralizing security and embracing SASE-based architectures, TIC 3.0 reduces latency, increases efficiency and enables agencies to apply modern cybersecurity practices more effectively. It gives system owners better visibility and control over network operations while allowing IT teams to manage threats in real time. The result is smoother, faster and more resilient user experiences.

With TIC 3.0, agencies can finally break free from the limitations of earlier TIC iterations. This modern framework not only resolves past inefficiencies, it creates a scalable, cloud-first foundation that evolves with emerging threats and technologies. TIC 3.0 supports zero trust priorities around integration, efficiency and rationalized investment, helping agencies shift from maintaining legacy infrastructure to enabling secure digital transformation.

Federal IT modernization isn’t just about replacing technology; it’s about redefining trust, performance and resilience for a cloud-first world. TIC 3.0 provides the framework, but true transformation comes from operationalizing that framework through platforms that are global, scalable, and adaptive to mission needs.

By extending security to where users and data truly live — at the edge — agencies can modernize without compromise: improving performance while advancing zero trust maturity. In that vision, TIC 3.0 isn’t simply an evolution of policy; it’s the foundation for how the federal enterprise securely connects to the future.

Sean Connelly is executive director for global zero trust strategy and policy at Zscaler and former zero trust initiative director and TIC program manager at CISA.

The post How the inefficiencies of TIC 2.0 hinder agencies’ cybersecurity progress first appeared on Federal News Network.

© Getty Images/iStockphoto/go-un lee

Industry 4.0, Internet of things (IoT) and networking, network connections

From obsolete to obligatory: 8 best practices for upgrading federal financial systems

1 December 2025 at 13:44

Today’s evolving financial landscape requires organizations to transform obsolete systems into future-ready infrastructures. Financial systems in particular need to align with compliance standards. The shift from reactive problem-solving to proactive transformation involves eliminating redundancies, automating processes, embedding analytics, integrating cloud platforms, enhancing data interoperability and adopting zero trust cybersecurity.

Here are eight best practices for implementing automation technology or taking on any modernization challenge:

  1. Lead with people-centric change management. No technology or solution will work without day one buy-in from the people who will use it. Communication at every step sets the stage for success. Flow charts and brainstorming sessions with key stakeholders allow managers to talk through a workflow that addresses everyone’s needs. Touch points and progress updates keep stakeholders informed, and robust user training ensures adoption and mission alignment.
  2. Streamline processes before you digitize. In a federal financial system, it’s common for accounts payable and receivable operations to function in isolation. A rigorous analysis sets the stage for creating a more efficient workflow that eliminates redundancy and obsolete manual processes. It’s essential to study the policies and procedures that guide workflows in manual or legacy systems and translate them to intelligent automation technology. Here, it’s often helpful to look at other federal agencies to see what has worked and lessons learned.
  3. Automate intelligently to enhance mission focus. Automation for its own sake goes nowhere. It must be targeted not just to cut costs but also to free up the skilled federal workforce from repetitive, low-value tasks better suited for machine learning. Once liberated by artificial intelligence and automation, workers can focus on high-level analysis, strategic planning, and mission-critical objectives.
  4. Mandate data interoperability for a single source of truth. In too many cases, one set of information does not connect to or integrate with another. Worse than these data silos are manual processes. Employees often work offline from Excel spreadsheets, making it even tougher to integrate the operations of federal agencies. Modernized core systems allow for the secure, seamless sharing of data to drive cross-agency collaboration, ensure comprehensive reporting and guarantee accurate analytics.
  5. Embed analytics to drive proactive decision-making. Why look behind when you can look ahead? Cleaner financial statements stem from real-time, predictive insights. Upgraded systems integrate dashboards, data visualizations, reports and other forms of business intelligence. Their streamlined views empower leaders to anticipate future needs and challenges.
  6. Build on a secure and scalable cloud framework. Rather than stack new technology on an obsolete core, cloud computing systems evolve as technology improves and make enhanced cybersecurity possible. What’s more, cloud technology can scale to handle massive data loads, integrate new tools and provide superior accessibility that a modern federal workforce demands.
  7. Secure by design with a zero trust architecture. In the current threat landscape, security cannot be an afterthought. In a zero trust environment, no external data or outsider enters a system and third parties are granted access only after a thorough vetting. Continuous authentication is crucial. To work across systems, data scientists and data miners will extract only what is needed and clean the data first.
  8. Develop your battle plan. Just as in the military, proactive transformation requires a clear, strategic roadmap. Follow these steps to produce a battle plan unique to your organization’s needs and challenges:
  • Conduct a comprehensive assessment. Analyze your current financial systems to identify pain points, redundancies and areas where manual processes are a bottleneck. This analysis will streamline your processes before you digitize them.
  • Convene key stakeholders. Map out new workflows and secure buy-in across the organization. Communication is fundamental, from initial brainstorming sessions to ongoing progress updates.
  • Create a phased process. Instead of a massive, disruptive overhaul, implement your plan in manageable iterations. Focus on intelligently automating one or two key processes first, such as accounts payable or receivable. Use these initial wins to demonstrate value and build momentum. Embed analytics as you go to measure progress and make proactive adjustments.
  • Scale up. Use this solid foundation in a secure cloud framework to integrate new tools, enhance data interoperability, and ensure that your system is both future-ready and protected by a zero trust architecture.

This structured approach will improve decision-making, transparency and service delivery while meeting compliance mandates as you move your organization from reactive problem-solving to a proactive and successful transformation.

Tabatha Turman is the founder and CEO of Integrated Finance and Accounting Solutions.

The post From obsolete to obligatory: 8 best practices for upgrading federal financial systems first appeared on Federal News Network.

© Getty Images/iStockphoto/metamorworks

Digital transformation concept. System engineering. Binary code. Programming.

Tips and strategies for navigating career transition conversations with your family during the holidays

24 November 2025 at 15:07

There are some topics families dread broaching during the holidays. For federal employees, reductions-in-force (RIFs) and government shutdown furloughs may be added to the list of tough topics to navigate this year. If you are dreading this topic, you are not alone. Here are some options for how to navigate this conversation this year.

Talking about career change: Why family conversations matter

Whether you are experiencing a workforce reduction, transitioning between industries, or determining your next steps, your family can serve as a valuable source of emotional support and practical guidance. Open, honest communication can help your loved ones understand your motivations, fears and aspirations as you navigate this transition. Changing careers in the current environment can be especially difficult, and navigating conversations about this transition may be different than previous career transition conversations you have had with family and friends. Having this conversation in a healthy way can make a difference in your mental state during an already complex time. This conversation does not have to lead to interpersonal conflicts or distress. Consider these tips when thinking about having the conversation.

1. Prepare for the conversation and remain constructive.

Have a plan. Being prepared will help you communicate your intentions clearly and confidently when you are ready to navigate career transition conversations. Before sitting down with your family for the holidays, take some time to reflect on how you want to have conversations about career transition with your family and keep these things in mind.

Your mental state. If you are struggling with negative thoughts, consider taking those thoughts captive and reframing them. For example, if you are thinking “This time last year I was talking about getting promoted, how can I tell my family I was a part of the RIF?” Consider reevaluating that thought. Remember that being a part of a recent RIF may not be indicative of your capability, performance or value to the organization.

Your disposition. Improve your disposition by implementing or increasing healthy habits. Getting proper sleep, exercise and diet throughout the holidays can improve your mood so that you can converse more positively, have the patience to deal with challenging family members, and approach tough conversations with a clear head.

Clearing your cognitive pallet: Limit influences that sour your mood. Feed your mind uplifting information and focus on things within your actual span of control. For example, you cannot control whether the government is open or remains closed, but you can control what you do with the time that you have during this furlough. Excepted employees still working can contribute to a supportive work environment by extending grace to fellow excepted employees each day at the office.

 2. Choose the right time and setting.

Have the conversation when you are in the best mental space. If you are energized and positive upon arrival at family events, consider initiating the conversation early. If you prefer to incorporate the discussion into table chat or when the family has settled after dinner, plan to have the talk then.

If you prefer not to have the conversation during the holidays, consider discussing the transition during a designated family time. A 2023 Harvard Family Study found that families who engage in weekly meetings report 50% stronger emotional bonds compared to those who do not. If your family has designated quality time, this may be a better alternative than a holiday gathering. Holidays can be stressful and emotionally charged for some family members, making it difficult to have a healthy conversation.

 3. Consider your communication strategy.

Consider with which family members it makes sense to have this conversation: Having unhealthy conversations may sap your energy and negatively affect your mood, so choose wisely. Decide how you plan to respond when you are invited to share information with the family or whether you want to initiate the talk. Having conversations with supportive family members can be helpful and cathartic. You may even find more support than you anticipated when you are open and honest.

Remember that “open and honest” is not “open kimono”: Discussing a career transition with those other than your spouse does not require a detailed account of your personal and private matters. Acknowledging this as a “challenging time of transition” may be sufficient. Resist the temptation to invite family members into the intricacies of your marriage, bedroom or finances.

Consider the outcomes you want from the conversation. Jot down key points to help you stay focused during the discussion. For example, if you do not want to have the conversation repeatedly, carve out a specific time to share what you are comfortable sharing and close the conversation with something like, “That’s all I have to share about that.” As the questions come throughout the holidays, refer to when you will discuss or when you already discussed the topic. Here is an example: Uncle Bill says, “I heard you were let go, I thought you were doing well there, what happened?” Consider responding with a closed statement like, “We talked about it before dinner,” or “I plan to talk about it after dinner.”

Tailor the talk to your comfort level: You could share something like, “As some of you may know, [organization] conducted a RIF this year. My division was a part of that RIF. That is all I have to share about that.” If your family is helpful, you may want to add something like, “However, I am interested in learning about options if you have job leads or opportunities for me to continue using my skills and expertise in [career area].” Let your family know if you are seeking new challenges, personal fulfillment or better work-life balance if you feel that they can help. Honest communication can help build trust and understanding, and could even help you land your next job.

 4. Listen to their concerns and neutralize criticism.

Your family may have questions or worries about finances, stability or future plans. Listen attentively and acknowledge their feelings. Let them know you value their opinions and support. This may sound something like, “I hear you. I am thinking about those things, too. I appreciate your concern and appreciate your support during this transition.” If the conversation is not constructive consider closing it with something like, “I’ve heard you,” or “We’ve heard you.”

5. Share your plan.

If your family is supportive, broadly discuss how you plan to approach your career transition. Outline steps you will take, such as updating your skills, networking, or seeking guidance from mentors. Sharing your plan demonstrates responsibility and helps alleviate uncertainty. You may share varying levels of detail with different family members based on the relationship or their ability to help.

5. Invite support and collaboration.

Ask your family for their support and input. Whether it is helping with research, offering encouragement, or brainstorming together, involving them in the process can strengthen your bond and make the transition smoother. This may need to be tailored for certain family members if family dynamics are complex.

7. Keep the conversation going.

Career transitions are ongoing journeys. Keep your family updated on your progress, celebrate small victories, and address new challenges together. Regular check-ins can help maintain open lines of communication and foster a supportive environment. If your family is incapable or unwilling to be supportive through this transition, consider keeping the conversation going with a supportive group of friends, former co-workers or faith community. Support systems are not exclusive to biological families.

Don’t go it alone

Talking with your family about a career transition is an opportunity to build a foundation of understanding and support. With preparation, empathy and honest dialogue, you can navigate this change together and set the stage for a successful new chapter.

It is important to recognize that each career journey presents its own distinct challenges and opportunities. This time of transition is significant and anomalous in many ways, so you may have trepidation about having career transition conversations. Consider seeking support and guidance from family members, friends, and current and former colleagues as you pursue new endeavors. You do not have to traverse this transition alone.

Jamel Harling is an executive coach, ombuds and advisor at Better Than Good Solutions.

The post Tips and strategies for navigating career transition conversations with your family during the holidays first appeared on Federal News Network.

© _

Two Families Eating Meal At Outdoor Restaurant Together

Strengthening executive leadership to serve our nation

20 November 2025 at 10:56

The federal government’s ability to deliver results for the American people hinges on the effectiveness of its leadership. At the core of this leadership is the Senior Executive Service, a cadre of high-level officials entrusted with the responsibility of translating presidential priorities into operational outcomes. The stakes are high: Senior executive leadership affects whether the president’s agenda is properly implemented and whether agencies deliver on their missions. SES officials are the president’s senior-most career executives, and their leadership affects the lives of millions of Americans every day.

Given the gravity of their responsibilities, SES members must be equipped with the right knowledge and tools. Statutory mandates require both the Office of Personnel Management and federal agencies to provide ongoing executive development. And for good reason. Just as private-sector CEOs may benefit from executive development programs, our senior executives need the same level of preparation tailored to the constitutional role they hold and the scale of the missions they lead.

Historically, federal leadership programs have imitated academic or private-sector programs — many of which are expensive, time-consuming and not designed to meet the unique needs of government executives. With the closure of the Federal Executive Institute, OPM is refocusing and reimagining executive development, building on the best leadership development theories and practices with a renewed focus on the specific job knowledge, skills and tools SES officials need to be successful in their jobs.

Thus, to ensure the successful implementation of the president’s agenda by equipping SES through high-quality, targeted training programs, OPM has launched the Senior Executive Development Program (SEDP) — a targeted, administration-driven training initiative designed to ensure SES officials have the knowledge, context and clarity necessary to carry out the president’s agenda and best serve the American people.

The SEDP reflects a significant shift in how we approach executive development. More than just generic leadership training, the program will equip senior executives with the practical knowledge and skills to thrive as government leaders, covering topics such as constitutional governance, budget and policymaking, executive core qualifications, and strategic human capital management. Rather than relying on just generalized management theory one could get from academic or private-sector programs, this program provides high-impact, directly relevant training that speaks to the needs of today’s federal executives. The curriculum is designed around the constitutional role of the executive, the rule of law, merit system principles, and the operational realities of modern governance and leading responsive and efficient bureaucratic organizations. Participants will hear from seasoned career executive colleagues and administration officials on topics directly applicable to the complex missions and organizations they lead.

The SEDP is not a one-size-fits-all program. It’s designed to be flexible and scalable, leveraging high-quality recorded video modules delivered through OPM’s centralized learning management system. This approach allows executives to learn on demand, at their own pace, while maintaining full access to their day-to-day responsibilities. It also allows OPM to track participation and outcomes — ensuring transparency, accountability and a strong return on investment.

The goal is clear: to strengthen our federal leadership by ensuring every SES member has technical expertise and a true understanding of their constitutional responsibilities and the president’s vision. The success of any administration depends on its ability to act. And acting depends on leadership.

Through the Senior Executive Development Program, OPM is reaffirming its commitment to preparing that leadership – ensuring that America’s senior executives are ready, responsive and capable of delivering for the American people.

Scott Kupor is director of the Office of Personnel Management.

The post Strengthening executive leadership to serve our nation first appeared on Federal News Network.

© AP Photo/Mark Schiefelbein

Scott Kupor, President Donald Trump's pick for director of the Office of Personnel Management, listens during Senate Homeland Security and Governmental Affairs Committee nomination hearing, April 3, 2025. (AP Photo/Mark Schiefelbein)

Unlocking efficiency: The case for expanding shared services

13 November 2025 at 18:10

As the federal government contends with tightening budgets, lean staffing and soaring citizen expectations, it faces a unique opportunity — and obligation — to modernize service delivery by investing in shared services. The notion of centralized provisioning might conjure up all sorts of challenges and issues; however, it is a proven avenue to lower costs, eliminate duplication and elevate service performance. While shared services isn’t the answer for all functions, we possess valuable learned lessons and survey feedback on shared services to create a powerful pathway to increase government effectiveness while lowering costs.

The federal government has demonstrated tangible benefits of shared services related to administrative systems. For example, between fiscal 2002 and 2015, consolidating payroll and HR systems generated more than $1 billion in cost savings and an additional $1 billion when 22 payroll systems were consolidated into four, according to the Government Accountability Office. A 2024 report published by the Federation of American Scientists noted that consolidating the payroll centers yielded cumulative savings exceeding $3.2 billion. Other measurable results include the General Services Administration’s fleet management program that consolidated more than 5,000 agency-owned vehicles on leasing, maintenance and administrative costs. Shared IT services have also expanded steadily, with the adoption of Login.gov, USAJOBS, and data center consolidation that saved over $2.8 billion, according to GAO and the Office of Management and Budget in 2020.

Why it matters — and why now

The federal government continues to advance priorities that improve citizen services while driving down costs as we enter the artificial intelligence era where AI, data and information are transforming industries, big data and human capabilities. Agencies are facing a pressing need to modernize IT systems, increase efficiencies by incorporating automation and AI, and increase cybersecurity. Without integrated business services, agencies will struggle to maintain infrastructure, secure their systems and modernize for the AI era. Consolidated IT investments, such as Login.gov and ID.me, have proven to provide stronger, more resilient platforms that enhance cybersecurity and protect mission-critical systems, provide standardized data and analytics and improve transparency.

Still, shared services must be implemented with care. Agencies need flexibility to select providers that best fit their mission-specific requirements. Focusing first on areas where agencies are already seeking solutions — such as accounting, fleet management and office space aligned by security requirements — offers a pragmatic path forward. Service providers must be held to strict performance standards, with service-level agreements ensuring that quality improves alongside efficiency. Equally important, strong leadership and coordination are necessary to sustain momentum.

Agencies like the Office of Personnel Management, GSA and Treasury, which have successfully acted as managing partners in the past, can provide the oversight and accountability required for long-term success. Rather than measuring Quality Service Management Offices (QSMOs) solely by their early momentum, their success should be understood in light of the current environment: smaller budgets, fewer staff and an increased focus on mission delivery. In this context, the adoption of integrated business services positions agencies for long-term gains.

Shared services provide the architecture for a more modern, efficient and mission-focused government. From payroll to fleet management to IT modernization, the federal government has demonstrated the value of this approach through billions of dollars in savings and significant performance improvements. With bipartisan policy support, proven blueprints and advances in shared platforms, the federal enterprise is well-positioned to expand shared services — carefully, collaboratively and with agency choice at its core. If pursued deliberately, shared services can become a cornerstone of fiscal responsibility and high-quality service delivery for the American people.

Erika Dinnie is the vice president of federal strategy and planning for MetTel. Before joining the company, Dinnie served as the General Services Administration’s associate chief information officer for digital infrastructure technologies for nearly 10 years, overseeing GSA’s IT infrastructure, systems, software and applications.

The post Unlocking efficiency: The case for expanding shared services first appeared on Federal News Network.

© Getty Images/NicoElNino

Merging zero trust with digital twins: The next frontier in government cyber resilience

12 November 2025 at 16:22

Cyber adversaries aren’t standing still, and our defenses can’t either. In an environment where government networks face relentless, increasingly sophisticated attacks, it’s evident that perimeter-based security models belong in the past. A zero trust framework redefines the approach: Every user, device, and connection is treated as unverified until proven otherwise, or “trust but verify.” By assuming breach, zero trust delivers what today’s government missions demand: speed, resilience and the ability to contain damage before it spreads.

To truly operationalize zero trust, agencies must look beyond theory and embrace emerging technologies. Many federal organizations are already turning to artificial intelligence and digital twins to get there. A digital twin — a software-based replica of a real-world network — creates an invaluable proving ground. Rather than waiting for an adversary to strike live systems, agencies can safely simulate cyberattacks, test and refine policies, and validate updates before deployment. In my view, this marks a fundamental shift: Digital twins aren’t just a tool, they represent the future of proactive cyber defense, where learning, adaptation and resilience happen before a crisis, not after.

This approach doesn’t just strengthen agency defenses; it also streamlines operations. Instead of maintaining expensive, outdated physical labs, agencies can rely on digital twins to keep pace with evolving cyber threats. Most recently, a large government agency demonstrated the power of this approach by overcoming years of technical debt, rapidly reconfiguring critical systems, and building a testing environment that delivered greater speed, precision and efficiency that advanced their mission and operational goals.

Strategies for anticipating compromise while ensuring operational resilience

Digital twins offer significant potential for enhancing cybersecurity, yet their widespread adoption remains nascent due to several challenges, including budget constraints and agency inertia. Agencies can reference established frameworks such as the National Institute of Standards and Technology SP 800-207 and the Cybersecurity Infrastructure and Security Agency Zero Trust Maturity Model, to guide their zero trust journeys. However, with various legacy systems, cloud services and devices, agencies require zero trust capabilities for their specific needs. The core challenge for government then becomes how to proactively implement effective zero trust strategies that anticipate compromises while ensuring continued operations.

To address these challenges and effectively implement zero trust, here are key actions for agency leaders to consider that include people, process and tools:

  • People

Embrace change management

Zero trust implementation is as much about people and process as it is about technology. To foster cross-team buy-in, agencies must clearly articulate the “why” behind zero trust. Instead of just a technical mandate, zero trust should be framed as a strategy to improve security and efficiency. This involves creating a shared understanding of the framework’s benefits and how it impacts each team member.

Quantify and communicate value

Measuring the ROI of zero trust is complex, as preventing incidents yields invisible benefits. How will you define success: reduced risk, faster compliance, operational consistency? Agencies should set milestones for measuring security posture improvements and regulatory progress while recognizing the limitations of conventional ROI calculations.

  • Process

Adopt zero trust as a damage-limitation strategy

Rather than asking, “How do we stop every breach?” agencies should take steps to shift from prevention-only thinking to dynamic containment and defense, such as:

  • Developing an incident response plan that outlines roles, responsibilities and communication protocols for cyberattack stages.
  • Conducting regular tabletop exercises and simulations to test the plan’s effectiveness and find improvement areas.
  • Automating security workflows to accelerate response times and reduce human error.

Be thorough with zero trust planning

According to public sector best practices, projects with 90% planning and 10% execution are far more likely to succeed. Agency technology and information leaders should take an active role in driving zero trust transformation, ensuring comprehensive planning, stakeholder engagement, and organizational buy-in are prioritized from the outset.

  • Tools

Leverage digital twins

Agencies are turning to emerging technology, including AI and digital twins, to keep pace with threat actors. Government IT and SecOps teams can deploy digital twins to simulate attacks, validate controls and reduce costly physical testing environments. Digital twins should also be considered a safe space for agencies to experiment, identify vulnerabilities, and optimize policies before deployment — an invaluable asset for agencies navigating mixed legacy and cloud ecosystems. Moreover, model-based systems engineering and agile approaches, paired with digital twins, can empower agencies to “rehearse” security incidents and fine-tune architectures.

Tackle tool sprawl using informed consolidation

The sheer volume of disparate vendors and tools can undermine even the best zero trust architecture. Utilizing digital twins to map and simulate your IT environment allows for thoughtful consolidation without sacrificing security or compliance. Lastly, agencies should identify where they are duplicating capabilities and envision a streamlined, mission-focused toolset.

Accelerating zero trust at scale

To address the pace and complexity of future threats, government agencies must act boldly by embracing zero trust not only as a framework but also as a fundamental mindset for continual adaptation and resilience.

By harnessing the power of technologies like AI and digital twins, modernizing planning and response strategies, and committing to cross-team collaboration, agencies can outmaneuver adversaries and protect their most critical missions.

The path forward is clear: Operational resilience is achieved by investing today in future-ready strategies that anticipate compromise, ensure continuity and empower every stakeholder to play a proactive role in defense.

 

 

John Fair is vice president of Air Force sales and account management at Akima.

The post Merging zero trust with digital twins: The next frontier in government cyber resilience first appeared on Federal News Network.

© Getty Images/Alexander Sikov

Cyber Security Data Protection Business Technology Privacy conceptCyber Security Data Protection Business Technology Privacy concept.

Return-to-office mandates are undermining federal workforce readiness — especially for employees with disabilities

7 November 2025 at 16:02

In the wake of the federal government’s push to bring employees back to the office, agencies like FEMA are facing a critical crossroads. While the intent behind return-to-office policies may be rooted in tradition, optics or perceived productivity, the reality is far more complex — and far more costly.

For employees with disabilities, these mandates are not just inconvenient. They are exclusionary, legally questionable and operationally unsound.

The law is clear — even if agency practices aren’t

The Jan. 20, 2025, presidential mandate directing federal employees to return to in-person work includes a crucial caveat: It must be implemented consistent with applicable law. That includes the Rehabilitation Act of 1973 and the Americans with Disabilities Act (ADA), which require agencies to provide effective accommodations to qualified employees with disabilities unless doing so would cause undue hardship.

Yet, across the federal landscape, many agency leaders are misinterpreting this mandate as a blanket prohibition against remote work, even in cases where virtual accommodations are medically necessary and legally protected. This misapplication is not only harmful to employees, it exposes agencies to legal liability, reputational damage and operational risk.

FEMA’s case study: A broken system with real consequences

At FEMA, the consequences of this misinterpretation are playing out in real time. In fiscal year 2025 alone, FEMA employees submitted over 4,600 reasonable accommodation requests, up more than three times from the previous year. Despite this surge, the agency’s accommodation infrastructure remains underresourced and reactive.

Supervisors, often untrained in disability law, are making high-stakes decisions without adequate support. The result? Delays, denials and errors that leave employees feeling unseen, unsupported and in some cases, forced out of the workforce entirely.

One FEMA reservist with a service-connected disability shared:

“After months of silence and no support, I gave up. I stopped applying for deployments. I felt like FEMA had no place for me anymore.”

Another permanent employee wrote:

“I wasn’t asking for anything fancy — just to do my job from home so I didn’t collapse from pain after 20 minutes in the building. Instead, I was treated like a problem.”

These stories are not isolated. They reflect a systemic failure that is both preventable and fixable.

The cost of dysfunction

When agencies deny effective accommodations, they don’t just violate the law, they lose talent, morale and money.

Consider the cost of FEMA forcing an employee to deploy in person to a disaster event when they could be performing the same job virtually instead. Tens of thousands of dollars in airfare, lodging and meals — all paid from the Disaster Relief Fund — become unnecessarily incurred expenses. Worse, the employee may underperform due to physical hardship or burn out entirely. In contrast, virtual deployment may be a zero-cost, high-return accommodation that results in better stewardship of taxpayer dollars.

Reasonable accommodations, when applied correctly, do not remove essential job functions or lower performance standards. They enable employees to meet those standards in a way that aligns with their health and abilities. They are not a problem; they are a solution.

Return-to-office mandates are not one-size-fits-all

Federal agencies must recognize that return-to-office policies and reasonable accommodations are not mutually exclusive. Virtual work can, and should, coexist with in-person mandates when it enables qualified individuals with disabilities to perform their essential functions.

This is not just a legal imperative. It’s a strategic one.

A supported, well-equipped workforce is more productive, more mission-focused, and less likely to file complaints and grievances. Accommodations foster a positive workplace culture, which is critical for retaining skilled staff. They also align with the administration’s stated goals of rooting out inefficiency and ensuring high performance among public servants.

A smarter path forward

To modernize federal accommodation practices and align them with both legal obligations and operational goals, agencies should consider the following steps:

  1. Strategic messaging campaign
    The highest levels of leadership must publicly affirm that supporting reasonable accommodations is a legal requirement and a mission enabler — not a discretionary gesture.
  2. Training and certification for deciding officials
    Supervisors must be equipped with the knowledge and tools to make informed, lawful decisions about accommodation requests.
  3. Portability review of roles
    Agencies should classify roles based on their viability for virtual or in-person work to promote consistency, fairness, and transparency in decision-making. This classification should be grounded in the actual essential functions of each role — not tradition or “the way it’s always been done.” For FEMA and similar agencies, defining disaster-related roles by their portability (i.e., whether they can be performed remotely or require physical presence) would provide a clear, functional framework for evaluating reasonable accommodation requests. This approach enables faster, more equitable adjudication and ensures that decisions are aligned with both operational needs and employee rights.
  4. Enhanced support infrastructure
    Create interactive tool kits, office hours and just-in-time training to support supervisors and employees navigating the accommodation process.
  5. Contractor resource optimization
    Utilize existing contracts and skilled personnel to accelerate processing of virtual work-related requests, reducing backlog, and swiftly complying with strict processing time frames.
  6. Streamlined implementation
    Improve procurement and delivery of approved accommodations — such as assistive technology, sign language interpretation and ergonomic equipment.
  7. Employee feedback integration
    Use post-decision surveys to monitor effectiveness, identify barriers and improve transparency.

Accommodations are not a burden — they’re a blueprint for better governance

Federal agencies must stop treating reasonable accommodations as a bureaucratic hurdle and start recognizing them as a strategic asset. A fully optimized accommodation program enhances legal compliance, protects against risk, retains mission-critical personnel and improves operational excellence.

Return-to-office mandates may be politically popular, but they must be implemented with nuance, compassion and legal integrity. For employees with disabilities, flexibility is not a perk — it’s a lifeline. And for agencies like FEMA, it’s the key to building a workforce that’s not just present, but prepared.

Jodi Hershey is a former FEMA reasonable accommodation specialist and is now the founder of EASE, LLC.

The post Return-to-office mandates are undermining federal workforce readiness — especially for employees with disabilities first appeared on Federal News Network.

© Getty Images/Ridofranz

Mature businessman holding his head in stress, sitting at a desk with computer and documents. Indian manager working late and worried for company deadline. Stress, anxiety and burnout of a mid adult business man accounting risk and audit report or debt.
❌
❌