Deadly border fighting breaks out between Pakistan and Afghanistan
A three-judge panel ruled Friday that President Donald Trump’s firings without cause of Cathy Harris and Gwynne Wilcox, Democratic members on the Merit Systems Protection Board and the National Labor Relations Board, were lawful.
The split 2-to-1 panel decision of the D.C. Circuit Court of Appeals has no immediate effect, since both Harris and Wilcox’s firings were finalized in May. But Friday’s ruling comes as the Supreme Court is expected to soon hear arguments on whether to overturn a 90-year-old ruling known as Humphrey’s Executor — a decision that could expand Trump’s power to shape independent agencies.
In the 1935 Supreme Court ruling on Humprey’s Executor, the justices unanimously found that commissioners can be removed only for misconduct or neglect of duty, effectively limiting when presidents can fire board members.
But when Judges Gregory Katsas and Justin Walker ruled Friday in favor of Trump’s firings of Harris and Wilcox, they argued that MSPB and NLRB fall outside the limitations stemming from Humphrey’s Executor, and that the president can still “remove principal officers who wield substantial executive power.”
“The NLRB and MSPB wield substantial powers that are both executive in nature and different from the powers that Humphrey’s Executor deemed to be merely quasi-legislative or quasi-judicial,” the judges wrote. “So, Congress cannot restrict the President’s ability to remove NLRB or MSPB members.”
Judge Florence Pan, the dissenting panel member and a Biden appointee, argued that the two agencies do fall under the scope of Humphrey’s Executor, and that maintaining the independence of MSPB and NLRB is critical. She wrote that the Trump administration’s “extreme view of executive power sharply departs from precedent.”
“We may soon be living in a world in which every hiring decision and action by any government agency will be influenced by politics, with little regard for subject-matter expertise, the public good, and merit-based decision-making,” she wrote.
The MSPB is an independent agency responsible for adjudicating appeals from federal employees who allege prohibited personnel practices by their agencies. The NLRB investigates unfair labor practices in the private sector and oversees union elections. Both boards are typically composed of members of both political parties.
Trump fired both Wilcox and Harris within his first few weeks in office, but did not point to a specific reason for the terminations. Wilcox and Harris, both of whom were Democratic board members, sued the president over their removals, arguing that they are protected by a federal law meant to ensure MSPB and NLRB’s independence from political considerations — and that the president can only remove them “for inefficiency, neglect of duty, or malfeasance in office.”
Though a federal judge initially ruled the two terminations were unlawful, the Supreme Court reversed that decision in May, effectively green-lighting the finalization of the board members’ firings earlier this year.
In its May decision, the Supreme Court indicated that it was likely “that both the NLRB and MSPB exercise considerable executive power,” which it said would make restrictions on the president’s ability to fire them unconstitutional. Friday’s panel ruling aligns with the Supreme Court’s initial arguments.
The Supreme Court is expected to hear arguments Monday on Trump’s firing of Rebecca Slaughter, a Democratic member of the Federal Trade Commission — a case that may further influence the outcome of both Harris and Wilcox’s terminations.
The Associated Press contributed reporting.
The post Appeals court backs Trump’s firings of MSPB, NLRB members first appeared on Federal News Network.

© AP Photo/J. Scott Applewhite
America’s AI Action Plan outlines a comprehensive strategy for the country’s leadership in AI. The plan seeks, in part, to accelerate AI adoption in the federal government. However, there is a gap in that vision: agencies have been slow to adopt AI tools to better serve the public. The biggest barrier to adopting and scaling trustworthy AI isn’t policy or compute power — it’s the foundation beneath the surface. How agencies store, access and govern their records will determine whether AI succeeds or stalls. Those records aren’t just for retention purposes; they are the fuel AI models need to power operational efficiencies through streamlined workflows and uncover mission insights that enable timely, accurate decisions. Without robust digitalization and data governance, federal records cannot serve as the reliable fuel AI models need to drive innovation.
Before AI adoption can take hold, agencies must do something far less glamorous but absolutely essential: modernize their records. Many still need to automate records management, beginning with opening archival boxes, assessing what is inside, and deciding what is worth keeping. This essential process transforms inaccessible, unstructured records into structured, connected datasets that AI models can actually use. Without it, agencies are not just delaying AI adoption, they’re building on a poor foundation that will collapse under the weight of daily mission demands.
If you do not know the contents of the box, how confident can you be that the records aren’t crucial to automating a process with AI? In AI terms, if you enlist the help of a model like OpenAI, the results will only be as good as the digitized data behind it. The greater the knowledge base, the faster AI can be adopted and scaled to positively impact public service. Here is where agencies can start preparing their records — their knowledge base — to lay a defensible foundation for AI adoption.
Many agencies are sitting on decades’ worth of records, housed in a mix of storage boxes, shared drives, aging databases, and under-governed digital repositories. These records often lack consistent metadata, classification tags or digital traceability, making them difficult to find, harder to govern, and nearly impossible to automate.
This fragmentation is not new. According to NARA’s 2023 FEREM report, only 61% of agencies were rated as low-risk in their management of electronic records — indicating that many still face gaps in easily accessible records, digitalization and data governance. This leaves thousands of unstructured repositories vulnerable to security risks and unable to be fed into an AI model. A comprehensive inventory allows agencies to see what they have, determine what is mission-critical, and prioritize records cleanup. Not everything needs to be digitalized. But everything needs to be accounted for. This early triage is what ensures digitalization, automation and analytics are focused on the right things, maximizing return while minimizing risk.
Without this step, agencies risk building powerful AI models on unreliable data, a setup that undermines outcomes and invites compliance pitfalls.
One of the biggest misconceptions around modernization is that digitalization is a tactical compliance task with limited strategic value. In reality, digitalization is what turns idle content into usable data. It’s the on-ramp to AI driven automation across the agency, including one-click records management and data-driven policymaking.
By focusing on high-impact records — those that intersect with mission-critical workflows, the Freedom of Information Act, cybersecurity enforcement or policy enforcement — agencies can start to build a foundation that’s not just compliant, but future-ready. These records form the connective tissue between systems, workforce, data and decisions.
The Government Accountability Office estimates that up to 80% of federal IT budgets are still spent maintaining legacy systems. Resources that, if reallocated, could help fund strategic digitalization and unlock real efficiency gains. The opportunity cost of delay is increasing exponentially everyday.
Modern AI adoption isn’t just about models and computation; it’s about trust, traceability, and compliance. That’s why strong information governance is essential.
Agencies moving fastest on AI are pairing records management modernization with evolving governance frameworks, synchronizing classification structures, retention schedules and access controls with broader digital strategies. The Office of Management and Budget’s 2025 AI Risk Management guidance is clear: explainability, reliability and auditability must be built in from the start.
When AI deployment evolves in step with a diligent records management program centered on data governance, agencies are better positioned to accelerate innovation, build public trust, and avoid costly rework. For example, labeling records with standardized metadata from the outset enables rapid, digital retrieval during audits or investigations, a need that’s only increasing as AI use expands. This alignment is critical as agencies adopt FedRAMP Moderate-certified platforms to run sensitive workloads and meet compliance requirements. These platforms raise the baseline for performance and security, but they only matter if the data moving through them is usable, well-governed and reliable.
Strengthening the digital backbone is only half of the modernization equation. Agencies must also ensure the physical infrastructure supporting their systems can withstand growing operational, environmental, and cybersecurity demands.
Colocation data centers play a critical role in this continuity — offering secure, federally compliant environments that safeguard sensitive data and maintain uptime for mission-critical systems. These facilities provide the stability, scalability and redundancy needed to sustain AI-driven workloads, bridging the gap between digital transformation and operational resilience.
By pairing strong information governance with resilient colocation infrastructure, agencies can create a true foundation for AI, one that ensures innovation isn’t just possible, but sustainable in even the most complex mission environments.
Melissa Carson is general manager for Iron Mountain Government Solutions.
The post Three steps to build a data foundation for federal AI innovation first appeared on Federal News Network.

© Getty Images/iStockphoto/FlashMovie
Interview transcript
Terry Gerton You have spent a lot of time on the Hill lately talking to lawmakers about ways the VA could modernize access to care. Tell us both what your message is and what you’re hearing from the lawmakers.
Sean O’Connor Yeah. And maybe before that, Terry, just to touch on why we think this is so important or why personally it’s so important to me. And then thank you again for having us, and [I’m] looking forward to having this conversation today. So just at the start, I’m a third-generation veteran. Both my grandfathers fought and served in World War II, one in the Pacific, one in Europe. My father and my uncles all served during the during the Vietnam era. And I’m a 9/11 vet and served during nine eleven. So since the 1940s, my family has been, you know, leaning on and relying on the VA for all kinds of support and care. So, it’s a mission and it’s an institution that’s very important to me personally and very important to the fabric of our country. So, I think it’s no surprise the VA has struggled, you know, being in the early forefront of EHR … adoption to kind of being a laggard now in kind of EHR modernization. And there’s 9 million vets that really struggle to get access to timely care for some of the services they need as the VA works to modernize. So we’ve been spending a lot of time just talking to some of the leadership on the Hill around the momentum that seems to be building to try to modernize finally and kind of make access to care easier for veterans and and trying to make sure that as community care grows and the VA and veterans have more options to seek care both inside and outside the VA, that we really move the needle on reducing time to care and improving efficiency of care delivery for veterans. So that’s where we’re trying to, you know, spend time talking to the folks in SVAC and the Hill about, and learn about some of the strategies people are trying to implement when it comes to the Dole Act and some of the other things that people are trying to advance when it comes to improving access to care for veterans and really, we’re a small technology company that focuses on healthcare access. And we’re just, you know, trying to support improving access to care for veterans wherever and whenever we can because it’s a really important institution. It’s the largest health system in our country. And it’s probably one of the most outdated when it comes to the complexity of modernizing care for scheduling and finding appointments for veterans. And there’s a lot of things that I think we can do to help the VA as they work to improve some of those services.
Terry Gerton You’ve said that the VA was built for the last century and you’ve just mentioned the Electronic Health Record that the VA spent billions of dollars on and still doesn’t have an operational system. What would you recommend in terms of practice for modernizing some of those administrative functions of the VA?
Sean O’Connor Yeah, it’s complicated. So I’m not suggesting this isn’t complicated. It’s, the VA has gone through four different attempts to try to modernize and it’s still not successful yet in trying to get to the end goal of improving access to care for veterans and having a global view of care. So I think the first thing we’ve been talking to folks about is, today everything works in silos. And it’s tough to leverage the size and sophistication of the VA caregivers when everything’s in silos. And there’s close to 130 different VistA instances, a growing number of Oracle instances. And one of the leaders we talked to at the VA last time we were in D.C. said that the complexity of VA care delivery is beyond human comprehension. There’s how customized each of those VistA instances are. They’re all a unique Snowflake. They don’t talk to each other, they don’t share inventory. One of the VISNs we’re talking to now about a project, there’s roughly 10,000 appointments that go unutilized every month in his hospital because these different EHR instances don’t talk to each other. So one of the first things we’re talking about is, you know, trying to break down those data cells to bring all the supply and all the demand into one queue. And this is what we do for some of the other largest health systems in the country, Kaiser and other folks, where we take this global view of inventory and then you can use, you know, AI and some of these sophisticated navigation tools that have been built in the digital age of healthcare since the pandemic, to start to look at how you load balance that network a little more efficiently, how you share resources, how you improve internal utilization, improve efficiency, and reduce care gaps across boards. So I think until the VA finds a way through either a massive conversion to a centralized EHR or finding ways to work with technology entrepreneurs and vendors that can break down some of these data silos, they’ll continue to have the problem of trying to transition to a large EMR system in Oracle and through that process still have these 130 other systems and up to 24 different scheduling solutions that have been customized across the various VISNs, none of them working together, none of them sharing information across each other. So you have the largest health system in the country, 9 million veterans and their family members that we’re supposed to provide and care for, and none of this stuff talks to each other to share capacity, to share utilization, to share best practices. It’s a very fragmented, siloed and complicated environment. So until we find ways to break down those silos and share, leverage the power of tech and data to kind of level that playing field, it’s going to be very difficult to move anything in a substantial manner, we think.
Terry Gerton I’m speaking with Sean O’Connor. He’s a Navy veteran and co founder and chief strategy officer at DexCare. The VA is not the only federal agency that’s bad at a big bang tech deployment. So when you talk about an agency-wide solution that breaks down silos, anybody who’s been around for a while probably rolls their eyes at us. What would intermediate sorts of technology be that could provide some solution while an agency-wide solution is underway?
Sean O’Connor Yeah, we’ve been a big proponent in working with other really large healthcare systems in the country and doing, you know, scalable, strategically thought-out proof of concepts and smaller fragments first and then learning and scaling and iterating and adopting quickly. So I think one of the things the VA has for it is it does have the VISN network and the ability to kind of do proof of concepts in some of these smaller regional health systems, learn, iterate and adopt and then look to scale from there. We think that’s the best way to do this stuff. That’s how we’ve done it with Kaiser and some of these other really large healthcare systems. You do smaller proof of concepts, you learn the integration points that are important to move the needle. You begin with the end in mind and understanding the success metrics that are going to be important to drive this. And then you learn, iterate and scale quickly from there with bottom-down and top-down support is the only way to kind of move these things. And at the same time, being very conscious of the providers as well. So all of the technology companies we’ve built, we built inside of large healthcare systems. And often cases, technology is only 50% of the problem. Understanding the provider and the change management and the amount of pressure that those folks are under to provide care, and not being disruptive to their workflows and making their lives less efficient. You have to be very thoughtful about that, or none of the stuff is going to go anywhere. You can’t just have tech for tech’s sake. It has to understand the provider world and how the provider interacts. And you have to be very purposeful in how you build these things out to scale from the bottom up over time.
Terry Gerton One of the big points that you’ve emphasized is real time access to care, especially for mental health services and especially in rural communities. Those are two big complicating aspects of the VA’s network. How can the VA think about addressing those kinds of issues? Is it a technology solution? Is it a culture solution? How do they get on to real time care, especially in mental health?
Sean O’Connor I think it’s both. And I think the hard part is it’s probably more culture than technology. But it’s a — I don’t know of a bigger issue for us to kind of rally around as a community to try to improve access care of veterans than this. So when I transitioned from the service in 2004, the VA received roughly $21 billion to support its mission, and 17 men and women took their life every day to suicide: friends, brothers, sisters, husbands, wives. Fast forward to 2024, the VA received $121 billion to support its mission, and that number is still the same. Roughly 17 men and women, brothers, sisters, mothers, daughters took their lives to suicide. We’ve lost more people to suicide in the last 20 years than we did in, you know, during the 9/11 era and supporting the 9/11 kind of ground-on combat. So it’s it’s a crisis that’s not talked about. We haven’t really moved the needle on it despite spending over $100 billion more to support the healthcare delivery mission of the VA. So it’s clearly not just a technology issue, but not having — going back to your first question, Terry — not having the ability to share resources across the network and reduce time to care and make it easier for vets to find and get into the services initially is a problem. I won’t say that’s the biggest problem, but it certainly doesn’t help. So … mental health services in the veteran community is a really complicated issue … It’s not just about having access to cares. You know, a big portion of people that need the care aren’t even enrolled in the VA, and then there’s a homeless population that’s not enrolled in the VA. And how do you how do you outreach and bring those folks in that need the help the most? So it’s a complicated issue, but not being able to have one 24/7-365 on-demand network that shares capacity across mental health services for the VA is an issue as well. And the technology issues are easier to address. We just got to have people that are willing to address them. The cultural issues and the stigmatism around, you know, raising your hand for help is a harder issue to address, but it’s just something we gotta continue to talk about because it’s a travesty that in over 20 years, that number really hasn’t moved, despite putting, you know, literally over $100 billion more at the overall global healthcare issue.
Terry Gerton Well you talked about capacity there and certainly building out the community network of care is a big issue and a big initiative for VA. Are there issues on the community participant side of this so, that community care providers don’t understand the VA as much as the VA doesn’t understand community care providers?
Sean O’Connor We’re going to run out of time on your podcast. Yes, so that’s to me like, you know, obviously selfishly, like, we want to help the VA as a technology company, but the importance of improving access to care for veterans is at the heart of everything that we’re trying to do here. So the beauty of the VA to me — I mention I’m a third-generation veteran, it is a unique community. So when I when I first got out of the military, I moved to Seattle, like, it was a tough transition going from the military to the corporate world. I didn’t know anybody up here. My family and I grew up in Jersey, all my family was on the East Coast. I would literally just go to the Seattle VA and hang out in the lobby and just talk to people that you know had their Vietnam hat on. It’s a community and a culture that you know, should be protected in this institution, in this country. And some of the caregivers, you know, we’re talking about the technology piece here. These are some of the most mission-driven caregivers in the world. Like, they can make more money outside the VA. They choose to work with this community and this provider network for a reason. So there is an understanding of that that I think we need to protect because there is an understanding of someone that’s come back from deployment and has been through some serious high optempo stuff that comes back, and you just get a different conversation with your primary care provider in the VA than somebody outside the VA. So I think there’s that element that we have to protect. But there’s also the element, frankly, that you know, as a veteran, I like the option to have choice to go outside the VA for services that they may not be expert in. So certainly, you know, wound care, PTSD, that stuff, I think should stay in the VA. But maybe, you know, I’m a former athlete and tore my knee up and can get into an ortho appointment outside the VA. I want to have that optionality. And some stuff like that, the history isn’t as important to the veteran for some of those conditions. So, to have the optionality to go out there and do that is important. But what we’re seeing, at least for some of the areas that we work with is the community providers, one, they don’t have a lot of excess capacity to share with the VA. Every health system is stretched to the gill. Like there’s not a ton of health systems raising hands saying, hey, we have providers sitting on their hands. It’s six to eight months to get into an ortho appointment in some of these large health systems as it is. So to have that capacity to share with the VA, one, is difficult. Some of those things I think are bigger deals than others to your point of, you know, should there be a continuum to care in the VA? I’d argue some services is, just do it in the VA and some are easily, you know, sourced out. And then there’s the whole issue of, when they’re sourced out, how do you manage the care gaps for the veteran? How do we close some of those care gaps as those services continue to rise and the disparate records continue to grow across the network?
The post The VA’s size and complexity may be keeping top tech minds away, and veterans pay the price first appeared on Federal News Network.

© Getty Images/Kiyoshi Tanno