Normal view

There are new articles available, click to refresh the page.
Yesterday — 5 December 2025Main stream

At VA, cyber dominance is in, cyber compliance is out

5 December 2025 at 15:25

The Department of Veterans Affairs is moving toward a more operational approach to cybersecurity.

This means VA is applying a deeper focus on protecting the attack surfaces and closing off threat vectors that put veterans’ data at risk.

Eddie Pool, the acting principal assistant secretary for information and technology and acting principal deputy chief information officer at VA, said the agency is changing its cybersecurity posture to reflect a cyber dominance approach.

Eddie Pool is the acting principal assistant secretary for information and technology and acting principal deputy chief information officer at the Department of Veterans Affairs.

“That’s a move away from the traditional and an exclusively compliance based approach to cybersecurity, where we put a lot of our time resources investments in compliance based activities,” Pool said on Ask the CIO. “For example, did someone check the box on a form? Did someone file something in the right place? We’re really moving a lot of our focus over to the risk-based approach to security, pushing things like zero trust architecture, micro segmentation of our networks and really doing things that are more focused on the operational landscape. We are more focused on protecting those attack surfaces and closing off those threat vectors in the cyber space.”

A big part of this move to cyber dominance is applying the concepts that make up a zero trust architecture like micro segmentation and identity and access management.

Pool said as VA modernizes its underlying technology infrastructure, it will “bake in” these zero trust capabilities.

“Over the next several years, you’re going to see that naturally evolve in terms of where we are in the maturity model path. Our approach here is not necessarily to try to map to a model. It’s really to rationalize what are the highest value opportunities that those models bring, and then we prioritize on those activities first,” he said. “We’re not pursuing it in a linear fashion. We are taking parts and pieces and what makes the most sense for the biggest thing for our buck right now, that’s where we’re putting our energy and effort.”

One of those areas that VA is focused on is rationalizing the number of tools and technologies it’s using across the department. Pool said the goal is to get down to a specific set instead of having the “31 flavors” approach.

“We’re going to try to make it where you can have any flavor you want so long as it’s chocolate. We are trying to get that standardized across the department,” he said. “That gives us the opportunity from a sustainment perspective that we can focus the majority of our resources on those enterprise standardized capabilities. From a security perspective, it’s a far less threat landscape to have to worry about having 100 things versus having two or three things.”

The business process reengineering priority

Pool added that redundancy remains a key factor in the security and tool rationalization effort. He said VA will continue to have a diversity of products in its IT investment portfolios.

“Where we are at is we are looking at how do we build that future state architecture, as elegantly and simplistically as possible so that we can manage it more effectively, they can protect it more securely,” he said.

In addition to standardizing on technology and cyber tools and technologies, Pool said VA is bringing the same approach to business processes for enterprisewide services.

He said over the years, VA has built up a laundry list of legacy technology all with different versions and requirements to maintain.

“We’ve done a lot over the years in the Office of Information and Technology to really standardize on our technology platforms. Now it’s time to leverage that, to really bring standard processes to the business,” he said. “What that does is that really does help us continue to put the veteran at the center of everything that we do, and it gives a very predictable, very repeatable process and expectation for veterans across the country, so that you don’t have different experiences based on where you live or where you’re getting your health care and from what part of the organization.”

Part of the standardization effort is that VA will expand its use of automation, particularly in processing of veterans claims.

Pool said the goal is to take more advantage of the agency’s data and use artificial intelligence to accelerate claims processing.

“The richness of the data and the standardization of our data that we’re looking at and how we can eliminate as many steps in these processes as we can, where we have data to make decisions, or we can automate a lot of things that would completely eliminate what would be a paper process that is our focus,” Pool said. “We’re trying to streamline IT to the point that it’s as fast and as efficient, secure and accurate as possible from a VA processing perspective, and in turn, it’s going to bring a decision back to the veteran a lot faster, and a decision that’s ready to go on to the next step in the process.”

Many of these updates already are having an impact on VA’s business processes. The agency said that it set a new record for the number of disability and pension claims processed in a single year, more than 3 million. That beat its record set in 2024 by more than 500,000.

“We’re driving benefit outcomes. We’re driving technology outcomes. From my perspective, everything that we do here, every product, service capability that the department provides the veteran community, it’s all enabled through technology. So technology is the underpinning infrastructure, backbone to make all things happen, or where all things can fail,” Pool said. “First, on the internal side, it’s about making sure that those infrastructure components are modernized. Everything’s hardened. We have a reliable, highly available infrastructure to deliver those services. Then at the application level, at the actual point of delivery, IT is involved in every aspect of every challenge in the department, to again, bring the best technology experts to the table and look at how can we leverage the best technologies to simplify the business processes, whether that’s claims automation, getting veterans their mileage reimbursement earlier or by automating processes to increase the efficacy of the outcomes that we deliver, and just simplify how the veterans consume the services of VA. That’s the only reason why we exist here, is to be that enabling partner to the business to make these things happen.”

The post At VA, cyber dominance is in, cyber compliance is out first appeared on Federal News Network.

© Getty Images/ipopba

Cyber security network and data protection technology on virtual interface screen.
Before yesterdayMain stream

Yeske helped change what complying with zero trust means

7 November 2025 at 17:44

The Cybersecurity and Infrastructure Security Agency developed a zero trust architecture that features five pillars.

The Defense Department’s zero trust architecture includes seven pillars.

The one the Department of Homeland Security is implementing takes the best of both architectures and adds a little more to the mix.

Don Yeske, who recently left federal service after serving for the last two-plus years as the director of national security in the cyber division at DHS, said the agency had to take a slightly different approach for several reasons.

Don Yeske is a senior solutions architect federal at Virtu and a former director of national security in the cyber division at the Homeland Security Department.

“If you look at OMB [memo] M-22-09 it prescribes tasks. Those tasks are important, but that itself is not a zero trust strategy. Even if you do everything that M-22-09 told us to do — and by the way, those tasks were due at the beginning of this year — even if you did it all, that doesn’t mean, goal achieved. We’re done with zero trust. Move on to the next thing,” Yeske said during an “exit” interview on Ask the CIO. “What it means is you’re much better positioned now to do the hard things that you had to do and that we hadn’t even contemplated telling you to do yet. DHS, at the time that I left, was just publishing this really groundbreaking architecture that lays out what the hard parts actually are and begins to attack them. And frankly, it’s all about the data pillar.”

The data pillar of zero trust is among the toughest ones. Agencies have spent much of the past two years focused on other parts of the architecture, like improving their cybersecurity capabilities in the identity and network pillars.

Yeske, who now is a senior solutions architect federal at Virtru, said the data pillar challenge for DHS is even bigger because of the breadth and depth of its mission. He said between the Coast Guard, FEMA, Customs and Border Protection and CISA alone, there are multiple data sources, requirements and security rules.

“What’s different about it is we viewed the problem of zero trust as coming in broad phases. Phase one, where you’re just beginning to think about zero trust, and you’re just beginning to adjust your approach, is where you start to take on the idea that my network boundary can’t be my primary, let alone sole line of defense. I’ve got to start shrinking those boundaries around the things that I’m trying to protect,” he said. “I’ve got to start defending within my network architecture, not just from the outside, but start viewing the things that are happening within my network with suspicion. Those are all building on the core tenants of zero trust.”

Capabilities instead of product focused

He said initial zero trust strategy stopped there, segmenting networks and protecting data at rest.

But to get to this point, he said agencies too often are focused on implementing specific products around identity or authentication and authorization processes.

“It’s a fact that zero trust is something you do. It’s not something you buy. In spite of that, federal architecture has this pervasive focus on product. So at DHS, the way we chose to describe zero trust capability was as a series of capabilities. We chose, without malice or forethought, to measure those capabilities at the organization, not at the system, not at the component, not as a function of design,” Yeske said. “Organizations have capabilities, and those capabilities are comprised of three big parts: People. Who’s responsible for the thing you’re describing within your organization? Process. How have you chosen to do the thing that you’re describing at your organization and products? What helps you do that?”

Yeske said the third part is technology, which, too often, is intertwined with the product part.

He said the DHS architecture moved away from focusing on product or technology, and instead tried to answer the simple, yet complex, questions: What’s more important right now? What are the things that I should spend my limited pool of dollars on?

“We built a prioritization mechanism, and we built it on the idea that each of those capabilities, once we understand their inherent relationships to one another, form a sort of Maslow’s hierarchy of zero trust. There are things that are more basic, that if you don’t do this, you really can’t do anything else, and there are things that are really advanced, that once you can do basically everything else you can contemplate doing this. And there are a lot of things in between,” he said. “We took those 46 capabilities based on their inherent logical relationships, and we came up with a prioritization scheme so that you could, if you’re an organization implementing zero trust, prioritize the products, process and technologies.”

Understanding cyber tool dependencies

DHS defined those 46 capabilities based on the organization’s ability to perform that function to protect its data, systems or network.

Yeske said, for example, with phishing-resistant, multi-factor authentication, DHS didn’t specify the technology or product needed, but just the end result of the ability to authenticate users using multiple factors that are resistant to phishing.

“We’re describing something your organization needs to be able to do because if you can’t do that, there are other things you need to do that you won’t be able to do. We just landed on 46, but that’s not actually all that weird. If you look at the Defense Department’s zero trust roadmap, it contains a similar number of things they describe as capability, which are somewhat different,” said Yeske, who spent more than 15 years working for the Navy and Marine Corps before coming to DHS. “We calculated a 92% overlap between the capabilities we described in our architecture and the ones DoD described. And the 8% difference is mainly because the DHS one is brand new. So just understanding that the definition of each of these capabilities also includes two types of relationships, a dependency, which is where you can’t have this capability unless you first had a different one.”

Yeske said before he left DHS in July, the zero trust architecture and framework had been approved for use and most of the components had a significant number of cyber capabilities in place.

He said the next step was assessing the maturity of those capabilities and figuring out how to move them forward.

If other agencies are interested in this approach, Yeske said the DHS architecture should be available for them to get a copy of.

The post Yeske helped change what complying with zero trust means first appeared on Federal News Network.

© Getty Images/design master

Innovator Spotlight: Corelight

By: Gary
9 September 2025 at 12:24

The Network’s Hidden Battlefield: Rethinking Cybersecurity Defense Modern cyber threats are no longer knocking at the perimeter – they’re already inside. The traditional security paradigm has fundamentally shifted, and CISOs...

The post Innovator Spotlight: Corelight appeared first on Cyber Defense Magazine.

Contain Breaches and Gain Visibility With Microsegmentation

1 February 2023 at 09:00

Organizations must grapple with challenges from various market forces. Digital transformation, cloud adoption, hybrid work environments and geopolitical and economic challenges all have a part to play. These forces have especially manifested in more significant security threats to expanding IT attack surfaces.

Breach containment is essential, and zero trust security principles can be applied to curtail attacks across IT environments, minimizing business disruption proactively. Microsegmentation has emerged as a viable solution through its continuous visualization of workload and device communications and policy creation to define what communications are permitted. In effect, microsegmentation restricts lateral movement, isolates breaches and thwarts attacks.

Given the spotlight on breaches and their impact across industries and geographies, how can segmentation address the changing security landscape and client challenges? IBM and its partners can help in this space.

Breach Landscape and Impact of Ransomware

Historically, security solutions have focused on the data center, but new attack targets have emerged with enterprises moving to the cloud and introducing technologies like containerization and serverless computing. Not only are breaches occurring and attack surfaces expanding, but also it has become easier for breaches to spread. Traditional prevention and detection tools provided surface-level visibility into traffic flow that connected applications, systems and devices communicating across the network.  However, they were not intended to contain and stop the spread of breaches.

Ransomware is particularly challenging, as it presents a significant threat to cyber resilience and financial stability. A successful attack can take a company’s network down for days or longer and lead to the loss of valuable data to nefarious actors. The Cost of a Data Breach 2022 report, conducted by the Ponemon Institute and sponsored by IBM Security, cites $4.54 million as the average ransomware attack cost, not including the ransom itself.

In addition, a recent IDC study highlights that ransomware attacks are evolving in sophistication and value. Sensitive data is being exfiltrated at a higher rate as attackers go after the most valuable targets for their time and money. Ultimately, the cost of a ransomware attack can be significant, leading to reputational damage, loss of productivity and regulatory compliance implications.

Organizations Want Visibility, Control and Consistency

With a focus on breach containment and prevention, hybrid cloud infrastructure and application security, security teams are expressing their concerns. Three objectives have emerged as vital for them.

First, organizations want visibility. Gaining visibility empowers teams to understand their applications and data flows regardless of the underlying network and compute architecture.

Second, organizations want consistency. Fragmented and inconsistent segmentation approaches create complexity, risk and cost. Consistent policy creation and strategy help align teams across heterogeneous environments and facilitate the move to the cloud with minimal re-writing of security policy.

Finally, organizations want control. Solutions that help teams target and protect their most critical assets deliver the greatest return. Organizations want to control communications through selectively enforced policies that can expand and improve as their security posture matures towards zero trust security.

Microsegmentation Restricts Lateral Movement to Mitigate Threats

Microsegmentation (or simply segmentation) combines practices, enforced policies and software that provide user access where required and deny access everywhere else. Segmentation contains the spread of breaches across the hybrid attack surface by continually visualizing how workloads and devices communicate. In this way, it creates granular policies that only allow necessary communication and isolate breaches by proactively restricting lateral movement during an attack.

The National Institute of Standards and Technology (NIST) highlights microsegmentation as one of three key technologies needed to build a zero trust architecture, a framework for an evolving set of cybersecurity paradigms that move defense from static, network-based perimeters to users, assets and resources.

Suppose existing detection solutions fail and security teams lack granular segmentation. In that case, malicious software can enter their environment, move laterally, reach high-value applications and exfiltrate critical data, leading to catastrophic outcomes.

Ultimately, segmentation helps clients respond by applying zero trust principles like ‘assume a breach,’ helping them prepare in the wake of the inevitable.

IBM Launches Segmentation Security Services

In response to growing interest in segmentation solutions, IBM has expanded its security services portfolio with IBM Security Application Visibility and Segmentation Services (AVS). AVS is an end-to-end solution combining software with IBM consulting and managed services to meet organizations’ segmentation needs. Regardless of where applications, data and users reside across the enterprise, AVS is designed to give clients visibility into their application network and the ability to contain ransomware and protect their high-value assets.

AVS will walk you through a guided experience to align your stakeholders on strategy and objectives, define the schema to visualize desired workloads and devices and build the segmentation policies to govern network communications and ring-fence critical applications from unauthorized access. Once the segmentation policies are defined and solutions deployed, clients can consume steady-state services for ongoing management of their environment’s workloads and applications. This includes health and maintenance, policy and configuration management, service governance and vendor management.

IBM has partnered with Illumio, an industry leader in zero trust segmentation, to deliver this solution.  Illumio’s software platform provides attack surface visibility, enabling you to see all communication and traffic between workloads and devices across the entire hybrid attack surface. In addition, it allows security teams to set automated, granular and flexible segmentation policies that control communications between workloads and devices, only allowing what is necessary to traverse the network. Ultimately, this helps organizations to quickly isolate compromised systems and high-value assets, stopping the spread of an active attack.

With AVS, clients can harden compute nodes across their data center, cloud and edge environments and protect their critical enterprise assets.

Start Your Segmentation Journey

IBM Security Services can help you plan and execute a segmentation strategy to meet your objectives. To learn more, register for the on-demand webinar now.

The post Contain Breaches and Gain Visibility With Microsegmentation appeared first on Security Intelligence.

❌
❌