Reading view

There are new articles available, click to refresh the page.

At VA, cyber dominance is in, cyber compliance is out

The Department of Veterans Affairs is moving toward a more operational approach to cybersecurity.

This means VA is applying a deeper focus on protecting the attack surfaces and closing off threat vectors that put veterans’ data at risk.

Eddie Pool, the acting principal assistant secretary for information and technology and acting principal deputy chief information officer at VA, said the agency is changing its cybersecurity posture to reflect a cyber dominance approach.

Eddie Pool is the acting principal assistant secretary for information and technology and acting principal deputy chief information officer at the Department of Veterans Affairs.

“That’s a move away from the traditional and an exclusively compliance based approach to cybersecurity, where we put a lot of our time resources investments in compliance based activities,” Pool said on Ask the CIO. “For example, did someone check the box on a form? Did someone file something in the right place? We’re really moving a lot of our focus over to the risk-based approach to security, pushing things like zero trust architecture, micro segmentation of our networks and really doing things that are more focused on the operational landscape. We are more focused on protecting those attack surfaces and closing off those threat vectors in the cyber space.”

A big part of this move to cyber dominance is applying the concepts that make up a zero trust architecture like micro segmentation and identity and access management.

Pool said as VA modernizes its underlying technology infrastructure, it will “bake in” these zero trust capabilities.

“Over the next several years, you’re going to see that naturally evolve in terms of where we are in the maturity model path. Our approach here is not necessarily to try to map to a model. It’s really to rationalize what are the highest value opportunities that those models bring, and then we prioritize on those activities first,” he said. “We’re not pursuing it in a linear fashion. We are taking parts and pieces and what makes the most sense for the biggest thing for our buck right now, that’s where we’re putting our energy and effort.”

One of those areas that VA is focused on is rationalizing the number of tools and technologies it’s using across the department. Pool said the goal is to get down to a specific set instead of having the “31 flavors” approach.

“We’re going to try to make it where you can have any flavor you want so long as it’s chocolate. We are trying to get that standardized across the department,” he said. “That gives us the opportunity from a sustainment perspective that we can focus the majority of our resources on those enterprise standardized capabilities. From a security perspective, it’s a far less threat landscape to have to worry about having 100 things versus having two or three things.”

The business process reengineering priority

Pool added that redundancy remains a key factor in the security and tool rationalization effort. He said VA will continue to have a diversity of products in its IT investment portfolios.

“Where we are at is we are looking at how do we build that future state architecture, as elegantly and simplistically as possible so that we can manage it more effectively, they can protect it more securely,” he said.

In addition to standardizing on technology and cyber tools and technologies, Pool said VA is bringing the same approach to business processes for enterprisewide services.

He said over the years, VA has built up a laundry list of legacy technology all with different versions and requirements to maintain.

“We’ve done a lot over the years in the Office of Information and Technology to really standardize on our technology platforms. Now it’s time to leverage that, to really bring standard processes to the business,” he said. “What that does is that really does help us continue to put the veteran at the center of everything that we do, and it gives a very predictable, very repeatable process and expectation for veterans across the country, so that you don’t have different experiences based on where you live or where you’re getting your health care and from what part of the organization.”

Part of the standardization effort is that VA will expand its use of automation, particularly in processing of veterans claims.

Pool said the goal is to take more advantage of the agency’s data and use artificial intelligence to accelerate claims processing.

“The richness of the data and the standardization of our data that we’re looking at and how we can eliminate as many steps in these processes as we can, where we have data to make decisions, or we can automate a lot of things that would completely eliminate what would be a paper process that is our focus,” Pool said. “We’re trying to streamline IT to the point that it’s as fast and as efficient, secure and accurate as possible from a VA processing perspective, and in turn, it’s going to bring a decision back to the veteran a lot faster, and a decision that’s ready to go on to the next step in the process.”

Many of these updates already are having an impact on VA’s business processes. The agency said that it set a new record for the number of disability and pension claims processed in a single year, more than 3 million. That beat its record set in 2024 by more than 500,000.

“We’re driving benefit outcomes. We’re driving technology outcomes. From my perspective, everything that we do here, every product, service capability that the department provides the veteran community, it’s all enabled through technology. So technology is the underpinning infrastructure, backbone to make all things happen, or where all things can fail,” Pool said. “First, on the internal side, it’s about making sure that those infrastructure components are modernized. Everything’s hardened. We have a reliable, highly available infrastructure to deliver those services. Then at the application level, at the actual point of delivery, IT is involved in every aspect of every challenge in the department, to again, bring the best technology experts to the table and look at how can we leverage the best technologies to simplify the business processes, whether that’s claims automation, getting veterans their mileage reimbursement earlier or by automating processes to increase the efficacy of the outcomes that we deliver, and just simplify how the veterans consume the services of VA. That’s the only reason why we exist here, is to be that enabling partner to the business to make these things happen.”

The post At VA, cyber dominance is in, cyber compliance is out first appeared on Federal News Network.

© Getty Images/ipopba

Cyber security network and data protection technology on virtual interface screen.

The Dual Role of AI in Cybersecurity: Shield or Weapon?

Artificial intelligence isn’t just another tool in the security stack anymore – it’s changing how software is written, how vulnerabilities spread and how long attackers can sit undetected inside complex environments. Security researcher and startup founder Guy Arazi unpacks why AI has become both a powerful defensive accelerator and a force multiplier for adversaries, especially..

The post The Dual Role of AI in Cybersecurity: Shield or Weapon? appeared first on Security Boulevard.

When the Browser Becomes the Battleground for AI and Last-Mile Attacks

For years we treated the browser as just another application. That era is over. As Vivek Ramachandran points out, the browser has quietly become the new endpoint—and attackers have noticed. Users now live in the browser for work, banking, crypto, entertainment and everything in between. If that’s where the users are, that’s where the attacks..

The post When the Browser Becomes the Battleground for AI and Last-Mile Attacks appeared first on Security Boulevard.

How the administration is bringing much needed change to software license management

Over the last 11 months, the General Services Administration has signed 11 enterprisewide software agreements under its OneGov strategy.

The agreements bring both standard terms and conditions as well as significant discounts for a limited period of time to agencies.

Ryan Triplette, the executive director of the Coalition for Fair Software Licensing, said the Trump administration seems to be taking cues from what has been working, or not working, in the private sector around managing software licenses.

Ryan Triplette is the executive director of the Coalition for Fair Software Licensing.

“They seem to be saying, ‘let’s see if we can import that in to the federal agencies,’ and ‘let’s see if we can address that to mitigate some of the issues that have been occurring in some of the systemic problems that have been occurring here,’” said Triplette on Ask the CIO. “Now it’s significant, and it’s a challenge, but it’s something that we think is important that you understand any precedent that is set in one place, in this instance, in the public agencies, will have a ripple of impact over into the commercial sector.”

The coalition, which cloud service providers created in 2022 to advocate for less-restrictive rules for buying software, outlined nine principles that it would like to see applied to all software licenses, including terms should be clear and intelligible, customers should be free to run their on-premise software on the cloud of their choice and licenses should cover reasonably expected software uses.

Triplette said while there still is a lot to understand about these new OneGov agreements, GSA seems to recognize there is an opportunity to address some long standing challenges with how the government buys and manages its software.

“You had the Department of Government Efficiency (DOGE) efforts and you had the federal chief information officer calling for an assessment of the top five software vendors from all the federal agencies. And you also have the executive order that established OneGov and having them seeking to establish these enterprisewide licensees, I think they recognize that there’s an opportunity here to effect change and to borrow practices from what they have seen has worked in the commercial sector,” she said. “Now there’s so many moving parts of issues that need to be addressed within the federal government’s IT and systems, generally. But just tackling issues that we have seen within software and just tackling the recommendations that have been made by the Government Accountability Office over the past several years is important.”

Building on the success of the MEGABYTE Act

GAO has highlighted concerns about vendors applying restrictive licensing practices. In November 2024, GAO found vendor processes that limit, impede or prevent agencies’ efforts to use software in cloud computing. Meanwhile of the six agencies auditors analyzed, none had “fully established guidance that specifically addressed the two key industry activities for effectively managing the risk of impacts of restrictive practices.”

Triplette said the data call by the federal CIO in April and the OneGov efforts are solid initial steps to change how agencies buy and manage software.

The Office of Management and Budget and GSA have tried several times over the past two decades to improve the management of software. Congress also joined the effort passing the Making Electronic Government (MEGABYTE) Act in 2016.

Triplette said despite these efforts the lack of data has been a constant problem.

“The federal government has found that even when there’s a modicum of understanding of what their software asset management uses, they seem to find a cost performance improvement within the departments. So that’s been one issue. You have the differing needs of the various agencies and departments. This has led them in previous efforts to either opt out of enterprisewide licenses or to modify them with their own terms. So even when there’s been these efforts, you find, like, a year or two or three years later, it’s all a wash,” she said. “Quite frankly, you have a lack of a central mandate and appropriations line. That’s probably the most fundamental thing and why it also differs so fundamentally from other governments that have some of these more centralized services. For instance, the UK government has a central mandate, it works quite well.”

Triplette said what has changed is what she called a “sheer force of will” by OMB and GSA.

“They are recognizing the significant amount of waste that’s been occurring and that there has been lock-in with some software vendors and other issues that need to be tackled,” she said. “I think you’ve seen where the administration has really leaned into that. Now, what is going to be interesting is because it has been so centralized, like the OneGov effort, it’s still also an opt-in process. So that’s why I keep on saying, it’ll to be determined how effective it will be.”

SAMOSA gaining momentum

In addition to the administration’s efforts, Triplette said she’s hopeful Congress finally passes the Strengthening Agency Management and Oversight of Software Assets (SAMOSA) Act. The Senate ran out of time to act on SAMOSA last session, after the House passed it in December.

The latest version of SAMOSA mirrors the Senate bill the committee passed in May 2023. It also is similar to the House version introduced in March by Reps. Nancy Mace (R-S.C.), the late Gerry Connolly (D-Va.), and several other lawmakers.

The coalition is a strong supporter of SAMOSA.

Triplette said one of the most important provisions in the bill would require agencies to have a dedicated executive overseeing software license asset management.

“There is an importance and a need to have greater expertise within the federal workforce, around software licensing, and especially arguably, vendor-specific software licensing terms,” she said. “I think this is one area that the administration could take a cue from the commercial sector. When they’re engaged in commercial licensing, they tend to work with consultants that are experts in the vendor licensing rules, they understand the policy and they understand the ins and outs. They often have somebody in house that … may not be solely specific to one vendor, but they may do only two or three and so you really have that depth of expertise, that you can understand some great cost savings.”

Triplette added that while finding these types of experts isn’t easy, the return on the investment of either hiring or training someone is well worth it.

She said some estimate that the government could save $50 million a year by improving how it manages its software licenses.  This is on top of what the MEGABYTE Act already produced. In 2020, the Senate Homeland Security and Governmental Affairs Committee found that 13 agencies saved or avoided spending more than $450 million between fiscal 2017 and 2019 because of the MEGABYTE Act.

“The MEGABYTE Act was an excellent first step, but this, like everything, [is] part of an iterative process. I think it’s something that needs to have the requirement that it has to be done and mandated,” Triplette said. “This is something that has become new as you’ve had the full federal movement to the cloud, and the discussion of licensing terms between on-premise and the cloud, and the intersection between all of this transformation. That is something that wasn’t around during the MEGABYTE Act. I think that’s where it’s a little bit of a different situation.”

The post How the administration is bringing much needed change to software license management first appeared on Federal News Network.

© Federal News Network

fnr-icon-full

The more people trust the systems they use, the more they’ll participate in the digital economy, and that’s how innovation truly scales

Cyber threats are evolving at an unprecedented pace, making collaboration, innovation, and resilience more essential than ever. In this exclusive interview, we sit down with Tina Mirceta, Senior Managing Consultant, Security Services, SEE at Mastercard, to discuss how the cybersecurity landscape is transforming and what organizations can do to stay ahead of increasingly complex attacks. […]

The post The more people trust the systems they use, the more they’ll participate in the digital economy, and that’s how innovation truly scales appeared first on DefCamp 2025.

Cybersecurity is no longer a separate layer – it’s at the core of digital transformation

As digital transformation accelerates, cybersecurity has become a cornerstone of business resilience. We spoke with Florin Popa, Orange Business Director, about how Orange Romania is shaping the future of ICT and cybersecurity through innovation, partnerships, and education. From the launch of SCUT, the newest cybersecurity company in Romania, to the importance of collaboration within the […]

The post Cybersecurity is no longer a separate layer – it’s at the core of digital transformation appeared first on DefCamp 2025.

Yeske helped change what complying with zero trust means

The Cybersecurity and Infrastructure Security Agency developed a zero trust architecture that features five pillars.

The Defense Department’s zero trust architecture includes seven pillars.

The one the Department of Homeland Security is implementing takes the best of both architectures and adds a little more to the mix.

Don Yeske, who recently left federal service after serving for the last two-plus years as the director of national security in the cyber division at DHS, said the agency had to take a slightly different approach for several reasons.

Don Yeske is a senior solutions architect federal at Virtu and a former director of national security in the cyber division at the Homeland Security Department.

“If you look at OMB [memo] M-22-09 it prescribes tasks. Those tasks are important, but that itself is not a zero trust strategy. Even if you do everything that M-22-09 told us to do — and by the way, those tasks were due at the beginning of this year — even if you did it all, that doesn’t mean, goal achieved. We’re done with zero trust. Move on to the next thing,” Yeske said during an “exit” interview on Ask the CIO. “What it means is you’re much better positioned now to do the hard things that you had to do and that we hadn’t even contemplated telling you to do yet. DHS, at the time that I left, was just publishing this really groundbreaking architecture that lays out what the hard parts actually are and begins to attack them. And frankly, it’s all about the data pillar.”

The data pillar of zero trust is among the toughest ones. Agencies have spent much of the past two years focused on other parts of the architecture, like improving their cybersecurity capabilities in the identity and network pillars.

Yeske, who now is a senior solutions architect federal at Virtru, said the data pillar challenge for DHS is even bigger because of the breadth and depth of its mission. He said between the Coast Guard, FEMA, Customs and Border Protection and CISA alone, there are multiple data sources, requirements and security rules.

“What’s different about it is we viewed the problem of zero trust as coming in broad phases. Phase one, where you’re just beginning to think about zero trust, and you’re just beginning to adjust your approach, is where you start to take on the idea that my network boundary can’t be my primary, let alone sole line of defense. I’ve got to start shrinking those boundaries around the things that I’m trying to protect,” he said. “I’ve got to start defending within my network architecture, not just from the outside, but start viewing the things that are happening within my network with suspicion. Those are all building on the core tenants of zero trust.”

Capabilities instead of product focused

He said initial zero trust strategy stopped there, segmenting networks and protecting data at rest.

But to get to this point, he said agencies too often are focused on implementing specific products around identity or authentication and authorization processes.

“It’s a fact that zero trust is something you do. It’s not something you buy. In spite of that, federal architecture has this pervasive focus on product. So at DHS, the way we chose to describe zero trust capability was as a series of capabilities. We chose, without malice or forethought, to measure those capabilities at the organization, not at the system, not at the component, not as a function of design,” Yeske said. “Organizations have capabilities, and those capabilities are comprised of three big parts: People. Who’s responsible for the thing you’re describing within your organization? Process. How have you chosen to do the thing that you’re describing at your organization and products? What helps you do that?”

Yeske said the third part is technology, which, too often, is intertwined with the product part.

He said the DHS architecture moved away from focusing on product or technology, and instead tried to answer the simple, yet complex, questions: What’s more important right now? What are the things that I should spend my limited pool of dollars on?

“We built a prioritization mechanism, and we built it on the idea that each of those capabilities, once we understand their inherent relationships to one another, form a sort of Maslow’s hierarchy of zero trust. There are things that are more basic, that if you don’t do this, you really can’t do anything else, and there are things that are really advanced, that once you can do basically everything else you can contemplate doing this. And there are a lot of things in between,” he said. “We took those 46 capabilities based on their inherent logical relationships, and we came up with a prioritization scheme so that you could, if you’re an organization implementing zero trust, prioritize the products, process and technologies.”

Understanding cyber tool dependencies

DHS defined those 46 capabilities based on the organization’s ability to perform that function to protect its data, systems or network.

Yeske said, for example, with phishing-resistant, multi-factor authentication, DHS didn’t specify the technology or product needed, but just the end result of the ability to authenticate users using multiple factors that are resistant to phishing.

“We’re describing something your organization needs to be able to do because if you can’t do that, there are other things you need to do that you won’t be able to do. We just landed on 46, but that’s not actually all that weird. If you look at the Defense Department’s zero trust roadmap, it contains a similar number of things they describe as capability, which are somewhat different,” said Yeske, who spent more than 15 years working for the Navy and Marine Corps before coming to DHS. “We calculated a 92% overlap between the capabilities we described in our architecture and the ones DoD described. And the 8% difference is mainly because the DHS one is brand new. So just understanding that the definition of each of these capabilities also includes two types of relationships, a dependency, which is where you can’t have this capability unless you first had a different one.”

Yeske said before he left DHS in July, the zero trust architecture and framework had been approved for use and most of the components had a significant number of cyber capabilities in place.

He said the next step was assessing the maturity of those capabilities and figuring out how to move them forward.

If other agencies are interested in this approach, Yeske said the DHS architecture should be available for them to get a copy of.

The post Yeske helped change what complying with zero trust means first appeared on Federal News Network.

© Getty Images/design master

The journey to the Global Cybersecurity Camp (GCC), by DefCamp

By: florina

Building strong communities in the cybersecurity industry is crucial because cyber threats are constantly evolving, and no single entity can combat them alone. Collaboration fosters knowledge sharing, strengthens defenses, and accelerates incident response. This is exactly why the Global Cybersecurity Camp (GCC) exists – and why DefCamp is proud to represent Romania at GCC for […]

The post The journey to the Global Cybersecurity Camp (GCC), by DefCamp appeared first on DefCamp 2025.

“Bad actors will begin using massive A.I. to *create* new methods to escape traditional – State of the Art – security tools”

Edition #14 of DefCamp is just around the corner, and the excitement is building! With less than a week to go, the conference promises to dive into some of the most pressing challenges and emerging trends in the ever-evolving field of cybersecurity. As the industry continues to transform rapidly, DefCamp offers the perfect platform to […]

The post “Bad actors will begin using massive A.I. to *create* new methods to escape traditional – State of the Art – security tools” appeared first on DefCamp 2025.

Securing the cloud: insights on threats, solutions, and innovations

There is no mystery that everything nowadays has a digital component. A growing number of companies are gravitating towards digital and cloud storage solutions, recognizing the unparalleled convenience of accessing documents from any corner of the world. The appeal lies in the elimination of boundaries, fostering more efficient business transactions. We are proud to have […]

The post Securing the cloud: insights on threats, solutions, and innovations appeared first on DefCamp 2025.

Striking a balance between security updates, following threats and being a team player

The world of cybersecurity is fast paced, there’s no denying it. Innovation is constant and threats are ever-evolving. As a result, it’s not uncommon for professionals to get easily overwhelmed.  But here’s a comforting thought: none of us is ever truly alone in this journey. Where the infosec community comes into play In the vast […]

The post Striking a balance between security updates, following threats and being a team player appeared first on DefCamp 2025.

Pentesting: a tool for empowering – not punishing – companies

You’ve likely caught wind of this rising tide – offensive security, pentesting, and #RedTeams are not just gaining attention; they’re setting the trend. We addressed this very topic in an article earlier this year. What adds a dash of excitement is that DefCamp 2023, scheduled for next week, promises a lot of talk on offensive security. […]

The post Pentesting: a tool for empowering – not punishing – companies appeared first on DefCamp 2025.

Cybersecurity may come in different shapes and sizes, but eventually it’s all about customer trust and confidence

Since you’re working in cybersecurity, you’re most probably doing everything in your power to keep your company and/or your customers’ businesses as safe as possible. A cybersecurity specialist will strive to reduce the attack surface, conduct regular pentests, embrace the zero-trust approach, do constant research, keep up with the latest trends in cybercrime – the […]

The post Cybersecurity may come in different shapes and sizes, but eventually it’s all about customer trust and confidence appeared first on DefCamp 2022.

Driving innovation while securing intricate infrastructures

When scanning an extensive environment for vulnerabilities, there are lots of potential attack vectors hackers can employ, aiming to infiltrate protected organizations. But how does one protect against threats when operating on large and intricate infrastructures? Alex “Jay” Balan, CISO for Happening & Superbet Group, is in a capacity to offer some guidance. Leading the […]

The post Driving innovation while securing intricate infrastructures appeared first on DefCamp 2022.

IoT: how does interconnectivity alter the cybersecurity landscape?

The first things that come to mind when discussing “interconnectivity” are PCs, tablets, smartphones and smart houses. IoT goes further and broader than that. The IoT environment allows for anything to be wired up and connected to communicate, thus creating a massive information system with the capacity to improve the quality of life and enable […]

The post IoT: how does interconnectivity alter the cybersecurity landscape? appeared first on DefCamp 2022.

Raphaël Lheureux on the importance of Sharing information in cybersecurity as key to making the community thrive

Getting more context from cybersecurity pros is essential to have a more clear picture of cyber threats and see this industry through the lens of those who are actively involved in it. And we all need to understand why threat actors are still making a way into companies and wasting no opportunity to exploit their […]

The post Raphaël Lheureux on the importance of Sharing information in cybersecurity as key to making the community thrive appeared first on DefCamp 2022.

❌