Normal view

There are new articles available, click to refresh the page.
Before yesterdayMain stream

Digital Forensics: Drone Forensics for Battlefield and Criminal Analysis

23 December 2025 at 14:53

Welcome back, aspiring digital investigators!

Over the last few years, drones have moved from being niche gadgets to becoming one of the most influential technologies on the modern battlefield and far beyond it. The war in Ukraine accelerated this shift dramatically. During the conflict, drones evolved at an incredible pace, transforming from simple reconnaissance tools into precision strike platforms, electronic warfare assets, and logistics tools. This rapid adoption did not stop with military forces. Criminal organizations, including cartels and smuggling networks, quickly recognized the potential of drones for surveillance and contraband delivery. As drones became cheaper, more capable, and easier to modify, their use expanded into both legal and illegal activities. This created a clear need for digital forensics specialists who can analyze captured drones and extract meaningful information from them.

Modern drones are packed with memory chips, sensors, logs, and media files. Each of these components can tell a story about where the drone has been, how it was used, and who may have been controlling it. At its core, digital forensics is about understanding devices that store data. If something has memory, it can be examined.

U.S. Department of Defense Drone Dominance Initiative

Recognizing how critical drones have become, the United States government launched a major initiative focused on drone development and deployment. Secretary of War Pete Hegseth announced a one-billion-dollar “drone dominance” program aimed at equipping the U.S. military with large numbers of cheap, scalable attack drones.

US Department of Defense Drone Dominance Initiative

Modern conflicts have shown that it makes little sense to shoot down inexpensive drones using missiles that cost millions of dollars. The program focuses on producing tens of thousands of small drones by 2026 and hundreds of thousands by 2027. The focus has shifted away from a quality-over-quantity mindset toward deploying unmanned systems at scale. Analysts must be prepared to examine drone hardware and data just as routinely as laptops, phones, or servers.

Drone Platforms and Their Operational Roles

Not all drones are built for the same mission. Different models serve very specific roles depending on their design, range, payload, and level of control. On the battlefield, FPV drones are often used as precision strike weapons. These drones are lightweight, fast, and manually piloted in real time, allowing operators to guide them directly into high-value targets. Footage from Ukraine shows drones intercepting and destroying larger systems, including loitering munitions carrying explosive payloads.

Ukrainian "Sting" drone striking a Russian Shahed carrying an R-60 air-to-air missile
Ukrainian “Sting” drone striking a Russian Shahed carrying an R-60 air-to-air missile

To counter electronic warfare and jamming, many battlefield drones are now launched using thin fiber optic cables instead of radio signals. These cables physically connect the drone to the operator, making jamming ineffective. In heavily contested areas, forests are often covered with discarded fiber optic lines, forming spider-web-like patterns that reflect sunlight. Images from regions such as Kupiansk show how widespread this technique has become.

fiber optic cables in contested drone war zones

Outside of combat zones, drones serve entirely different purposes. Commercial drones are used for photography, mapping, agriculture, and infrastructure inspection. Criminal groups may use similar platforms for smuggling, reconnaissance, or intimidation. Each use case leaves behind different types of forensic evidence, which is why understanding drone models and their intended roles is so important during an investigation.

DroneXtractor – A Forensic Toolkit for DJI Drones

To make sense of all this data, we need specialized tools. One such tool is DroneXtractor, an open-source digital forensics suite available on GitHub and written in Golang. DroneXtractor is designed specifically for DJI drones and focuses on extracting and analyzing telemetry, sensor values, and flight data.

dronextractor a tool for drone forensics and drone file analysis

The tool allows investigators to visualize flight paths, audit drone activity, and extract data from multiple file formats. It is suitable for law enforcement investigations, military analysis, and incident response scenarios where understanding drone behavior is critical. With this foundation in mind, let us take a closer look at its main features.

Feature 1 – DJI File Parsing

DroneXtractor supports parsing common DJI file formats such as CSV, KML, and GPX. These files often contain flight logs, GPS coordinates, timestamps, altitude data, and other telemetry values recorded during a drone’s operation. The tool allows investigators to extract this information and convert it into alternative formats for easier analysis or sharing.

dji file parsing

In practical terms, this feature can help law enforcement reconstruct where a drone was launched, the route it followed, and where it landed. For military analysts, parsed telemetry data can reveal patrol routes, observation points, or staging areas used by adversaries. Even a single flight log can provide valuable insight into patterns of movement and operational habits.

Feature 2 – Steganography

Steganography refers to hiding information within other files, such as images or videos. DroneXtractor includes a steganography suite that can extract telemetry and other embedded data from media captured by DJI drones. This hidden data can then be exported into several different file formats for further examination.

stenography drone analysis

This capability is particularly useful because drone footage often appears harmless at first glance. An image or video shared online may still contain timestamps, unique identifiers and sensor readings embedded within it. For police investigations, this can link media to a specific location or event.

Feature 3 – Telemetry Visualization

Understanding raw numbers can be difficult, which is why visualization matters. DroneXtractor includes tools that generate flight path maps and telemetry graphs. The flight path mapping generator creates a visual map showing where the drone traveled and the route it followed. The telemetry graph visualizer plots sensor values such as altitude, speed, and battery levels over time.

telemetry drone visualization

Investigators can clearly show how a drone behaved during a flight, identify unusual movements, or detect signs of manual intervention. Military analysts can use these visual tools to assess mission intent, identify reconnaissance patterns, or confirm whether a drone deviated from its expected route.

Feature 4 – Flight and Integrity Analysis

The flight and integrity analysis feature focuses on detecting anomalies. The tool reviews all recorded telemetry values, calculates expected variance, and checks for suspicious gaps or inconsistencies in the data. These gaps may indicate file corruption, tampering, or attempts to hide certain actions.

drone flight analysis

Missing data can be just as meaningful as recorded data. Law enforcement can use this feature to determine whether logs were altered after a crime. Military analysts can identify signs of interference and malfunction, helping them assess the reliability of captured drone intelligence.

Usage

DroneXtract is built in Go, so before anything else you need to have Go installed on your system. This makes the tool portable and easy to deploy, even in restricted or offline environments such as incident response labs or field investigations.

We begin by copying the project to our computer

bash# > git clone https://github.com/ANG13T/DroneXtract.git

To build and run DroneXtract from source, you start by enabling Go modules. This allows Go to correctly manage dependencies used by the tool.

bash# > $ export GO111MODULE=on

Next, you fetch all required dependencies defined in the project. This step prepares your environment and ensures all components DroneXtract relies on are available.

bash# >  go get ./…

Once everything is in place, you can launch the tool directly:

bash# > go run main.go

At this point, DroneXtract is ready to be used for parsing files, visualizing telemetry, and performing integrity analysis on DJI drone data. The entire process runs locally, which is important when handling sensitive or classified material.

Airdata Usage

DJI drones store detailed flight information in .TXT flight logs. These files are not immediately usable for forensic analysis, so an intermediate step is required. For this, we rely on Airdata’s Flight Data Analysis tool, which converts DJI logs into standard forensic-friendly formats.

You can find the link here

Once the flight logs are processed through Airdata, the resulting files can be used directly with DroneXtract:

Airdata CSV output files can be used with:

1) the CSV parser

2) the flight path map generator

3) telemetry visualizations

Airdata KML output files can be used with:

1) the KML parser for geographic mapping

Airdata GPX output files can be used with:

1) the GPX parser for navigation-style flight reconstruction

This workflow allows investigators to move from a raw drone log to clear visual and analytical output without reverse-engineering proprietary formats themselves.

Configuration

DroneXtract also provides configuration options that allow you to tailor the analysis to your specific investigation. These settings are stored as environment variables in the .env file and control how much data is processed and how sensitive the analysis should be.

TELEMETRY_VIS_DOWNSAMPLE

This value controls how much telemetry data is sampled for visualization. Higher values reduce detail but improve performance, which is useful when working with very large flight logs.

FLIGHT_MAP_DOWNSAMPLE

This setting affects how many data points are used when generating the flight path map. It helps balance visual clarity with processing speed.

ANALYSIS_DOWNSAMPLE

This value controls the amount of data used during integrity analysis. It allows investigators to focus on meaningful changes without being overwhelmed by noise.

ANALYSIS_MAX_VARIANCE

This defines the maximum acceptable variance between minimum and maximum values during analysis. If this threshold is exceeded, it may indicate abnormal behavior, data corruption, or possible tampering.

Together, these settings give investigators control over both speed and precision, allowing DroneXtract to be effective in fast-paced operational environments and detailed post-incident forensic examinations.

Summary

Drone forensics is still a developing field, but its importance is growing rapidly. As drones become more capable, the need to analyze them effectively will only increase. Tools like DroneXtractor show how much valuable information can be recovered from devices that were once considered disposable. 

Looking ahead, it would be ideal to see fast, offline forensic tools designed specifically for battlefield conditions. Being able to quickly extract flight data, locations, and operational details from captured enemy drones could provide immediate tactical advantages. Drone forensics may soon become as essential as traditional digital forensics on computers and mobile devices.

The post Digital Forensics: Drone Forensics for Battlefield and Criminal Analysis first appeared on Hackers Arise.

Observo AI, Real Time Data Pipelines, and the Future of the Autonomous SOC: Rethinking Security Data from the Ground Up

8 September 2025 at 08:30

This morning, SentinelOne entered an agreement to acquire Observo AI—a deal that we believe will prove to be a major accelerator for our strategy and a key step forward in realizing our vision.

Data pipelines are key to any enterprise IT transformation. Data pipelines, On-premise, and cloud-native are the modern-day router for how all information technology runs. This is especially pronounced today with the need to make accessible highly sanitized, critically contextualized data into LLM-based systems, to truly unlock an agentic AI future. At the same time, enterprises need to critically move data from legacy systems, and into scaleable, ideally real-time-enabling technologies. A robust data pipeline that can move data from any source to any destination is a critical need to successfully modernize any IT environment, and on all clouds, including Microsoft Azure, AWS, and GCP, and even move data between them. All in a completely secure way. Modern data pipelines don’t stop at just routing data, they filter it, transform it and enrich it, inline, and in real time—an imperative for data efficiency and cost optimization.

Simply put, moving data freely between systems is a huge technological advantage for any enterprise, especially right now.

This is why we acquired Observo.AI, the market leader in real-time data pipelines. It’s a deal that we believe will have huge benefits for customers and partners alike.

We want to make it clear that we pledge to continue offering Observo’s data pipeline to all enterprises, whether they’re SentinelOne Singularity customers or not. We support complete freedom and control to help all customers to be able to own, secure, and route their data anywhere they want.

For security data specifically, data pipelines are the heart that pumps the blood. Unifying enterprise security data from all possible sources, end products and controls, security event aggregators, data lakes, and any custom source on premise or cloud based. As I mentioned above, the data pipeline juncture is a critical one for the migration of data.

The best security comes from the most visibility. Observo.AI will give SentinelOne the ability to bring data instantly into our real time data lake—allowing for unprecedented outcomes for customers, and marking a huge leap forward towards, unified, real time, AI-driven security, and one step closer to supervised autonomous security operations.

Data pipelines and the state of security operations

Today’s security operations teams don’t suffer from a lack of data. They suffer from a lack of usable data, latency, and relevant content.

The major culprit? Legacy data pipelines that weren’t built for modern, AI-enabled SOCs and today’s ever expanding attack surface. The result is increased cost, complexity, and delay—forcing compromises that reduce visibility, limit protection and slow response.

Enter Observo AI—a modern, AI-native data pipeline platform that gives enterprises full control over their data flows in real time.

With the acquisition of Observo AI, SentinelOne will address customers’ most critical security data challenges head-on.

Observo AI delivers a real-time data pipeline that ingests, enriches, summarizes, and routes data across the enterprise—before it ever reaches a SIEM or data lake. This empowers customers to dramatically reduce costs, improve detection, and act faster across any environment. As a result, we can create significant new customer and partner value by allowing for fast and seamless data routing into our AI SIEM, or any other destination.

It’s an acquisition and decision many months in the making—the result of an exhaustive technical evaluation, deep customer engagement, and a clear conviction grounded in the same disciplined approach we apply to all of our M&A activities. When you are thorough and do the hard work to identify the best possible technology, you can shorten the time to market and improve customer outcomes. And, in this case, the conclusion was clear: Observo AI is the best real time data pipeline platform on the market, by far.

Growing data, growing complexity and growing attack surface

As data volumes grow across endpoints, identity, cloud, GenAI apps, intelligent agents, and infrastructure, the core challenge is no longer about collection. It’s about control. Security teams need to act faster—across an ever expanding attack surface—with greater context and lower overhead. But today’s data pipelines are bottlenecks—built for batch processing, limited in visibility, static, and too rigid for modern environments.

To move security toward real autonomy, we need more than detection and response. We need a streaming data layer that can ingest, optimize, enrich, correlate and route data intelligently and at scale.

By joining forces with Observo AI, SentinelOne can deliver a modern, AI-native data platform that gives enterprises full control over their data flows in real time—allowing for fast and seamless data routing into our SIEM, or any other destination.

It also strengthens the value we’re already delivering with Singularity and introduces a new model for reducing data costs and improving threat detection, across any SIEM or data lake—helping customers lower data overhead, improve signal quality, and extract more value from the data they already have, no matter where it lives.

Legacy data pipelines give way to the next generation

Yesterday’s security data pipelines weren’t designed for autonomous systems and operations. They were built for manual triage, static rules, and post-ingestion filtering. As organizations move toward AI-enabled SOCs, that model breaks down.

Data today is:

  • Duplicated and noisy
  • Delayed in enrichment and normalization
  • Inconsistent across environments
  • Expensive to ingest and store
  • Dynamic in nature while solutions are rigid

The result is that too many security operations teams are forced to compromise— compromise for cost, for speed, for complexity, for innovation, and worse of all—compromise on the right visibility at the right time.

Observo AI is defining the next generation of data pipelines that change that by acting as an AI-driven streaming control plane for data. It operates upstream of SIEMs, data lakes, and AI engines—applying real-time enrichment, filtering, routing, summarizing, and masking before the data reaches storage or analysis. All this is achieved utilizing powerful AI models that continuously learn from the data.

It doesn’t just process more data. It delivers better data, faster, and with lower operational overhead.

The result is that teams can now harness the full benefit of all data in the SOC without compromise.

Observo AI’s real-time data pipeline advantage

Observo AI ingests data from any source—on-prem, edge, or cloud—and routes data to any destination, including SIEMs, object stores, analytics engines, and AI systems like Purple AI.

Key capabilities include:

  • Open integration – Supports industry standards and formats like OCSF, OpenTelemetry, JSON, and Parquet—ensuring compatibility across diverse ecosystems.
  • ML-based summarization and reduction – Uses machine learning to reduce data volume by up to 80%, without losing critical signal.
  • Streaming anomaly detection – Detects outliers and abnormal data in flight, not after the fact.
  • Contextual enrichment – Adds GeoIP, threat intelligence, asset metadata, and scoring in real time.
  • Field-level optimization – Dynamically identifies and drops redundant or unused fields based on usage patterns.
  • Automated PII redaction – Detects and masks sensitive data across structured and semi-structured formats while streaming.
  • Policy-based routing – Supports conditional logic to forward specific subsets of data—such as failures, high-risk activity, or enriched logs—to targeted destinations.
  • Agentic pipeline interface – Enables teams to generate and modify pipelines through natural language, not just static configuration files.

What We Learned from Evaluation and Customers

Prior to today’s announcement, we conducted a hands-on technical evaluation of the broader data pipeline landscape. We started with nine vendors and down-selected to four based on architecture, maturity, and extensibility.

To evaluate potential technology OEM partners, we conducted a structured scoring process across 11 technical dimensions, each representing a critical capability for scalable, secure, and high-performance data ingestion and transformation.

The evaluation criteria included:

  • Scalable data ingestion
  • On-prem and cloud collection support
  • Monitoring and UX
  • Speed of integrationBreadth of pre-built security integrations
  • OCSF mapping and normalization
  • Data transformations and enrichment capabilities
  • Filtering and streaming support
  • Sensitive data detection (PII)
  • Anomaly detection
  • Vendor lock-in mitigation (e.g., open formats, agnostic routing)

Each category was scored using a 3-tier rubric:

  • ✅ Exceeds Expectations – mature, production-grade capability
  • ⚠ Meets Expectations – functionally sufficient, may require optimization or future roadmap improvements
  • ❌ Does Not Meet Expectations – unsupported or significantly limited

Final vendor scores were calculated by normalizing across all 11 categories, enabling a comparative ranking based on technical depth, deployment readiness, and extensibility. Based on this methodology, Observo emerged as the clear front-runner, outperforming all other solutions in performance, UX, protocol support, and time-to-value.

Observo AI emerged as the clear leader—scoring highest across nearly every category. It wasn’t close.

We also conducted dozens of SentinelOne customer interviews across industries—ranging from high-scale technology firms to Fortune 500 enterprises. These organizations often operate at ingest volumes in the tens of terabytes per day, with clear plans to scale past 100+ TB/day.

Across those conversations, one theme was consistent: Observo AI was the best—the only next-generation, highly scalable data pipeline solution that was in serious consideration.

Other solutions were seen as either too rigid, too complex to manage, or lacking in automation and scale. Some were viewed as solid first-generation attempts—good for basic log shipping, but not built for real-time, AI-enabled operations.

Observo AI stood out for its ease of deployment, intuitive interface, rapid time to ROI, and overall maturity across cost optimization, AI support, and customer experience. As Lucas Moody, CISO of Alteryx, put it: “Observo AI solves our data sprawl issue so we can focus our time, attention, energy, and love on things that are going to matter downstream.”

In summary

  • Legacy data pipelines built for another era are forcing compromises that reduce visibility, limit protection and slow response for security operations teams managing today’s SOC
  • Observo AI is the defining AI-native, real-time data pipeline that ingests, enriches, summarizes, and routes data across the enterprise—before it ever reaches a SIEM or data lake
  • With Observo AI we will help customers dramatically reduce costs, improve detection, and act faster across any environment
  • This will be an accelerant to our AI SIEM strategy and our data solutions—creating significant new customer and partner value and bringing the autonomous SOC one step closer to reality

We’re excited to welcome the Observo AI team to SentinelOne, and even more excited about what this unlocks for our customers—a data pipeline built for the age of AI and autonomous security operations.

For any customer looking to route, ingest or optimize any type of enterprise data, with its vast integration ecosystem, and ML driven pipelines, Observo.AI is the best technology in the market, and the fastest to deploy, to start seeing real outcomes—now.

How to setup your own Basic Telemetry Lab with Cisco XR

By: Jo
22 February 2021 at 10:29
In this article, we will be talking about setting up a basic Lab for testing Telemetry on a Cisco NC55XX router. Telemetry – “Tele” means remote, “metry” means metrics or measurements, together this word simply

Continue readingHow to setup your own Basic Telemetry Lab with Cisco XR

❌
❌