In a previous article, we explored some of the ARM assembler commands. Today, we will delve into the practical application of the ADD instruction. By leveraging the power of the GNU Debugger (GDB), we will explore how to analyze and manipulate this instruction to gain deeper insights into ARM architecture.
Prepare an Environment
Before starting to learn assembly, we should prepare an environment. About possible ways to do so, you can check out this article. I’ll be using a Raspberry Pi with 32-bit Raspbian OS.
To check if your system is running a 32-bit userland, run:
raspberrypi> getconf LONG_BIT
Next, check what architecture your binaries are:
raspberrypi> file /bin/bash
In the case above, you can see a pretty common issue on modern Raspberry Pis: Raspbian OS is 32-bit, but uses a 64-bit kernel. This is an optimal installation, because you get 32-bit compatibility for all your applications and libraries, and better hardware support from a 64-bit kernel.
ADD Instruction
This instruction adds an immediate value to a register value and writes the result to the destination register.
The syntax is as follows:
ADD{S}{<c>}{<q>} {<Rd>,} <Rn>, #<const>
Where S – if presented, the instruction updates the flags. We’ll talk about flags later; <Rd> – destinations register; <Rn> – first operand; <const> – the immediate value to be added to the value obtained from <Rn>; <c> and <q>– are optional assembler fields.
Let’s move on to the practical stage and write the code. I’ll create a file instructions.s and open it with Vim.
The beginning of the file is as usual – declare “_start” value globally. I’ve explained this step in more detail in the following article. Also, I’ll add a comment with the add instruction syntax for ease of learning.
First of all, we need to have a register (<Rn>) that will be added to our constant value (#<const>). We’re going to set up a general-purpose register with the mov instruction.
As you might already remember from my previous article, general-purpose registers are r0-r12.
To set up a general-purpose register with a value of our choice, we can use the following command:
mov r0, #7
Where mov – instruction to copy the value to the register; r0 – destination register, where we’re going to store a temporary value;
#7 – pound sign signifies that the following value is constant. For this example, I’ve used number 7; you can choose any you want.
After that, we’re good to go with our add instruction.
add r1, r0, #3
Where r1 is the destination register where we’re going to store the sum of 7 + 3 r0 – our first operand with value 7.
#3 – constant value that will be added to r0. I’ve used value 3.
At this point, let’s assemble this code and see in gdb (GNU Debugger) what is happening.
Where -g – Include debugging information -nostdlib – Don’t link with standard library (since we’re not using it) -static – Create a static executable
Now, we can open the executable with GDB, but before that, I’ll install GEF (GDB Enhanced Features), which provides automatic register monitoring, color-code output, and more.
First of all, I’m going to disable displaced stepping to avoid some possible errors in GDB.
(gdb) set displaced-stepping off
After that, we can set a breakpoint at the _start label so execution stops there:
(gdb) break _start
Run our program:
(gdb) run
Here we can see that the program started execution but stopped in _start because of the breakpoint.
Let’s check the value of all registers:
(gdb) info registers
They are empty at this point. Let’s step through one assembly instruction:
(gdb) stepi
And check the value of only register r0 and r1
(gdb) info registers r0 r1
And here we can see that register r0 already stores the value 0x7 or 7 in decimal.
If we step through the next assembly instruction and check the register value again with the same commands, we can see the value of the r1 register.
Value of r1 is 0xa or 10 in decimal, just like we programmed.
Summary
In this article, we take a look at the ADD instruction in ARM assembly language. We walk through assembling the code with GCC and using GDB (GNU Debugger) to monitor execution and inspect register values, demonstrating how the results reflect the programmed additions. Understanding such low-level behavior is essential in exploit development, where manipulating register values and controlling program flow—such as redirecting execution or crafting return-oriented programming (ROP) chains—depends on precise knowledge of how instructions like ADD affect the system state.
Continuous Vulnerability Management: The New Cybersecurity Imperative Security leaders are drowning in data but starving for actionable insights. Traditional penetration testing has become a snapshot of vulnerability that expires faster...
Hello, aspiring ethical hackers. In our previous blogpost, you learnt about vulnerability scanning. In this article, you will learn about Nuclei, a high performance, fast and customizable vulnerability scanner that uses YAML based templates. Its features include, Let’s see how this tool works. For this, we will be using Kali Linux as attacker system as […]
Did you know that there are approximately 12.52 million credit card users in Australia, along with 43.77 million actively issued debit cards? These figures reflect Australia’s heavy reliance on digital payments and card-based transactions for everyday purchases and online commerce. However, with this widespread adoption comes an equally significant risk which is the growing threat of data breaches and payment fraud.
As digital transactions continue to grow, so do the challenges of protecting sensitive customer data. This is where PCI DSS (Payment Card Industry Data Security Standard) compliance becomes essential for Australian businesses.
In today’s article, we are going to learn how PCI DSS compliance protects businesses from data breaches. So, if you are wondering why you should invest in PCI DSS compliance in Australia and how it can safeguard your organization, keep reading to find out.
A brief introduction to PCI DSS
PCI DSS is a global data security framework that protects businesses handling cardholder data (CHD) from data breaches, fraud, and identity theft. It was first introduced in December 2004, by the founding members of American Express, Discover, JCB, MasterCard, and Visa International.
PCI DSS applies to any and every organization, regardless of size, that accepts, processes, stores, or transmits payment card data. Its framework consists of 12 core PCI DSS requirements grouped into six control objectives, which include:
Building and maintaining a secure network: Implementing firewalls and secure configurations.
Protecting cardholder data: Encrypting sensitive data during transmission.
Maintaining a vulnerability management program: Regularly updating anti-virus software and conducting vulnerability scans.
Implementing strong access control measures: Limiting access to cardholder data based on job responsibilities.
Regular monitoring and testing of networks: Performing routine security assessments.
Maintaining an information security policy: Establishing a documented security strategy.
The latest version PCI DSS v.4.0, was released on March 31, 2022, introducing enhanced security measures to address evolving cyber threats. These updates include increased flexibility for businesses and stronger authentication requirements, ensuring better protection in today’s dynamic digital landscape.
You may also check our latest YouTube video on PCI DSS 4.0 requirements which explains the changes from version 3.2.1 to 4.0.
The growing threat of data breaches in Australia
As Australia’s digital landscape continues to expand, the frequency and severity of data breaches are becoming increasingly concerning. In fact, the landscape of data security in Australia is becoming alarmingly dangerous, with a significant rise in data breaches posing a growing threat to businesses and individuals alike.
In the first quarter of 2024 alone, there were around 1.8 million accounts were leaked witnessing a 388% increase in compromised user accounts. This marks the severity of the data breaches exploited due to the soaring technology, and compliance negligence.
The financial implications of these breaches are profound. According to IBM’s annual Cost of a Data Breach Report 2024, the average cost of a data breach in Australia is estimated at AUD $4.26 million, which is said to have increased by 27% since 2020. These breaches not only affect an organization’s financial stability but also damage its reputation and erode customer trust. As cybercriminals continue to evolve their tactics, businesses must prioritize strong cybersecurity measures to mitigate these risks.
This is where the PCI DSS comes into play. While PCI DSS is not mandated by the Australian government, it is considered an important industry standard enforced by payment card brands. Achieving PCI DSS compliance ensures strong protection of sensitive payment data, reducing the risk of breaches and associated penalties. Moreover, compliance demonstrates your commitment to cybersecurity, boosting customer confidence in your business.
How PCI DSS protects your business from data breaches
PCI DSS provides a comprehensive framework that helps businesses defend against data breaches and payment fraud by implementing security measures specifically designed for handling payment card data. Here’s how PCI DSS compliance safeguards Australian businesses:
1. Encryption of payment card data
One of the key requirements of PCI DSS is the encryption of cardholder data both in transit and at rest. This ensures that even if cybercriminals manage to intercept the data, they will not be able to decrypt it and misuse it. By implementing robust encryption, businesses can significantly reduce the likelihood of their payment card data being exposed during a breach.
2. Secure network architecture
PCI DSS mandates businesses to establish and maintain a secure network with firewalls and other security configurations to protect against unauthorized access. By isolating payment card systems from the rest of the corporate network, businesses can minimize vulnerabilities and reduce the risk of data breaches.
3. Regular vulnerability scanning and penetration testing
PCI DSS requires ongoing vulnerability scans and penetration testing to identify and remediate potential security flaws before they can be exploited. This proactive approach ensures that systems are continuously evaluated for weaknesses and can quickly adapt to emerging cyber threats.
4. Access control and authentication
PCI DSS enforces stringent access control measures, ensuring that only authorized personnel can access sensitive payment card data. Through multi-factor authentication (MFA) and role-based access controls, businesses can limit exposure to potential breaches by restricting access based on job responsibilities.
5. Monitoring and logging
Constant monitoring and logging of payment systems are essential for detecting suspicious activities and mitigating data breaches. PCI DSS requires businesses to log all access and activities involving payment card data, which can be used to identify anomalies and investigate potential breaches swiftly.
6. Security awareness and staff training
Employees are often the weakest link in cybersecurity. PCI DSS emphasizes the importance of regular security training to ensure staff members understand the latest threats and best practices for safeguarding payment data. This harbours a culture of security within the organization and helps prevent human errors that could lead to breaches.
To Conclude
The rising threat of data breaches in Australia underscores the critical importance of robust cybersecurity practices. For businesses handling payment card data, PCI DSS compliance is a vital step toward safeguarding sensitive information, building customer trust, and mitigating financial and reputational risks. By adopting this globally recognized framework, organizations can strengthen their security posture and stay resilient against evolving cyber threats.
DNSRecon is a DNS scanning and enumeration tool written in Python, which allows you to perform different tasks, such as enumeration of standard records for a defined domain (A, NS, SOA, and MX). Top-level domain expansion for a defined domain.
With this graph-oriented user interface, the different records of a specific domain can be observed, classified and ordered in a simple way.
Install
git clone https://github.com/micro-joan/dnsrecon-gui cd dnsrecon-gui/ chmod +x run.sh ./run.sh
After executing the application launcher you need to have all the components installed, the launcher will check one by one, and in the case of not having any component installed it will show you the statement that you must enter to install it:
Use
When the tool is ready to use the same installer will give you a URL that you must put in the browser in a private window so every time you do a search you will have to open a new window in private or clear your browser cache to refresh the graphics.
This toolkit contains materials that can be potentially damaging or dangerous for social media. Refer to the laws in your province/country before accessing, using,or in any other way utilizing this in a wrong way.
This Tool is made for educational purposes only. Do not attempt to violate the law with anything contained here. If this is your intention, then Get the hell out of here!
Extensible Azure Security Tool (Later referred as E.A.S.T) is tool for assessing Azure and to some extent Azure AD security controls. Primary use case of EAST is Security data collection for evaluation in Azure Assessments. This information (JSON content) can then be used in various reporting tools, which we use to further correlate and investigate the data.
Installation now accounts for use of Azure Cloud Shell's updated version in regards to depedencies (Cloud Shell has now Node.JS v 16 version installed)
Checking of Databricks cluster types as per advisory
Audits Databricks clusters for potential privilege elevation - This control requires typically permissions on the databricks cluster"
Content.json is has now key and content based sorting. This enables doing delta checks with git diff HEAD^1 ¹ as content.json has predetermined order of results
¹Word of caution, if want to check deltas of content.json, then content.json will need to be "unignored" from .gitignore exposing results to any upstream you might have configured.
Use this feature with caution, and ensure you don't have public upstream set for the branch you are using this feature for
Change of programming patterns to avoid possible race conditions with larger datasets. This is mostly changes of using var to let in for await -style loops
Important
Current status of the tool is beta
Fixes, updates etc. are done on "Best effort" basis, with no guarantee of time, or quality of the possible fix applied
We do some additional tuning before using EAST in our daily work, such as apply various run and environment restrictions, besides formalizing ourselves with the environment in question. Thus we currently recommend, that EAST is run in only in test environments, and with read-only permissions.
All the calls in the service are largely to Azure Cloud IP's, so it should work well in hardened environments where outbound IP restrictions are applied. This reduces the risk of this tool containing malicious packages which could "phone home" without also having C2 in Azure.
Essentially running it in read-only mode, reduces a lot of the risk associated with possibly compromised NPM packages (Google compromised NPM)
Bugs etc: You can protect your environment against certain mistakes in this code by running the tool with reader-only permissions
Lot of the code is "AS IS": Meaning, it's been serving only the purpose of creating certain result; Lot of cleaning up and modularizing remains to be finished
There are no tests at the moment, apart from certain manual checks, that are run after changes to main.js and various more advanced controls.
The control descriptions at this stage are not the final product, so giving feedback on them, while appreciated, is not the focus of the tooling at this stage
As the name implies, we use it as tool to evaluate environments. It is not meant to be run as unmonitored for the time being, and should not be run in any internet exposed service that accepts incoming connections.
Documentation could be described as incomplete for the time being
EAST is mostly focused on PaaS resource, as most of our Azure assessments focus on this resource type
No Input sanitization is performed on launch params, as it is always assumed, that the input of these parameters are controlled. That being said, the tool uses extensively exec() - While I have not reviewed all paths, I believe that achieving shellcode execution is trivial. This tool does not assume hostile input, thus the recommendation is that you don't paste launch arguments into command line without reviewing them first.
Tool operation
Depedencies
To reduce amount of code we use the following depedencies for operation and aesthetics are used (Kudos to the maintainers of these fantastic packages)
Other depedencies for running the tool: If you are planning to run this in Azure Cloud Shell you don't need to install Azure CLI:
This tool does not include or distribute Microsoft Azure CLI, but rather uses it when it has been installed on the source system (Such as Azure Cloud Shell, which is primary platform for running EAST)
Azure Cloud Shell (BASH) or applicable Linux Distro / WSL
if (item.properties?.adminUserEnabled == false ){returnObject.isHealthy = true }
Advanced
Advanced controls include checks beyond the initial ARM object. Often invoking new requests to get further information about the resource in scope and it's relation to other services.
Example: Role Assignments
Besides checking the role assignments of subscription, additional check is performed via Azure AD Conditional Access Reporting for MFA, and that privileged accounts are not only protected by passwords (SPN's with client secrets)
Azure Data Factory pipeline mapping combines pipelines -> activities -> and data targets together and then checks for secrets leaked on the logs via run history of the said activities.
Composite
Composite controls combines two or more control results from pipeline, in order to form one, or more new controls. Using composites solves two use cases for EAST
You cant guarantee an order of control results being returned in the pipeline
You need to return more than one control result from single check
Get alerts from Microsoft Cloud Defender on subscription check
Form new controls per resourceProvider for alerts
Reporting
EAST is not focused to provide automated report generation, as it provides mostly JSON files with control and evaluation status. The idea is to use separate tooling to create reports, which are fairly trivial to automate via markdown creation scripts and tools such as Pandoc
While focus is not on the reporting, this repo includes example automation for report creation with pandoc to ease reading of the results in single document format.
cff-version: 1.2.0 title: Pandoc message: "If you use this software, please cite it as below." type: software url: "https://github.com/jgm/pandoc" authors: - given-names: John family-names: MacFarlane email: jgm@berkeley.edu orcid: 'https://orcid.org/0000-0003-2557-9090' - given-names: Albert family-names: Krewinkel email: tarleb+github@moltkeplatz.de orcid: '0000-0002-9455-0796' - given-names: Jesse family-names: Rosenthal email: jrosenthal@jhu.edu
Running EAST scan
This part has guide how to run this either on BASH@linux, or BASH on Azure Cloud Shell (obviously Cloud Shell is Linux too, but does not require that you have your own linux box to use this)
⚠️If you are running the tool in Cloud Shell, you might need to reapply some of the installations again as Cloud Shell does not persist various session settings.
Detailed Prerequisites (This is if you opted no to do the "fire and forget version")
Prerequisites
git clone https://github.com/jsa2/EAST --branch preview cd EAST; npm install
Pandoc installation on cloud shell
# Get pandoc for reporting (first time only) wget "https://github.com/jgm/pandoc/releases/download/2.17.1.1/pandoc-2.17.1.1-linux-amd64.tar.gz"; tar xvzf "pandoc-2.17.1.1-linux-amd64.tar.gz" --strip-components 1 -C ~
Installing pandoc on distros that support APT
# Get pandoc for reporting (first time only) sudo apt install pandoc
Login Az CLI and run the scan
# Relogin is required to ensure token cache is placed on session on cloud shell
az account clear az login
# cd EAST # replace the subid below with your subscription ID! subId=6193053b-408b-44d0-b20f-4e29b9b67394 # node ./plugins/main.js --batch=10 --nativescope=true --roleAssignments=true --helperTexts=true --checkAad=true --scanAuditLogs --composites --subInclude=$subId
Generate report
cd EAST; node templatehelpers/eastReports.js --doc
If you want to include all Azure Security Benchmark results in the report
cd EAST; node templatehelpers/eastReports.js --doc --asb
Share relevant controls across multiple environments as community effort
Company use
Companies have possibility to develop company specific controls which apply to company specific work. Companies can then control these implementations by decision to share, or not share them based on the operating principle of that company.
Non IPR components
Code logic and functions are under MIT license. since code logic and functions are alredy based on open-source components & vendor API's, it does not make sense to restrict something that is already based on open source
If you use this tool as part of your commercial effort we only require, that you follow the very relaxed terms of MIT license
Use rich and maintained context of Microsoft Azure CLIlogin & commands with Node.js control flow which supplies enhanced rest-requests and maps results to schema.
This tool does not include or distribute Microsoft Azure CLI, but rather uses it when it has been installed on the source system (Such as Azure Cloud Shell, which is primary platform for running EAST)
✅Using Node.js runtime as orchestrator utilises Nodes asynchronous nature allowing batching of requests. Batching of requests utilizes the full extent of Azure Resource Managers incredible speed.
✅Compared to running requests one-by-one, the speedup can be up to 10x, when Node executes the batch of requests instead of single request at time
clears tokens in session folder, use this if you get authorization errors, or have just changed to other az login account use az account clear if you want to clear AZ CLI cache too
no values
--tag
Filter all results in the end based on single tag--tag=svc=aksdev
no values
--ignorePreCheck
use this option when used with browser delegated tokens
no values
--helperTexts
Will append text descriptions from general to manual controls
no values
--reprocess
Will update results to existing content.json. Useful for incremental runs
no values
Parameters reference for example report:
node templatehelpers/eastReports.js --asb
Param
Description
Default if undefined
--asb
gets all ASB results available to users
no values
--policy
gets all Policy results available to users
no values
--doc
prints pandoc string for export to console
no values
(Highly experimental) Running in restricted environments where only browser use is available
⚠️Detect principals in privileged subscriptions roles protected only by password-based single factor authentication.
Checks for users without MFA policies applied for set of conditions
Checks for ServicePrincipals protected only by password (as opposed to using Certificate Credential, workload federation and or workload identity CA policy)
An unused credential on an application can result in security breach. While it's convenient to use password. secrets as a credential, we strongly recommend that you use x509 certificates as the only credential type for getting tokens for your application
Following methods work for contributing for the time being:
Submit a pull request with code / documentation change
Submit a issue
issue can be a:
⚠️Problem (issue)
Feature request
❔Question
Other
By default EAST tries to work with the current depedencies - Introducing new (direct) depedencies is not directly encouraged with EAST. If such vital depedency is introduced, then review licensing of such depedency, and update readme.md - depedencies
There is nothing to prevent you from creating your own fork of EAST with your own depedencies
DNSRecon is a DNS scanning and enumeration tool written in Python, which allows you to perform different tasks, such as enumeration of standard records for a defined domain (A, NS, SOA, and MX). Top-level domain expansion for a defined domain.
With this graph-oriented user interface, the different records of a specific domain can be observed, classified and ordered in a simple way.
Install
git clone https://github.com/micro-joan/dnsrecon-gui cd dnsrecon-gui/ chmod +x run.sh ./run.sh
After executing the application launcher you need to have all the components installed, the launcher will check one by one, and in the case of not having any component installed it will show you the statement that you must enter to install it:
Use
When the tool is ready to use the same installer will give you a URL that you must put in the browser in a private window so every time you do a search you will have to open a new window in private or clear your browser cache to refresh the graphics.
This toolkit contains materials that can be potentially damaging or dangerous for social media. Refer to the laws in your province/country before accessing, using,or in any other way utilizing this in a wrong way.
This Tool is made for educational purposes only. Do not attempt to violate the law with anything contained here. If this is your intention, then Get the hell out of here!
Extensible Azure Security Tool (Later referred as E.A.S.T) is tool for assessing Azure and to some extent Azure AD security controls. Primary use case of EAST is Security data collection for evaluation in Azure Assessments. This information (JSON content) can then be used in various reporting tools, which we use to further correlate and investigate the data.
Installation now accounts for use of Azure Cloud Shell's updated version in regards to depedencies (Cloud Shell has now Node.JS v 16 version installed)
Checking of Databricks cluster types as per advisory
Audits Databricks clusters for potential privilege elevation - This control requires typically permissions on the databricks cluster"
Content.json is has now key and content based sorting. This enables doing delta checks with git diff HEAD^1 ¹ as content.json has predetermined order of results
¹Word of caution, if want to check deltas of content.json, then content.json will need to be "unignored" from .gitignore exposing results to any upstream you might have configured.
Use this feature with caution, and ensure you don't have public upstream set for the branch you are using this feature for
Change of programming patterns to avoid possible race conditions with larger datasets. This is mostly changes of using var to let in for await -style loops
Important
Current status of the tool is beta
Fixes, updates etc. are done on "Best effort" basis, with no guarantee of time, or quality of the possible fix applied
We do some additional tuning before using EAST in our daily work, such as apply various run and environment restrictions, besides formalizing ourselves with the environment in question. Thus we currently recommend, that EAST is run in only in test environments, and with read-only permissions.
All the calls in the service are largely to Azure Cloud IP's, so it should work well in hardened environments where outbound IP restrictions are applied. This reduces the risk of this tool containing malicious packages which could "phone home" without also having C2 in Azure.
Essentially running it in read-only mode, reduces a lot of the risk associated with possibly compromised NPM packages (Google compromised NPM)
Bugs etc: You can protect your environment against certain mistakes in this code by running the tool with reader-only permissions
Lot of the code is "AS IS": Meaning, it's been serving only the purpose of creating certain result; Lot of cleaning up and modularizing remains to be finished
There are no tests at the moment, apart from certain manual checks, that are run after changes to main.js and various more advanced controls.
The control descriptions at this stage are not the final product, so giving feedback on them, while appreciated, is not the focus of the tooling at this stage
As the name implies, we use it as tool to evaluate environments. It is not meant to be run as unmonitored for the time being, and should not be run in any internet exposed service that accepts incoming connections.
Documentation could be described as incomplete for the time being
EAST is mostly focused on PaaS resource, as most of our Azure assessments focus on this resource type
No Input sanitization is performed on launch params, as it is always assumed, that the input of these parameters are controlled. That being said, the tool uses extensively exec() - While I have not reviewed all paths, I believe that achieving shellcode execution is trivial. This tool does not assume hostile input, thus the recommendation is that you don't paste launch arguments into command line without reviewing them first.
Tool operation
Depedencies
To reduce amount of code we use the following depedencies for operation and aesthetics are used (Kudos to the maintainers of these fantastic packages)
Other depedencies for running the tool: If you are planning to run this in Azure Cloud Shell you don't need to install Azure CLI:
This tool does not include or distribute Microsoft Azure CLI, but rather uses it when it has been installed on the source system (Such as Azure Cloud Shell, which is primary platform for running EAST)
Azure Cloud Shell (BASH) or applicable Linux Distro / WSL
if (item.properties?.adminUserEnabled == false ){returnObject.isHealthy = true }
Advanced
Advanced controls include checks beyond the initial ARM object. Often invoking new requests to get further information about the resource in scope and it's relation to other services.
Example: Role Assignments
Besides checking the role assignments of subscription, additional check is performed via Azure AD Conditional Access Reporting for MFA, and that privileged accounts are not only protected by passwords (SPN's with client secrets)
Azure Data Factory pipeline mapping combines pipelines -> activities -> and data targets together and then checks for secrets leaked on the logs via run history of the said activities.
Composite
Composite controls combines two or more control results from pipeline, in order to form one, or more new controls. Using composites solves two use cases for EAST
You cant guarantee an order of control results being returned in the pipeline
You need to return more than one control result from single check
Get alerts from Microsoft Cloud Defender on subscription check
Form new controls per resourceProvider for alerts
Reporting
EAST is not focused to provide automated report generation, as it provides mostly JSON files with control and evaluation status. The idea is to use separate tooling to create reports, which are fairly trivial to automate via markdown creation scripts and tools such as Pandoc
While focus is not on the reporting, this repo includes example automation for report creation with pandoc to ease reading of the results in single document format.
cff-version: 1.2.0 title: Pandoc message: "If you use this software, please cite it as below." type: software url: "https://github.com/jgm/pandoc" authors: - given-names: John family-names: MacFarlane email: jgm@berkeley.edu orcid: 'https://orcid.org/0000-0003-2557-9090' - given-names: Albert family-names: Krewinkel email: tarleb+github@moltkeplatz.de orcid: '0000-0002-9455-0796' - given-names: Jesse family-names: Rosenthal email: jrosenthal@jhu.edu
Running EAST scan
This part has guide how to run this either on BASH@linux, or BASH on Azure Cloud Shell (obviously Cloud Shell is Linux too, but does not require that you have your own linux box to use this)
⚠️If you are running the tool in Cloud Shell, you might need to reapply some of the installations again as Cloud Shell does not persist various session settings.
Detailed Prerequisites (This is if you opted no to do the "fire and forget version")
Prerequisites
git clone https://github.com/jsa2/EAST --branch preview cd EAST; npm install
Pandoc installation on cloud shell
# Get pandoc for reporting (first time only) wget "https://github.com/jgm/pandoc/releases/download/2.17.1.1/pandoc-2.17.1.1-linux-amd64.tar.gz"; tar xvzf "pandoc-2.17.1.1-linux-amd64.tar.gz" --strip-components 1 -C ~
Installing pandoc on distros that support APT
# Get pandoc for reporting (first time only) sudo apt install pandoc
Login Az CLI and run the scan
# Relogin is required to ensure token cache is placed on session on cloud shell
az account clear az login
# cd EAST # replace the subid below with your subscription ID! subId=6193053b-408b-44d0-b20f-4e29b9b67394 # node ./plugins/main.js --batch=10 --nativescope=true --roleAssignments=true --helperTexts=true --checkAad=true --scanAuditLogs --composites --subInclude=$subId
Generate report
cd EAST; node templatehelpers/eastReports.js --doc
If you want to include all Azure Security Benchmark results in the report
cd EAST; node templatehelpers/eastReports.js --doc --asb
Share relevant controls across multiple environments as community effort
Company use
Companies have possibility to develop company specific controls which apply to company specific work. Companies can then control these implementations by decision to share, or not share them based on the operating principle of that company.
Non IPR components
Code logic and functions are under MIT license. since code logic and functions are alredy based on open-source components & vendor API's, it does not make sense to restrict something that is already based on open source
If you use this tool as part of your commercial effort we only require, that you follow the very relaxed terms of MIT license
Use rich and maintained context of Microsoft Azure CLIlogin & commands with Node.js control flow which supplies enhanced rest-requests and maps results to schema.
This tool does not include or distribute Microsoft Azure CLI, but rather uses it when it has been installed on the source system (Such as Azure Cloud Shell, which is primary platform for running EAST)
✅Using Node.js runtime as orchestrator utilises Nodes asynchronous nature allowing batching of requests. Batching of requests utilizes the full extent of Azure Resource Managers incredible speed.
✅Compared to running requests one-by-one, the speedup can be up to 10x, when Node executes the batch of requests instead of single request at time
clears tokens in session folder, use this if you get authorization errors, or have just changed to other az login account use az account clear if you want to clear AZ CLI cache too
no values
--tag
Filter all results in the end based on single tag--tag=svc=aksdev
no values
--ignorePreCheck
use this option when used with browser delegated tokens
no values
--helperTexts
Will append text descriptions from general to manual controls
no values
--reprocess
Will update results to existing content.json. Useful for incremental runs
no values
Parameters reference for example report:
node templatehelpers/eastReports.js --asb
Param
Description
Default if undefined
--asb
gets all ASB results available to users
no values
--policy
gets all Policy results available to users
no values
--doc
prints pandoc string for export to console
no values
(Highly experimental) Running in restricted environments where only browser use is available
⚠️Detect principals in privileged subscriptions roles protected only by password-based single factor authentication.
Checks for users without MFA policies applied for set of conditions
Checks for ServicePrincipals protected only by password (as opposed to using Certificate Credential, workload federation and or workload identity CA policy)
An unused credential on an application can result in security breach. While it's convenient to use password. secrets as a credential, we strongly recommend that you use x509 certificates as the only credential type for getting tokens for your application
Following methods work for contributing for the time being:
Submit a pull request with code / documentation change
Submit a issue
issue can be a:
⚠️Problem (issue)
Feature request
❔Question
Other
By default EAST tries to work with the current depedencies - Introducing new (direct) depedencies is not directly encouraged with EAST. If such vital depedency is introduced, then review licensing of such depedency, and update readme.md - depedencies
There is nothing to prevent you from creating your own fork of EAST with your own depedencies
For all scans so far, we’ve only used the default scan configurations such as host discovery, system discovery and Full & fast. But what if we don’t want to run all NVTs on a given target (list) and only test for a few specific vulnerabilities? In this case we can create our own custom scan [...]
In the previous parts of the Vulnerability Scanning with OpenVAS 9 tutorials we have covered the installation process and how to run vulnerability scans using OpenVAS and the Greenbone Security Assistant (GSA) web application. In part 3 of Vulnerability Scanning with OpenVAS 9 we will have a look at how to run scans using different [...]
Is the previous tutorial Vulnerability Scanning with OpenVAS 9.0 part 1 we’ve gone through the installation process of OpenVAS on Kali Linux and the installation of the virtual appliance. In this tutorial we will learn how to configure and run a vulnerability scan. For demonstration purposes we’ve also installed a virtual machine with Metasploitable 2 [...]
A couple years ago we did a tutorial on Hacking Tutorials on how to install the popular vulnerability assessment tool OpenVAS on Kali Linux. We’ve covered the installation process on Kali Linux and running a basic scan on the Metasploitable 2 virtual machine to identify vulnerabilities. In this tutorial I want to cover more details [...]