A Hong Kong industry group has urged the city’s regulators to ease aspects of the Organisation for Economic Co-operation and Development’s (OECD) crypto reporting rules ahead of its implementation.
Association Pushes To Soften CARF Requirements
On Monday, the Hong Kong Securities & Futures Professionals Association (HKSFPA) released a response to the implementation of the OECD’s Crypto Asset Reporting Framework (CARF) and the related amendments made to Hong Kong’s Common Reporting Standard (CRS).
In their official response, the association shared its concerns about certain elements of the CARF and CRS amendments, warning that they could create operational and liability risks for market participants.
Notably, the HKSFPA affirmed that it mostly supports the proposals, but urged regulators to ease the record-keeping requirements for dissolved entities. “We generally agree with the six-year retention period to align with existing inland revenue and CRS standards,” they explained, “but we have concerns regarding the obligations placed on individuals post-dissolution.”
The industry group argued that holding directors or principal officers personally liable for record-keeping after dissolution poses significant practical challenges, noting that former officers of dissolved companies may lack the resources, infrastructure, and legal standing to maintain sensitive personal data of former clients.
As a result, they suggested the government “allow for the appointment of a designated third-party custodian (such as a liquidator or a licensed corporate service provider) to fulfill this obligation, rather than placing indefinite personal liability and logistical burden on former individual officers.”
Moreover, the association also cautioned that the proposed uncapped per-account penalties for minor technical errors. They asserted that this could lead to “disproportionately astronomical fines for systemic software errors affecting thousands of accounts where there was no intent to defraud.”
To solve this, they proposed a “reasonable cap” on total penalties for unintentional administrative errors or first-time offenses to ensure that the per-account calculation “is reserved for cases of willful negligence or intentional evasion.”
Additionally, the group suggested a “lite” registration or a simplified annual declaration process for Reporting Crypto-Asset Service Providers (RCASPs) that anticipate filing Nil Returns, to reduce administrative costs while still satisfying the Inland Revenue Department’s oversight requirements.
Hong Kong’s Crypto Hub Efforts
Notably, Hong Kong is among the 76 markets committed to implementing the upcoming crypto reporting framework, which is the OECD’s new global standard for exchanging tax information on crypto assets.
The CARF is designed to prevent tax evasion by bringing crypto users across borders under global tax transparency rules, similar to the OECD’s existing CRS for traditional finance. Hong Kong will be among the 27 jurisdictions that will begin their first cross-border exchanges of crypto reporting data in 2028.
Over the past few years, Hong Kong financial authorities have been actively working to develop a comprehensive framework that supports the expansion of the digital assets industry, part of its strategy to become a leading crypto hub in the world.
As reported by Bitcoinist, the city is exploring rules to allow insurance companies to invest in cryptocurrencies and the infrastructure sector. The Hong Kong Insurance Authority recently proposed a framework that could channel insurance capital into cryptocurrencies and stablecoins.
Moreover, the Hong Kong Monetary Authority (HKMA) is expected to grant the first batch of stablecoin issuer licenses in the first few months of the year. The HKMA enacted the Stablecoins Ordinance in August, which directs any individual or entity seeking to issue a stablecoin in Hong Kong, or any Hong Kong Dollar-pegged token, to obtain a license from the regulator.
Multiple companies have applied for the license, with over 30 applications filed in 2025, including logistics technology firm Reitar Logtech and the overseas arm of Chinese mainland financial technology giant Ant Group.
In the era where technology plays a core part in everything, fintech and blockchain have emerged as transformative forces for businesses. They not only reshape the financial landscape but also promise unparalleled transparency, efficiency and security as the world move forward to digital currency. That’s when you know being updated about SOX Compliance in Blockchain & Fintech are important than ever.
As per the latest statistics by DemandSage, there are around 29,955 Fintech startups in the world, in which over 13,100 fintech startups are based in the United States. This shows how much business are increasingly embracing technology to innovate and address evolving financial needs. It also highlights the global shift towards digital-first solutions, driven by a demand for greater accessibility and efficiency in financial services.
On the other hand, blockchain technology, also known as Distributed Ledger Technology (DLT) is currently valued at approximately USD $8.70 billion in USA and is estimated to grow an impressive USD $619.28 billion by 2034, according to data from Precedence Research.
However, as this digital continues the revolution, businesses embracing these technologies must also prioritize compliance, security, and accountability. This is where SOX (Sarbanes-Oxley) compliance plays an important role. In today’s article we are going to explore the reason SOX Compliance is crucial for fintech and blockchain industry. So, lets get started!
Understanding SOX compliance
The Sarbanes-Oxley Act (SOX), passed in 2002, aims to enhance corporate accountability and transparency in financial reporting. It applies to all publicly traded companies in the U.S. and mandates strict adherence to internal controls, accurate financial reporting, and executive accountability to prevent corporate fraud.
Blockchain technology and fintech solutions disrupt traditional financial systems by offering decentralized and automated alternatives. While these innovations bring significant benefits, they can also obscure transparency and accountability, two principles that SOX aims to uphold. SOX compliance focuses on accurate financial reporting, strong internal controls, and prevention of fraud, aligning with both the potential and risks of emerging technologies.
Key reasons why SOX compliance matters
1. Ensuring accurate financial reporting
Blockchain technology is often touted for its transparency and immutability. However, errors in smart contracts, incorrect data inputs, or cyberattacks can lead to inaccurate financial records. SOX compliance mandates stringent controls over financial reporting, ensuring that organizations maintain reliable records even when leveraging blockchain.
2. Mitigating risks in decentralized systems
Fintech platforms and blockchain ecosystems often operate without centralized oversight, making it challenging to identify and address fraud or anomalies. SOX’s requirement for management’s assessment of internal controls and independent audits provides a critical layer of oversight, helping organizations address vulnerabilities in decentralized environments.
3. Building stakeholder trust
The trust of investors, customers, and regulators is paramount for fintech and blockchain companies. Adhering to SOX requirements demonstrates a commitment to transparency and accountability, promoting confidence among stakeholders and distinguishing compliant organizations from their competitors.
4. Addressing regulatory scrutiny
As blockchain and fintech solutions gain adoption, regulatory scrutiny is intensifying. SOX compliance ensures that organizations are prepared to meet these demands by maintaining rigorous financial practices and demonstrating accountability in their operations.
5. Adapting to hybrid financial models
Many organizations are integrating traditional financial systems with blockchain-based solutions. This hybrid approach can create gaps in controls and reporting mechanisms. Leveraging blockchain in compliance with SOX helps bridge these gaps by enforcing comprehensive internal controls that adapt to both traditional and innovative systems.
6. Promoting operational efficiency
By enforcing stringent controls and systematic processes, SOX compliance encourages better business practices and operational efficiency. This results in more accurate financial reporting, reduced manual interventions, and streamlined processes, which ultimately support better decision-making and resource allocation.
7. Future proofing against emerging technologies
Blockchain and fintech are continuously evolving, and organizations must adapt to new technologies. SOX compliance offers a flexible framework that can scale and evolve with these changes, ensuring that financial reporting and internal controls remain relevant and effective in the face of new technological challenges and opportunities.
Tips to get SOX compliant for fintech and blockchain companies
1. Understand SOX Requirements
Familiarize yourself with the key SOX sections, especially Section 302 (corporate responsibility for financial reports) and Section 404 (internal control over financial reporting).
Identify the specific areas that apply to your company’s financial reporting, internal controls, and auditing processes.
2. Form a Compliance Team
Assemble an internal team including executives, compliance officers, and IT staff.
Consider hiring external experts like auditors to guide the process.
3. Assess Current Financial Processes
Review existing financial systems, processes, and internal controls to identify gaps.
Document and ensure that these processes are auditable and compliant with SOX.
4. Implement Financial Reporting Systems
Automate financial reporting to ensure timely, accurate results.
Regularly conduct internal audits to confirm financial controls are working effectively.
5. Strengthen Data Security
Implement strong encryption, multi-factor authentication, and role-based access control (RBAC) to secure financial data.
Ensure regular backups and disaster recovery plans are in place.
6. Create and Document Policies
Develop formal policies for internal controls, financial reporting, and data handling.
Train employees on SOX compliance and ensure clear communication about financial responsibilities.
7. Establish Internal Control Framework
Build a solid internal control framework, focusing on accuracy, completeness, and fraud prevention in financial reporting.
Regularly test, validate controls and consider third-party validation for independent assurance.
8. Disclose Material Changes in Real-Time
Develop a process for promptly disclosing any material changes to financial data, ensuring transparency with stakeholders.
9. Prepare for External Audits
Engage an independent auditor to review your financial processes and internal controls.
Organize records and ensure a clear audit trail to make the audit process smoother.
10. Monitor and Maintain Compliance
Continuously monitor financial systems and internal controls to detect errors or fraud.
Review and update systems regularly to ensure ongoing SOX compliance.
11. Develop a Compliance Culture
Encourage a company-wide focus on SOX compliance, transparency, and accountability.
Provide regular training and leadership to instill a culture of compliance.
Conclusion
In the fast-paced era of blockchain and fintech, SOX compliance has evolved from a regulatory necessity to a strategic cornerstone. By driving accurate financial reporting, minimizing risks, and cultivating trust, it sets the stage for lasting growth and innovation. Companies that prioritize compliance and auditing standards don’t just safeguard their operation, but they also position themselves as forward-thinking leaders in the rapidly transforming financial landscape.
Securing the Browser’s Blind Spot By Victoria Hargrove, CDM Reporter What CSide Does Most security stacks fortify servers, databases, and internal apps. CSide (Client-side Development, Inc. aka c/side) targets the...
Extensible Azure Security Tool (Later referred as E.A.S.T) is tool for assessing Azure and to some extent Azure AD security controls. Primary use case of EAST is Security data collection for evaluation in Azure Assessments. This information (JSON content) can then be used in various reporting tools, which we use to further correlate and investigate the data.
Installation now accounts for use of Azure Cloud Shell's updated version in regards to depedencies (Cloud Shell has now Node.JS v 16 version installed)
Checking of Databricks cluster types as per advisory
Audits Databricks clusters for potential privilege elevation - This control requires typically permissions on the databricks cluster"
Content.json is has now key and content based sorting. This enables doing delta checks with git diff HEAD^1 ¹ as content.json has predetermined order of results
¹Word of caution, if want to check deltas of content.json, then content.json will need to be "unignored" from .gitignore exposing results to any upstream you might have configured.
Use this feature with caution, and ensure you don't have public upstream set for the branch you are using this feature for
Change of programming patterns to avoid possible race conditions with larger datasets. This is mostly changes of using var to let in for await -style loops
Important
Current status of the tool is beta
Fixes, updates etc. are done on "Best effort" basis, with no guarantee of time, or quality of the possible fix applied
We do some additional tuning before using EAST in our daily work, such as apply various run and environment restrictions, besides formalizing ourselves with the environment in question. Thus we currently recommend, that EAST is run in only in test environments, and with read-only permissions.
All the calls in the service are largely to Azure Cloud IP's, so it should work well in hardened environments where outbound IP restrictions are applied. This reduces the risk of this tool containing malicious packages which could "phone home" without also having C2 in Azure.
Essentially running it in read-only mode, reduces a lot of the risk associated with possibly compromised NPM packages (Google compromised NPM)
Bugs etc: You can protect your environment against certain mistakes in this code by running the tool with reader-only permissions
Lot of the code is "AS IS": Meaning, it's been serving only the purpose of creating certain result; Lot of cleaning up and modularizing remains to be finished
There are no tests at the moment, apart from certain manual checks, that are run after changes to main.js and various more advanced controls.
The control descriptions at this stage are not the final product, so giving feedback on them, while appreciated, is not the focus of the tooling at this stage
As the name implies, we use it as tool to evaluate environments. It is not meant to be run as unmonitored for the time being, and should not be run in any internet exposed service that accepts incoming connections.
Documentation could be described as incomplete for the time being
EAST is mostly focused on PaaS resource, as most of our Azure assessments focus on this resource type
No Input sanitization is performed on launch params, as it is always assumed, that the input of these parameters are controlled. That being said, the tool uses extensively exec() - While I have not reviewed all paths, I believe that achieving shellcode execution is trivial. This tool does not assume hostile input, thus the recommendation is that you don't paste launch arguments into command line without reviewing them first.
Tool operation
Depedencies
To reduce amount of code we use the following depedencies for operation and aesthetics are used (Kudos to the maintainers of these fantastic packages)
Other depedencies for running the tool: If you are planning to run this in Azure Cloud Shell you don't need to install Azure CLI:
This tool does not include or distribute Microsoft Azure CLI, but rather uses it when it has been installed on the source system (Such as Azure Cloud Shell, which is primary platform for running EAST)
Azure Cloud Shell (BASH) or applicable Linux Distro / WSL
if (item.properties?.adminUserEnabled == false ){returnObject.isHealthy = true }
Advanced
Advanced controls include checks beyond the initial ARM object. Often invoking new requests to get further information about the resource in scope and it's relation to other services.
Example: Role Assignments
Besides checking the role assignments of subscription, additional check is performed via Azure AD Conditional Access Reporting for MFA, and that privileged accounts are not only protected by passwords (SPN's with client secrets)
Azure Data Factory pipeline mapping combines pipelines -> activities -> and data targets together and then checks for secrets leaked on the logs via run history of the said activities.
Composite
Composite controls combines two or more control results from pipeline, in order to form one, or more new controls. Using composites solves two use cases for EAST
You cant guarantee an order of control results being returned in the pipeline
You need to return more than one control result from single check
Get alerts from Microsoft Cloud Defender on subscription check
Form new controls per resourceProvider for alerts
Reporting
EAST is not focused to provide automated report generation, as it provides mostly JSON files with control and evaluation status. The idea is to use separate tooling to create reports, which are fairly trivial to automate via markdown creation scripts and tools such as Pandoc
While focus is not on the reporting, this repo includes example automation for report creation with pandoc to ease reading of the results in single document format.
cff-version: 1.2.0 title: Pandoc message: "If you use this software, please cite it as below." type: software url: "https://github.com/jgm/pandoc" authors: - given-names: John family-names: MacFarlane email: jgm@berkeley.edu orcid: 'https://orcid.org/0000-0003-2557-9090' - given-names: Albert family-names: Krewinkel email: tarleb+github@moltkeplatz.de orcid: '0000-0002-9455-0796' - given-names: Jesse family-names: Rosenthal email: jrosenthal@jhu.edu
Running EAST scan
This part has guide how to run this either on BASH@linux, or BASH on Azure Cloud Shell (obviously Cloud Shell is Linux too, but does not require that you have your own linux box to use this)
⚠️If you are running the tool in Cloud Shell, you might need to reapply some of the installations again as Cloud Shell does not persist various session settings.
Detailed Prerequisites (This is if you opted no to do the "fire and forget version")
Prerequisites
git clone https://github.com/jsa2/EAST --branch preview cd EAST; npm install
Pandoc installation on cloud shell
# Get pandoc for reporting (first time only) wget "https://github.com/jgm/pandoc/releases/download/2.17.1.1/pandoc-2.17.1.1-linux-amd64.tar.gz"; tar xvzf "pandoc-2.17.1.1-linux-amd64.tar.gz" --strip-components 1 -C ~
Installing pandoc on distros that support APT
# Get pandoc for reporting (first time only) sudo apt install pandoc
Login Az CLI and run the scan
# Relogin is required to ensure token cache is placed on session on cloud shell
az account clear az login
# cd EAST # replace the subid below with your subscription ID! subId=6193053b-408b-44d0-b20f-4e29b9b67394 # node ./plugins/main.js --batch=10 --nativescope=true --roleAssignments=true --helperTexts=true --checkAad=true --scanAuditLogs --composites --subInclude=$subId
Generate report
cd EAST; node templatehelpers/eastReports.js --doc
If you want to include all Azure Security Benchmark results in the report
cd EAST; node templatehelpers/eastReports.js --doc --asb
Share relevant controls across multiple environments as community effort
Company use
Companies have possibility to develop company specific controls which apply to company specific work. Companies can then control these implementations by decision to share, or not share them based on the operating principle of that company.
Non IPR components
Code logic and functions are under MIT license. since code logic and functions are alredy based on open-source components & vendor API's, it does not make sense to restrict something that is already based on open source
If you use this tool as part of your commercial effort we only require, that you follow the very relaxed terms of MIT license
Use rich and maintained context of Microsoft Azure CLIlogin & commands with Node.js control flow which supplies enhanced rest-requests and maps results to schema.
This tool does not include or distribute Microsoft Azure CLI, but rather uses it when it has been installed on the source system (Such as Azure Cloud Shell, which is primary platform for running EAST)
✅Using Node.js runtime as orchestrator utilises Nodes asynchronous nature allowing batching of requests. Batching of requests utilizes the full extent of Azure Resource Managers incredible speed.
✅Compared to running requests one-by-one, the speedup can be up to 10x, when Node executes the batch of requests instead of single request at time
clears tokens in session folder, use this if you get authorization errors, or have just changed to other az login account use az account clear if you want to clear AZ CLI cache too
no values
--tag
Filter all results in the end based on single tag--tag=svc=aksdev
no values
--ignorePreCheck
use this option when used with browser delegated tokens
no values
--helperTexts
Will append text descriptions from general to manual controls
no values
--reprocess
Will update results to existing content.json. Useful for incremental runs
no values
Parameters reference for example report:
node templatehelpers/eastReports.js --asb
Param
Description
Default if undefined
--asb
gets all ASB results available to users
no values
--policy
gets all Policy results available to users
no values
--doc
prints pandoc string for export to console
no values
(Highly experimental) Running in restricted environments where only browser use is available
⚠️Detect principals in privileged subscriptions roles protected only by password-based single factor authentication.
Checks for users without MFA policies applied for set of conditions
Checks for ServicePrincipals protected only by password (as opposed to using Certificate Credential, workload federation and or workload identity CA policy)
An unused credential on an application can result in security breach. While it's convenient to use password. secrets as a credential, we strongly recommend that you use x509 certificates as the only credential type for getting tokens for your application
Following methods work for contributing for the time being:
Submit a pull request with code / documentation change
Submit a issue
issue can be a:
⚠️Problem (issue)
Feature request
❔Question
Other
By default EAST tries to work with the current depedencies - Introducing new (direct) depedencies is not directly encouraged with EAST. If such vital depedency is introduced, then review licensing of such depedency, and update readme.md - depedencies
There is nothing to prevent you from creating your own fork of EAST with your own depedencies
Extensible Azure Security Tool (Later referred as E.A.S.T) is tool for assessing Azure and to some extent Azure AD security controls. Primary use case of EAST is Security data collection for evaluation in Azure Assessments. This information (JSON content) can then be used in various reporting tools, which we use to further correlate and investigate the data.
Installation now accounts for use of Azure Cloud Shell's updated version in regards to depedencies (Cloud Shell has now Node.JS v 16 version installed)
Checking of Databricks cluster types as per advisory
Audits Databricks clusters for potential privilege elevation - This control requires typically permissions on the databricks cluster"
Content.json is has now key and content based sorting. This enables doing delta checks with git diff HEAD^1 ¹ as content.json has predetermined order of results
¹Word of caution, if want to check deltas of content.json, then content.json will need to be "unignored" from .gitignore exposing results to any upstream you might have configured.
Use this feature with caution, and ensure you don't have public upstream set for the branch you are using this feature for
Change of programming patterns to avoid possible race conditions with larger datasets. This is mostly changes of using var to let in for await -style loops
Important
Current status of the tool is beta
Fixes, updates etc. are done on "Best effort" basis, with no guarantee of time, or quality of the possible fix applied
We do some additional tuning before using EAST in our daily work, such as apply various run and environment restrictions, besides formalizing ourselves with the environment in question. Thus we currently recommend, that EAST is run in only in test environments, and with read-only permissions.
All the calls in the service are largely to Azure Cloud IP's, so it should work well in hardened environments where outbound IP restrictions are applied. This reduces the risk of this tool containing malicious packages which could "phone home" without also having C2 in Azure.
Essentially running it in read-only mode, reduces a lot of the risk associated with possibly compromised NPM packages (Google compromised NPM)
Bugs etc: You can protect your environment against certain mistakes in this code by running the tool with reader-only permissions
Lot of the code is "AS IS": Meaning, it's been serving only the purpose of creating certain result; Lot of cleaning up and modularizing remains to be finished
There are no tests at the moment, apart from certain manual checks, that are run after changes to main.js and various more advanced controls.
The control descriptions at this stage are not the final product, so giving feedback on them, while appreciated, is not the focus of the tooling at this stage
As the name implies, we use it as tool to evaluate environments. It is not meant to be run as unmonitored for the time being, and should not be run in any internet exposed service that accepts incoming connections.
Documentation could be described as incomplete for the time being
EAST is mostly focused on PaaS resource, as most of our Azure assessments focus on this resource type
No Input sanitization is performed on launch params, as it is always assumed, that the input of these parameters are controlled. That being said, the tool uses extensively exec() - While I have not reviewed all paths, I believe that achieving shellcode execution is trivial. This tool does not assume hostile input, thus the recommendation is that you don't paste launch arguments into command line without reviewing them first.
Tool operation
Depedencies
To reduce amount of code we use the following depedencies for operation and aesthetics are used (Kudos to the maintainers of these fantastic packages)
Other depedencies for running the tool: If you are planning to run this in Azure Cloud Shell you don't need to install Azure CLI:
This tool does not include or distribute Microsoft Azure CLI, but rather uses it when it has been installed on the source system (Such as Azure Cloud Shell, which is primary platform for running EAST)
Azure Cloud Shell (BASH) or applicable Linux Distro / WSL
if (item.properties?.adminUserEnabled == false ){returnObject.isHealthy = true }
Advanced
Advanced controls include checks beyond the initial ARM object. Often invoking new requests to get further information about the resource in scope and it's relation to other services.
Example: Role Assignments
Besides checking the role assignments of subscription, additional check is performed via Azure AD Conditional Access Reporting for MFA, and that privileged accounts are not only protected by passwords (SPN's with client secrets)
Azure Data Factory pipeline mapping combines pipelines -> activities -> and data targets together and then checks for secrets leaked on the logs via run history of the said activities.
Composite
Composite controls combines two or more control results from pipeline, in order to form one, or more new controls. Using composites solves two use cases for EAST
You cant guarantee an order of control results being returned in the pipeline
You need to return more than one control result from single check
Get alerts from Microsoft Cloud Defender on subscription check
Form new controls per resourceProvider for alerts
Reporting
EAST is not focused to provide automated report generation, as it provides mostly JSON files with control and evaluation status. The idea is to use separate tooling to create reports, which are fairly trivial to automate via markdown creation scripts and tools such as Pandoc
While focus is not on the reporting, this repo includes example automation for report creation with pandoc to ease reading of the results in single document format.
cff-version: 1.2.0 title: Pandoc message: "If you use this software, please cite it as below." type: software url: "https://github.com/jgm/pandoc" authors: - given-names: John family-names: MacFarlane email: jgm@berkeley.edu orcid: 'https://orcid.org/0000-0003-2557-9090' - given-names: Albert family-names: Krewinkel email: tarleb+github@moltkeplatz.de orcid: '0000-0002-9455-0796' - given-names: Jesse family-names: Rosenthal email: jrosenthal@jhu.edu
Running EAST scan
This part has guide how to run this either on BASH@linux, or BASH on Azure Cloud Shell (obviously Cloud Shell is Linux too, but does not require that you have your own linux box to use this)
⚠️If you are running the tool in Cloud Shell, you might need to reapply some of the installations again as Cloud Shell does not persist various session settings.
Detailed Prerequisites (This is if you opted no to do the "fire and forget version")
Prerequisites
git clone https://github.com/jsa2/EAST --branch preview cd EAST; npm install
Pandoc installation on cloud shell
# Get pandoc for reporting (first time only) wget "https://github.com/jgm/pandoc/releases/download/2.17.1.1/pandoc-2.17.1.1-linux-amd64.tar.gz"; tar xvzf "pandoc-2.17.1.1-linux-amd64.tar.gz" --strip-components 1 -C ~
Installing pandoc on distros that support APT
# Get pandoc for reporting (first time only) sudo apt install pandoc
Login Az CLI and run the scan
# Relogin is required to ensure token cache is placed on session on cloud shell
az account clear az login
# cd EAST # replace the subid below with your subscription ID! subId=6193053b-408b-44d0-b20f-4e29b9b67394 # node ./plugins/main.js --batch=10 --nativescope=true --roleAssignments=true --helperTexts=true --checkAad=true --scanAuditLogs --composites --subInclude=$subId
Generate report
cd EAST; node templatehelpers/eastReports.js --doc
If you want to include all Azure Security Benchmark results in the report
cd EAST; node templatehelpers/eastReports.js --doc --asb
Share relevant controls across multiple environments as community effort
Company use
Companies have possibility to develop company specific controls which apply to company specific work. Companies can then control these implementations by decision to share, or not share them based on the operating principle of that company.
Non IPR components
Code logic and functions are under MIT license. since code logic and functions are alredy based on open-source components & vendor API's, it does not make sense to restrict something that is already based on open source
If you use this tool as part of your commercial effort we only require, that you follow the very relaxed terms of MIT license
Use rich and maintained context of Microsoft Azure CLIlogin & commands with Node.js control flow which supplies enhanced rest-requests and maps results to schema.
This tool does not include or distribute Microsoft Azure CLI, but rather uses it when it has been installed on the source system (Such as Azure Cloud Shell, which is primary platform for running EAST)
✅Using Node.js runtime as orchestrator utilises Nodes asynchronous nature allowing batching of requests. Batching of requests utilizes the full extent of Azure Resource Managers incredible speed.
✅Compared to running requests one-by-one, the speedup can be up to 10x, when Node executes the batch of requests instead of single request at time
clears tokens in session folder, use this if you get authorization errors, or have just changed to other az login account use az account clear if you want to clear AZ CLI cache too
no values
--tag
Filter all results in the end based on single tag--tag=svc=aksdev
no values
--ignorePreCheck
use this option when used with browser delegated tokens
no values
--helperTexts
Will append text descriptions from general to manual controls
no values
--reprocess
Will update results to existing content.json. Useful for incremental runs
no values
Parameters reference for example report:
node templatehelpers/eastReports.js --asb
Param
Description
Default if undefined
--asb
gets all ASB results available to users
no values
--policy
gets all Policy results available to users
no values
--doc
prints pandoc string for export to console
no values
(Highly experimental) Running in restricted environments where only browser use is available
⚠️Detect principals in privileged subscriptions roles protected only by password-based single factor authentication.
Checks for users without MFA policies applied for set of conditions
Checks for ServicePrincipals protected only by password (as opposed to using Certificate Credential, workload federation and or workload identity CA policy)
An unused credential on an application can result in security breach. While it's convenient to use password. secrets as a credential, we strongly recommend that you use x509 certificates as the only credential type for getting tokens for your application
Following methods work for contributing for the time being:
Submit a pull request with code / documentation change
Submit a issue
issue can be a:
⚠️Problem (issue)
Feature request
❔Question
Other
By default EAST tries to work with the current depedencies - Introducing new (direct) depedencies is not directly encouraged with EAST. If such vital depedency is introduced, then review licensing of such depedency, and update readme.md - depedencies
There is nothing to prevent you from creating your own fork of EAST with your own depedencies