Quantcast
Channel: Hackers target SSRF flaws to steal AWS credentials | CSO Online
Viewing all articles
Browse latest Browse all 1623

DOGE’s cost-cutting database dives offer cybersecurity pros vital lessons in cloud security

$
0
0

Cybersecurity has been politically agnostic until recently, but with the many rapid changes introduced by the Trump administration, it has become somewhat politicized and fraught with questions and self-examination.

Claims on social media, rumors, and information disseminated by questionable sources are having a palpable impact on the security world, even at the highest of levels — the Cybersecurity and Infrastructure Security Agency (CISA) recently had to clarify that it had not pulled back from its longstanding defense against cyber threats from Russia.

But the move that has perhaps caused the most disruption is the action that has been taken by the administration’s DOGE efficiency team in its review of government purchases and spending.

I have not found any technology-related reporting on the topic revealing details of the processes and types of computer access that these teams have been given to perform their task. There have been some suggestions that the team is using an Azure instance and artificial intelligence tools to perform the task, but no one is asking the more detailed questions I’d like to see answered.

To me, these are key questions not only of interest to those concerned with the changes to government, but relevant to any business looking to use such tools to reduce waste and build efficiency: Can you ensure the consultants and tools you bring in have appropriate and secure access? Can you ensure the integrity of databases?

Fundamental questions that need to be answered

If I was tasked to connect to a government database and run tests on the underlying data, there are a few items that I would consider foundational.

First, given that this is government data that I would be working with, I would ensure that the Azure license I had procured was deemed appropriate for government use. Microsoft provides cloud resources specifically designated for use by the US government, with US-based servers that carry documentation attesting to their compliance with various federal and state regulations.

Some services are held to the even-stricter Department of Defense (DoD) Impact Level 5 (IL5) standard of isolation. As Microsoft notes in its documentation, “This separation ensures that a virtual machine that could potentially compromise the physical host can’t affect a DoD workload. To remove the risk of runtime attacks and ensure long-running workloads aren’t compromised from other workloads on the same host, all IL5 virtual machines and virtual machine scale sets should be isolated by DoD mission owners via Azure Dedicated Host or isolated virtual machines.”

Security bugs that impact Hyper-V, Microsoft’s hardware virtualization software, to the point that they could allow guest machines to take over the host or allow for elevated privileges are patched by Microsoft on a regular basis.

For example in January, CVE-2025-21334, CVE-2025-21333 and CVE-2025-21335 fixed issues that, while not directly impacting Hyper-V server, were vulnerabilities that created an elevation-of-privilege issue in the NT kernel integration virtual service provider (VSP) layer.

Critical logging and access control procedures need to be followed

Exploiting this vulnerability allows an attacker to run arbitrary code in the context of the Hyper-V host, giving them potentially unrestricted access to the underlying hardware. As noted, the impact of this vulnerability could be significant. Once an attacker gains unrestricted access to the Hyper-V host, they can manipulate the resources allocated to the guest operating systems, exfiltrate sensitive information from the guest machines, and potentially compromise or delete entire guest operating systems.

I would want to see anyone accessing sensitive data of this magnitude use services and tools in an isolated configuration and ensure that logging and zero-trust processes are put in place. It’s necessary to have certain licensing in place to implement proper logging. External storage should be kept of this logging to document the access — cloud logging is all too often lost as the steps to capture were not performed ahead of time.

When retrieving data from various data centers, logging must be performed in Coordinated Universal Time (UTC) as it is difficult to align timestamps without standardizing on one time zone. Ensure that any extra admin instances or permissions created to facilitate the review are removed once the project is over — enterprises often add access for a consultant and then fail to monitor or expire access and permissions in a timely manner.

It helps to ensure that all such access for consulting projects has an expiration date. Review Entra ID or Azure AD permissions with PowerShell and make sure that only the access you intended to have granted to an app is allowed for that application.

Finally, you should ensure that whatever modifications you have made to a database leave it in a condition to be relied upon going forward. Is there a hash value process or checksum to ensure that the data you had in place before has not been tampered with?

Databases typically have some sort of process to ensure the integrity of the data has not been impacted. Whenever you bring in outside tools or consultants, this process needs to be performed to ensure integrity has not been compromised.


Viewing all articles
Browse latest Browse all 1623

Trending Articles