How agencies can improve zero trust architecture by addressing their data problem
Carmelo McCutcheon, the public sector CTO for VAST Data Federal, offers six ways agencies can improve their data protection.
The dynamic cybersecurity landscape presents increasingly complex challenges for enterprises. Federal agencies, in particular, are a ripe target for cyberattacks because they manage massive amounts of sensitive data. The National Institute of Standards and Technology suggests that “cybersecurity architects and administrators need to develop a comprehensive monitoring process that can handle the volume of data needed for the dynamic nature of zero trust.”
In November 2024, federal agencies were required to provide plans for implementing zero trust to the Office of the National Cyber Director and the Office of Management and Budget. To meet these expectations, federal agencies must shift their focus from solely building protective barriers to implementing robust data-centric strategies that align with zero-trust principles.
The zero trust architecture (ZTA) framework has become a bulwark against progressively sophisticated cyber threats. However, despite the widespread adoption of the zero trust security model, many federal agencies still struggle to effectively combat these threats.
These agencies don’t necessarily have a security problem; they have a data problem. Unless they properly address the data pillar of the ZTA framework, they will fail to strengthen their cybersecurity posture.
What is the data pillar?
An effective ZTA framework relies on carefully categorizing data. That data is then encrypted and restricted to authorized personnel through a least privilege access policy.
According to the National Security Agency, “the data pillar focuses on securing and enforcing access to data at rest and in transit through various methods, including encryption, tagging and labeling, data loss prevention (DLP) strategies, and application of data rights management (DRM) tools.”
Securing the ZTA data pillar
The three basic building blocks of the data pillar are categorization, authorization and segmentation.
Federal agencies must have visibility over all of their data resources. More importantly, they need a granular and detailed understanding of what data exists and how it’s used to ensure classified data is not used incorrectly. This can be achieved by deploying advanced data discovery tools to automate the identification and categorization of data across systems, coupled with real-time analytics to track data usage patterns. By leveraging these technologies, agencies can establish an accurate inventory of their data, assess its sensitivity, and apply appropriate security protocols tailored to the data’s context. Classifying and labeling data is foundational to an effective zero trust architecture.
Based on data classifications, federal agencies can apply identity management policies like role-based access controls (RBAC) and attribute-based access controls (ABAC) to govern how, when and by whom data can be accessed. Further segmentation and micro-segmentation allow them to enforce the principle of least privilege to limit access to authenticated and authorized users and devices.
Only after data resources have been classified and tagged can federal agencies apply the appropriate level of security to data in each category to shield sensitive information from unauthorized access.
Once federal agencies gain an understanding of their data resources and encrypt them appropriately with Federal Information Processing Standard (FIPS) compliant tools, they should continuously audit and monitor how and when they are being accessed both in transit and at rest. For example, federal agencies must proactively and properly maintain logs and analyze data access patterns in order to detect and respond to potential security threats.
Finally, agencies need to ensure both the integrity and availability of their data. This is a critical function even during a cybersecurity incident. They should ensure that authorized users can have access to data to minimize downtime and avoid any unplanned business disruptions.
Improve ZTA effectiveness
Exponential growth in the amounts of data federal agencies collect and store presents a major challenge for their cybersecurity strategy. Even with the necessary protocols of zero trust architecture in place, they can still be susceptible to cyber threats if they don’t appropriately manage data. Here are some additional steps agencies can take to improve data protection within the ZTA framework.
- Automated data labeling: Manually categorizing and labeling vast amounts of data has become untenable for many federal agencies. It requires too much time and manpower, and it’s subject to human error. Automating the process of defining labels and attributes that every object, file and directory inherits will accelerate the process and improve the accuracy of data classification.
- Hybrid authorization model: Employing a hybrid RBAC and ABAC model enhances traditional access control mechanisms with dynamic, attribute-based policies. This will help agencies identify user roles, attributes and attribute access policies for more granular control, streamlined administration and flexibility in assigning permissions.
- Microsegmentation: Agencies need to control which virtual local area networks (VLANs) have access to views and what protocols can be accessed on them. However, they should also filter based on endpoint IP and make microsegmentation decisions controlling not only what an endpoint can access but also its level of access.
- At-flight and at-rest encryption: Encryption methodologies must include at-flight encryption such as S3 (HTTPS), NFS, NFSoRDMA, and NFSoROCEv2. Additionally, they should utilize key exchange and at-rest encryption that supports crypto-erase capabilities at both the tenant and view levels, providing robust key management and secure data erasure options.
- AI-powered anomaly detection: Artificial intelligence and machine learning technologies are now capable of efficiently and reliably executing functions to strengthen an organization’s cybersecurity strategy. Agencies should leverage tools that use machine learning algorithms to detect anomalies and better identify suspicious user activities and data patterns characteristic of security threats.
- Immutable data backups: In the event of a ransomware attack, federal agencies should have a secure backup of all critical data. On top of that, they must ensure those backups cannot be altered or deleted by attackers. Indestructible snapshots of an agency’s data provide an additional layer of protection by making backup copies immutable, rendering ransomware attacks ineffective.
Zero trust hinges on data management
There are many steps that federal agencies should take to protect data in the evolving cybersecurity landscape. As attackers develop new tactics to circumvent traditional zero-trust policies, agencies need to adopt new tools and methods to protect sensitive data.
Adopting ZTA is the first step toward protecting data. However, data visibility, classification and control are essential to strengthening the data pillar and overall effectiveness of a ZTA framework. Without addressing the data problem, the rest of a cybersecurity strategy is likely to fall apart.
Carmelo McCutcheon is the public sector chief technology officer for VAST Data Federal. As the public sector CTO at VAST Data Federal, he leverages his expertise to simplify complex technology solutions for customers, focusing on optimizing AI/ML workloads and securing AI/ML environments to meet stringent federal standards.
Copyright
© 2025 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.