Wednesday, March 29, 2023

 One of the ways to secure data is to control access.  These involve generating access keys, shared access signatures, and Azure AD access. Access keys provide administrative access to an entire resource such as   storage accounts. Microsoft recommends these are only used for administrative purposes. Shared access signatures are like tokens that can be used to generate granular access to resources within a storage account. Access is provided to whoever or whatever has this signature. Role based access control can be used to control access to both the management and data layer. Shared access signatures can be generated to be used in association with an Azure AD identity, instead of being created with a storage account access key.  Access to files from domain joined devices are secured using the Azure AD identities that can be used for authentication and authorization.

Another way to secure the data is to protect the data using storage encryption, Azure Disk Encryption, and Immutable storage. Storage encryption involves server-side encryption for data at rest and Transport Layer Security based encryption for data in transit. The default encryption involves an Account Encryption Key which is Microsoft managed but security can be extended through the use of customer-managed keys which are usually stored in a Key Vault. The volume encryption encrypts the boot OS and data volumes to further protect the data.

A checklist helps with migrating sensitive data to the cloud and provides benefits to overcome the common pitfalls regardless of the source of the data. It serves merely as a blueprint for a smooth secure transition.

Characterizing permitted use is the first step for data teams need to take to address data protection for reporting. Modern privacy laws specify not only what constitutes sensitive data but also how the data can be used. Data obfuscation and redacting can help with protecting against exposure. In addition, data teams must classify the usages and the consumers. Once sensitive data is classified, and purpose-based usage scenarios are addressed, role-based access control must be defined to protect future growth.

Devising a strategy for governance is the next step; this is meant to prevent intruders and is meant to boost data protection by means of encryption and database management. Fine grained access control such as attribute or purpose-based ones also help in this regard.

Embracing a standard for defining data access policies can help to limit the explosion of mappings between users and the permissions for data access; this gains significance when a monolithic data management environment is migrated to the cloud. Failure to establish a standard for defining data access policies can lead to unauthorized data exposure.

When migrating to the cloud in a single stage with all at once data migration must be avoided as it is operationally risky. It is critical to develop a plan for incremental migration that facilitates development testing and deployment of a data protection framework which can be applied to ensure proper governance. Decoupling data protection and security policies from the underlying platform allows organizations to tolerate subsequent migrations.

There are different types of sanitizations such as redaction, masking, obfuscation, encryption tokenization and format preserving encryption. Among these static protection in which clear text values are sanitized and stored in their modified form and dynamic protection in which clear text data is transformed into a ciphertext are most used.

Finally defining and implementing data protection policies brings several additional processes such as validation, monitoring, logging, reporting, and auditing. Having the right tools and processes in place when migrating sensitive data to the cloud will allay concerns about compliance and provide proof that can be submitted to oversight agencies. 

No comments:

Post a Comment