Wednesday, May 31, 2017

We continue our discussion of system design for online store as mentioned in the previous post. We talk about securing data at rest and in transit. This is generally performed with layers and the principles of least privilege. For example, the lower levels of the software stack such at the operating system level operate at elevated and with internal account privilege and independent from the applications and services that run on it. The higher levels of the software stack may be external to the operating system and not require the same privilege as the lower levels. Their privilege may be secured with specific service accounts to run on the lower levels. The same considerations for lower and higher levels in the system side applies to user mode. Code executing on behalf of a user must have demarcated authentication and authorization lines of control before transitioning into internal context for execution. Every user access and interface to the system must be secured. Role based access control could be used to differentiate user access. This enables separation of concerns such as between user and administrator work. It also facilitates specifying security policies and changes to them without affecting the data. Access and privilege to user data should be as granular as possible so that changes to one may not affect another. As long as policies and system can be separated, we can change the policies without affecting the system. This comes in useful in throttling and controlling access to the server which may be under duress from excessive and unwanted connections. Authentication and encyrption protect the data in transit so we use it for all network connections. In the case of web traffic, we prefer https over http. But it is not just the higher privileged user that we want to secure. Connections coming with the lowest classification such as the anonymous or guest user must have the least privilege to run code anywhere in the system. This is particularly important in cloud computing where we rely on authentication tokens as a form of user credentials. If a token issuer, malfunctions it could allow anonymous user the same privilege as a regular user. Therefore security reviews with Strength-Weakness-Opportunities-Threats need to be conducted on all flows - both control and data through the system. Fail fast mechanism is used to prevent unwanted access or execution. Consequently most routines at every level and entrypoint check the parameters validate their assumptions. The Cloud Security Alliance announced ten major challenges in big data security and privacy and suggest best practices for their mitigations. These include:
1. Secure code  - check authentication, authorization, audit and security properties, mask personally identifiable information
2. Secure non-relational data stores - use fuzzy techniques for penetration testing.
3. Secure data storage and logs using Secure Untrusted Data Repository (SUNDR) methods
4. Secure devices - use conventional endpoint protection discussed earlier along with anomaly detection
5. Secure based on monitoring - This applies to cloud level, cluster level, application levels etc. Compliance standard violations should also be monitored.
6. Protecting Data privacy - Code does not differentiate which data to operate on if they are both valid. Policies are used instead. Similarly if data exists in two or more sources, they could be related by a malicious user.
7. Secure data with encryption - perform encryption without sharing keys
8. Secure with restrictions implemented by granular access control and federation.
9. Secure with granular audits - use a Secure Information and Event Management SIEM solution
10. Secure provenance data that describes how the data was derived even if it bloats size.
#codingexercise


No comments:

Post a Comment