Tuesday, January 31, 2017

In the previous posts, we talked about public cloud offerings. This leads us to compare public clouds. Let us do so today with comparision of private network offerings.

Azure started offering virtual networks from 2013 while AWS has been an early starter.  Azure's Virtual Network VN resembles AWS VPC in many aspects but we will also enumerate differences. That said Azure has kicked it up a notch.

Subnets:
Subnets divide the network into smaller network and are very helpful to create private networks. AWS allows VPC to be created with
1) single public subnet
2) public and private subnets
3) public and private subnets with hardware VPN access
4) private subnet only with a hardware VPN access.
Azure also gives similar flexibility to create subnets of any size. The difference is in the use of wizards between AWS and Azure.  AWS wizards have been used for a while longer.

Security:
Security is one of the most important considerations for the virtual private network. AWS provides security services to tackle at instance level, subnet level and overall network.  Azure has simpler design. Inbound and outbound rules can be configured in both.

Security Groups:
AWS security groups helps control access and this is available from both. Azure calls it Network Security Group and it is available only for regional virtual networks. 

Network ACLs:
Azure and AWS both support network ACLs which allow users to selectively permit or deny traffic to your networks.

Custom routing tables.
subnets inherit the main routing table. In AWS, we can customize these within a subnet.In Azure, the support was somewhat limited until recently.

Virtual network interfaces
multiple NICs are allowed to be activated on virtual machine instances however both cloud providers only allow this on large instances.
documentation on Azure: https://docs.microsoft.com/en-us/azure/
documentation on AWS: https://aws.amazon.com/documentation/ 

#codingexercise
int GetMinSumSquares(List<int>positives, uint scalar)
{
if (positives == null || positives.Count == 0) return 0;
int inclusive = positives[0];
int exclusive = 0;
for (int i = 1; i < positives.Count; i++)
{
var min = Math.min(inclusive, exclusive);
inclusive  = exclusive + positives[i] × positives[i];
exclusive = min;
}
return Math.min(inclusive, exclusive);
}
Given an integer array of n integers, find sum of bit differences in all pairs that can be formed from array elements
For example bit difference for 2 represented by 010 and 7 represented by 111 is 2. Therefore an array with {2,7} will have (2,2), (2,7), (7,2) and (7,7) with sum of bit differences = 0 + 2 + 2 + 0 = 4.
int GetSumBitDifferences(List<int> A)
{
int ret = 0;
for (int i = 0; i < 32; i++)
{
int count = 0;
for (int j = 0; j < A.Count; j++)
     if (A[j] & (1 << i))
         count++;

ret += (count x (A.Count-count) x 2);
}
return ret;
}
Another way to do this would be to enumerate all combinations of pairs from the array using the Combine method discussed earlier
int GetSumBitDifferencesUsingCombinations(List<int> A)
{
int ret = 0;
var pairs = new List<Tuple<int,int>>();
var seq = new List<int>();
int start = 0;
int level = 0;
Combine(pairs, ref seq, ref pairs, start, level);
pairs.ForEach ( x => ret += GetDiff(x.first, x.second));
return ret;
}
The combinations above also include those where the first and second element are the same however there is no contribution to the result

Monday, January 30, 2017


An introduction to private cloud versus public cloud masquerading as private cloud can be found here: https://1drv.ms/w/s!Ashlm-Nw-wnWrF0qVoeSgL7Xnu1w 
Here are a few specific ways to differentiate and make the private cloud more appealing while not stealing any light from public cloud. These include
1)     Provide container resources in addition to virtual machines to explode the number of computing resource
2)     Provide services that are customized to frequent usages by private cloud customers. This includes not only making it easier to use some services but also provisioning those that many customers often use.
3)     Anticipate customer requests and suggest compute resources based on past history and measurements.
4)     Provide additional services that more customers are drawn to the services and not just to the cloud. Additionally, the customers won’t mind when the weight of the services is shifted between public and private cloud infrastructure as the costs dictate.
5)     Provide additional services that won’t be offered elsewhere. For example, data tiering, aging, archival, deduplication, file services, backup and restore, naming and labeling, accelerated networks etc. offer major differentiation that do not necessarily have to lean towards machine learning to make the private cloud smart.
6)     Offer major periodic maintenance and activities on behalf of the customer such as monitoring disk space and adding storage, checking for usages and making in place suggestions on the portal.
7)     Reduce the volume of service desk tickets aggressively with preemptive actions and minimizing them to only failures.  This is paying off debt so it may not translate to new services.
8)     Improving regional experiences not only with savvy resources but also improved networks for major regions
9)     Provide transparency, accounting and auditing so that users can always choose to get more information for self-help and troubleshooting. FAQs and documentations could be improved preferably with search field.
10)  Enable subscriptions to any or all alerts that can be setup by the customer on various activities. This gives the user informational emails with subjects that can be used to filter and treat at appropriate levels.
 #codingexercise
int GetMinSumWeightedNegatives(List<int>negatives, uint scalar)
{
if (negatives == null || negatives.Count == 0) return 0;
int inclusive = negatives[0];
int exclusive = 0;
for (int i = 1; i < negatives.Count; i++)
{
var min = Math.min(inclusive, exclusive);
inclusive  = exclusive + scalar × negatives[i];
exclusive = min;
}
return Math.min(inclusive, exclusive);
}

It may be interesting to note that scalar weight cannot be negative unless the count of included negatives is even.

Sunday, January 29, 2017

Today we continue to review the book "From Smart to Wise" by Kaipa and Radjou. The authors say that "In our experience, wise leadership succeeds where smart leadership cannot." This book is about six capabilities which are essential for wise leadership and these are :
perpective, action orientation, role clarity, decision logic, fortitude and motivation. The leaders are classified as follows:
1) functional smart leaders and categorized as blue zone and
2) business smart leaders and categorized as red zone
The blue zone leaders are experts in their field. They execute effectively. They are risk-averse. They are detail oriented. They deliver exceptional results and they are efficient. Tim Cook is an example of this category
The red zone leaders are visionaries. They see the big picture. They are risk takers. They are very competitive. They are strategic entrepreneurs. They are highly motivated. Bill Gates is considered an example of this category.
Yesterday we read about what wise leaders do to apply the blue zone or the red zone technique as appropriate. Each of the capabilities was explained in this context and described. Today we read about how the blue zone and the red zone compare against each capability. The author states that the blue zone and red zone perspective is based on our knowledge and our experiences. With regard to finding the purpose, the blue zone focus on short term goals and execute it perfectly with like minded people's perspectives. They miss the subtle external changes and are not innovative outside their area of expertise.  The red zone perspective focus on long term goals, see the big picture,  are innovative and like blue zone also focus on like minded people's perspectives but they struggle to execute their grand vision. The authors quote Einstein as saying "one cannot solve a problem with the same mind-set that created it in the first place."
With regard to acting authentic and appropriate, the blue zone leaders focus on doing things the right way technically but not ethically. They follow a rule book or have a static style.  The red zone leaders are impulsive and follow their emotions. They need to feel successful and tend to believe that the ends justify the means.
With regard to role clarity, the blue zone leaders demonstrate leadership only in their domain of expertise and are willing to put emotions aside to fulfill their role. They get stuck in their role and can lose their identities. The Red zone leaders are very attached  to their role as a leader. They are emotional/passionate but they do not lose themselves within their role. To cultivate role clarity, each perspective could let go by cultivating a beginner's mind-set.
With regard to decision logic, blue zone leaders are cautious decision makers and better at short term decisions. They are open to feedback and tend to focus on validating comments. They rely on their instincts and take a long time to make decisions. The red zone leaders like high risk and thus high reward decisions. They make decisions quickly and are overconfident in their decision making capabilities. They don't pursue other viewpoints. The wise leaders operate in a grey zone. They exhibit "both clarity and pragmatism".
With regard to flexible fortitude, the functional smart leaders work very methodically, sometimes slowly and with limitations and boundaries. They are not very flexible and are stubborn but they end up completing high quality projects.  The smart zone leaders only care about projects  that lead to rewards and recognition. They can be moody but are not good at multitasking and tend to be resilient when faced with failures. Both type of smart leaders exhibit great fortitude but have their faults, only the wise show flexible fortitude.
#codingexercise
int GetMinSumNegatives(List<int>negatives)
{
if (negatives == null || negatives.Count == 0) return 0;
int inclusive = negatives[0];
int exclusive = 0;
for (int i = 1; i < negatives.Count; i++)
{
var min = Math.min(inclusive, exclusive);
inclusive  = exclusive + A[i];
exclusive = min;
}
return Math.min(inclusive, exclusive);
}

Saturday, January 28, 2017

Today we write a review of a book "From Smart to Wise" by Kaipa and Radjou. The authors say that "In our experience, wise leadership succeeds where smart leadership cannot."
This book is about six capabilities which are essential for wise leadership and these are :
perpective, action orientation, role clarity, decision logic, fortitude and motivation.
It starts off by classifying leaders as two types - functional smart leaders and categorized as blue zone and business smart leaders categorized as red zone.
The blue zone leaders are experts in their field. They execute effectively. They are risk-averse. They are detail oriented. They deliver exceptional results and they are efficient. Tim Cook is an example of this category
The red zone leaders are visionaries. They see the big picture. They are risk takers. They are very competitive. They are strategic entrepreneurs. They are highly motivated. Bill Gates is considered an example of this category.
Wiseness is about being equally dexterous in both categories and to know when to apply which category. It often comes with a perspective of what will benefit others while the smartness is about oneself.
The authors suggest that we should figure out which zone we are operating in, decide where we are on our path, create a road map, and find tools to help us stay on the outlined path.
This book articulates that to be a wise leader, we must find and follow a noble purpose which will guide us like the North Star. In fact this perspective is considered the most critical capability. For a wise men, this perspective means becoming sensitive to the context around them and being able to see the world without any filters. Since this perspective transcends personal gain and ego, it gives meaning and a path to happiness. Wise leaders exhibit mindfulness and have a growth mindset.
 A leader's action needs to be authentic and appropriate. Authentic means being true to oneself. Appropriate means the actions should be done in context. Often we fail from that but here's a way to bridge the gap: what we say must line up with what we do,   what we say must line up with what we actually feel and who we are must line up with what we do.
Role clarity, as per the authors, is the capability to perform a chosen role convincingly and with enthusiasm, and without losing sense of who we are behind our role. They demonstrate leadership in any role and are good team players. They empower others and amplify the smarts and capabilities  of the people around them.
To quote the authors, the decision logic is the system, processes and principles of reasoning used in making important decisions. If the decisions exhibit ethical clarity and pragmatism, not complicated with emotions and alternatives are weighed while listening to all viewpoints, this capability is demonstrated.
Flexible fortitude is to know when to hold and when to fold. we need to strike a balance between having the courage to stay the course and being flexible to change direction. The leaders in both the blue and the red zone demonstrate great fortitude but risk being inflexible whereas the wise leader will be comfortable striking a balance.
Finally wise leaders believe that what is in the public interest will eventually be in the interest of all individual and groups, including themselves. They demonstrate strong intrinsic motivation and they are keen on serving.

#codingexercise
Yesterday we described a codingexercise to find the maximum product of a sequence in an array where the elements are not adjacents.if the integers were both positive and negative, we would have need to make sure the count of selected negatives is even. We would keep track of two products one with the negative integers only and the other regular.
The key observation here is that we skip the negatives and find the desired max among the positives. Then we take the negatives alone and find their desired max with the additional criteria that they have to be even in count. Whichever is greater dictates the list that will be included as is and elements of the other list will be included such that there are no adjacencies.

The same logic holds for sum but the skipped elements should have a value of zero.

int GetMaxSumBothPositiveAndNegative(List<int>A)
{
var positives = A.Clone();
for (int i = 0; i < positives.Count; i++)
    if (positive[i] <= 0)
        positives[i] = 0;
return GetMaxSum(positives);
}

Friday, January 27, 2017

Today we continue to read up on compliance overview of Adobe Cloud Services.Adobe is able to meet these standards and regulations because it practices a bottom up physical layer securing approach called the Adobe Common Controls Framework and a top down software layer securing approach called the Adobe Secure Product Lifecycle. The former manifests as a set of Adobe specific controls that map to approximately a dozen security standards and the latter is a set of security activities that span several product development practices, processes and tools. Therefore there is a layering of software supported on both infrastructure and operations which is available over a physical layer. This is secured from both directions - top and bottom with the SPL and CCF.
The CCF is built so that it is not specific to any of the products. Instead teams can inherit control capabilities from other parts of the organization. This way a software engineer does not have to be concerned about data center security but they inherit it from the data center operations team. The CCF conceptual model promotes Adobe corporate standards such as business continuity, security governance, data privacy, and people resources over product engineering which is based on product operations and that in turn operates on Adobe Product cloud which assures physical and environmental security. The product engineering adopts the SPL approach and the product operations adopts enforces monitoring, configuration, asset management, change management, logical access and incident response. The CCF has approximately 273 common controls across  twenty control domains. These include asset management, backup management, Business continuity, change management, configuration management, data management, Identity and access management, Incident response, Mobile device management, network operations, people resources, risk management, security governance, service lifecycle, site operations, system design documentation, systems monitoring, third party management, training and awareness and vulnerability management.
Compliance is reviewed periodically.
#codingexercise
Yesterday we described a codingexercise to find the maximum product of a sequence in an array where the elements are not adjacents.if the integers were both positive and negative, we would have need to make sure the count of selected negatives is even. We would keep track of two products one with the negative integers only and the other regular.
The key observation here is that we skip the negatives and find the desired max among the positives. Then we take the negatives alone and find their desired max with the additional criteria that they have to be even in count. Whichever is greater dictates the list that will be included as is and elements of the other list will be included such that there are no adjacencies.
int GetMaxProductBothPositiveAndNegative(List<int>A)
{
var positives = A.Clone();
for (int i = 0; i < positives.Count; i++)
    if (positives[i] <= 0)
        positives[i] = 1;
var max_positives = GetMaxProductList(positives);
var negatives = A.Clone();
for (int i = 0; i < negatives.Count; i++)
    if (negatives[i] >= 0)
        negatives[i] = 1;
var max_negatives = GetMaxNegativeList(negatives);
return RemoveAdjacencies(max_positives, max_negatives, A).Product();
}
List<int> GetMaxPositiveList(List<int> A)
{
var ret = new List<int>();
assert( A.all(x => x > 0));
if (A == null || A.Count == 0) return 0;
int inclusive = A[0];
int exclusive = 0;
for (int i = 1; i < A.Count; i++)
{
var max = Math.max(inclusive, exclusive);
if (max == inclusive)
    ret.add(A[i-1]);
inclusive  = exclusive × A[i];
exclusive = max;
}
int last= Math.max(inclusive, exclusive);
if (last== inclusive)
    ret.add(A[A.count-1]);
return ret;
}

Thursday, January 26, 2017

Today we continue to describe migratory assets. A significant percentage of the end users do not care about where the assets come from.  Consequently we can move assets between public cloud to public cloud or even public cloud to private cloud or even private to private cloud while letting the users access their resources by name and in some cases by the same IP. 
There are two factors contributing to this: 
1) a large number of requests are for fine grained number of assets 
2) Most assets are managed by system center managers that don't depend on the host but only on the agent running inside the asset. 
These promote migratory assets. 
Another important criteria is the offloading of infrastructure activities to existing automations in the public cloud. Take for example, periodic evaluation of unused resources and the reminders to customers. This is no longer done with custom or proprietary automations. Instead the same technologies that monitor usages and trigger alerts in public cloud can be reused. There is significant savings in reusing out of box technologies. First, the tools and technologies are widely accepted and have already matured to feedback across organizational usages. Second, there is no need to increase ownership and maintenance of in house tools and technologies that serve the same purpose. 
By a similar counterargument, there is always a place for in house tools and automations that can benefit private cloud implementations and boost the overall offering to the customer. Cheap, dirty and fast solutions are easily facilitated in a private cloud and we welcome this with independently controlled private cloud as an alternative offering to the customers.  For example, we can create clusters and containers in a private cloud that can benefit as additional resources to the user. 
Lastly, applications and services can leverage appliances, technologies and services that are not available in a private cloud. We have storage appliances, network accelerators, firewalls, routers and many other technologies that can apply only in private cloud. These then become foundations for Infrastructure as a service, Platform as a service and software as a service layers which bring their own advantages. Arguably though these upper layers for the most part should not differentiate between whether the resources are requested from private cloud or public cloud providers
#codingexercise
Find the maximum product of a subsequence of an array of integers such that no two elements are adjacent.
int GetMaxProduct(List<int> A)
{
assert( A.all(x => x > 0));
if (A == null || A.Count == 0) return 0;
int inclusive = A[0];
int exclusive = 0;
for (int i = 1; i < A.Count; i++)
{
var max = Math.max(inclusive, exclusive);
inclusive  = exclusive × A[i];
exclusive = max;
}
return Math.max(inclusive, exclusive);

}
if the integers were both positive and negative, we would have need to make sure the count of selected negatives is even. We would keep track of two products one with the negative integers only and the other regular.
The key observation here is that we skip the negatives and find the desired max among the positives. Then we take the negatives alone and find their desired max with the additional criteria that they have to be even in count. Whichever is greater dictates the list that will be included as is and elements of the other list will be included such that there are no adjacencies.

Wednesday, January 25, 2017

Migratory Assets
          Today many organizations use one or more public clouds to meet the demand for the compute resource by its employees. A large number of these requests are fine grained where customers request a handful of virtual machines for their private use. Usually not more than twenty percent of the customers have demands that are very large ranging to about a hundred or more virtual machines. 

 There are a few challenges in servicing these demands. First, the virtual machines for the individual customers are sticky. Customers don’t usually release their resource and even identify it by their name or ip address for their day to day work.  They host applications, services and automations on their virtual machines and often cannot let go of their virtual machine unless files and programs have a migration path to another compute resource. Typically they do not take this step to create regular backups and keep moving the resource. While container platforms for Platform-as-a-service (PaaS) have enabled software to be deployed without any recognition of the host and frequently rotated from one host to another, the end users adoption of PaaS platform depend on the production readiness of the applications and services The force for PaaS adoption has made little or no changes to the use and proliferation of virtual machines by individual users. Therefore the second criteria is that the individual users habits don’t change.
Consequent to these two factors, the cloud services provider can package services such as additional storage, regular backup schedule, patching schedule, system management, securing and billing at the time of request for each asset. However such services depend on the cloud where the services are requested. For private cloud a lot of the service is in-house adding to the costs even if the inventory is free.

I describe that a significant percentage of the end users do not care about where the assets come from.  Consequently we can move assets between public cloud to public cloud or even public cloud to private cloud or even private to private cloud while letting the users access their resources by name and in some cases by the same IP.

#codingexercise
Find the maximum of a subsequence of an array of integers such that no two elements are adjacent.
int GetMaxSum(List<int> A)
{
if (A == null || A.Count == 0) return 0;
int inclusive = A[0];
int exclusive = 0;
for (int i = 1; i < A.Count; i++)
{
var max = Math.max(inclusive, exclusive);
inclusive  = exclusive + A[i];
exclusive = max;
}
return Math.max(inclusive, exclusive);
}
Courtesy: geeksforgeeks

Tuesday, January 24, 2017

We read how Google protects data.While Google is an innovator in two-step authentication and encryption methods and may even have more scale of operations than many other companies, all that other companies need to do is to ensure that they don't fall short of the considerations from Google's security efforts. Their implementations can vary but the threats will have been mitigated. Where may companies may lead the initiative is to show how to use the public cloud by setting security standards in its adoption. I enumerate the areas of control. These include:
1) network integration between on-premise and public cloud
2) penetration testing of virtual private network in the public cloud
3) Active Directory connector between virtual private network and on premise
4) System Center Manager for public cloud assets
5) Setting up of monitoring and alerts on all threat related operational activities
6) Successful key management for all assets and operations related key usages
7) Isolation among security groups or access controls for individual instances and groups.
8) Metering and reports on all usages so that anomalies can be detected
9) Metrics on all distributions across regions and fault tolerance
10) Size of large groups and the percentage of fine granularity in resource groups
11)Time based variations over a period

Most public cloud providers publish their own security guidelines and practice while providing documentation and community support forums. These best practice cover many tools and features that the cloud providers give to enable empowering the companies that adopt the public clouds. However, the degree to which each one uses these clouds and their compliance to minimum security standards is largely not governed. The contract between public cloud providers and organizations safeguards the public cloud provider but does not secure the end user or their data in all aspects of their usage. For example, with the use of public cloud for private networks, an organization may or may not guarantee isolation of user data on the same server, its robustness and reliability against loss, theft or mishandling, its availability to export when the user no longer wishes to use the services and its portability. Moreover, administrators have their own administration tasks that require certain level security compliance as we saw with our coverage of how Google secures its data. These tasks include writing rules that dictate data retention, showing responsibility towards legal hold and law enforcement response, maintaining transparency in their actions and the publication of their reports, or showing discretion in other maintenance activities.
Therefore, adoption of public cloud computing for organizational needs requires certain criteria that can evolve as an internal security compliance requirement.
More reading at https://1drv.ms/w/s!Ashlm-Nw-wnWrF0qVoeSgL7Xnu1w


#codingexercise
Yesterday we were discussing a codingexercise that attempted to sort a doubly linked list of 0 and 1.
It had a helper function to cut and insert element before or after the transition point. We implemented the insertion after the transition point and we implement the insertion before the transition point.
dll CutAndInsertBefore(dll transition, dll current, dll root)
{
assert (current != null && root != null);
if (transition == null) return root; // nothing to do
if (transition == current) return root;
if (current == root) return root; // this op only when current > transition
// root == transition is acceptable
// cut current;
var temp = current.next;
if (temp)
     temp.prev = current.prev;
if (current.prev){
     current.prev.next = current.next;
     current.next = null;
}
// insert before
if (transition.prev)
     transition.prev.next = current;
else
      root = current;
current.next = transition;
current.prev = transition.prev;
transition.prev = current;
return root;
}

Monday, January 23, 2017

Today we conclude reading up on how Google protects data.Google has incredible expertise in this area. Security is an integral part of their operations. They scan for security threats using tools, they use penetration testing, quality assurance processes, security reviews and external audit. They have a rigorous incident management process for security events that affect the confidentiality, integrity or availability of systems of data. They apply defense-in-depth perspective to securing their custom-designed servers, proprietary operating systems and geographically distributed data centers. Data is encrypted in transit, at rest and on backup media, moving over the internet or traveling between the data centers. They build redundancy into their server design, data storage, network and internet connectivity and even the software services.Their audits cover compliance standards and we reviewed some of them. We also reviewed some of the tools used to empower users and administrators to improve security and compliance. We were discussing user authentication, data management and email management features. We reviewed data detention which is facilitated by the eDiscovery feature.Administrators can specify retention rules that specify how long certain messages are retained before they are removed from user mailboxes and expunged from all Google systems. Administrators can enforce policies over mobile devices in their organization, encrypt data on devices, and perform actions like remotely wiping or locking lost or stolen devices. Administrators can also recover data by restoring a deleted user account or restoring a user's drive or gmail data. They can publish reports that provide transparency to their actions.
Thus Google protects data in its infrastructure, applications and personnel operations. The same security considerations are acknowledged by the industry. We mentioned comparision to Adobe's efforts in this regard. While Google is an innovator in two-step authentication and encryption methods and may even have more scale of operations than many other companies, all that other companies need to do is to ensure that they don't fall short of the considerations from Google's security efforts.  Their implementations can vary but the threats will have been mitigated. Adobe has secured its data by virtue of the very same compliances that Google has made. These include  the SOC, ISO27001, FedRAMP and Payment card industry data security standard. In addition Adobe fulfils regulatory compliances such as Gramm-Leach-Bliley Act GLBA for financial institutions, HIPAA the health insurance portability and accountability act and protected health information PHI, the code of Federal regulation, Title 21 and the US Family educational rights and privacy act.
Adobe is able to meet these standards and regulations because it practices a bottom up physical layer securing approach called the Adobe Common Controls Framework and a top down software layer securing approach called the Adobe Secure Products Lifecycle. The former manifests as a set of Adobe specific controls that map to approximately a dozen security standards and the latter is a set of security activities that span several product development practices, processes and tools.
Where Adobe could become pioneer is to put together a security standard for using hybrid or public clouds for organizational use. While Google claims its public cloud has better security than on-premise computing, Adobe could become a pioneer to say that its embrace of public cloud is more secure than on-premise computing of many other companies.
#codingexercise
Yesterday we were discussing a codingexercise that attempted to sort a doubly linked list of 0 and 1.
It had a helper function to cut and insert element before or after the transition point. It goes like this:
dll CutAndInsertAfter(dll transition, dll current, dll root)
{
assert (current != null && root != null);
if (transition == null) return root; // nothing to do
if (transition == current) return root;
if (current == root) return root; // this op only when current > transition
// root == transition is acceptable
// cut current;
var temp = current.next;
if (temp)
     temp.prev = current.prev;
if (current.prev){
     current.prev.next = current.next;
     current.next = null;
}
// insert after
if (transition.next)
     transition.next.prev = current;
current.next = transition.next;
current.prev = transition;
transition.next = current;
// root remains unmodified for insertion after transition
return root;
}

The cutandinsertbefore method is similar to the one above except that the root might change.

Sunday, January 22, 2017

Today we continue to read up on how Google protects data.Google has incredible expertise in this area. Security is an integral part of their operations. They scan for security threats using tools, they use penetration testing, quality assurance processes, security reviews and external audit. They have a rigorous incident management process for security events that affect the confidentiality, integrity or availability of systems of data. They apply defense-in-depth perspective to securing their custom-designed servers, proprietary operating systems and geographically distributed data centers. Data is encrypted in transit, at rest and on backup media, moving over the internet or traveling between the data centers. They build redundancy into their server design, data storage, network and internet connectivity and even the software services.Their audits cover compliance standards and we reviewed some of them. We also reviewed some of the tools used to empower users and administrators to improve security and compliance. We were discussing user authentication, data management and email management features. We now review data detention which is facilitated by the eDiscovery feature. eDiscovery is the means to prepare for lawsuits and other legal matters. Google Vault is the eDiscovery solution for Google Apps that lets customers retain, archive, search and export their business gmail. Administrators can specify retention rules that specify how long certain messages are retained before they are removed from user mailboxes and expunged from all Google systems. Retention rules can be specified based on duration, content, labels and delegation. A legal hold is a binding requirement on a user to preserve all their emails and on the record chats indefinitely In a legal hold, messages may be deleted from the user's view but not from the message servers. Email and chat can be searched using user accounts, organizational units, date or keyword. Results can be exported as evidence reports.
Endpoints are also secured. For example, administrators can enforce policies over mobile devices in their organization, encrypt data on devices, and perform actions like remotely wiping or locking lost or stolen devices. This comes with mobile data management in Google Apps. Chrome browser security can be managed by policies specified by the administrator remotely There are over 280 policies that can be specified. The Google Apps admin console applies policy  to Chrome Devices
Administrators can also recover data by restoring a deleted user account or restoring a user's drive or gmail data. They can publish reports that provide transparency to their actions.
#codingexercise
given a doubly linked list of 0 and 1 integers, sort them in place

dll Sort0And1(dll root)
{
if ( root == null || root.next == null) return;
var current = root:
var transition = null);
while(current)
{
if (current.data ==1)
{
if (transition == null)
     transition = current;
root = CutAndInsertAfter(transition, current, root);
}else{
root = CutAndInsertBefore(transition, current, root);
}
current = current.next;
}
return root;
}

Saturday, January 21, 2017

Today we continue to read up on how Google protects data.Google has incredible expertise in this area. Security is an integral part of their operations. They scan for security threats using tools, they use penetration testing, quality assurance processes, security reviews and external audit. They have a rigorous incident management process for security events that affect the confidentiality, integrity or availability of systems of data. They apply defense-in-depth perspective to securing their custom-designed servers, proprietary operating systems and geographically distributed data centers. Data is encrypted in transit, at rest and on backup media, moving over the internet or traveling between the data centers. They build redundancy into their server design, data storage, network and internet connectivity and even the software services.Their audits cover compliance standards and we reviewed some of them. We also reviewed some of the tools used to empower users and administrators to improve security and compliance.
Specifically, the user authentication/authorization features  involved a 2-step verification that adds an extra layer of security to Google Apps accounts by requiring users to enter a verification code which was also enhanced with a physical security key SAML single sign on allowed customers access to multiple services using the same sign in page. Oauth and OpenID extended the single sign on to multiple cloud solutions. In addition to these user access management enhancements, the data is also managed with features such as information rights management that can disable sharing of information meant for one or select people. The driver audit log lists every access that involves a data manipulation. Specific actions taken by user can also trigger alerts using the drive content compliance or alerting. There are trusted domains for drive sharing. There are trusted domains for drive sharing where whitelists are maintained. Sharing of data such as by emails are done over encrypted channels Even the from address in emails, if forged in a common compromise attack called phishing is detected and the threat mitigated. All emails have data loss prevention. Objectionable content is removed. Thus emails conform with content compliance. Administrators can even restrict the email address users can exchange mail with. Google vault provides an eDiscovery feature that lets customers retain, archive, search and export their business gmail This is very helpful to stay prepared in case of lawsuits and other legal matters.
#codingexercise
Find four elements a,b, c and d in an array such that a+b = c + d 
We discussed several solutions to this exercise. Another way to do this would be as follows
1) generate combinations of four element subarrays from the overall array
2) find if the four elements satisfy the given condition.
For distinct integers this can be like:
Void Combine (List<int> a, List<int> b,int start, int level, int val)
{
For (int I = start ; I < a.Length; i++)
{
if (b.contains(a[i])== false){
b[level] = a[i];
if (b.length == 4 && isValid(b, val)){Console.Write(b.ToString()) };
If (I < a.Length)
    Combine(a,b,start+1,level+1), val)
b[level] = 0;
}
}
}
bool isValid(List<int> nums, int val)
{
assert(nums.Count == 4);
nums.sort();
if (nums[0] + nums[3] == nums[1] + nums[2] && nums[0] + nums[3] == val)
   return true;
return false;
}

Friday, January 20, 2017

Today we continue to read up on how Google protects data.Google has incredible expertise in this area. Security is an integral part of their operations. They scan for security threats using tools, they use penetration testing, quality assurance processes, security reviews and external audit. They have a rigorous incident management process for security events that affect the confidentiality, integrity or availability of systems of data. They apply defense-in-depth perspective to securing their custom-designed servers, proprietary operating systems and geographically distributed data centers.Data is encrypted in transit, at rest and on backup media, moving over the internet or traveling between the data centers. They build redundancy into their server design, data storage, network and internet connectivity and even the software services.Their audits cover compliance standards and we reviewed some of them.
Next we started reviewing some of the tools used to empower users and administrators to improve security and compliance.
These included User and authentication/authorization features such as 2-step verification which requires users to sign in with their username, password as well as one-time-password-tokens. In addition, a security key developed by Google as an actual physical key to use with the Google account is used to enhance the 2-step verification.  The purpose of the security key is to send an encrypted code rather than a code which means login cannot be phished. Services such as single sign-on enables user to use multiple services with the same login.  OAuth 2.0 and Open ID allows customers to configure this single sign on service to multiple cloud solutions.
In addition to securing user, data management features are also provided to users and administrators.
Data Management includes such things as securing information rights with driver audit log, driver content compliance, trusted domains for drive sharing and email security features such as secure TLS, phishing prevention, data loss prevention, content compliance, restriction to content and objectionable content.

#codingexercise
Find four elements a,b, c and d in an array such that a+b = c + d 
List<Tuple<int,int>> PairSums(List<intnumsint sum) { 
Var ret = new List<Tuple<int,int>>(); 
int left = 0; int right = nums.Count - 1; if (nums.Count <= 1) return ret; 
nums.Sort(); while (left < right) {          if(nums[left] + nums[right] == sum) 
{               var t = new Tuple<int,int>(nums[left], nums[right]); 
              ret.Add(t); 
              ret.Append(PairSums(nums.GetRange(left, right-left), sum); 
              ret.Append(PairSums(nums.GetRange(left+1, right-left, sum); 
              left++; 
              right--; 
}          else if (nums[left] + nums[right] < sum)               left++;          else               right--; } return ret; } 
CombinePairs(PairSums(nums,sum));