Saturday, January 28, 2017

Today we write a review of a book "From Smart to Wise" by Kaipa and Radjou. The authors say that "In our experience, wise leadership succeeds where smart leadership cannot."
This book is about six capabilities which are essential for wise leadership and these are :
perpective, action orientation, role clarity, decision logic, fortitude and motivation.
It starts off by classifying leaders as two types - functional smart leaders and categorized as blue zone and business smart leaders categorized as red zone.
The blue zone leaders are experts in their field. They execute effectively. They are risk-averse. They are detail oriented. They deliver exceptional results and they are efficient. Tim Cook is an example of this category
The red zone leaders are visionaries. They see the big picture. They are risk takers. They are very competitive. They are strategic entrepreneurs. They are highly motivated. Bill Gates is considered an example of this category.
Wiseness is about being equally dexterous in both categories and to know when to apply which category. It often comes with a perspective of what will benefit others while the smartness is about oneself.
The authors suggest that we should figure out which zone we are operating in, decide where we are on our path, create a road map, and find tools to help us stay on the outlined path.
This book articulates that to be a wise leader, we must find and follow a noble purpose which will guide us like the North Star. In fact this perspective is considered the most critical capability. For a wise men, this perspective means becoming sensitive to the context around them and being able to see the world without any filters. Since this perspective transcends personal gain and ego, it gives meaning and a path to happiness. Wise leaders exhibit mindfulness and have a growth mindset.
 A leader's action needs to be authentic and appropriate. Authentic means being true to oneself. Appropriate means the actions should be done in context. Often we fail from that but here's a way to bridge the gap: what we say must line up with what we do,   what we say must line up with what we actually feel and who we are must line up with what we do.
Role clarity, as per the authors, is the capability to perform a chosen role convincingly and with enthusiasm, and without losing sense of who we are behind our role. They demonstrate leadership in any role and are good team players. They empower others and amplify the smarts and capabilities  of the people around them.
To quote the authors, the decision logic is the system, processes and principles of reasoning used in making important decisions. If the decisions exhibit ethical clarity and pragmatism, not complicated with emotions and alternatives are weighed while listening to all viewpoints, this capability is demonstrated.
Flexible fortitude is to know when to hold and when to fold. we need to strike a balance between having the courage to stay the course and being flexible to change direction. The leaders in both the blue and the red zone demonstrate great fortitude but risk being inflexible whereas the wise leader will be comfortable striking a balance.
Finally wise leaders believe that what is in the public interest will eventually be in the interest of all individual and groups, including themselves. They demonstrate strong intrinsic motivation and they are keen on serving.

#codingexercise
Yesterday we described a codingexercise to find the maximum product of a sequence in an array where the elements are not adjacents.if the integers were both positive and negative, we would have need to make sure the count of selected negatives is even. We would keep track of two products one with the negative integers only and the other regular.
The key observation here is that we skip the negatives and find the desired max among the positives. Then we take the negatives alone and find their desired max with the additional criteria that they have to be even in count. Whichever is greater dictates the list that will be included as is and elements of the other list will be included such that there are no adjacencies.

The same logic holds for sum but the skipped elements should have a value of zero.

int GetMaxSumBothPositiveAndNegative(List<int>A)
{
var positives = A.Clone();
for (int i = 0; i < positives.Count; i++)
    if (positive[i] <= 0)
        positives[i] = 0;
return GetMaxSum(positives);
}

Friday, January 27, 2017

Today we continue to read up on compliance overview of Adobe Cloud Services.Adobe is able to meet these standards and regulations because it practices a bottom up physical layer securing approach called the Adobe Common Controls Framework and a top down software layer securing approach called the Adobe Secure Product Lifecycle. The former manifests as a set of Adobe specific controls that map to approximately a dozen security standards and the latter is a set of security activities that span several product development practices, processes and tools. Therefore there is a layering of software supported on both infrastructure and operations which is available over a physical layer. This is secured from both directions - top and bottom with the SPL and CCF.
The CCF is built so that it is not specific to any of the products. Instead teams can inherit control capabilities from other parts of the organization. This way a software engineer does not have to be concerned about data center security but they inherit it from the data center operations team. The CCF conceptual model promotes Adobe corporate standards such as business continuity, security governance, data privacy, and people resources over product engineering which is based on product operations and that in turn operates on Adobe Product cloud which assures physical and environmental security. The product engineering adopts the SPL approach and the product operations adopts enforces monitoring, configuration, asset management, change management, logical access and incident response. The CCF has approximately 273 common controls across  twenty control domains. These include asset management, backup management, Business continuity, change management, configuration management, data management, Identity and access management, Incident response, Mobile device management, network operations, people resources, risk management, security governance, service lifecycle, site operations, system design documentation, systems monitoring, third party management, training and awareness and vulnerability management.
Compliance is reviewed periodically.
#codingexercise
Yesterday we described a codingexercise to find the maximum product of a sequence in an array where the elements are not adjacents.if the integers were both positive and negative, we would have need to make sure the count of selected negatives is even. We would keep track of two products one with the negative integers only and the other regular.
The key observation here is that we skip the negatives and find the desired max among the positives. Then we take the negatives alone and find their desired max with the additional criteria that they have to be even in count. Whichever is greater dictates the list that will be included as is and elements of the other list will be included such that there are no adjacencies.
int GetMaxProductBothPositiveAndNegative(List<int>A)
{
var positives = A.Clone();
for (int i = 0; i < positives.Count; i++)
    if (positives[i] <= 0)
        positives[i] = 1;
var max_positives = GetMaxProductList(positives);
var negatives = A.Clone();
for (int i = 0; i < negatives.Count; i++)
    if (negatives[i] >= 0)
        negatives[i] = 1;
var max_negatives = GetMaxNegativeList(negatives);
return RemoveAdjacencies(max_positives, max_negatives, A).Product();
}
List<int> GetMaxPositiveList(List<int> A)
{
var ret = new List<int>();
assert( A.all(x => x > 0));
if (A == null || A.Count == 0) return 0;
int inclusive = A[0];
int exclusive = 0;
for (int i = 1; i < A.Count; i++)
{
var max = Math.max(inclusive, exclusive);
if (max == inclusive)
    ret.add(A[i-1]);
inclusive  = exclusive × A[i];
exclusive = max;
}
int last= Math.max(inclusive, exclusive);
if (last== inclusive)
    ret.add(A[A.count-1]);
return ret;
}

Thursday, January 26, 2017

Today we continue to describe migratory assets. A significant percentage of the end users do not care about where the assets come from.  Consequently we can move assets between public cloud to public cloud or even public cloud to private cloud or even private to private cloud while letting the users access their resources by name and in some cases by the same IP. 
There are two factors contributing to this: 
1) a large number of requests are for fine grained number of assets 
2) Most assets are managed by system center managers that don't depend on the host but only on the agent running inside the asset. 
These promote migratory assets. 
Another important criteria is the offloading of infrastructure activities to existing automations in the public cloud. Take for example, periodic evaluation of unused resources and the reminders to customers. This is no longer done with custom or proprietary automations. Instead the same technologies that monitor usages and trigger alerts in public cloud can be reused. There is significant savings in reusing out of box technologies. First, the tools and technologies are widely accepted and have already matured to feedback across organizational usages. Second, there is no need to increase ownership and maintenance of in house tools and technologies that serve the same purpose. 
By a similar counterargument, there is always a place for in house tools and automations that can benefit private cloud implementations and boost the overall offering to the customer. Cheap, dirty and fast solutions are easily facilitated in a private cloud and we welcome this with independently controlled private cloud as an alternative offering to the customers.  For example, we can create clusters and containers in a private cloud that can benefit as additional resources to the user. 
Lastly, applications and services can leverage appliances, technologies and services that are not available in a private cloud. We have storage appliances, network accelerators, firewalls, routers and many other technologies that can apply only in private cloud. These then become foundations for Infrastructure as a service, Platform as a service and software as a service layers which bring their own advantages. Arguably though these upper layers for the most part should not differentiate between whether the resources are requested from private cloud or public cloud providers
#codingexercise
Find the maximum product of a subsequence of an array of integers such that no two elements are adjacent.
int GetMaxProduct(List<int> A)
{
assert( A.all(x => x > 0));
if (A == null || A.Count == 0) return 0;
int inclusive = A[0];
int exclusive = 0;
for (int i = 1; i < A.Count; i++)
{
var max = Math.max(inclusive, exclusive);
inclusive  = exclusive × A[i];
exclusive = max;
}
return Math.max(inclusive, exclusive);

}
if the integers were both positive and negative, we would have need to make sure the count of selected negatives is even. We would keep track of two products one with the negative integers only and the other regular.
The key observation here is that we skip the negatives and find the desired max among the positives. Then we take the negatives alone and find their desired max with the additional criteria that they have to be even in count. Whichever is greater dictates the list that will be included as is and elements of the other list will be included such that there are no adjacencies.

Wednesday, January 25, 2017

Migratory Assets
          Today many organizations use one or more public clouds to meet the demand for the compute resource by its employees. A large number of these requests are fine grained where customers request a handful of virtual machines for their private use. Usually not more than twenty percent of the customers have demands that are very large ranging to about a hundred or more virtual machines. 

 There are a few challenges in servicing these demands. First, the virtual machines for the individual customers are sticky. Customers don’t usually release their resource and even identify it by their name or ip address for their day to day work.  They host applications, services and automations on their virtual machines and often cannot let go of their virtual machine unless files and programs have a migration path to another compute resource. Typically they do not take this step to create regular backups and keep moving the resource. While container platforms for Platform-as-a-service (PaaS) have enabled software to be deployed without any recognition of the host and frequently rotated from one host to another, the end users adoption of PaaS platform depend on the production readiness of the applications and services The force for PaaS adoption has made little or no changes to the use and proliferation of virtual machines by individual users. Therefore the second criteria is that the individual users habits don’t change.
Consequent to these two factors, the cloud services provider can package services such as additional storage, regular backup schedule, patching schedule, system management, securing and billing at the time of request for each asset. However such services depend on the cloud where the services are requested. For private cloud a lot of the service is in-house adding to the costs even if the inventory is free.

I describe that a significant percentage of the end users do not care about where the assets come from.  Consequently we can move assets between public cloud to public cloud or even public cloud to private cloud or even private to private cloud while letting the users access their resources by name and in some cases by the same IP.

#codingexercise
Find the maximum of a subsequence of an array of integers such that no two elements are adjacent.
int GetMaxSum(List<int> A)
{
if (A == null || A.Count == 0) return 0;
int inclusive = A[0];
int exclusive = 0;
for (int i = 1; i < A.Count; i++)
{
var max = Math.max(inclusive, exclusive);
inclusive  = exclusive + A[i];
exclusive = max;
}
return Math.max(inclusive, exclusive);
}
Courtesy: geeksforgeeks

Tuesday, January 24, 2017

We read how Google protects data.While Google is an innovator in two-step authentication and encryption methods and may even have more scale of operations than many other companies, all that other companies need to do is to ensure that they don't fall short of the considerations from Google's security efforts. Their implementations can vary but the threats will have been mitigated. Where may companies may lead the initiative is to show how to use the public cloud by setting security standards in its adoption. I enumerate the areas of control. These include:
1) network integration between on-premise and public cloud
2) penetration testing of virtual private network in the public cloud
3) Active Directory connector between virtual private network and on premise
4) System Center Manager for public cloud assets
5) Setting up of monitoring and alerts on all threat related operational activities
6) Successful key management for all assets and operations related key usages
7) Isolation among security groups or access controls for individual instances and groups.
8) Metering and reports on all usages so that anomalies can be detected
9) Metrics on all distributions across regions and fault tolerance
10) Size of large groups and the percentage of fine granularity in resource groups
11)Time based variations over a period

Most public cloud providers publish their own security guidelines and practice while providing documentation and community support forums. These best practice cover many tools and features that the cloud providers give to enable empowering the companies that adopt the public clouds. However, the degree to which each one uses these clouds and their compliance to minimum security standards is largely not governed. The contract between public cloud providers and organizations safeguards the public cloud provider but does not secure the end user or their data in all aspects of their usage. For example, with the use of public cloud for private networks, an organization may or may not guarantee isolation of user data on the same server, its robustness and reliability against loss, theft or mishandling, its availability to export when the user no longer wishes to use the services and its portability. Moreover, administrators have their own administration tasks that require certain level security compliance as we saw with our coverage of how Google secures its data. These tasks include writing rules that dictate data retention, showing responsibility towards legal hold and law enforcement response, maintaining transparency in their actions and the publication of their reports, or showing discretion in other maintenance activities.
Therefore, adoption of public cloud computing for organizational needs requires certain criteria that can evolve as an internal security compliance requirement.
More reading at https://1drv.ms/w/s!Ashlm-Nw-wnWrF0qVoeSgL7Xnu1w


#codingexercise
Yesterday we were discussing a codingexercise that attempted to sort a doubly linked list of 0 and 1.
It had a helper function to cut and insert element before or after the transition point. We implemented the insertion after the transition point and we implement the insertion before the transition point.
dll CutAndInsertBefore(dll transition, dll current, dll root)
{
assert (current != null && root != null);
if (transition == null) return root; // nothing to do
if (transition == current) return root;
if (current == root) return root; // this op only when current > transition
// root == transition is acceptable
// cut current;
var temp = current.next;
if (temp)
     temp.prev = current.prev;
if (current.prev){
     current.prev.next = current.next;
     current.next = null;
}
// insert before
if (transition.prev)
     transition.prev.next = current;
else
      root = current;
current.next = transition;
current.prev = transition.prev;
transition.prev = current;
return root;
}

Monday, January 23, 2017

Today we conclude reading up on how Google protects data.Google has incredible expertise in this area. Security is an integral part of their operations. They scan for security threats using tools, they use penetration testing, quality assurance processes, security reviews and external audit. They have a rigorous incident management process for security events that affect the confidentiality, integrity or availability of systems of data. They apply defense-in-depth perspective to securing their custom-designed servers, proprietary operating systems and geographically distributed data centers. Data is encrypted in transit, at rest and on backup media, moving over the internet or traveling between the data centers. They build redundancy into their server design, data storage, network and internet connectivity and even the software services.Their audits cover compliance standards and we reviewed some of them. We also reviewed some of the tools used to empower users and administrators to improve security and compliance. We were discussing user authentication, data management and email management features. We reviewed data detention which is facilitated by the eDiscovery feature.Administrators can specify retention rules that specify how long certain messages are retained before they are removed from user mailboxes and expunged from all Google systems. Administrators can enforce policies over mobile devices in their organization, encrypt data on devices, and perform actions like remotely wiping or locking lost or stolen devices. Administrators can also recover data by restoring a deleted user account or restoring a user's drive or gmail data. They can publish reports that provide transparency to their actions.
Thus Google protects data in its infrastructure, applications and personnel operations. The same security considerations are acknowledged by the industry. We mentioned comparision to Adobe's efforts in this regard. While Google is an innovator in two-step authentication and encryption methods and may even have more scale of operations than many other companies, all that other companies need to do is to ensure that they don't fall short of the considerations from Google's security efforts.  Their implementations can vary but the threats will have been mitigated. Adobe has secured its data by virtue of the very same compliances that Google has made. These include  the SOC, ISO27001, FedRAMP and Payment card industry data security standard. In addition Adobe fulfils regulatory compliances such as Gramm-Leach-Bliley Act GLBA for financial institutions, HIPAA the health insurance portability and accountability act and protected health information PHI, the code of Federal regulation, Title 21 and the US Family educational rights and privacy act.
Adobe is able to meet these standards and regulations because it practices a bottom up physical layer securing approach called the Adobe Common Controls Framework and a top down software layer securing approach called the Adobe Secure Products Lifecycle. The former manifests as a set of Adobe specific controls that map to approximately a dozen security standards and the latter is a set of security activities that span several product development practices, processes and tools.
Where Adobe could become pioneer is to put together a security standard for using hybrid or public clouds for organizational use. While Google claims its public cloud has better security than on-premise computing, Adobe could become a pioneer to say that its embrace of public cloud is more secure than on-premise computing of many other companies.
#codingexercise
Yesterday we were discussing a codingexercise that attempted to sort a doubly linked list of 0 and 1.
It had a helper function to cut and insert element before or after the transition point. It goes like this:
dll CutAndInsertAfter(dll transition, dll current, dll root)
{
assert (current != null && root != null);
if (transition == null) return root; // nothing to do
if (transition == current) return root;
if (current == root) return root; // this op only when current > transition
// root == transition is acceptable
// cut current;
var temp = current.next;
if (temp)
     temp.prev = current.prev;
if (current.prev){
     current.prev.next = current.next;
     current.next = null;
}
// insert after
if (transition.next)
     transition.next.prev = current;
current.next = transition.next;
current.prev = transition;
transition.next = current;
// root remains unmodified for insertion after transition
return root;
}

The cutandinsertbefore method is similar to the one above except that the root might change.

Sunday, January 22, 2017

Today we continue to read up on how Google protects data.Google has incredible expertise in this area. Security is an integral part of their operations. They scan for security threats using tools, they use penetration testing, quality assurance processes, security reviews and external audit. They have a rigorous incident management process for security events that affect the confidentiality, integrity or availability of systems of data. They apply defense-in-depth perspective to securing their custom-designed servers, proprietary operating systems and geographically distributed data centers. Data is encrypted in transit, at rest and on backup media, moving over the internet or traveling between the data centers. They build redundancy into their server design, data storage, network and internet connectivity and even the software services.Their audits cover compliance standards and we reviewed some of them. We also reviewed some of the tools used to empower users and administrators to improve security and compliance. We were discussing user authentication, data management and email management features. We now review data detention which is facilitated by the eDiscovery feature. eDiscovery is the means to prepare for lawsuits and other legal matters. Google Vault is the eDiscovery solution for Google Apps that lets customers retain, archive, search and export their business gmail. Administrators can specify retention rules that specify how long certain messages are retained before they are removed from user mailboxes and expunged from all Google systems. Retention rules can be specified based on duration, content, labels and delegation. A legal hold is a binding requirement on a user to preserve all their emails and on the record chats indefinitely In a legal hold, messages may be deleted from the user's view but not from the message servers. Email and chat can be searched using user accounts, organizational units, date or keyword. Results can be exported as evidence reports.
Endpoints are also secured. For example, administrators can enforce policies over mobile devices in their organization, encrypt data on devices, and perform actions like remotely wiping or locking lost or stolen devices. This comes with mobile data management in Google Apps. Chrome browser security can be managed by policies specified by the administrator remotely There are over 280 policies that can be specified. The Google Apps admin console applies policy  to Chrome Devices
Administrators can also recover data by restoring a deleted user account or restoring a user's drive or gmail data. They can publish reports that provide transparency to their actions.
#codingexercise
given a doubly linked list of 0 and 1 integers, sort them in place

dll Sort0And1(dll root)
{
if ( root == null || root.next == null) return;
var current = root:
var transition = null);
while(current)
{
if (current.data ==1)
{
if (transition == null)
     transition = current;
root = CutAndInsertAfter(transition, current, root);
}else{
root = CutAndInsertBefore(transition, current, root);
}
current = current.next;
}
return root;
}