Thursday, January 11, 2024

 

This is a summary of a book titled “Unsupervised” written by Stuart Leaf and Daniel Doll-Steinberg, both of whom are investors in emerging technologies. They argue that technological innovation is accelerating at an exponential rate, leading to a "Cognitive Revolution." Artificial intelligence (AI) is a foundational technology of the Cognitive Revolution, with various types of AI being developed. These include weak or narrow AI, strong or artificial general intelligence (AGI), and artificial super intelligence (ASI). AI can hold conversations, create visual art, write computer code, and compose graduate school-level papers. Methodologies for simulating intelligence in machines include machine learning, neural networks, deep learning, and generative pre-trained transformer models.

Quantum computing, a foundational technology, could significantly increase computing processing power. Traditional computing stores information in bits, representing binary choices, while quantum computers store information in quantum bits, representing infinite outcomes. Quantum computers can run many calculations simultaneously, producing many possible conclusions. However, developers must resolve technical problems before general purpose quantum computers become common. The cognitive revolution is expected to affect many sectors of society, eliminate some jobs and create others.

Advances in wireless communication, such as 5G cellular and satellite networks, will be crucial for the evolution of foundational technologies, such as the Internet of Things. Enabling technologies, such as blockchain, smart contracts, decentralized autonomous organizations (DAO), and tokenization, will facilitate these technologies. Hardware advances will fuel the Cognitive Revolution, with software, including cognitive abilities, miniaturization, and interfaces, granting robots and other physical technologies increased functions. The metaverse, a digitally generated virtual world, could revolutionize commerce and entertainment. The energy sector will experience significant change, with emerging technology bringing new efficiencies to solar, wind, and geothermal power. AI and other emerging technologies will affect various aspects of healthcare, from research to treatments. Fintech is about to change banking, investment, and the financial industry. Overall, the future of technology depends on everyone's ownership and participation.

Technology is transforming various sectors of the economy, including manufacturing, education, healthcare, and leisure and recreation. Advances in additive technology, such as 3D printers, are accelerating the adoption of digitally controlled automation to cut costs and improve speed, quality, and safety. This shift could reduce dependence on foreign manufacturers, a concern due to the COVID-19 era's supply-chain issues. Machine learning advancements could turn computer programs into information gatekeepers, potentially eliminating as many as 300 million jobs. However, machines will eventually take control of their own development, writing their own code and installing updates. Employment in the face of frontier technological advances will depend on existing jobs that can be done differently and new jobs that currently don't exist. The rise of digital newspapers and the rise of computers and smartphones will also impact physical dematerialization. Computers will increasingly take over human reasoning and skills to drive cognitive dematerialization, such as in securities trading.


Wednesday, January 10, 2024

 

Some of the design choices for infrastructure consolidation fall into well-known patterns. These include the following patterns:

1.       Dedicated instance: This is a pattern having a dedicated instance to serve or maintain other instances of the same resource type. Case in point is the deployment of Overwatch monitoring to Databricks workspaces. A single workspace is chosen to host the dashboards that the Overwatch creates but logs, events, and metrics flow from associated workspaces. This pattern draws minimal resource utilization from other participating instances and facilitates the overall cost management and reporting of the resource utilization from the dedicated instance.

2.       Separation of resources by purpose: Compute and storage often have go hand-in-hand in many deployments. Depending on the purpose of the deployment or a group within the deployment, one may be heavier use than other calling for different forms that lean either way. One such example is Kubernetes versus app services. Applications that are not necessarily stateful and are more compute-oriented can become more native to the cloud in the form of Function and Web applications.  A Kubernetes deployment is easier to move entire compute and storage using applications from on-premises to cloud. This pattern also makes scaling and load-balancing independent for computing and storage.

3.       Colocation of data – It is easy to visualize data in an external storage account or s3 store with little or no thought about whether data is being written across regions or if each write is replicated across regions especially if the reference to the store can easily be shared among multiple applications. The same goes for messaging infrastructure but colocation of messages and subscribers and storage accounts and data consumers is not only beneficial in terms of performance and costs but also a necessity in terms of maintenance, assignment, and scoping. Colocation also makes object iteration easier especially when they share the same prefix in their addressability.  Objects grouped together by account help with account throttling. Isolation also improves, which further helps with troubleshooting.

4.       Vertical integration – Since computing and storage go together, it is possible to store computing logic as modules together with data and associated with the same catalog for both. This helps the compute to locate the data locally and render the results faster to the caller. This is not only a performance improvement but also a convenience to share them in the same way as the data. Databases have used this to their advantage and the design has stood the test of data migration to the cloud.

5.       Infrastructure layering – This pattern is evident from container infrastructure layering which allows even more scale because it virtualizes the operating system. While traditional virtual machines enable hardware virtualization and hyper V’s allow isolation plus performance, containers are cheap and barely anything more than just applications. Azure container service serves both Linux and Windows container services. It has standard docker tooling and API support with streamlined provisioning of DCOS and Docker swarm. Job based computations use larger sets of resources such as with compute pools that involve automatic scaling and regional coverage with automatic recovery of failed tasks and input/output handling. Azure demonstrates a lot of fine grained loosely coupled micro services using HTTP listener, Page content, authentication, usage analytic, order management, reporting, product inventory and customer databases.

Previous articles: IaCResolutionsPart62.docx

Tuesday, January 9, 2024

The power of infrastructure consolidation:


Infrastructure deployed to the public clouds is often characterized by shared-nothing deployments that are dedicated to different workloads. With the convenience of repeatable and automated deployments via Infrastructure-as-Code aka IaC, it becomes easy to spin up as many resources as necessary to separate the usages. This is advantageous in many ways. For example, deployments can evolve differently and have different lifetimes and growth spurts. Ownership can be handed off to independent teams. Access control can be set up differently. Different workloads can require different tunings which might manifest as resource-level settings and configurations.  The deployments can scale independently. It is also possible to bill them to different accounts or at least tag them differently. State may be captured in the tags and labels differently between the resources of the same type in those deployments. The explosion in the number of resources is not a concern for automation that repeats the same steps for each. Workloads can be studied more effectively, and they do not share the same fate. Troubleshooting and maintenance become easier and cheaper. It is even possible to establish a control set for baseline.

But consolidation has its own merits. For example, web applications deployed to independent application services can be hosted on the same application service plan when they do not have widely varying requirements. A single app service plan requires only one subnet for virtual network integration to guarantee that all outbound traffic goes via the virtual network. This is especially helpful to ensure a robust private plane connectivity between storage and compute resources that these web applications might need. Cost savings come from avoiding higher sku app service plans as well as those from associated resources such as subnets and networking appliances. It provides another level of differentiation between infrastructure and workload perspectives albeit at the resource level instead of the resource group level. While simultaneously deployed resources and their dependencies are easy to interpret from the written IaC for deployments, the savings have a ripple effect in other non-functional aspects such as logging and monitoring. With a reduction in maintenance, consolidation also lowers costs for skills, training, and dedication. Consolidation as a strategy is even a prerequisite to creating service tiers so that quality-of-service aka QoS can be better articulated. The resource pooling and workload classification to pools is an established pattern for improving resource utilization in any infrastructure whether it is on-premises and at cloud scale.

Even the savviest cloud account owners are wary of ballooning costs over time for resources and there are many features available natively from the cloud to address some of these concerns, but no cloud can truly understand the resource utilization or refactoring possible without some involvement from these owners. Case in point is the use of specific resource types and monitoring practices to set up feedback-loop cycles. Both AWS and Azure offer resource types such as X-Rays and Application Insights to analyze and debug distributed applications which collects data about the requests that the application serves and provides tools to view, filter, and gain insights into that data. Yet owners seldom take advantage of setting up monitoring and dashboards for all their assets. Infrastructure providers aka IaC deployers are left to fill the gap but they have one advantage that others don’t. They can do this for entire resource groups and deployments that are not limited to specific web applications or resource types. Even if the dashboard they create and maintain to study the infrastructure usage patterns become private, they will be empowered to advise the account owners on tactical consolidations. They are not limited to making inferences from cost management dashboards and utilization dashboards and patterns can also help them to validate their choice of SKUs, reservations, and other assertions.  They are not strangers to alerts and notifications and resource-level chores, but they could benefit from creative ways to setup deployment scope feedback cycles.


Monday, January 8, 2024

 

This is a summary of the book titled “No Limits” and written by John C. Maxwell and published by Center Street in 2018. He is a renowned leadership author who emphasizes the importance of maximizing personal abilities and overcoming mental limits. He suggests that by eliminating the ways we hold ourselves back with and focusing on our strengths, we can move from supposed limits to limitless potential. Maxwell's advice is to focus on and expand our strengths, not dwell on our weaknesses. To achieve our full potential, we must unleash ourselves and undergo significant change.

The story of Mount Everest, where George Mallory led three unsuccessful expeditions, serves as a reminder of the potential of human beings to grow and improve. By focusing on our strengths and not dwelling on our weaknesses, we can make the right choices and achieve our goals.

Maxwell's advice is to discard any limits we have placed on ourselves and focus on building our strengths, not dwelling on our weaknesses. This approach is similar to how most pastors spend their time equipping their parishioners, helping them extend and fortify their strengths. Achieving our full potential requires embracing change and taking action to build our abilities.  To reach our full capacity, it is essential to make slow and meaningful changes that are not easy or quick. Investing our energy and time wisely, even if it is minor, can have a positive effect on our lives. It is crucial to recognize our limitations and strive to become the best version of ourselves.

Nick Vujicic, a motivational speaker born with limited physical capacities, exemplifies this concept. He was rejected by the first 52 organizations he approached for a paid speaking engagement, but eventually, he received 35,000 requests annually to speak to groups worldwide.

To improve our capacity, we focus on two areas: energy capacity and emotional capacity. Energy capacity involves utilizing physical abilities and managing energy more effectively than time. Emotional capacity involves becoming the master of our emotions, aiming to become emotionally strong and able to cope with challenges. By taking charge of our emotions, we can better handle problems, change, and stress.

The book highlights the importance of developing various capacities in individuals.

Thinking capacity involves logical reasoning and creating a system for organizing and recording ideas. People capacity involves establishing and maintaining relationships, with individuals who excel in these areas focusing on helping others achieve their goals. Creative capacity involves finding viable solutions to problems and sharing ideas with others. Productive people are futurists who learn from the past and leverage the present to leverage the future.

Responsibility capacity involves taking control of one's life and accepting accountability. Character capacity is built on values and is the foundation of success. Abundance capacity involves having an abundance mindset, believing in the possibility of good things and the possibility of improvement. Discipline capacity involves exercising self-discipline, controlling habits, and never making pace an excuse for failing to fulfill responsibilities.

Intentionality capacity involves understanding what is meaningful and significant in life, which includes deliberateness, consistency, and willfulness. Attitude capacity involves maintaining a positive attitude, taking intelligent risks, and building spirituality. Risk capacity involves taking risks to move forward and adjust accordingly. Spirituality capacity provides optimism and strength, while partnership capacity helps multiply the impact of other efforts. Growth capacity involves releasing other capacities, leading to personal confidence and autonomy in building and directing one's life and career. By focusing on these capacities, individuals can achieve greater success and achieve greater personal growth.

In this way, Maxwell offers seventeen areas to select from, for becoming stronger and becoming the best we can be, which are: Energy capacity, Emotional capacity, Thinking capacity, People capacity, Creative capacity, Production capacity, Leadership capacity, Responsibility capacity, Character capacity, Abundance capacity, Discipline capacity, Intentionality capacity, Attitude capacity, Risk capacity, Spiritual capacity, Partnership capacity, and Growth capacity.

Previous Book Summary: BookSummary36.docx

Summarizing Software: SummarizerCodeSnippets.docx

#codingexercise: CodingExercise-01-08-2024.docx

Sunday, January 7, 2024

 

This is a summary of a book titled “The Age of Prediction” written by Christopher Mason and Igor Tulchinsky and published by MIT Press, 2023. This book highlights the potential of AI algorithms and big data to reduce uncertainty and risks in various industries. They suggest that while exponential data growth won't eliminate risk, it can create both dystopian possibilities and life-saving solutions. Future employers could use genetic markers for performance prediction, while AI-enabled weapons could threaten humanity. More accurate predictive polling could lead to voter manipulation, threatening democracy. As humankind's ability to make accurate predictions increases, complex new risks emerge. The COVID-19 pandemic exemplified this, with BioNTech using machine-learning trained algorithms to create a COVID-19 vaccine in just nine months. The exponential growth of data production has led to increased competition and turbulence, requiring companies to embrace flexibility and empiricism. As the world's data continues to grow exponentially, it is essential to leverage diverse perspectives when model building to gain better insights from big data.

Predictive models can be more effective when leveraging diverse data sets, including proxy data from interconnected sources. For example, Kinsa's smart thermometers helped researchers predict COVID-19 outbreaks more accurately. WorldQuant Predictive uses an approach called "idea arbitrage" to apply financial prediction tools across industries. However, exponential data growth won't eliminate risk, as complex systems are constantly in flux. As predictive accuracy increases, "moral hazard" can increase when actors perceive diminished risks. The garden of biology and data is constantly growing and changing, and data scientists must re-analyze older data when newer data emerges. New predictive tools create both dystopian possibilities and life-saving solutions. Scientists can use cell-free DNA analysis to predict health problems, heart transplant success, and detect tissue death. The body's signals can also be interpreted, such as mitochondrial DNA serving as a warning system for stress or suicide.

Predictive technologies have led to a demand for data that many people view as intimate or private. As data-enabled predictions become more ubiquitous, eroding privacy, new risks such as genetic discrimination and weaponizing identity tracing arise. Insurers could easily change life insurance premiums if an individual had genetic markers indicating a vulnerability to certain diseases. As human life is increasingly quantified and scientists collect new forms of data, people are becoming increasingly resistant to sharing data.

Future employers could draw on new forms of data, such as genetic markers, when predicting performance. However, predicting performance accurately is not always easy, and companies rely on external recruiters who often fail to accurately predict which candidates will perform well. In the future, predicting careers based on genetic data may become the norm, as employers may expect workers to modify their genomes. AI-enabled predictions are transforming modern warfare, with militaries exploring how to rapidly analyze nonstop streams of data and make rapid predictions needed for decision-making during combat. Precision-guided bombs and missiles enable militaries to wreak devastation by improving accuracy, using controls similar to those in video games.

The development of autonomous weapons systems raises ethical concerns as machine decision-making could replace human decision-making in selecting human targets. The US Department of Defense is reluctant to give robots full autonomy during warfare, and autonomous weapons could end up on the black market, posing a threat to humanity. Accurate predictive polling can lead to voter manipulation, threatening democracy. As humankind's power to make accurate predictions increases, complex new risks emerge, including economic disruption, job loss, privacy threats, mass surveillance, anti-science sentiments, epistemic confusion, and authoritarianism. As machines become more opaque, it becomes harder for humans to detangle the logic behind predictions. Humans must reflect on what life might be like if machines obtain a monopoly on predictable areas, such as politics and war. The potential benefits of predictive AI algorithms outweigh the risks, as AI could help humans achieve various goals, such as colonizing Mars or extending life expectancy.

Previous Book Summaries: BookSummary35.docx 

Summarizing Software: SummarizerCodeSnippets.docx
#codingexercise: CodingExercise-01-07-2024.docx

Saturday, January 6, 2024

 

This is a summary of the book titled “Virtual You” written by Peter Coveney and Roger Highfield and published by Princeton in 2023. Digital twins, computer simulations of the human body, are being used by doctors to provide personalized medical care. These simulations would mimic the reactions of the patient to drugs or treatments and alert doctors to potential health issues. The concept of digital twins has been applied to the design of machinery and manufacturing processes, and now scientists aim to create detailed simulations of individual patients. This "Virtual You" would model the functions of the body, including molecular interactions and the workings of the heart, lungs, bones, and brain. This would allow doctors to offer "healthcasts" to predict and prevent specific illnesses, based on variables such as diet and lifestyle. The creation of Virtual You requires amassing health data from sources like lab tests, DNA sequencing, and molecular biology discoveries. However, computer power restricts the level of practical detail, and the necessary granularity depends on the specific use case.

Life sciences, including medical science, have traditionally relied on mathematical expressions to make predictions, but this approach has been less dependent on theory. In the 1950s, British scientists Alan Hodgkin and Andrew Huxley advanced biological theory by describing the function of the axon in longfin inshore squid. Medical science relies on post-hoc explanations of observed phenomena, such as diagnosing and explaining symptoms of a disease. To add a predictive dimension to medical science, data must be transformed into equations that reveal natural laws. This would enable doctors to predict the effectiveness of treatments and diets for individual patients. Creating Virtual You would require new levels of computer power, with the Manhattan Project and the ENIAC (Electronic Numerical Integrator and Computer) leading to advances in simulating molecular phenomena. Advances in modeling efforts have accelerated with the increasing power of supercomputers and artificial intelligence. Simulated cell research could lead to new understandings of molecular biology and new medications.

Cell simulations have become increasingly important in medicine, as they provide insights into the human body's functions and potential treatments. Researchers at Keio University in Japan created a model of the bacterium Mycoplasma genitalium in the late 1990s, which was the first bacterium model to address all known gene functions. The University of Connecticut and Mount Sinai School of Medicine used these models to study the kidney, pancreas, and brain. Simulations could be useful in designing personalized medical treatments, drug development, and trials. Researchers are also working to simulate and study human organs, using multiscale and multiphysics approaches. The Barcelona Supercomputing Center (BSC) has been working since the 1990s to model the entire human heart, which can provide insights into heart failure and raise red flags on potential dangers of various heart medications. The BSC collaborates with Medtronic to design simulations for medical device development, such as pacemakers. Numerous projects are underway worldwide to develop schemes for simulating the entire body.

Virtual You simulations could help doctors control inflammation and suppress infections, as well as provide insights into disorders affecting elderly and overweight populations. However, digital twins are generalized and require new types of computers. Researchers may use updated versions of analog computers, such as metamaterials, which can manipulate light waves. Neuromorphic computing, inspired by the human brain, is another approach, with applications like Europe's Human Brain Project. Quantum computers, which use qubits to represent ones and zeroes simultaneously, could be the best future option for work in chemistry and biochemistry.

Previous Book Summaries: BookSummary34.docx

Summarizing Software: SummarizerCodeSnippets.docx

Friday, January 5, 2024

 

This is a summary of a book “Becoming Data Literate” written by David Reed and published by Harriman House in 2021. He is a data, and analytics expert who emphasizes the importance of data in organizations to realize their vision. He suggests that organizations should embrace a data literacy framework and embark on data transformation to become data natives. To do this, organizations should start with a vision and move through five stages to become data natives. This includes inspiring stakeholders, fostering a culture where teams share data and a common language, learning novel approaches for managing data practitioners, and establishing a clear data ethics code. The DataIQ Way framework provides a framework to guide leaders in creating data-driven organizations, incorporating five dimensions: vision, business strategy, value creation, culture, and data foundations. The framework aims to bridge gaps between company culture and data culture, aligning with shared objectives, rewards, and metrics. By embracing a data literacy framework, organizations can become data literate within three years.

This involves understanding the value of data, providing employees with the necessary data, trusting the data, and fostering a culture of data across the organization. Companies can assess their current level of data literacy by assessing both consumers' understanding and alignment of data departments and producers with business goals. As an organization matures, it moves through five stages: "data user," "data driven," "data literate," "data cultured," and "data native." To ensure adoption, organizations should demonstrate the value of data transformation and secure buy-in from stakeholders. The analytics team should engage with stakeholders by collaborating, communicating, championing, and challenging the status quo when data reveals a need for change. By demonstrating the benefits of data, organizations can demonstrate the potential of data transformation and make their strategies more achievable.

Leaders play a crucial role in promoting behavior change in organizations by connecting actions to the organizational vision. They can do this by providing context, connecting actions to desired outcomes, and offering symbolic messages. Building a data culture involves sharing data and a common language for discussing it, fostering a data culture that embraces data democratization and clear data governance. Data leaders should develop a common language for communicating about data and consider introducing data literacy programs. They should also align data strategy with brand values at every consumer touchpoint. Data leaders must learn new approaches for managing data practitioners, prioritizing factors such as social cohesion, perceived supervisory support, information sharing, goal and vision clarity, and trust. They must demonstrate continuous improvement in the performance of the data department through skill development, productivity, and engagement. Data is an intangible asset with no standardized valuation metrics, making it difficult for data leaders to establish metrics to demonstrate its value.

Data practitioners often use metrics to identify their benefits to organizations, but leaders must establish and implement a clear data ethics code. Consumers are becoming cautious about sharing data with corporations, and leaders should embrace transparency around data governance. Research shows that over 20% of consumers share their data without properly reading privacy notices. Data leaders should focus on four principles: autonomy, beneficence, nonmaleficence, and fairness. They should ensure that data-driven decisions are demonstrably fair and that they do not infringe on citizens' rights. Data practitioners lack a system of sanctions for operating outside a professional code of ethics, so leaders must create mechanisms to respond to ethical breaches.