Thursday, December 8, 2022

Problem Statement: An Application intends to make use of S3 for storing and retrieving documents that are uploaded by users who are not yet onboarded to the application. An external Identity provider can confirm the validity of a user but the serverless function must authenticate and authorize their requests prior to upload and download. 

Solution:  

The solution revolves around the creation of a user pool to integrate with a third-party identity provider.  This allows a high level of flexibility to choose appropriate access management for an Application Gateway that can be used to onboard existing users, allow robust operational support (troubleshooting), and improve agility in the development of the serverless capability.  There are two options for this authentication module pilot: 

  1. AWS Cognito User Pools Authorizer for Lambda running on Application Gateway, as an IDP-agnostic option, using tools the team is already familiar with.  Here, the benefit is consolidating all the serverless access via Application Gateway, allowing the team to focus on building serverless capabilities via Lambda functions with little overhead from an operational perspective. 
     

  1. AWS Custom Lambda Authorizer is a custom authorizer to setup a user pool for accessing serverless functions via Application Gateway. Here, the benefit is greater control over the issue of identity and access tokens but increasing the maintainability considerations for the team. 

Given these choices, the Cognito user pool authorizer is preferred for the following reasons.  

  • S3 access using AWS technologies such as Lambda and Application Gateway integrates well with Cognito that supports external Identity providers and works with both identity and access tokens. The less overhead and maintenance the development team has, the more the focus on the serverless and S3 accesses. 

  • Cognito user pool authorizers provide smoother onboarding of existing and new user pools. 

  • The proposed user pool authorizer will help with third-party OIDC and OAuth2 providers seamlessly with little overhead 

  • Overall, Cognito user pool authorizer is an out-of-box technology, with extensive documentation, examples, and community support. 

However, both authorizers are strong contenders and offer many of the same benefits and are superior to any ad hoc implementation of an authorization module by virtue of both being AWS core technologies.  Specifically, both offer: 

  • Ways to authorize identity and access tokens 

  • Both can be used for enabling access to S3  

  • Come with extensive documentation and community support 

  • Strong integrations with applications using REST APIs 

 

Wednesday, December 7, 2022

 CloudFormation versus Terraform – a choice 

 

Infrastructure-as-a-code is a declarative paradigm that is a language for describing infrastructure and the state that it must achieve. The service that understands this language supports tags, RBAC, declarative syntax, locks, policies, and logs for the resources and their create, update, and delete operations which can be exposed via the command-line interface, scripts, web requests, and the user interface. Declarative style also helps to boost agility, productivity, and quality of work within the organizations. 

Such a service for AWS public cloud is called AWS CloudFormation. Terraform is the open-source equivalent that helps the users with the task of setting up and provisioning datacenter infrastructure independent of clouds. These cloud configuration files can be shared among team members, treated as code, edited, reviewed and versioned. 

AWS CloudFormation has a certain appeal for being AWS native with a common language to model and provision AWS and third-party resources. It abstracts the nuances in managing AWS resources and their dependencies making it easier for creating and deleting resources in a predictable manner. It makes versioning and iterating of the infrastructure more accessible. It supports iterative testing as well as rollback. 

Terraform’s appeal is that it can be used for multi-cloud deployment. For example, it deploys serverless functions with AWS Lambda, manage Microsoft Azure Active Directory resources, and provision a load balancer in Google cloud. 

Both facilitate state management. With CloudFormation, users can perform drift detection on all of their assets and get notifications when something changes. It also determines dependencies and performs certain validations before a delete command is honored. Terraform stores the state of the infrastructure on the provisioning computer, or in a remote site in proprietary JSON which serves to describe and configure the resources. The state management is automatically done with no user involvement by CloudFormation whereas Terraform requires you to specify the remote store or fallback to local disk to save state. 

Both have their unique ways for addressing flexibility for changing requirements. Terraform has modules which are containers for multiple resources that are used together and CloudFormation utilizes a system called “nested stacks” where templates can be called from within templates. A benefit of Terraform is increased flexibility over CloudFormation regarding modularity. 

They also differ in how they handle configuration and parameters. Terraform uses provider specific data sources. The implementation is modular allowing data to be fetched and reused. CloudFormation uses up to 60 parameters per template that must be of a type that CloudFormation understands. They must be declared or retrieved from the System Manager parameter store and used within the template. 

Both are powerful cloud infrastructure management tools, but one is more favorable for cloud-agnostic support. It also ties in very well with DevOps automations such as GitLab. Finally, having an abstraction over cloud lock-ins might also be beneficial to the organization in the long run. 

 

 

Tuesday, December 6, 2022

 

The Architecture Driven Modernization process comprises of two main steps: the Knowledge Discovery Metamodel (KDM) model extraction and metric report generation. The process can be walked through in this manner. The source code is converted into an Abstract Syntax Tree model using a Code-to-Model transformation. The Abstract Syntax Tree model is converted into a KDM model using a model-to-model transformation. The KDM model converts to a metrics model using another model-to-model transformation. Finally, a metrics report is generated from the metrics model using a model-to-text transformation.


If we take the example of a set of SQL statements converted to Pl/SQL Abstract Syntax Tree Metamodel, it will consist of definitions like RDBTableDefinition, RDBColumnDefinition and such others, primitive types which consist of RDBTableType, RDBColumnType, RDBDatabaseType and such others, statements that comprise RDBSelectStatement, RDBModifyStatement, RDBInsertStatement and such others, BinaryExpressions such as RDBSelectExpression, RDBHostVariableExpression and such others.

When the models are extracted from the GPL code, the main task is the collecting scattered information for creating the model elements from source code. The scattering occurs due to the references between the elements. When such references are explicit in the models, they are implicitly established in the source code with the use of identifiers such as the reference between a variable and its declaration. Transforming an identifier-based reference into an explicit reference involves looking for the identified element in the source code. Dedicated parsers result from this challenge. This scattering problem requires complex processing to locate the correspondences between source code and the model elements. A powerful XPath like language specially built for resolving references can help here.


If we take the example of a set of SQL statements converted to Pl/SQL Abstract Syntax Tree Metamodel, it will consist of definitions like RDBTableDefinition, RDBColumnDefinition and such others, primitive types which consist of RDBTableType, RDBColumnType, RDBDatabaseType and such others, statements that comprise RDBSelectStatement, RDBModifyStatement, RDBInsertStatement and such others, BinaryExpressions such as RDBSelectExpression, RDBHostVariableExpression and such others.

With the popularity of machine learning techniques and softmax classification, extracting domain classes according to syntax tree meta-model and semantic graphical information has become more meaningful. The two-step process of parsing to yield Abstract Syntax Tree Meta-model and restructuring to express Abstract Knowledge Discovery Model becomes enhanced with collocation and dependency information. This results in classifications at code organization units that were previously omitted. For example, code organization and call graphs can be used for such learning as shown in reference 1. The discovery of KDM and SMM can also be broken down into independent learning mechanisms with the Dependency Complexity being one of them.  

Monday, December 5, 2022

Model-driven Software development

 


Model driven Software development evolves existing systems and facilitates the creation of new software systems.

The salient features of model driven software development include:

1.       Domain-specific languages (DSLs) that express models at different abstraction levels.

2.       DSL notation syntaxes that are collected separately

3.       Model transformations for generating code from models either directly by model-to-text transformations or indirectly by intermediate model-to-model transformations.

An abstract syntax is defined by a metamodel that uses a metamodeling language to describe a set of concepts and their relationships. These languages use object-oriented constructs to build metamodels. The relationship between a model and a metamodel can be described by a “conforms-to” relationship.

There are seven metamodels including Knowledge Discovery Metamodel, Abstract Syntax Tree Metamodel, the Software Measurement Metamodel, analysis program, visualization, refactoring and transformation

ASTM and KDM are complimentary in modeling software systems’ syntax and semantics. ASTMs use Abstract Syntax Trees to mainly represent the source code’s syntax, KDM helps to represent semantic information about a software system, ranging from source code to higher level of abstractions. KDM is the language of architecture and provides a common interchange format intended for representing software assets and tools interoperability. Platform, user interface or data can each have its own KDM and are organized as packages. These packages are grouped into four abstraction layers to improve modularity and separation of concerns: infrastructure, program elements, runtime resource and abstractions.

SMM is the metamodel that can represent both metrics and measurements. It includes a set of elements to describe the metrics in KDM models and their measurements.

Taking the example of the modernization of a database forms application and migrating it to a Java platform, an important part of the migration could involve PL/SQL triggers in legacy Forms code. In a Forms application, the sets of SQL statements corresponding to triggers are tightly coupled to the User Interface. The cost of the migration project is proportional to the number and complexity of these couplings. The reverse engineering process involves extracting KDM models from the SQL code.

An extractor that generates the KDM model from SQL code can be automated. A framework that provides domain specific languages for extraction of model is available and this can be used to create a model that conforms to a target KDM from program that conforms to grammar. Dedicated parsers can help with this code-to-model transformation.

With the popularity of machine learning techniques and softmax classification, extracting domain classes according to syntax tree meta-model and semantic graphical information has become more meaningful. The two-step process of parsing to yield Abstract Syntax Tree Meta-model and restructuring to express Abstract Knowledge Discovery Model becomes enhanced with collocation and dependency information. This results in classifications at code organization units that were previously omitted. For example, code organization and call graphs can be used for such learning. The discovery of KDM and SMM can also be broken down into independent learning mechanisms with the Dependency Complexity being one of them.

Sunday, December 4, 2022

 

The summary of the book “How The Secret changed my life” – Real people Real stories by Rhonda Byrnes

This book is a collection of stories from people of all walks of life who bring forward their testimonies of the book “The Secret” by Rhonda Byrnes.

The short stories that follow are categorizded by the daily teachings

“Ask believe receive”

                A lady from Kerala talks about undergoing divorce from her husband of just under a year and possible deportation at the same time. When she stopped worrying and whenever asked about her status, she would say that things are following the process, she started feeling better. She just kept visualizing her greencard to have an expiry several years later. She got a greencard that had an even later expiration date.

                A lady had lost her pet. She put the pet’s bed in her bedroom and constantly believed he was around and even put out the dish for his food daily. She put posters around. Finally, she received a call from a person returning from out of state whose niece had found a lost dog, asked around and finally took it to her state.

See it feel it receive it

                A lady had a tough time and wanted to get a scholarship. She kept feeling good that she will get a scholarship and how it will end all her concerns. She finally did it and it worked well for her.

                A person did not have a great romance. She started writing a romance book by putting pictures and visualizing the ring she wanted. She forgot about the book as she got busy with a new job but kept feeling exactly how she wanted to feel. Finally, she did meet a man who took her to the same place and even happened to give her the engagement ring she had pictured.

Be happy now, feel good now

                We are entirely free to choose how we feel now and that power can change the outcomes we face.

                A new couple was trying to get a second baby and move into their first home but their attempts had not been successful. Instead they started feeling happy now and soon they started seeing results their way.

Abundance

                Many people feel the pinch and money is a very real problem but some have found the way to even wish it and realize it into their life.

                A soldier deployed to Iraq had been praying to get a revelation from all the bad things she was seeing around her. She had downloaded this book and read it in two days. Initially she started attracting small things then she started attracting big things such as a civilian high paying job, three raises and a wonderful husband. She started believing in abundance and made a goal for an amount that would give her the financial freedom she wanted. She put it on her corkboard along with all her wish lists. She kept visiting it year after year. Each year she would manage to work through all the items on her board except for the large amount. In the process, she learned a lot about her financial responsibilities. Then one year she finally found that she had attained it even though she had stopped keeping track of her progress and started to believe that she was already getting there.

                A lady in Manhattan had started following the book by getting up early, going to her balcony and expressing her gratitude for the day. One day she found a penny on her balcony. Several months later she found seven-dollar bills. One fine day she found two twenty-dollar bills. Each time she asked people around her missing money, she found none.

                A family living in a dilapidated house had struggled for fourteen years to move but could neither afford it nor cope with their place. They collectively watched The Secret and started believing. They remembered a house on sale they had liked and wanted to live there but couldn’t afford it. They collectively visualized it. Soon their house appreciated while the one they wanted to live in remained on the market and even had a price drop. Finally, they moved into their dream house.

                A lady wanted to go on a world tour of all the landmarks she had collected in. She heard about a dream job of selling high-end jewelry on a cruise ship. She ended up visiting all the places she had put in her yearbook. The highlight of her belief was that she wished for a commission from her sales to match a certain number specified to the last digit and on one of her cruised she got a round number commission of that very amount.

Think good thoughts, take good actions, get good results

Change relationships

There are more examples that are uplifting to hear about people believing and getting ready and finally receiving what they want

A single mother desperate to find a loving partner accidentally runs into a wedding shop in an unknown corner and ends up buying a wedding dress and feeling silly afterwards. But this very person kept visualizing it and one day ran into a person who was looking for the same address as she was. They became loving husband and wife in the time to follow.

Love

A lady from British Columbia was suffering from ulcerative colitis and bleeding each day for several years. One day she found this book and started making positive affirmations and showing love. Each day her gratitude and love was unconditionally given and the universe returned it to her. Her disease was now in remission and she feels better.

A lady who had a dog with a terminal disease wished for him to be happy and healthy again. She would use every spare moment for this purpose. And it worked. There numerous, real and heartfelt tales of people with different disease conditions making a positive difference to their life style by changing every negative thinking and attitude that they have.

 

Saturday, December 3, 2022

Reverse engineering using models for software engineering

 

Software models are not just great for learning about software, but they are essential to software modernization, which usually consists of three steps: reverse engineering, restructuring and forward engineering.  When extracted from legacy software, models are easier to transform to a different architecture such as microservices. This activity could be performed by tools to bridge grammarware and model-driven technical spaces. Such dedicated parsers could also come with a query language that eases the retrieval of scattered information in syntax trees. It could also incorporate extensibility and grammar reuse mechanisms.

Model driven software development raises the level of abstraction and automation in the construction of software. Although commonly used for building new software systems, models also have the potential to evolve existing systems. They help to reduce the software evolution costs and improve the quality of the artifacts. Migration and modernization can both benefit from model driven approaches. A set of standard metamodels have been popularized by solutions and tools which represent the information normally managed in modernization tasks. This is true for language-to-language migrations as well where the first step is to extract models from the application code written in source language. Similarly, in a modernization effort, models could be extracted from schemas to improve the data quality. When these initial models are obtained, transformations can then be applied to generate higher level abstraction models followed by generation of artifacts such as code in another language or an improved data schema. Other operations on models such as model comparisons and synchronizations can also be applied.

The relationship between pairs of concepts grammar or program and metamodel or model is an example of a bridge between two different technical spaces specifically grammarware and modelware. Dedicated parsers are implemented to obtain models from code conforming to a grammar. These parsers perform model generation tasks in addition to code parsing. First, a syntax tree is created from the source code by static analysis and then this syntax tree is traversed to obtain the information needed to create the model elements. This is a complex task which requires both collecting scattered information and resolving references in the syntax tree.

Model transformations are classified into three categories which include text to model transformations which obtain models from existing source code; model-to-model transformations whose input and output are models; and models-to-text transformations, which generate software artifacts from a source model. When a model transformation language is designed, two key design choices are how to express the mappings between source and target elements and how to navigate through the source artifact. The former involves binding, and the latter involves querying.

Certain improvements can be imagined with the use of 1) reuse mechanisms at rule-level 2) a new kind of rule for dealing with expressions efficiently, and 3) an extensibility mechanism to add new operators. New functionalities could also target expressiveness, usability, and performance.

Friday, December 2, 2022

 Application Modernization Field Guide continued.

Application is also about selectivity. The business needs that are driving to modernize maintain higher 

priority than others. Tools native to the cloud can help with gaining insights and for performance 

analysis that brings together the technical priorities. The modernization journey begins with matching 

effort to business priority. The first few projects could be selected such that they are short in duration 

and high in potential business value. Then, all possible modernization options can be evaluated and 

ranked based on their complexity, cost and business value. Then the options can be prioritized and 

assessed for feasibility. 

As an example, if a node.js application deployed on a Linux server must be modernized, one of the ways 

to modernize it would be to decouple the application into microservices, package the application into a 

Docker container, which is then run on a cloud native infrastructure. Certain cloud services are designed 

to make it easy to run, stop and manage Docker containers on a cluster. It is even possible to host the 

cluster on a serverless infrastructure that is managed by the cloud if the services or tasks can be called 

out. These services also create a consistent deployment and build experience, manage, and scale batch 

and Extract-Transform-Load workloads, and build sophisticated application architectures on a 

microservice model. 

Writing an application in a container image is good first step towards modernization, but many 

applications aren’t optimized for containers. Load balancing, application-state handling, and monitoring 

can be different when hosted in the cloud. This requires rewriting portions of the applications. Similarly, 

performance tuning and DevOps processes must be aligned to containers. Prebuilt containerized 

middleware and services help with this journey. Finally, managing and monitoring operations can help 

maximize speed and agility. 

Some applications are best exposed as APIs that are easily reused for building new capabilities that 

augment the existing software. Those APIs can then be reused to integrate future applications into the 

ecosystem. APIs can adhere to Representational State transfer architectural style and expose existing 

access from any endpoint. These APIs can be placed under management control to improve security, 

performance, and visibility. New applications can build on freshly exposed APIs from existing 

applications without requiring changes to existing applications. Businesses also use this opportunity to 

harness their data to create better outcomes and customer experiences. 

When an application is pushed to production, the DevOps and configuration must be managed and 

modernized as well. Applications are monitored to ensure availability and performance according to 

service-level agreements. Service management must also transform to support this paradigm shift. 

Monitoring metrics and logs helps to determine application health and dashboards are easy to generate 

with cloud computing. Using such tools and automations and tracking issues helps to promote visibility 

of incidents to everyone.

Lastly, there is not a single formula that holds but an approach that is totally unique to the business 

goals and needs. The modernization goals of agile delivery, transform-and-innovate, reduce costs, 

replace with SaaS, and cloud migrations can be planned for by analyzing for insights and the utilization 

of one or more modernization patterns that include pivoting from monolithic strategies, adding new 

capabilities as microservices, refactoring monolith to microservices, exposing APIs, and migrating 

monolith to a cloud runtime.