Saturday, January 21, 2023

Some Lambda errors and resolutions:

1.       Lambda experiences timeout and provides no response:

Possibly this will help:

version: '3.9'

services:

  api:

    image: public.ecr.aws/sam/build-nodejs14.x:1.26.0

    volumes:

      - /var/run/docker.sock:/var/run/docker.sock:ro

      - ./dist:/var/task:ro

    ports:

      - 3000:3000

    command: sam local start-api --template stack.yaml --host 0.0.0.0 --docker-network application --container-host host.docker.internal --warm-containers EAGER

 

networks:

  default:

    name: application

Essentially, the idea is to increase the timeout for all the large dependencies to load or          use a warm start.

2.       Runtime.HandlerNotFound: index.handler is undefined or not exported

One of the following remediation steps could help assuming that a function handler exists as entrypoint for Lambda:

1.       Module.exports  = handler

2.       Exports default handler;

3.       Ensure that the file with the handler is at the root level.

4.       The handler reference in the template has path qualification.

3.       And the handler does not like import statements in the NodeJscode

                                                               i.      Use require as preferred on the Javascript console

                                                             ii.      Use ES modules as preferred by the newer nodeJs runtime

4.       The size of the code exceeds 50MB.

                                                               i.      If the archive exceeds 50MB, upload it to S3

                                                             ii.      Separate the dependency into layers

                                                           iii.      Use a container image.

Friday, January 20, 2023

 

A previous article described the test cases for validating lambda function handler. This article covers some of the issues encountered and their resolutions.

First, the version of the lambda runtime might be different from the development environment. This cascades to version incompatibilities with the package dependencies for the code invoked. The package-lock.json used with a nodeJs based lambda function handler articulates the versions expected for each dependency. Removing the dependencies folder named the ‘node_modules’ folder and refreshing it using the commands “npm install” and “npm audit fix” will adjust the versions to suit the runtime. Usually, the higher version of runtime has backward compatibility with the lower versions, so if the lambda code works with a lower version runtime, it should work with the latest.

A simple lambda code such as the following:

const AWS = require('aws-sdk')

const s3 = new AWS.S3()

 

exports.handler = async function(event) {

  return s3.listBuckets().promise()

}

Will work on older versions of the runtime.

If we use the Javascript v3 sdk, we might have a syntax as follows:

// Import required AWS SDK clients and commands for Node.js.

import { ListBucketsCommand } from "@aws-sdk/client-s3";

import { s3Client } from "./libs/s3Client.js";

export const run = async () => {

  try {

    const data = await s3Client.send(new ListBucketsCommand({}));

    console.log("Success", data.Buckets);

    return data; // For unit tests.

  } catch (err) {

    console.log("Error", err);

  }

};

And some of the errors encountered might be like “cannot use import statement outside a module in aws lambda console”. This could be quite a pesky issue even driving the code to change to using the require syntax. If the Lambda console allowed using this, it could have alleviated much of the hassle but there is an easy resolution to this. The package.json could include an attribute “type”: “module” to denote this is an ECMAScript. There are some version differences between ECMAScript5 and ECMAScript6 and specifying the attribute informs the Lambda runtime to use ES modules rather than the traditional syntax.

It is also better to use configuration layers for nodeJs modules. These Lambda layers are a convenient way to package dependencies so that the size of the uploaded deployment archives is reduced. A layer can contain libraries, a custom runtime, data or configuration files. Layers promote reusability and separation of responsibilities. Layer contents are archived into a zip file and uploaded to S3. They are imported under the /opt directory at execution. If the same folder structure is specified in the layer zip file archive, the function code can access the content without the need to specify the path.

 

Thursday, January 19, 2023

 

Testing NodeJs applications and serverless functions:

A previous post described running the application locally as a way of testing the code without requiring it to be deployed anywhere remote from the development server. This write-up follows up on it with a test case.

One way to write unit-tests and integration tests is to use Jest. It’s testing framework with a name that goes with its approach to be delightfully simple. It works on all forms of JavaScript and TypeScript such as Node, React, Angular and others.

A sample Jest unit script might look like this:

__tests__\unit\handlers\simple.test.js:

// Mock uuid

const uuidvalue = 'f8216640-91a2-11eb-8ab9-57aa454facef'

jest.mock('uuid', () => ({ v1: () =>  uuidvalue}));

 

// This includes all tests for documents handler

describe('Test simple handler', () => {

    let sendSpy;

 

    // Test one-time setup and teardown, see more in https://jestjs.io/docs/en/setup-teardown

    beforeAll(() => {

        // Mock s3 methods

        // https://jestjs.io/docs/en/jest-object.html#jestspyonobject-methodname

        sendSpy = jest.spyOn(Array.prototype, 'push');

 

    });

 

    // Clean up mocks

    afterAll(() => {

        sendSpy.mockRestore();

    });

 

     it('should simply return', async () => {

        const items =  {

            "Items": [

            ],

            "Count": 0,

            "ScannedCount": 0

        };

 

        // Return the specified value whenever the spied function is called

        sendSpy.mockReturnValue(items);

 

        const event = {

            "httpMethod": "GET",

            "rawPath": "/documents",

            "requestContext": {

                "requestId":"e0GDshQXoAMEJug="

            }

        }

 

        // Invoke Lambda handler

        var foo = new Array("foo", "bar");

        const result = foo.push("echo");

 

        const expectedResult = {

            statusCode: 200,

            body: JSON.stringify(items),

            headers : {

                "Content-Type": "application/json"

              }

        };

 

        // Compare the result with the expected result

        expect(result).toEqual(expectedResult);

    });

});

node_modules\.bin\jest

PASS  ./__tests__/unit/handlers/simple.test.js

  Test simple handler

    √ should simply return (1 ms)

 

Test Suites: 1 passed, 1 total

Tests:       1 passed, 1 total

Snapshots:   0 total

Time:        0.279 s, estimated 1 s

Ran all test suites.

 

Wednesday, January 18, 2023

 

Applications wishing to test a serverless function locally have one of the following options:

1.       Invoke Serverless Application Model (SAM) from the AWS public cloud command line interfaces and the steps are:

a.       sam build -t template.yml

b.       sam local invoke

The command to build the libraries for a default index.js lambda function handler comes with the option to create a container image. This can be very useful for portability and the image can be executed on any container framework.

While this is optional, the local invocation might require that all the source code and its dependencies be transpiled into a format that is easy to interpret and run.

Enter Babel, a well-known transpiler for this purpose and it is almost easy to invoke different formats of javascripts and typescript notations.

2.       Another option to test the lambda function handlers has been to write different unit tests and integration tests. These tests can be run with Jest – a test framework that makes it easy to execute the tests by discovering it from the build folder. All test files are usually named as .test.js or .spec.js and the test cases follow the describe/it readable specifications that have made unit-tests a pleasure to read and execute.

Jest is local to the node_modules folder where all the dependencies of the lambda function handler are installed. It can be installed with –save-dev option that eliminates the need to install it for production builds.

Executing the jest requires a mock library for unit tests and by virtue of the Javascript’s support for object’s prototype, features can be inherited and substituted.

3.       A final option is to use a middleware that exercises just the handler methods of the lambda function handler, so that they can be invoked across the wire by curl commands. Enter Koa – a lightweight middleware that is not bundled with anything else, and writing an http server becomes as simple as:

const Koa = require('koa');

const app = new Koa();

 

// response

app.use(ctx => {

  ctx.body = 'Hello Koa';

});

 

app.listen(3000);

 

and curl http://localhost:3000/ will return ‘Hello Koa’

This comes with the nice benefit that each of the handler method can now be tried individually, and the results will be similar to how the lambda functions are invoked.

 

Another way to exercise the methods would be to include the routes:

var koa = require('koa');
var http = require('http');
var router = require('koa-router')();
var bodyParser = require('koa-body')();

router.post('/resource', bodyParser, function *(next){
  console.log(this.request.body);
  this.status = 200;
  this.body = 'some output for post requests';
  yield(next);
});

startServerOne();

function startServerOne() {
  var app = koa();
  app.use(router.routes());
  http.createServer(app.callback()).listen(8081);
  console.log('Server at Port 8081');
}

These are some of the ways to test lambda function handlers.

Tuesday, January 17, 2023

 

Writing a serverless method that uploads documents:

Description: One of the most common techniques for uploading documents involves a form submission from the FrontEnd application. The user points to a file at a location specified by a folder on her computer and the client-side script in the frontend reads the file as a stream. When the same file content needs to be made available to the middleware, HTTP based methods struggle with the right way to send the data.

The key to send the data is to specify it as a multipart/form-data. This is the most efficient content-type for sending binary data to the server. Multiparts means that data is sent to the server in separate parts. Each of the components may have a different content type, file name and data. The data are separated from each other by a boundary string. When a command line tool like curl sends this request , it does so as a multipart request with a specially formatted POST message body and a series of parts separated by MIME boundaries

For example,

POST /echo HTTP/1.1

Content-Type: multipart/form-data; boundary=---WD9543A

Content-Length: 100

---WD9543A

Content-Disposition: form-data; name=”user-name”

John

---WD9543A

Content-Disposition: form-data; name=”text-data”;

filename=”user.txt”

Content-Type: text/plain

[Text-Data]

---WD9543A

When the controller receives this request, it can refer directly to the parts as follows:

Request.body.name

And

Request.file

For example:

Const multer = require(‘multer’)

Const upload = multer({ dest: os.tmpdir()});

Router.post(‘/upload’, upload.single(file), function (req, res) {

const title = req.body.title;

const file = req.file;

console.log(title);

console.log(file);

res.sendStatus(200);

});

 

Monday, January 16, 2023

 

Public clouds provide an adoption framework for businesses that helps to create an overall cloud adoption plan that guides programs and teams in their digital transformation. The plan methodology provides templates to create backlogs and plans to build necessary skills across the teams. It helps rationalize the data estate, prioritize the technical efforts, and identify the data workloads. It’s important to adhere to a set of architectural principles which help guide development and optimization of the workloads. A well-architected framework stands on five pillars of architectural excellence which include:

-          Reliability

-          Security

-          Cost Optimization

-          Operational Excellence

-          Performance efficiency

The elements that support these pillars are a review, a cost and optimization advisor, documentation, patterns-support-and-service offers, reference architectures and design principles.

This guidance provides a summary of how these principles apply to the management of the data workloads.

Cost optimization is one of the primary benefits of using the right tool for the right solution. It helps to analyze the spend over time as well as the effects of scale out and scale up. An advisor can help improve reusability, on-demand scaling, reduced data duplication, among many others.

Performance is usually based on external factors and is very close to customer satisfaction. Continuous telemetry and reactiveness are essential to tuned up performance. The shared environment controls for management and monitoring create alerts, dashboards, and notifications specific to the performance of the workload. Performance considerations include storage and compute abstractions, dynamic scaling, partitioning, storage pruning, enhanced drivers, and multilayer cache.

Operational excellence comes with security and reliability. Security and data management must be built right into the system at layers for every application and workload. The data management and analytics scenario focus on establishing a foundation for security. Although workload specific solutions might be required, the foundation for security is built with the Azure landing zones and managed independently from the workload. Confidentiality and integrity of data including privilege management, data privacy and appropriate controls must be ensured. Network isolation and end-to-end encryption must be implemented. SSO, MFA, conditional access and managed service identities are involved to secure authentication. Separation of concerns between azure control plane and data plane as well as RBAC access control must be used.

The key considerations for reliability are how to detect change and how quickly the operations can be resumed. The existing environment should also include auditing, monitoring, alerting and a notification framework.

In addition to all the above, some consideration may be given to improving individual service level agreements, redundancy of workload specific architecture, and processes for monitoring and notification beyond what is provided by the cloud operations teams.

Sunday, January 15, 2023

#codingexercise 

Merge K sorted Lists:

List<Integer> merge(List<List<Integer>> slices)

{

var result = new List<Integer> ();

while(slices.size() > 0) {            

              slices.sort(new ListComparator<List<Integer>>());

              while(slices.size() > 0 && slices[0].size() == 0) {

                        slices.remove(0);

              }

               If (slices.size() == 0) {

         return result;

}              

result.add(slices[0].removeAt(0));

}

return result.toList():

}

 

public class ListComparator<List<Integer>> implements Comparator<List<Integer>> {

        public int compare(List<Integer> l1, List<Integer> l2) {

                           if (l1 == null && l2 == null) return -1;

                           if (l1 == null) return -1;                        

                           if (l2 == null) return 1;

                           if (l1.size() == 0 && l2.size() == 0) return -1;

                           if (l1.size() == 0) return -1;

                           if (l2.size() == 0) return 1;

                          return l1.get(0).compareTo(l2.get(0));

        }

}

 

Test cases:

1.       Null, Null -> []

2.       Null, [] -> []

3.       [], Null -> []

4.       [],[] -> []

5.       [1],[] -> [1]

6.       [], [1] -> [1]

7.       [1],[1] -> [1, 1]

8.       [1],[2] -> [1, 2]

9.       [2],[1] -> [1,2]

10.   [1],[2,3] -> [1,2,3]

11.   [1,2],[3] -> [1,2,3]

12.   [1,2,3],[] -> [1,2,3]

13.   [1,3][2] -> [1,2,3]

14.   [],[1,2,3] -> [1,2,3]

15.   [1][2][3]->[1,2,3]

16.   [1][2],[3],[5][6]-> [1,2,3,5,6]

17.   [][][1,2,3]->[1,2,3]

18.   [][1,2,3][]->[1,2,3]

19.   [1,2][3,5,6][]->[1,2,3,5,6]

20.   [1,2][3][4,5][6]->[1,2,3,4,5,6]

21.   [1,2,3,4,5,6,7][]->[1,2,3,4,5,6,7]

Reference: previous post