This is a continuation of the earlier posts starting with this one: http://ravinote.blogspot.com/2020/09/best-practice-from-networking.html
While container platforms for Platform-as-a-service (PaaS) have enabled software to be deployed without any recognition of the host and frequently rotated from one host to another, the end user's adoption of the PaaS platform depends on the production readiness of the applications and services. The force for PaaS adoption has made little or no changes to the use and proliferation of virtual machines by individual users
The cloud services provider can package services such as additional storage, regular backup schedule, patching schedule, system management, securing, and billing at the time of request for each asset. However, such services depend on the cloud where the services are requested. For private cloud, a lot of the service is in-house adding to the costs even if the inventory is free.
The use of a virtual machine image as a storage artifact only highlights the use of large files in storage and networking. They are usually saved on the datastore in the datacenter but nothing prevents the end-user from owning the machine, taking periodic backups of the VM image and uploading with tools like duplicity. These files can then be stashed in storage products like object storage. The ability of S3 to take on multi-part upload eases the proliferation of large files.
The use of large files helps test most bookkeeping associated with the logic that depends on the size of the artifact. While performance optimizations remove redundant operations in different layers to streamline a use case, the unoptimized code path is better tested with large files.
No comments:
Post a Comment