Archival using streams discussion is continued:
Traditionally, blob storage has developed benefits for archival:
1) Backups tend to be binary. Blobs can be large and binary
2) Backups and archival platforms can take data over protocols and blobs can be transferred natively or via protocols
3) Cold storage class of object storage is well suited for archival platforms
Stream stores allows one steam to be converted to another and do away with class.
There cold, warm and hot regions of the stream perform the equivalent of class.
The data transformation routines that can be offloaded to a compute outside the storage, if necessary, to transform the data prior to archival. The idea here is to package a common technique across data sources that handle their own archival preparations across data streams. All in all it becomes an archival management system rather than remain a store.
Let us compare a stream archival policy evaluator with Artifactory. Both of them emphasize the following:
1) Proper naming with a format like “<team>-<technology>-<maturity>-<locator>” where
A team is the primary identifier for the project
A technology, tool or package type is being used
A maturity level where binary objects indicate the stage of processing such as debug or release
A locator which indicates the topology of the artifact.
With such proper naming, both make use of rules to use their organization for effective processing.
2) Both make use of three main categories: security, performance and operability.
Security permissions determine who has access to the streams.
Performance considerations determine the cleanup policies such that the repositories are performing efficiently.
Operability considerations determine whether objects need to be in different repositories to improve read, write and delete access to reduce interference
3) Both of them make heavy use of local, remote and virtual repositories to their advantage in getting and putting objects
While Artifactory relies on organizations to determine their own policies, the stream policy evaluator is a manifestation of those policies and spans across repositories, organizations and administration responsibilities.
1) Backups tend to be binary. Blobs can be large and binary
2) Backups and archival platforms can take data over protocols and blobs can be transferred natively or via protocols
3) Cold storage class of object storage is well suited for archival platforms
Stream stores allows one steam to be converted to another and do away with class.
There cold, warm and hot regions of the stream perform the equivalent of class.
The data transformation routines that can be offloaded to a compute outside the storage, if necessary, to transform the data prior to archival. The idea here is to package a common technique across data sources that handle their own archival preparations across data streams. All in all it becomes an archival management system rather than remain a store.
Let us compare a stream archival policy evaluator with Artifactory. Both of them emphasize the following:
1) Proper naming with a format like “<team>-<technology>-<maturity>-<locator>” where
A team is the primary identifier for the project
A technology, tool or package type is being used
A maturity level where binary objects indicate the stage of processing such as debug or release
A locator which indicates the topology of the artifact.
With such proper naming, both make use of rules to use their organization for effective processing.
2) Both make use of three main categories: security, performance and operability.
Security permissions determine who has access to the streams.
Performance considerations determine the cleanup policies such that the repositories are performing efficiently.
Operability considerations determine whether objects need to be in different repositories to improve read, write and delete access to reduce interference
3) Both of them make heavy use of local, remote and virtual repositories to their advantage in getting and putting objects
While Artifactory relies on organizations to determine their own policies, the stream policy evaluator is a manifestation of those policies and spans across repositories, organizations and administration responsibilities.