Thoughts on Healthcare data and technology integrations
The following case study pertains to a massive Healthcare
mission to modernize data and technology operations. A previous article
discussed public cloud adoption for the same cause and the journey therein.
This article discusses future improvements in terms of overall ambition and
growth.
First, it must be said that the mission is just over a
decade old and has made significant strides in all areas of computing including
migrations and modernizations towards the public cloud. For instance, the
detection and redressal of incorrect or fraudulent payment activities alone are
in the range of hundreds of millions of dollars each year. Similarly, the use
of technology stacks is as exhaustive as it could possibly be, with mention
about the use of all forms of latest languages, libraries, packages, and
hosting solutions including Golang and Python with significant investments in
GPU based Machine learning models, old and new data science and analysis software
including statistical analysis and use of most recent versions of each for
reducing technical debt and overhead. There are internal portals to request all
kinds of resources for computing, storage and network as well as cloud
infrastructure and Platform-as-a-service stacks demonstrating the best
practices in the industry.
Second, there are multiple levels of development,
architecture, and cross cutting initiatives that the venture has already tried
and tested, which although began on-premises has also nimbly moved to the
cloud. Leaders and pioneers have forged significant trends and patterns with
little or no concerns about organizational mindsets about remaining on-premises
or adhering to old and outdated practice.
With this background, some of the improvements that can be
called out are in the areas of data and machine learning pipelines because the
infrastructure is a significant contributor to the spirit and practice of
exploring new opportunities for business improvements. The next sections focus
on these independently:
Data pipelines:
Some of the lessons learned from centralizing data and
moving on-premises data to the cloud are that access to the data and the manner
of use are just as significant as making the data available. Pass-through authentication allows a variety
of clients to access data so that the callers are responsible individually for
their actions.
Another significant area of challenge is that network
utilization becomes increasingly a bottleneck as data transfers must be spread
across time and calendar to justify the bandwidth that they consume.
The third area of challenge involves setting up a variety of
central data stores for structured, unstructured and event or streaming data
architectures.
No comments:
Post a Comment