Monday, March 13, 2023

Continuous event analysis for Fleet Management software:

Use case: Continuous events from fleet management operations involve data that pertain to geospatial analytics and driverless vehicles weblogs in clickstream analytics and point of sale from inventory control. The real-time fleet management of a station involves routing incoming vehicles through the station and scheduling their departures with the objective of optimizing punctuality and regularity of transit service. The purpose is to develop an automated vehicle traffic control system. The scheduling problem is modeled as a bicriteria job shop scheduling problem with additional constraints. There are two objective functions in lexicographical order: first, the minimization of tardiness/earliness and second, the headway optimization. This problem is solved in two steps. A heuristic builds a feasible solution by considering the first objective function. Then the regularity is optimized. This also works well for simulated data at the station. This article investigates the use of a data pipeline and cloud native resources for the management of a fleet.

Implementing a data pipeline:

The example taken here is with regards to the Azure public cloud for pointing to specific products and principles, but any equivalent public cloud resources can be used. There is a point of ingestion from data sources typically via Azure Event Hubs, IoT hub, or BLOB storage. Even tottering options and time windows can be suitably adjusted to perform aggregations. The language of query is SQL, and it can be extended with JavaScript or C sharp user-defined functions. Queries written in SQL are easy to apply to filtering, sorting, and aggregation. Open-source stream analytics software such as Apache Flink also provide SQL like querying ability in addition to the structured query operations familiar with collections and per event processing methods. The topology between ingestion and delivery is handled by this stream analytics service while allowing extensions with the help of reference data stores, Azure functions, and real-time scoring via machine learning services. Event Hubs, Azure BLOB storage, and IoT hubs can collect data on the ingestion side, while they are distributed after analysis via alerts and notifications, dynamic dashboarding, data warehousing, and storage/archival. The fan-out of data to different services is itself a value addition but the ability to transform events into processed events also generates more possibilities for downstream usages including reporting and visualizations. As with all the services in the Azure portfolio, a data pipeline comes with standard deployment using Azure resource manager templates, health monitoring via Azure monitoring, billing usages that can drive down costs, and various forms of programmability options such as SDK, REST-based API services, command-line interfaces, and PowerShell automation. It can be offered as a fully managed PaaS offering so the infrastructure and workflow initializers need not be set up by hand for most deployments. It can also run directly in the cloud instead of an infrastructure like Kubernetes hosted in the cloud and scale to many events with relatively low latency. Such a cloud native continuous event fleet management service can not only be production ready but also reliable in mission-critical deployments. Security and compliance are not sacrificed for the sake of performance as is typical with the best practices of cloud resources.

No comments:

Post a Comment