Friday, February 11, 2022

 

Microsoft Graph 

This is a continuation of a series of articles on Azure services from an operational engineering perspective with the most recent introduction of this topic with the link here. The previous article discussed the connectors used with Microsoft Graph. This article introduces the Microsoft Graph Data Connect. Microsoft Graph enables integration with the best of Microsoft 365, Windows 10 and Enterprise mobility and security services in Microsoft 365, using REST APIs and client libraries

Microsoft Graph provides a unified programmability model and similar in its utility as a Kusto cluster and database. The Microsoft Graph model allows Microsoft Graph Connectors to access data from different data sources and provides a common way to query the data. It is the gateway to data and intelligence in Microsoft 365. It can also act as a source for downstream Azure data stores that require data to be delivered. The Microsoft Graph Data Connect provides a set of tools to streamline secure and scalable delivery of Microsoft Graph Data.

The emphasis is on heterogeneity of data in the form of files, messages, meetings, user, people, devices, groups, calendar, coworkers, insights, chats, teams, and tasks. The unified programming access it provides can access data in and across Microsoft services including the Cloud, the hybrid cloud and the third-party cloud. A thin aggregation layer is used to route incoming requests to their corresponding destination services. This pattern of data virtualization is not uncommon, but the collection of data and the unified model provides an incredible opportunity for developers.

Microsoft Graph Data Connect augments Microsoft Graph’s transactional model with an intelligent way to access rich data at scale. It is ideal to connect big data and for machine learning.

It allows us to develop applications for analytics, intelligence, and business process optimization by extending Microsoft 365 data into Azure. It uses Azure Data Factory to copy Microsoft 365 data to the application’s storage at configurable intervals. It provides a set of tools to streamline the delivery of this data into Microsoft Azure. It allows us to manage the data and see who is accessing it, and it requests specific properties of an entity. It enhances the Microsoft Graph model, which grants or denies applications access to entire entities. The granular data consent model allows applications to access only specific properties in an entity and opens new use cases on the same data without compromising security and isolation. It supports all Azure native capabilities such as encryption, geo fencing, auditing and policy enforcement.

 

 

Thursday, February 10, 2022

 

Microsoft Graph

This is a continuation of a series of articles on Azure services from an operational engineering perspective with the most recent introduction of this topic with the link here. This article continues to elaborate on the connectors used with the Microsoft Graph. Microsoft Graph enables integration with the best of Microsoft 365, Windows 10 and Enterprise mobility and security services in Microsoft 365, using REST APIs and client libraries.

It uses the concepts of users and groups to elaborate on these functionalities.  A user is an individual who uses Microsoft 365 cloud services and for Microsoft Graph, it is the focus for which the identity is protected, and access is well managed. The data associated with this entity and the opportunities to enrich the context, provide real-time information, and deep insights are what makes Microsoft Graph so popular. A group is the fundamental entity that lets users collaborate and integrate with other services which enable scenarios for task planning, teamwork, education and more.

Since Microsoft Graph is the data fabric that empowers intelligent experiences, it needs mechanisms to bring content from external services to Microsoft Graph which enables external data to power Microsoft 365 experiences.

Connectors offer a simple and intuitive way to do just that. For example, the data brought in from the organization can appear in Microsoft Search results. This expands the type of content sources that are searchable in Microsoft 365 productivity applications and the broader ecosystem.

There are over a hundred connectors that are currently available from Microsoft and partners which include Azure Services, Box, ServiceNow, Salesforce, Google services, MediaWiki, and more. An example of writing a custom connector will explain the details of its working.

There is a set of connector REST APIs available from Microsoft Graph. These are used to 1. Create and manage external data connections, 2. Define and register the schema of the external data type(s), 3. Ingest external data items into Microsoft Graph and 4. Sync external groups.

A connection is a logical unit for the external data that can be managed as a single unit. It can be used to create, update, and delete connections in Microsoft Graph. The Connection API provides the connection resource. The connection schema determines how the content will be used in various Microsoft 365 experiences. Schema is a flat list of all properties that can be added to the connection along with the attributes, labels, and aliases. The schema must be registered before ingesting items into the Microsoft Graph. Items that can be added to the Microsoft Search service are represented by the externalItem resource in Microsoft Graph. Items in the external service can be granted or denied access via ACL to different types of non-Azure Active Directory groups. When the items are ingested into Microsoft Graph, they must honor these ACLs. The External Groups API sets permission on external items ingested into the Microsoft Graph.  The connector must be registered as an application in the Azure AD admin center.

 

 

 

Wednesday, February 9, 2022

 

Microsoft Graph

This is a continuation of a series of articles on Azure services from an operational engineering perspective with the most recent introduction of this topic with the link here. This article continues to elaborate on the best practices in working with the Microsoft Graph. Microsoft Graph enables integration with the best of Microsoft 365, Windows 10 and Enterprise mobility and security services in Microsoft 365, using REST APIs and client libraries. It uses the concepts of users and groups to elaborate on these functionalities.  A user is an individual who uses Microsoft 365 cloud services and for Microsoft Graph, it is the focus for which the identity is protected and access is well managed. The data associated with this entity and the opportunities to enrich the context, provide real-time information, and deep insights are what makes Microsoft Graph so popular. A group is the fundamental entity that lets users collaborate and integrate with other services which enable scenarios for task planning, teamwork, education and more.

The Graph Explorer helps to know the API and is the easiest way to start experimenting with the data available. Proper REST requests can be made and the responses are representative of those encountered programmatically which eliminates surprises and errors during implementation. Authentication for Microsoft Graph is made easier using the Microsoft Authentication Library API, MSAL which acquires an access token.

The best practices include the following:

·        Using least privilege so that the APIs are called only with what permission is necessary.

·        Using the correct permission type based on the scenario which is particularly important for delegated permissions. If the code runs without a signed-in user, it can lead to vulnerability.

·        Configuring the application properly for end-user and administrator experiences.

·        Using multi-tenant applications so that the customer have various application and consent controls in different states.

·        Using Pagination when the responses to the requests made to Microsoft Graph are large and the results must be browsed efficiently.

·        Handling expected errors for robustness and user-convenience. Certain errors are retriable while others need to be translated to the user.

·        Adding members to existing enumerations can break applications. Evolvable enumerations provide a better alternative. They have a common sentinel member called the unknownFutureVaue which demarcates known members that have been defined in the enum initially and unknown members that are added subsequently or will be defined in the future. Members of evolvable enums can be references by their string values.

·        Making calls to Microsoft Graph for real-time data and storing data locally only when required and the terms of use and privacy policy can be upheld.

·        Getting only the minimum amount of data for improving performance, security and privacy.

·        Choosing only the properties that the application needs and no more.

·        Using webhooks to get push notifications when data changes

·        Using delta query to efficiently keep data up to date

·        Using webhooks and delta query together because if only one is used, the right polling interval is required. Using webhook notifications as the trigger to make delta query calls.

·        Batching which enables optimization of application by combining multiple requests into a single JSON object.

·        Combining individual requests into a single batch can save significant network latency and conserve connection requests.

·        Using and honoring TLS to support all capabilities of Microsoft Graph

·        Opening connections to all advertised DNS answers and generating unique Guid for sending in the HTTP request headers and for logging.

These are some of the considerations towards the best practice in working with Microsoft Graph.

Tuesday, February 8, 2022

Writing automation that retrieves data via APIs versus retrieving it from Kusto.

 

Introduction:

Many partners want access to the data that a service team maintains in its inventory. One popular technique to open up data about the resources provisioned by the service involves the use of a Kusto database and cluster. APIs provide real-time access to the data and Kusto provides continuously replicated data. Sometimes there is a lag for the refresh of the data in the shared Kusto database and it can vary arbitrarily from table to table depending on their size and use.  Besides the lag, the data is expected to be the same between the two and this article explores the appropriate usage of one versus the other.

Description:

Kusto data access is an option to access the data directly from a database. It is very helpful when we want to browse through the data or explore it. APIs encapsulate logic with access to the data and provide validation, error translation and response formatting along with diagnosability, telemetry and troubleshooting help. These are owned by the service team that also owns the data. The APIs are also versioned providing applications that use them with some reassurance towards compatibility and migration path. APIs are also very performant and already have fast path for critical scenarios. They are continuously maintained for streamlined access to data and with proper controls and overrides for desired custom behavior. APIs guarantee robustness, predictability, and SLAs for existing and new capabilities that the service team authors. In this sense, it is managed access to the data

This can be compared to the Kusto data access where the compute required to extract, transform, and load the data is similar in nature to the implementation of the API but now falls under the Do-It-Yourself onus of the service. If the client teams were to invest in building access to the data via Kusto queries, they will also own the maintenance and the total cost of ownership which accrues enormously over extended periods of time. One of the unaccounted costs for Kusto comes from its fragility. The queries, the formatting of the data, the semantics and deprecation of schema associated with the data are all susceptible to change without any notification to the applications and their authors. Even the values of the data can change, and assumptions made on them can break. This implies that the cost for data access is higher for Kusto.

Another dimension of comparison is storage. The APIs provide both read-write capabilities while the Kusto is essentially for read-only purposes mandating local storage for stashed results and transformations during vectorized executions. The size of the storage is also a consideration when the frequency and the access patterns are high. An application that wishes to enrich the data in place must make a copy of the original, transform it and save it in local or remote stores but not at the source. If the data supported user-defined objects and dictionaries, then the APIs provide a way to enhance them so that the next data access will get the additional state persisted with the access.  

APIs become the first choice for accessing data, but Kusto can be useful in automations that cannot wait for the functionality to be available via APIs that are published by the service team. They are also very useful to write one-off automations that are special purposed or dedicated without any impact to customers. Most commercial systems will rely on APIs for interacting between services especially for production environments and cloud scale. In-house projects, and reporting dashboards can make use of Kusto directly or Azure Data Explorer or automations based on them.  Kusto queries can also be quite elaborate or custom defined to suit specifics needs that are faster and lightweight compared to the staged management, release pipelines and scheduled delivery of features introduced into the APIs. The ability to include such queries in background automation is sometimes useful because they don’t have interactivity with the customer and those automations can be kicked off periodically or on-demand.

Both Kusto and API data access can be programmatic involving the use of a query provider and an http client respectively. But the code for the Kusto data access will likely involve more packing and unpacking into objects as well as conversions whereas the requests and responses for the API come even versioned with their corresponding APIs in already composed form. This investment can be made if the language and the query provide usefulness that is not available otherwise or requires much more code to be written on the API side. No code or low code scenarios prefer this approach, but those scenarios do not include cases where transfer of data must be made formal.

Conclusion:

Data access, its mode and delivery are governed by factors that together weigh in favor of one versus the other.

Monday, February 7, 2022

Microsoft Graph

This is a continuation of a series of articles on Azure services from an operational engineering perspective with the most recent introduction of this topic with the link here. This article continues to elaborate on the best practices in working with the Microsoft Graph.

Microsoft Graph enables integration with the best of Microsoft 365, Windows 10 and Enterprise mobility and security services in Microsoft 365, using REST APIs and client libraries. It uses the concepts of users and groups to elaborate on these functionalities.

A user is an individual who uses Microsoft 365 cloud services. Throughout Microsoft Graph, it is the focal point of whose identity is protected and access is well managed. It is the data associated with this entity and the opportunities to enrich the context, provide real-time information, and deep insights that make the Microsoft Graph so popular. The services supporting this entity are the Azure AD and most productivity, collaboration intelligence, and education services.

A group is the fundamental entity that lets users collaborate and integrate with other services which enable scenarios for task planning, teamwork, education and more.

The Graph Explorer helps to know the API and is the easiest way to start experimenting with the data available. Proper REST requests can be made and the responses are representative of those encountered programmatically which eliminates surprises and errors during implementation.

Authentication for Microsoft Graph is made easier using the Microsoft Authentication Library API, MSAL which acquires an access token.

The best practices for consent and authorization involve the following:

-          Using least privilege so that the APIs are called only with what permission is necessary.

-          Using the correct permission type based on the scenario which is particularly important for delegated permissions. If the code runs without a signed-in user, it can lead to vulnerability.

-          Configuring the application properly for end-user and administrator experiences.

-          Using multi-tenant applications so that the customer have various application and consent controls in different states.

Responses can be large for the requests made to Microsoft Graph. Pagination can help browse the results efficiently.

Handling of expected errors is required from the application using the Microsoft Graph for robustness and user-convenience. Certain errors are retriable while others need to be translated to the user.

Adding members to existing enumerations can break applications. Evolvable enumerations provide a better alternative. They have a common sentinel member called the unknownFutureVaue that demarcates known members that have been defined in the enum initially and unknown members that are added subsequently or will be defined in the future. Members of evolvable enums can be references by their string values.

These are some of the considerations towards the best practice in working with Microsoft Graph.

Sunday, February 6, 2022

 

Microsoft Graph

This is a continuation of a series of articles on Azure services from an operational engineering perspective with the most recent introduction of this topic with the link here. This article continues to elaborate on the services and features in the Microsoft Graph.

Microsoft Graph enables integration with the best of Microsoft 365, Windows 10 and Enterprise mobility and security services in Microsoft 365, using REST APIs and client libraries. We use the concepts of users and groups to elaborate on these functionalities.

A user is an individual who uses Microsoft 365 cloud services. Throughout Microsoft Graph, it is the focal point of whose identity is protected and access is well managed. It is the data associated with this entity and the opportunities to enrich the context, provide real-time information, and deep insights that make the Microsoft Graph so popular. The services supporting this entity are the Azure AD and most productivity, collaboration intelligence, and education services.

A group is the fundamental entity that lets users collaborate and integrate with other services which enable scenarios for task planning, teamwork, education and more.

The services included in the Microsoft 365 portfolio are:

1.       Identity and Access Management service supported by Azure AD which creates and manages directory resources.

2.       Calendar which is managed by Outlook that is used to setup appointments and meetings.

3.       Files which are used to manage and share on OneDrive and SharePoint

4.       Mail supported by Outlook which helps users communicate, organize messages

5.       Notes supported by OneNote which lets users plan and organize ideas and information

6.       Contacts manager on the web, mobile, and desktop devices.

7.       Workbooks and charts which let users use Excel spreadsheets to do complex calculations, track analyze and generate reports

8.       To-Do tasks which let users manage their personal tasks across work and life. It is integrated with Outlook, Teams, Planner and Cortana.

9.       Cloud communications supported by Teams and Skype which lets apps and services interact with users

10.   Sites and lists which are web-based platform for users and Microsoft 365 groups to share, organize, manage, and discover content

11.   Tasks and Plans which enable users in Microsoft 365 groups to create plans and track progress

12.   Teamwork supported by Microsoft Teams which enables hub-and-chat based workspace for teams to share files, notes, calendar, and plans.

13.   People supported by Azure AD, Outlook, SharePoint and more to get information about persons as ordered by their relevance to a user.

14.   Profile and Profile Card which provides lightweight mechanism for storing and retrieving information within a tenant

15.   Document insights which use advanced analytics and machine learning techniques

16.   Analytics uses advanced analytics and machine learning techniques to provide insights.

17.   Cloud printing which uses Universal Print API for simple, rich, and secure print service

18.   Corporate management of devices by enrolling, configuring, and managing them

19.   Cloud PC which works with Windows 365 cloud PCs using the Microsoft Graph API

20.   Device Updates which provide control over the approval, scheduling, monitoring, and safeguarding of content delivered from Windows update.

21.   Multi-tenant management which lets managed service providers to remotely manage multiple customer tenants

22.   Service Health and communications which provides access to the health status and message center posts about Microsoft Cloud services.

23.   Usage reports which use Microsoft Teams, OneDrive, Outlook, SharePoint, Skype for business and Yammer.

24.   Security integration which provides a unified gateway to security insights and actions across Microsoft and ecosystem partners.

25.   Cross device experiences which enable application experience that go beyond a single device

26.   User notifications which enable cross-platform notification experiences

27.   Reports which enable activity and usage-information of a supporting service

28.   Education which provides information relevant for education scenarios, including schools, classes, students, teachers, and assignment information

29.   Customer booking which targets organizations to enable their users and customers to book services directly

30.   Financials which enable management of financial data, automation and securing of the supply chain, sales management and improved customer service.

Saturday, February 5, 2022

 

Microsoft Graph

This is a continuation of a series of articles on Azure services from an operational engineering perspective with the most recent introduction of this topic with the link here. This article continues the developer experience before talking about the operations.

Microsoft Graph (aka MSGraph) provides rich context for applications such as who are out of office, what documents they have been working on, etc. It provides deep insights generated from usage patterns, such as trending documents, best team meeting times, or who people typically work with. It provides real-time updates which helps to respond to changes in the data such as to reschedule a meeting based on responses, notify others when a file is modified, or continue a process after it has been approved. It builds solutions that target enterprise users in Azure and Microsoft 365, consumers in Office online or both.

The Graph Growth and telemetry landing page is a central hub for telemetry on Microsoft Graph. It provides a centralized dashboard to view the data.

Authentication and authorization are simplified with MS Graph.  There is a single authentication protocol and a single resource. Clients need to manage just one token, one sign-in and a single set of permissions scopes to consent.  Each of the data access to dedicated stores behind the MSGraph uses a separate identity. Workloads don’t share identity. Once the authentication and authorization complete at the entry point of the MS Graph APIs, the resolved user, application and permission scopes are passed around to the data stores accessed. The On-behalf-of flow in Oauth2 is leveraged for this purpose. The access token is not passed directly to the stores behind the MSGraph to avoid token replay attack. The Protected Forwarded Token pattern applies to Azure Active Directory and MSA accounts as well.

The MSGraph APIs can be improved for better customer satisfaction with the following strategies:

1.       Separate out the high data volume services as premium services

2.       Separate out the high data volume with added value as high-speed lanes

3.       Use the marketplace for ISV solutions.

4.       The monetization occurs with Azure Consumed Revenue where the revenue is associated with usage of Azure metered and suite services.

5.       The APIs must be written such that it satisfies the customer’s need for automation, integration and reporting

6.       It should help the ISV partners to integrate for success

7.       It should support backend scale up and out for a needed backend reliability.

8.       It should provide modern authentication aligned with Azure AD.

9.       It should support testability for its functionalities.

10.   Admin and end-user's persona can be separated

11.   The API usage must be seamless across backend data access.