Подія
31 бер., 23 - 2 квіт., 23
Найбільша подія навчання Fabric, Power BI і SQL. 31 березня – 2 квітня. Щоб заощадити 400 грн, скористайтеся кодом FABINSIDER.
Реєструйтеся сьогодніЦей браузер більше не підтримується.
Замініть його на Microsoft Edge, щоб користуватися перевагами найновіших функцій, оновлень безпеки та технічної підтримки.
Примітка
The Livy API for Fabric Data Engineering is in preview.
Applies to: ✅ Data Engineering and Data Science in Microsoft Fabric
Get started with Livy API for Fabric Data Engineering by creating a Lakehouse; authenticating with a Microsoft Entra app token; submit either batch or session jobs from a remote client to Fabric Spark compute. You'll discover the Livy API endpoint; submit jobs; and monitor the results.
Fabric Premium or Trial capacity with a LakeHouse
Enable the Tenant Admin Setting for Livy API (preview)
A remote client such as Visual Studio Code with Jupyter notebook support, PySpark, and Microsoft Authentication Library (MSAL) for Python
A Microsoft Entra app token is required to access the Fabric Rest API. Register an application with the Microsoft identity platform
You can use various programming languages or GUI clients to interact with REST API endpoints. In this article, we use Visual Studio Code. Visual Studio Code needs to be configured with Jupyter Notebooks, PySpark, and the Microsoft Authentication Library (MSAL) for Python
To work with Fabric APIs including the Livy API, you first need to create a Microsoft Entra application and obtain a token. Your application needs to be registered and configured adequately to perform API calls against Fabric. For more information, see Register an application with the Microsoft identity platform.
There are many Microsoft Entra scope permissions required to execute Livy jobs. This example uses simple Spark code + storage access + SQL:
Примітка
During public preview we will be adding a few additional granular scopes, and if you use this approach, when we add these additional scopes your Livy app will break. Please check this list as it will be updated with the additional scopes.
Some customers want more granular permissions than the prior list. You could remove Item.ReadWrite.All and replacing with these more granular scope permissions:
When you've registered your application, you'll need both the Application (client) ID and the Directory (tenant) ID.
The authenticated user calling the Livy API needs to be a workspace member where both the API and data source items are located with a Contributor role. For more information, see Give users access to workspaces.
A Lakehouse artifact is required to access the Livy endpoint. Once the Lakehouse is created, the Livy API endpoint can be located within the settings panel.
The endpoint of the Livy API would follow this pattern:
https://api.fabric.microsoft.com/v1/workspaces/<ws_id>/lakehouses/<lakehouse_id>/livyapi/versions/2023-12-01/
The URL is appended with either <sessions> or <batches> depending on what you choose.
For each Fabric workspace, a default starter pool is provisioned, the execution of all the spark code use this starter pool by default. You can use Fabric Environments to customize the Livy API Spark jobs.
The full swagger files for the Livy API are available here.
Now that setup of the Livy API is complete, you can choose to submit either batch or session jobs.
You can use the Monitoring Hub to see your prior Livy API submissions, and debug any submissions errors.
Подія
31 бер., 23 - 2 квіт., 23
Найбільша подія навчання Fabric, Power BI і SQL. 31 березня – 2 квітня. Щоб заощадити 400 грн, скористайтеся кодом FABINSIDER.
Реєструйтеся сьогодніНавчання
Модуль
Use Apache Spark in Microsoft Fabric - Training
Apache Spark is a core technology for large-scale data analytics. Microsoft Fabric provides support for Spark clusters, enabling you to analyze and process data at scale.
Сертифікація
Microsoft Certified: Fabric Data Engineer Associate - Certifications
Як спеціаліст з обробки даних fabric, ви маєте мати досвід роботи з предметами з шаблонами завантаження даних, архітектурами даних і процесами оркестрування.
Документація
Submit Spark session jobs using the Livy API - Microsoft Fabric
Learn how to submit Spark session jobs using the Livy API.
Livy API overview - Microsoft Fabric
Learn about the Microsoft Fabric Livy API for submitting jobs to Spark
Submit Spark batch jobs using the Livy API - Microsoft Fabric
Learn how to submit Spark batch jobs using the Livy API.