diff --git a/.openpublishing.redirection.json b/.openpublishing.redirection.json index 194f317b21..4773f5921a 100644 --- a/.openpublishing.redirection.json +++ b/.openpublishing.redirection.json @@ -4431,8 +4431,8 @@ }, { "source_path": "powerbi-docs/webinars.md", - "redirect_url": "/power-bi/fundamentals/webinars", - "redirect_document_id": true + "redirect_url": "https://community.fabric.microsoft.com/t5/Upcoming-Events/bd-p/Community", + "redirect_document_id": false }, { "source_path": "powerbi-docs/developer/embedded/power-bi-permissions.md", @@ -5511,6 +5511,11 @@ "redirect_url": "/power-bi/consumer/mobile/mobile-apps-for-mobile-devices", "redirect_document_id": false }, + { + "source_path": "powerbi-docs/fundamentals/webinars.md", + "redirect_url": "https://community.fabric.microsoft.com/t5/Upcoming-Events/bd-p/Community", + "redirect_document_id": false + }, { "source_path": "powerbi-docs/developer/embedded/jupyter-quick-report.md", "redirect_url": "/power-bi/create-report/mobile-apps-for-mobile-devices", diff --git a/powerbi-docs/consumer/end-user-faq.yml b/powerbi-docs/consumer/end-user-faq.yml index cb5c0af76a..7057d8527c 100644 --- a/powerbi-docs/consumer/end-user-faq.yml +++ b/powerbi-docs/consumer/end-user-faq.yml @@ -86,7 +86,6 @@ sections: The following resources are available to help get you started: * [Power BI Blog](https://powerbi.microsoft.com/blog/) - * [Webinars](../fundamentals/webinars.md) * Getting started videos on our [YouTube Channel](https://www.youtube.com/user/mspowerbi) * [Get started with Power BI as a *business user*](index.yml) article * [Join our community](https://community.powerbi.com/) and ask questions diff --git a/powerbi-docs/create-reports/service-metrics-submetrics.md b/powerbi-docs/create-reports/service-metrics-submetrics.md index 4f76ca0ebc..0aac7cceb2 100644 --- a/powerbi-docs/create-reports/service-metrics-submetrics.md +++ b/powerbi-docs/create-reports/service-metrics-submetrics.md @@ -1,6 +1,6 @@ --- -title: Create submetrics in the Power BI service -description: Submetrics make scorecards more readable by collecting related metrics together under a single parent metric. +title: Create subgoals in the Power BI service +description: Subgoals make scorecards more readable by collecting related goals together under a single parent goal. author: maggiesMSFT ms.author: maggies ms.reviewer: '' @@ -11,94 +11,91 @@ ms.topic: how-to ms.date: 10/10/2022 --- -# Create submetrics in the Power BI service +# Create subgoals in the Power BI service [!INCLUDE [applies-no-desktop-yes-service](../includes/applies-no-desktop-yes-service.md)] -Metrics in Power BI let customers curate their metrics and track them against key business objectives, in a single pane. *Submetrics* make scorecards more readable by collecting related metrics together under a single parent metric. You can expand and collapse the parent metric. Submetrics can either be standalone values, unrelated to the parent value, or they can roll up to the parent value. Read about [*rollups*](#get-started-creating-rollups) later in this article. +Goals in Power BI let customers curate their goals and track them against key business objectives, in a single pane. *Subgoals* make scorecards more readable by collecting related goals together under a single parent goal. You can expand and collapse the parent goal. Subgoal can either be standalone values, unrelated to the parent value, or they can roll up to the parent value. Read about [*rollups*](#get-started-creating-rollups) later in this article. -## Create a submetric +## Create a subgoal -You can define one or more submetrics for a metric. Like their parent metrics, submetrics can be either connected or set manually. There are two entry points to create a submetric. +You can define one or more subgoals for a goal. Like their parent goals, subgoals can be either connected or set manually. There are two entry points to create a subgoal. 1. Open a scorecard and select **Edit**. :::image type="content" source="media/service-goals-create/power-bi-goals-edit-scorecard.png" alt-text="Screenshot of Select the Edit pencil to edit the scorecard."::: -1. Select the metric you want to create a submetric for, and select **Add submetric** on top of the scorecard. +1. Select the goal you want to create a subgoal for, and select **Add subgoal** on top of the scorecard. - :::image type="content" source="media/service-goals-create/power-bi-goals-add-subgoal-button.png" alt-text="Screenshot of Select the Add Submetric button."::: + :::image type="content" source="media/service-goals-create/power-bi-goals-add-subgoal-button.png" alt-text="Screenshot of Select the Add Subgoal button."::: - Or hover over the metric you want to create a submetric for, select **More options (...)** > **New submetric**. + Or hover over the goal you want to create a subgoal for, select **More options (...)** > **New subgoal**. - :::image type="content" source="media/service-goals-create/power-bi-goals-add-subgoal-more-options.png" alt-text="Screenshot of Select more options, then add submetric."::: + :::image type="content" source="media/service-goals-create/power-bi-goals-add-subgoal-more-options.png" alt-text="Screenshot of Select more options, then add subgoal."::: -1. Repeat the first step as needed to create more submetrics. +1. Repeat the first step as needed to create more subgoals. - Make sure you've selected the metric so you can create submetrics. + Make sure you've selected the goal so you can create subgoal. -1. See these articles for details on creating manual or connected metrics: +1. See these articles for details on creating manual or connected goals: - - [Create a manual metric](service-goals-create.md#step-2-create-a-manual-metric) - - [Create connected metrics](service-goals-create-connected.md) + - [Create a manual goal](service-goals-create.md#step-2-create-a-manual-metric) + - [Create connected goal](service-goals-create-connected.md) -## Create submetrics that roll up to the metric +## Create subgoals that roll up to the goal -You can also create *rollup* submetrics, whose values automatically aggregate up to a parent metric. These aggregations recalculate anytime the submetric values change, and capture the history. Rollups are a great way to keep a parent metric up to date without having it backed by a report connection. Rollups work when submetrics are manual or connected, so you can always roll up whatever the current value is, regardless of where it's coming from. +You can also create *rollup* subgoals, whose values automatically aggregate up to a parent goal. These aggregations recalculate anytime the subgoal values change, and capture the history. Rollups are a great way to keep a parent goal up to date without having it backed by a report connection. Rollups work when subgoals are manual or connected, so you can always roll up whatever the current value is, regardless of where it's coming from. ### Get started creating rollups First, make sure you're in edit mode. -1. Select the **pencil icon** to enter inline editing of a metric. +1. Select the **pencil icon** to enter inline editing of a goal. :::image type="content" source="media/service-metrics-submetrics/roll-up-edit-metric.png" alt-text="Screenshot showing a scorecard in edit mode."::: -1. Next to the value and target, you see an indication of what type of metric it is: +1. Next to the value and target, you see an indication of what type of goal it is: - - **Manual metric** - - **Use submetrics** + - **Manual goal** + - **Use subgoal** - **Connected to data** - :::image type="content" source="media/service-metrics-submetrics/roll-up-open-options.png" alt-text="Screenshot showing the types of metrics."::: + :::image type="content" source="media/service-metrics-submetrics/roll-up-open-options.png" alt-text="Screenshot showing the types of goals."::: -1. When you choose rollup submetrics, you can choose the aggregation type: +1. When you choose rollup subgoals, you can choose the aggregation type: - - **SUM**: a sum of the submetric values - - **AVERAGE**: an average of the submetric values - - **MIN**: reflects the lowest submetric value - - **MAX**: reflects the largest submetric value - - :::image type="content" source="media/service-metrics-submetrics/roll-up-open-options-two.png" alt-text="Screenshot showing the types of rollup aggregations."::: + - **SUM**: a sum of the subgoal values + - **AVERAGE**: an average of the subgoal values + - **MIN**: reflects the lowest subgoal value + - **MAX**: reflects the largest subgoal value The option you choose automatically calculates the appropriate value and shows it in either the **Current value** or **Target value** area. You can set up different rollup aggregation types on the current and target value, allowing for maximum flexibility. - :::image type="content" source="media/service-metrics-submetrics/roll-up-sum-saved.png" alt-text="Screenshot showing a metric calculating the sum of its submetrics."::: ### Preserve historical values -You may want to preserve historical values if you're switching metric type from connected to rollup type. +You may want to preserve historical values if you're switching goal type from connected to rollup type. -1. In **Edit** mode for the scorecard, select **More options (...)** > **See details** next to the metric. +1. In **Edit** mode for the scorecard, select **More options (...)** > **See details** next to the goal. :::image type="content" source="media/service-metrics-submetrics/metrics-more-options-see-details.png" alt-text="Screenshot showing selecting More options, then See details."::: -2. On the **Connections** tab, ensure that you turn off the toggle labeled **Clear metric values when connecting to a new report**. +2. On the **Connections** tab, ensure that you turn off the toggle labeled **Clear goal values when connecting to a new report**. :::image type="content" source="media/service-metrics-submetrics/roll-up-settings.png" alt-text="Screenshot showing scorecard data connection settings toggle."::: ### Rollups considerations - Rollups take place from the day they're set up moving forward. They don't remove, recalculate, or override historical data with retroactive rollup values, in almost all scenarios. -- One scenario where connected historical data is deleted is if the person creating the rollup is *not* the metric connection owner. To preserve the history, ensure the metric connection owner is the same person as the rollup creator. -- If a scorecard author changes a metric from a rollup to a connected metric and they bring in the history via report connection, the report history overrides historical rollup values. +- One scenario where connected historical data is deleted is if the person creating the rollup is *not* the goal connection owner. To preserve the history, ensure the goal connection owner is the same person as the rollup creator. +- If a scorecard author changes a goal from a rollup to a connected goal and they bring in the history via report connection, the report history overrides historical rollup values. - New rollup types calculate with the new aggregation moving forward, but don't replace the history. -- Rollups always show the same aggregation value of all submetrics, regardless of metric level permissions. This means if person A only has view access to 3 out of 5 submetrics, they still see the same parent metric value as someone who has access to all submetrics. +- Rollups always show the same aggregation value of all subgoals, regardless of goal level permissions. This means if person A only has view access to 3 out of 5 subgoals, they still see the same parent goal value as someone who has access to all subgoals. ## Related content -- [Get started with metrics in Power BI](service-goals-introduction.md) -- [Create scorecards and manual metrics in Power BI](service-goals-create.md) -- [Create connected metrics in Power BI](service-goals-create-connected.md) +- [Get started with goals in Power BI](service-goals-introduction.md) +- [Create scorecards and manual goals in Power BI](service-goals-create.md) +- [Create connected goals in Power BI](service-goals-create-connected.md) More questions? [Try the Power BI Community](https://community.powerbi.com/). diff --git a/powerbi-docs/fundamentals/TOC.yml b/powerbi-docs/fundamentals/TOC.yml index 275fd60f50..760bd0c9d3 100644 --- a/powerbi-docs/fundamentals/TOC.yml +++ b/powerbi-docs/fundamentals/TOC.yml @@ -63,9 +63,7 @@ - name: Resources items: - name: "Videos" - href: videos.md - - name: "Webinars" - href: webinars.md + href: videos.md - name: "Power BI updates archive" href: desktop-latest-update-archive.md - name: "Power BI Desktop change log" diff --git a/powerbi-docs/fundamentals/index.yml b/powerbi-docs/fundamentals/index.yml index 7a2c9a040e..f822fc2c63 100644 --- a/powerbi-docs/fundamentals/index.yml +++ b/powerbi-docs/fundamentals/index.yml @@ -72,5 +72,4 @@ landingContent: links: - text: "Power BI videos" url: videos.md - - text: "Webinars" - url: webinars.md + diff --git a/powerbi-docs/fundamentals/webinars.md b/powerbi-docs/fundamentals/webinars.md deleted file mode 100644 index 0cf7d85a8a..0000000000 --- a/powerbi-docs/fundamentals/webinars.md +++ /dev/null @@ -1,374 +0,0 @@ ---- -title: Power BI webinars -description: Register for upcoming Power BI live webinars or watch recorded sessions on-demand for features such as Getting started, Partners solutions, and much more. -author: kfollis -ms.author: kfollis -ms.reviewer: '' -ms.service: powerbi -ms.subservice: pbi-fundamentals -ms.topic: conceptual -ms.date: 08/15/2024 ---- -# Power BI :::no-loc text="webinars"::: -[//]: # "Tatevik Tatero (tatevik.teroyan@simple-concepts.com) and Anna Khachatryan (v-annakh) are vendors who work with Chauncy Freels and help to maintain this list of webinars" - -Register for our upcoming live webinars or watch our recorded sessions on-demand. - -## Upcoming :::no-loc text="webinars"::: - -[Upcoming events from the Power BI community](https://community.powerbi.com/t5/Upcoming-Events/bd-p/Community) - -## Featured :::no-loc text="webinars"::: - -Get started with these popular on-demand webinars: - -Starter guide for Power BI consumers -by Will Thompson -[Watch now](https://info.microsoft.com/ww-ondemand-Starter-Guide-for-Power-BI-Consumers.html) - -Drive productivity and effective decision-making with Excel and Power BI -by Ikechukwu Edeagu -[Watch now](https://info.microsoft.com/ww-ondemand-Drive-Productivity-And-Effective-Decision-Making-With-Excel-And-Power-BI.html) - -Unleash your Dynamics 365 data with Azure Synapse Analytics and Power BI -
by Cillian Mitchell and Scott Sewell -[Watch now](https://info.microsoft.com/ww-ondemand-unleash-your-data-with-azure-synapse-analytics-and-power-bi.html) - -Quickly transform your organization with a data-driven culture through Power BI -by Lukasz Pawlowski -[Watch now](https://info.microsoft.com/ww-ondemand-quickly-transform-your-organization-with-a-data-driven-culture-through-power-bi.html) - -Power BI: Security and governance for your organization -by Anton Fritz and Rick Xu -[Watch now](https://info.microsoft.com/ww-ondemand-security-and-governance-for-your-organization.html) - -Power BI Apps: Distribute content to your organization -by Anshul Rampal -[Watch now](https://info.microsoft.com/ww-landing-powerbidistributecontent.html) - -Power BI 101: Create reports quickly and effectively -by Amanda Rivera -[Watch now](https://info.microsoft.com/ww-landing-bi101createreportseffectively.html) - -Power BI how-to: Analyze real-time data with streaming dataflows -by Mohammad Ali and Jeroen ter Heerdt -[Watch now](https://info.microsoft.com/ww-ondemand-power-bi-how-to-analyze-real-time-data-with-streaming-dataflows.html) - -:::no-loc text="Webinars"::: series: Mastering data modeling with Power BI -Episode 1 - Data modeling 101: Increasing the impact of Power BI -by Jeroen ter Heerdt, Microsoft and Marc Lelijveld, Macaw Netherlands -[Watch now](https://info.microsoft.com/ww-Landing-Mastering-Data-Modeling-with-Power-BI.html) - -:::no-loc text="Webinars"::: series: Mastering data modeling with Power BI -Episode 2 - Learn advanced data modeling with Power BI -by Jeroen ter Heerdt, Microsoft and Marc Lelijveld, Macaw Netherlands -[Watch now](https://info.microsoft.com/ww-Landing-Mastering-Data-Modeling-with-PowerBI.html?LCID=EN-US) - -:::no-loc text="Webinars"::: series: Mastering data modeling with Power BI -Episode 3 - Data modeling for experts with Power BI -by Jeroen ter Heerdt, Microsoft and Marc Lelijveld, Macaw Netherlands -[Watch now](https://info.microsoft.com/ww-landing-MasteringDataModelingWithPowerBI.html?LCID=EN-US) - -:::no-loc text="Webinars"::: series: Data modeling with Power BI -Episode 4 - Calculation groups and composite models -
By Jeroen ter Heerdt, Microsoft and Marc Lelijveld, Macaw Netherlands -[Watch now](https://info.microsoft.com/ww-ondemand-calculation-groups-and-composite-models.html) - -Behind the scenes with the Power BI Team -by Jeroen ter Heerdt and Miguel Martinez -[Watch now](https://info.microsoft.com/ww-landing-behindthescenespowerbiteam.html) - -Enable greater data agility with Azure Purview and Power BI -by Chandru Sugunan and Gaurav Malhotra -[Watch now](https://info.microsoft.com/ww-Ondemand-Enable-Greater-Data-Agility-with-Azure-Purview-and-Power-BI.html) - -Best Practices for deploying Power BI Embedded -
by Alon Baram -
[Watch now](https://info.microsoft.com/ww-Ondemand-Best-Practices-for-Deploying-Power-BI-Embedded.html) - -Harness Power BI for self-service data prep with dataflows -by Charles Webb -[Watch now](https://info.microsoft.com/ww-Landing-harness-PowerBI-SelfService-Prep-Dataflows.html?LCID=EN-US) - -Understanding Power BI Premium Gen 2 -by David Magar -[Watch now]( https://info.microsoft.com/ww-Landing-Understanding-Power-BI-Premium-Gen-2.html?LCID=EN-US) - -Demystifying Power BI semantic models -by Peter Myers, Bitwise Solutions and Chris Webb, Microsoft -[Watch now](https://info.microsoft.com/ww-landing-demystifying-PowerBI-datasets.html?LCID=EN-US) - -Securing your data in motion and at rest with Power BI -by Anton Fritz and Yitzhak Kesselman -[Watch now](https://info.microsoft.com/ww-Landing-SecuringyourdatainmotionandatrestwithPowerBI.html?LCID=EN-US) - -Quickstart guide to navigating Power BI -by Miguel Martinez -[Watch now](https://info.microsoft.com/ww-Landing-Quickstart-Guide-to-Navigating-Power-BI.html) - -Data-driven insights for real-time decisions and stronger customer connections -
by Shruti Shukla and Chandra Stevens -[Watch now](https://info.microsoft.com/ww-Landing-DataDrivenInsights.html) - -Get up and running quickly with Power BI -by Miguel Martinez -[Watch now](https://info.microsoft.com/ww-landing-get-up-and-running-quickly-with-power-bi.html) - -Drive a remote data culture with Power BI and Microsoft Teams -by Lukasz Pawlowski -[Watch now](https://info.microsoft.com/ww-Landing-RemoteDataCulturePowerBIandMicrosoftTeams.html) - -Better together: Five benefits Excel users will get from using Power BI -by Miguel Martinez and Carlos Otero -[Watch now](https://info.microsoft.com/ww-Landing-FiveBenefitsExcelUsersWillGetFromUsingPowerBI-Webinar.html) - -Monitor your data in real-time with Microsoft Power BI -by Miguel Martinez, Microsoft and Peter Myers, Bitwise Solutions -[Watch now](https://info.microsoft.com/ww-landing-Monitor-Your-Data-in-Real-time-with-Microsoft-Power-BI.html) - -Accelerate Power BI on Azure Data Lake Storage with Dremio -by Chris Webb, Microsoft and Tomer Shiran, Dremio -
[Watch now]( https://info.microsoft.com/ww-landing-Accelerate-Power-BI-on-Azure-Data-Lake-Storage-with-Dremio.html) - -Build scalable BI Solutions using Power BI and Snowflake -by Chris Webb, Microsoft, Craig Collier, Snowflake, and Chris Holliday, Visual BI -[Watch now](https://info.microsoft.com/ww-landing-build-scalable-BI-solutions-using-power-BI-and-snowflake.html) - -Boost user satisfaction with best practices for managing BI content -by Nimrod Shalit -[Watch now](https://info.microsoft.com/ww-landing-Boost-User-Satisfaction-with-Best-Practices-for-Managing-BI-Content.html) - -From insight to action: Driving a data culture with Power BI -by Arun Ulagaratchagan and Amir Netz -[Watch now]( https://info.microsoft.com/ww-landing-From-Insight-to-Action-Driving-a-Data-Culture-with-Power-BI.html) - -Enable better analytics with Power BI Embedded -by Alon Baram -[Watch now](https://info.microsoft.com/ww-landing-Enable-Better-Analytics-with-Power-BI-Embedded.html) - -How the Miami HEAT used Power BI to drive business decisions -by Edson Crevecoeur, Miami HEAT, Frank Mesa, Microsoft, and Xinrou Tan, Microsoft -[Watch now]( https://info.microsoft.com/ww-Landing-How-the-Miami-HEAT-Used-Power-BI-to-Drive-Business-Decisions.html) - -Simplify big data prep and analysis with Power BI -by Priya Sathy -[Watch now](https://info.microsoft.com/ww-landing-simplify-big-data-prep-and-analysis-with-power-BI.html) - -Improve decision-making with Power BI -by Kim Manis and Lukasz Pawlowski -[Watch now](https://info.microsoft.com/ww-Landing-Improve-Decision-Making-with-Power-BI.html) - -## On-demand webinars - -Watch recorded sessions at any time. - -Explore the total economic impact of Microsoft Power BI -by Megan Tomlin, Microsoft and Jonathan Lipsitz, Forrester Consulting -[Watch now]( https://info.microsoft.com/ww-landing-Explore-the-Total-Economic-Impact-Of-Microsoft-Power-BI.html) - -Analytics in Azure virtual event: Accelerate time to insight with Azure Synapse Analytics -by Gayle Sheppard and John Macintyre -[Register and watch now](https://info.microsoft.com/Analytics-in-Azure-virtual-event-Accelerate-Time-to-Insight-with-Azure-Synapse-Analytics-On-Demand-Registration.html) - -How Microsoft is changing BI data protection -by Anton Fritz and Adi Regev -[Register and watch now](https://info.microsoft.com/ww-landing-How-Microsoft-Is-Changing-BI-Data-Protection.html) - -How to become an insights-driven business -by Amir Netz, Microsoft and Boris Evelson, Forrester -[Register and watch now](https://info.microsoft.com/ww-landing-how-to-become-an-insights-driven-business.html) - -Three ways AI is changing BI -by Justyna Lucznik -[Register and watch now](https://info.microsoft.com/ww-landing-Three-Ways-AI-Is-Changing-BI.html) - -Power BI and the future of Modern and Enterprise BI -by Arun Ulagaratchagan and Amir Netz -[Register and watch now](https://info.microsoft.com/ww-landing-The-Future-of-Modern-and-Enterprise-BI-video.html) - -Nine trends shaping the future of big data analytics -by Vijay Gopalakrishnan -[Register and watch now](https://info.microsoft.com/ww-landing-Nine-Trends-Shaping-the-Future-of-Big-Data-Analytics.html) - -Getting started with Power BI -by Miguel Martinez -[Register and watch now](https://info.microsoft.com/getting-started-with-power-bi-ondemand.html?Is=Website) - -Get started with the Power BI mobile app -
by Maya Shenhav -[Register and watch now](https://info.microsoft.com/ww-Landing-Getting-Started-with-the-Power-BI-Mobile-App-Video.html) - -Learn to navigate your way through a Power BI dashboard in 20 minutes -by Miguel Martinez -[Register and watch now](https://info.microsoft.com/powerbi-dashboard-in-20-min.html?Is=Website) - -Strengthen your data modeling skills with Power BI -by Kasper de Jonge -[Register and watch now](https://info.microsoft.com/Strengthen-Your-Data-Modeling-Skills-with-PowerBI-Registration.html?Is=Website) - -Using Power BI with Dynamics 365 Finance and Operations -by Kevin Horlock -[Register and watch now](https://info.microsoft.com/ww-landing-Using-Power-BI-with-Dynamics-365-Finance-and-Operations.html) - -Microsoft runs on Power BI: Financial planning & analysis made easy -by Cory Hrncirik and Miguel Martinez -[Register and watch now](https://info.microsoft.com/Microsoft-Runs-on-Power-BI-OnDemandRegistration.html?Is=Website) - -Microsoft runs on Power BI: Using Power BI in Modern Treasury -by Pankaj Gudimella and Guru Kirthigavasan -[Register and watch now](https://info.microsoft.com/Microsoft-Runs-on-Power-BI-Using-Power-BI-in-Modern-Treasury-Registration.html) - -Supercharge your applications using the Power BI JavaScript API -by Nimrod Shalit -[Register and watch now](https://info.microsoft.com/ww-landing-PBI-JavaScript-API-video.html) - -Power BI, Excel, and Microsoft 365: Optimize your Enterprise Data -by Olaf Hubel and Miguel Martinez -[Register and watch now](https://info.microsoft.com/Unlocking-the-Value-of-your-Enterprise-Data-OnDemandRegistration.html?Is=Website) - -Simply compelling—Tips for better visualization design -by Miranda Li -[Register and watch now](https://info.microsoft.com/ww-landing-powerbi-tips-for-better-visualization-design.html?Is=Website) - ->[Browse the library of Power BI on-demand webinars from our community experts.](https://community.powerbi.com/t5/Webinars-and-Video-Gallery/bd-p/VideoTipsTricks?filter=webinars&featured=yes&Is=Website) - -## Getting started - -Automate day-to-day business processes with Power BI, PowerApps, and Power Automate -by Wim Coorevits and Enrique Plaza Garcia -[Register and watch now](https://info.microsoft.com/Automate-Day-to-Day-Business-Processes-with-Power-BI-Power-Apps-and-Microsoft-Flow-OnDemandRegistration.html) - -Best practices for managing Power BI embedded analytics for multi-tenant deployments -by Nimrod Shalit -[Register and watch now](https://info.microsoft.com/ww-landing-PBI-webinar-Best-Practices-for-Managing-Power-BI-Embedded-video.html) - -Power BI: Analytics done right -by Gohul Shanmugalingam -[Register and watch now](https://info.microsoft.com/CA-PowerBI-WBNR-FY19-11Nov-08-PowerBIAnalyticsDoneRight-MCW0008690_01Registration-ForminBody.html?Is=Website) - -Make your Power BI data visual: Core chart types and how to use them -by Miranda Li -[Register and watch now](https://info.microsoft.com/Make-your-Power-BI-Data-Visual-Registration.html?Is=Website) - -How to design visually stunning Power BI reports -by Charles Sterling -[Watch now](https://community.powerbi.com/t5/Webinars-and-Video-Gallery/5-3-17-Webinar-How-to-Design-Visually-Stunning-Power-BI-Reports/m-p/168204?Is=Website) - -The total economic impact of Power Automate and PowerApps -by Jonathan Lipsitz, Forrester and Enrique Plaza Garcia, Microsoft -[Register and watch now](https://info.microsoft.com/The-TEI-of-PowerApps-and-Microsoft-Flow-OnDemandRegistration.html?Is=Website) - -Better together: 5 benefits Excel users will get from using Power BI -by Carlos Otero and Miguel Martinez -[Register and watch now](https://info.microsoft.com/excel-powerbi-better-together.html?Is=Website) - -Learn about Power BI Embedded in 20 minutes -by Megan Assarrane and Colin Murphy -[Register and watch now](https://info.microsoft.com/ww-landing-power-bi-embedded-in-20-min.html?Is=Website) - -Beyond the spreadsheet -by Gohul Shanmugalingam -[Register and watch now](https://info.microsoft.com/CA-PowerBI-WBNR-FY18-05May-09-DataBeyondtheSpreadsheet-MCW0006385_01Registration-ForminBody.html?Is=Website) - -Draw the right insights with Power BI and Visio -by Shakun Grover -[Register and watch now](https://info.microsoft.com/ww-landing-powerbi-and-visio.html?Is=Website) - -Transforming a report from good to GREAT! -by Reid Havens -[Watch now](https://community.powerbi.com/t5/Webinars-and-Video-Gallery/Power-BI-Transforming-A-Report-From-Good-to-GREAT/m-p/315119?Is=Website) - -## Partner Solutions Series - -[Watch this series](https://info.microsoft.com/ww-landing-PartnerWebinarSeriesPage.html) - -Power BI: How to get insights from your Workday HR data -by Iman Eftekhari, Agile Analytics, Julia Paton, Agile Analytics, and Shahram Karimi, QBE Insurance -[Register and watch now](https://info.microsoft.com/How-to-Get-insights-from-Your-Workday-HR-Data-Registration.html) - -Achieving a win-win for consumer product goods, manufacturers, and retailers -by Liz McCreesh, Thorogood -[Register and watch now](https://info.microsoft.com/Achieving-a-Win-Win-for-Consumer-Packaged-Goods-Manufacturers-and-Retailers-registration.html) - -Transform customer data into retail success with Power BI -by Angad Soni, Hitachi Solutions -[Register and watch now](https://info.microsoft.com/Transform-Your-Customer-Data-into-Retail-Success-OnDemandRegistration.html?wt.mc_id=undefined) - -Proven healthcare solutions to improve both patient outcomes and profitability -by Stephen Cracknell, UA Medical IT and Stuart Macanliss, US Medical IT -[Register and watch now](https://info.microsoft.com/Proven-Techniques-for-Building-Effective-Dashboards-Registration.html?Is=Website) - -Manufacturers: Your industry is going through a digital transformation - Maintain leadership by leveraging analytics to maximize profitability -by Jon Thompson, Blue Margin and Jim Pastor, Elgin Fastener Group -[Register and watch now](https://info.microsoft.com/digital-transformation-in-manufacturing.html?) - -Visualize public or private datasets with the new Power BI and data.world connector -by Patrick McGarry and Miguel Martinez -[Register and watch now](https://info.microsoft.com/data-world-connector-powerbi.html?Is=Website) - -Boost your BI with location intelligence -by Scott Ball, Esri and Enrique Plaza, Microsoft -[Register and watch now](https://info.microsoft.com/ww-ondeamnd-boost-powerbi-with-arcgis.html?Is=Website) - -Five habits of a successful trend curator -by Rohit Bhargava, Non-Obvious -[Register and watch now](https://info.microsoft.com/ww-landing-5-Habits-of-a-Successful-Trend-Curator-Video.html) - -## Community - -Visit the [Community Webinars and Video Gallery](https://community.powerbi.com/t5/Webinars-and-Video-Gallery/bd-p/VideoTipsTricks) for more resources. - -Power BI tricks, tips, and tools from the owners of PowerBI.Tips -by Mike Carlo and Seth Bauer -[Watch now](https://www.youtube.com/watch?v=fnj1_e3HXow) - -Storytelling with your data and Power BI -by Tristan Malherbe -[Watch now](https://www.youtube.com/watch?v=egk0suekwHo) - -Practical DAX for Power BI -by Phil Seamark -[Watch now](https://www.youtube.com/watch?v=1fGfqzS37qs) - -Developing with Power BI Embedding – The April 2018 Update -
by Ted Pattison -[Watch now](https://www.youtube.com/watch?v=swnGlrRy588) - -Power BI security deep dive -by Kasper de Jonge -[Watch now](https://community.powerbi.com/t5/Webinars-and-Video-Gallery/5-23-2017-Power-BI-security-deep-dive-by-Kasper-de-Jonge/m-p/161476?Is=Website) - -Ask a Partner: Developing Power BI visuals for Power BI -by Ted Pattison -[Watch now](https://community.powerbi.com/t5/Webinars-and-Video-Gallery/Ask-a-Partner-Developing-Custom-Visuals-for-Power-BI/m-p/150368?Is=Website) - -## Advanced Topics - -Advanced analytics with Excel and Power BI -by Nagasaikiran Kambhampati, Myriad Consulting and Miguel Martinez, Microsoft -[Register to watch](https://info.microsoft.com/ww-landing-advanced-analytics-excel-powerbi.html?Is=Website) -[Download the Advanced Analytics Starter Kit to follow along](https://aka.ms/pbiaawebinar) - -Power BI adoption framework webinar series -by Manu Kanwarpal and Paul Henwood -[Register and watch now - Part 1 - Adoption: Adopt a data-driven culture](https://info.microsoft.com/ww-landing-powerbi-adoption-ondemand.html?Is=Website) -[Register and watch now - Part 2 - Governance: Govern your Power BI usage](https://info.microsoft.com/ww-ondemand-powerbi-governance.html?Is=Website) -[Register and watch now - Part 3 - Service Management: Power BI Service Management Insights](https://info.microsoft.com/ww-landing-pbi-adoption-framework-part3.html?Is=Website) -[Register and watch now - Part 4 - Security: Keeping your data secure with Power BI](https://info.microsoft.com/ww-landing-pbi-adoption-framework-part4.html?Is=Website) -[Register and watch now - Part 5 - Rollout: Successfully rolling out Power BI](https://info.microsoft.com/ww-landing-powerbi-adoption-part5-rollout.html?Is=Website) - -Be a full stack Power BI Jedi - A walkthrough of Power BI most advanced features through Star Wars data -by Gil Raviv -[Watch now](https://www.youtube.com/watch?v=r0Qk5V8dvgg) - -## Related content - -- [Power BI whitepapers](../guidance/whitepapers.md) - -- [What is Power BI?](power-bi-overview.md) - -- Follow [@MSPowerBI on Twitter](https://twitter.com/mspowerbi) - -- Subscribe to our [YouTube channel](https://www.youtube.com/mspowerbi) - -More questions? [Try asking the Power BI Community](https://community.powerbi.com/) diff --git a/powerbi-docs/guidance/TOC.yml b/powerbi-docs/guidance/TOC.yml index 2793e0f8e4..7a773b93dc 100644 --- a/powerbi-docs/guidance/TOC.yml +++ b/powerbi-docs/guidance/TOC.yml @@ -277,6 +277,8 @@ href: powerbi-implementation-planning-auditing-info-protection-data-loss-prevention.md - name: "Data gateways" href: powerbi-implementation-planning-data-gateways.md + - name: "Integration with other services" + href: powerbi-implementation-planning-integration-with-other-services.md - name: "Auditing and monitoring" items: - name: "Auditing and monitoring overview" diff --git a/powerbi-docs/guidance/powerbi-implementation-planning-integration-with-other-services.md b/powerbi-docs/guidance/powerbi-implementation-planning-integration-with-other-services.md new file mode 100644 index 0000000000..b392f4fb9b --- /dev/null +++ b/powerbi-docs/guidance/powerbi-implementation-planning-integration-with-other-services.md @@ -0,0 +1,527 @@ +--- +title: "Power BI implementation planning: Integration with Other Services" +description: "This article helps you to plan how and when to integrate Power BI and Microsoft Fabric with other services." +author: peter-myers +ms.author: daengli +ms.reviewer: asaxton +ms.service: powerbi +ms.subservice: powerbi-resource +ms.topic: conceptual +ms.custom: fabric-cat +ms.date: 10/12/2024 +ms.collection: ce-skilling-ai-copilot +--- + +# Power BI implementation planning: Integration with other services + +[!INCLUDE [powerbi-implementation-planning-context](includes/powerbi-implementation-planning-context.md)] + +This article helps you to plan how and when to integrate Power BI and Microsoft Fabric with other services. This article is primarily targeted at: + +- **BI and analytics directors and managers**: Decision makers who are responsible for overseeing the BI program and strategy. These individuals decide whether to use other services to support specific strategic objectives, or to complement Fabric or Power BI. +- **Fabric administrators**: The administrators who are responsible for overseeing Fabric in the organization. Fabric administrators control which services can integrate with Fabric by enabling [Integration tenant settings](/fabric/admin/service-admin-portal-integration), and they set up tenant-level integration with services in Azure or Microsoft Teams. Often, Fabric administrators need to collaborate with other administrators to facilitate this integration. +- **Center of Excellence (COE), IT, and BI teams**: The teams that are responsible for overseeing Power BI in the organization. These teams look for opportunities to use services that when integrated help people solve problems or use Power BI more effectively. +- **Content owners and content creators**: The teams and individuals that champion analytics in a team or department. These teams and individuals perform workspace-level and solution-level integration to support specific needs and use cases, where permitted. + +When you use Power BI, you might experience certain needs or challenges that you can't address with the core Power BI tools and features. In these situations you can consider integrating Power BI with other services. Most of these services are Microsoft services, like Azure or Microsoft 365, but you can also integrate Power BI with custom or third-party services. Extending the functionality of Power BI in these ways can help solve new problems, and it allows people to become more effective with their regular tasks. + +Here are some common scenarios involving the integration of Power BI with other services: + +- You have specific requirements that mandate the use of another service. For example, you must integrate with [Azure Private Link](/azure/private-link/private-link-overview) to connect to services over a [private endpoint](/azure/private-link/private-endpoint-overview) in your virtual network. +- You encounter specific challenges that can't be overcome with Power BI alone. For example, you use [Azure Log Analytics](../transform-model/log-analytics/desktop-log-analytics-overview.md) integration to obtain detailed query diagnostics of your semantic models for troubleshooting and auditing. +- You want to use services that you already use, or extend the capabilities of Power BI. For example, you can allow Excel users to connect to semantic models by using the [Excel add-in](../collaborate-share/service-analyze-in-excel.md) to insert connected PivotTables. + +You can integrate Power BI with other services at the level of your tenant, your workspace, or individual solutions (like semantic models and reports): + +- **Tenant-level integration**: Affects the entire tenant and is set up by Fabric administrators, usually in collaboration with other administrators. For example, [Teams integration](../collaborate-share/service-collaborate-microsoft-teams.md) is set up at the tenant level. Another example that affects networking is [Azure ExpressRoute](/power-platform/guidance/expressroute/overview). +- **Workspace-level integration**: Affects all content in the workspace and is set up by workspace administrators. For example, [Git integration](/fabric/cicd/git-integration/intro-to-git-integration) is set up at the workspace level to achieve source control with [Azure Repos](/azure/devops/repos/get-started/what-is-repos?view=azure-devops&preserve-view=true), which is a service of [Azure DevOps](/azure/devops/user-guide/what-is-azure-devops?view=azure-devops&preserve-view=true). +- **Solution-level integration**: Affects a single content item and is set up by the content creator. For example, [Python](../connect-data/desktop-python-visuals.md) or [R](../connect-data/service-r-packages-support.md) is set up at the solution level to enable the creation of custom, interactive visuals. + +For all three of these levels, there are considerations you should keep in mind when you integrate Power BI with other services: + +- **Security considerations**: Integrating other services inevitably results in more risks that you must mitigate to use them successfully. For example, integration with AI services has the potential to expose internal data to external services that train their models. To mitigate this risk, ensure that you proactively evaluate any security risks and considerations of integrating a service. Also, identify concrete actions to ensure compliance with data security and privacy policies in your region and organization. +- **Licensing considerations**: Integrating other services might require a specific subscription or license. For example, integrating Power BI reports with [PowerApps](/power-apps/powerapps-overview) is only possible when you have the appropriate PowerApps licenses. For each service, ensure that you evaluate whether you need a specific license or subscription to integrate it, and what the estimated cost is per user or capacity. Do this evaluation not only for the services, but also for Fabric and Power BI per-user and per-capacity licenses. +- **Governance considerations**: Integrating other services results in more diverse activities and operations that people undertake in your tenant, some of which might lead to inappropriate practices. For example, integration of Power BI reports with OneDrive or SharePoint might lead to people sharing Power BI Desktop (.pbix) files directly with report viewers. This approach diverts from the better practices of publishing the report to a workspace and sharing it via direct access, workspace viewer roles, or a Power BI app. Therefore, you should proactively identify any potential governance risks before you integrate a service, and identify the effort needed to monitor and support the service in your tenant. +- **Mentoring and user enablement considerations**: Integrating other services might require time and effort to train users to use any new capabilities effectively. For example, if you allow users to integrate Excel with Power BI, you should train users about how to effectively use [Analyze in Excel](../collaborate-share/service-analyze-in-excel.md#analyze-in-excel). Training should guide them on when to use it, and inform them of its considerations and limitations. Ensure that you proactively plan how to train and support people that will use this integration. + +The remainder of this article describes the possibilities to integrate Power BI with other services at the level of your tenant, workspaces, and individual solutions (like reports or semantic models). + +> [!NOTE] +> This article provides an overview of the different services that you can integrate with Power BI, and the potential use cases to do so. The purpose of this article _is not_ to guide you in the technical steps required to set up or troubleshoot the integration. You will find links to technical information in each respective section of this article. + +## Tenant-level integration + +Fabric administrators can integrate some services for use across the entire tenant. Typically, this integration facilitates broader interoperability between Fabric or Power BI and related services, like those available in Azure. Tenant-level integration can also affect how certain data is handled. + +> [!IMPORTANT] +> For an overview of the relevant administration settings a Fabric administrator can use to control integration of Microsoft Fabric or Power BI with external services, see [Integration tenant settings](/fabric/admin/service-admin-portal-integration). A Fabric administrator can control integration with services across all levels with these tenant settings. + +### Integration with Azure services + +You can integrate your tenant with a wide range of Azure services that you might already use to store or manage your data. This integration helps you apply the scope and benefits of Azure services from within Fabric and Power BI. It also enables more advanced capabilities that can support many roles, from administrators and centralized teams to decentralized content owners or creators. + +Integrating with Azure services requires that you have an active Azure subscription for them. Additionally, there are some specific licensing considerations for this option. Using sensitivity labels and DLP policies requires an Azure Information Protection Premium P1 or Premium P2 license. Users might require a Power BI Pro or Premium Per User (PPU) license to use features resulting from this integration, like applying sensitivity labels. Finally, some of these services also require that you have Fabric or Premium capacity, and they might use your capacity resources. + +For guidance on how to integrate with Azure services, see: + +- [Azure Information Protection](../enterprise/service-security-enable-data-sensitivity-labels.md) for sensitivity labels and data loss prevention (DLP) policies +- [Azure Synapse Analytics](/fabric/onelake/onelake-azure-synapse-analytics) +- [Azure Databricks](/fabric/onelake/onelake-azure-databricks) +- [Databricks Unity Catalog](/fabric/onelake/onelake-unity-catalog) +- [Azure HDInsight](/fabric/onelake/onelake-azure-hdinsight) +- [Azure Automation](/azure/automation/overview) + +While not necessarily Azure services, you can also use the following tools available for tenant-level integration with Power BI: + +- [PowerShell](/fabric/onelake/onelake-powershell) +- [Azure Storage Explorer](/fabric/onelake/onelake-azure-storage-explorer) + +#### Integration with AI services + +Apart from [Copilot in Fabric](/fabric/get-started/copilot-fabric-overview), there are various AI services that you can integrate with Fabric and Power BI. These services can help you perform advanced analytics to apply specific models to your data, depending on your needs and use cases. + +Integrating with AI Azure services requires that you have an active Azure subscription for them. Additionally, some of these services also require that you have Fabric or Premium capacity, and they will use your capacity resources. To ensure that these workloads don't have a negative impact on your capacity utilization, ensure that you set a memory limit for AI workloads within your capacity. That way, you can avoid unexpected usage of your capacity units (CUs). For more information, see [Manage impact on a Premium capacity](../transform-model/desktop-ai-insights.md#manage-impact-on-a-premium-capacity). + +For guidance on how to integrate with the different AI services in Azure, see: + +- [Azure OpenAI Services](/fabric/data-science/ai-services/how-to-use-openai-via-rest-api) +- [Azure AI Services (Text Analytics and Vision)](../transform-model/desktop-ai-insights.md#use-text-analytics-and-vision) +- [Azure AI Translator](/fabric/data-science/ai-services/how-to-use-text-translator?tabs=rest) +- [Azure Machine Learning](../transform-model/desktop-ai-insights.md#use-azure-machine-learning) + +#### Integrate Azure AI Services in Power Query + +You can invoke specific AI functions in Power Query by using Azure AI Services. These functions run by using Fabric capacity or Premium capacity for a selected workspace. They can derive useful information from less-structured text or image data. + +Use cases for Azure AI Services integration with a semantic model or dataflow include: + +- [Language detection](../transform-model/desktop-ai-insights.md#detect-language) from text in a field. +- [Key phrase extraction](../transform-model/desktop-ai-insights.md#extract-key-phrases) from text in a field. +- [Sentiment analysis](../transform-model/desktop-ai-insights.md#score-sentiment) of text from input fields. +- [Image classification](../transform-model/desktop-ai-insights.md#tag-images) of images of recognizable objects, entities, scenes, or actions. + +#### Integrate Azure Machine Learning in Power Query + +Similarly to how you can use Azure AI Services, you can apply machine learning models to your data by [invoking dynamic Power Query functions](../transform-model/desktop-ai-insights.md#invoke-an-azure-machine-learning-model-in-power-query). These machine learning models must have schema files generated in Python by the model creator. + +Dataflow Gen1 creators can also use [AutoML](../transform-model/dataflows/dataflows-machine-learning-integration.md#work-with-automl) to create their own machine learning models by using Power BI during data preparation. Creators can choose a specific type of model, either [binary prediction](../transform-model/dataflows/dataflows-machine-learning-integration.md#binary-prediction-models), [general classification](../transform-model/dataflows/dataflows-machine-learning-integration.md#classification-models), or [regression](../transform-model/dataflows/dataflows-machine-learning-integration.md#regression-models). Next, they train these models with input data, and evaluate the results before they [apply the model](../transform-model/dataflows/dataflows-machine-learning-integration.md#apply-the-automl-model) to new or updated data after dataflow refresh. + +Use cases for Azure Machine Learning integration with a semantic model or dataflow Gen1 include: + +- Conduct predictive modeling in Power BI without needing a deep expertise in data science tools or Python. +- Perform simple churn prediction and forecasting. +- Apply organizational models in Azure Machine Learning to enrich data in Power BI. + +### Integration for Independent Software Vendors + +Independent software vendors (ISVs), who produce and sell software, can integrate with Fabric to support and extend their applications. + +There are three different models that ISVs can use to integrate with Fabric: + +- **[Interop model](/fabric/cicd/partners/partner-integration#interop-with-fabric-onelake)**: ISVs can integrate with OneLake through various tools, such as the OneLake APIs, and others. +- **[Develop on Fabric model](/fabric/cicd/partners/partner-integration#develop-on-fabric)**: ISVs can develop their own products and services on Fabric, and even embed the capabilities of Fabric in their software. +- **[Build a Fabric workload model](/fabric/cicd/partners/partner-integration#build-a-fabric-workload)**: ISVs can use the Microsoft Fabric Workload Development Kit to create and [monetize](/fabric/workload-development-kit/monetization) workloads. + +For more information about how ISVs can integrate with Fabric, see [Microsoft Fabric Integration Pathways for ISVs](/fabric/cicd/partners/partner-integration). + +### Microsoft Teams integration + +You can integrate your tenant with Microsoft Teams to allow users to access Fabric and Power BI from within the Teams application. This capability is a convenient way to centralize collaboration and promote adoption of both Teams and Power BI. + +For more information about how to integrate teams with Power BI, see: + +- [Add the Power BI app to Microsoft Teams](../collaborate-share/service-microsoft-teams-app.md): Integrate the Power BI experience into Microsoft Teams. +- [Embed interactive reports in Teams channels and chats with a Power BI tab](../collaborate-share/service-embed-report-microsoft-teams.md): Help colleagues find and discuss your team's data. +- [Use interactive reports in Teams meetings](../consumer/business-user-teams-meetings.md): Discuss a report during a meeting, or use the report to support the meeting objectives. +- [Create a link preview in the Teams message box](../collaborate-share/service-teams-link-preview.md): Paste links to reports, dashboards, or Power BI apps. +- [Chat in Microsoft Teams directly from within the Power BI service](../collaborate-share/service-share-report-teams.md): Share a filtered view of reports and dashboards, and start conversations. +- [View all the Power BI tabs you have in Microsoft Teams](../collaborate-share/service-teams-pivot.md): Select the _In Teams_ tab on the Power BI app home page. +- [Get notified in the Teams activity feed](../collaborate-share/service-teams-notifications.md): Quickly learn when important events happen in Power BI. + +Use cases for Teams integration with Power BI include: + +- Curate a [centralized portal](../guidance/fabric-adoption-roadmap-mentoring-and-user-enablement.md#centralized-portal) for your [community of practice](../guidance/fabric-adoption-roadmap-community-of-practice.md) and embed key Power BI reports and resources. +- Create dedicated teams or teams channels for content distributed from a Power BI app, where people can share feedback, issues, or ask questions about the content. +- Train users to make [shared views](../collaborate-share/service-share-reports.md) that they can share via Teams to discuss specific perspectives or data points. + +### Geospatial services integration + +When you work with geospatial data, you'll probably want to visualize it in interactive map visuals with Power BI. However, these visuals require integration with other services, which you can control at the tenant-level by using the tenant settings. These visuals can be effective in reports that present geospatial data, but you should ensure that using these services doesn't violate any data residency or compliance requirements. + +For more information about how to integrate Power BI with various geospatial services, see: + +- [ArcGIS visualizations in Power BI reports](../visuals/power-bi-visualizations-arcgis.md), which use Esri services. +- [Azure Maps visualizations for Power BI reports](https://go.microsoft.com/fwlink/?linkid=2132636), which use Azure services. +- [Map](../visuals/desktop-shape-map.md) and [filled map visuals](../visuals/power-bi-visualization-filled-maps-choropleths.md), which use Bing services. + +> [!WARNING] +> Geospatial services might use other services that are outside of the geographic region of your Power BI tenant, compliance boundary, or national cloud instance. Furthermore, these services might store and process your data where they maintain facilities, and use of these services might be subject to separate terms and privacy policies beyond Power BI. +> +> This warning also applies to any third-party custom visual that you use to visualize geospatial information. + +## Workspace-level integration + +You can integrate certain services at the level of individual workspaces. These services can enable capabilities to help you develop, manage, and view content in a workspace. + +### Git integration + +If your workspace uses Fabric capacity, Premium capacity, or PPU license modes, you can use [Git integration](/fabric/cicd/git-integration/intro-to-git-integration) to connect a workspace to a remote Git repository to support more advanced lifecycle management scenarios. A remote Git repository facilitates [source control](../guidance/powerbi-implementation-planning-content-lifecycle-management-develop-manage.md#source-control-by-using-a-remote-git-repository) of files, which allows content creators to [track and manage changes](../guidance/powerbi-implementation-planning-content-lifecycle-management-develop-manage.md#decide-how-youll-use-version-control). Git integration also promotes [collaboration](../guidance/powerbi-implementation-planning-usage-scenario-enterprise-content-publishing.md#collaboration-flow-diagram) among developers, particularly when using [branches](/fabric/cicd/git-integration/manage-branches) to isolate development of specific features before integrating those changes into a main branch with a merge before deployment. + +In brief, content creators can develop content either locally or in the Power BI service, then commit and push those changes to a remote Git repository, like [Azure Repos](/azure/devops/repos/get-started/what-is-repos) or [GitHub Enterprise](/visualstudio/subscriptions/access-github). For information about how to set up and use Git integration for Power BI and Fabric, see [Get started with Git integration](/fabric/cicd/git-integration/git-get-started?tabs=commit-to-git) or [Tutorial: end-to-end lifecycle management](/fabric/cicd/cicd-tutorial). + +Content creators store Power BI Project (.pbip) files, metadata files, and documentation in a central Azure Repos remote repository. These files are curated by a [technical owner](../guidance/powerbi-adoption-roadmap-content-ownership-and-management.md#ownership-and-stewardship). While a content creator develops a solution, a technical owner is responsible for managing the solution and reviewing the changes and merging them into a single solution. Azure Repos provides more sophisticated options for tracking and managing changes compared to SharePoint and OneDrive. Maintaining a well-curated, documented repository is essential because it's the foundation of all content and collaboration. + +Consider using source control to track and manage changes in the following scenarios: + +- Centralized or decentralized teams create and manage the content. +- Content creators collaborate by using Azure DevOps. +- Content creators are familiar with Git, source control management, or [DataOps architecture design](/azure/architecture/data-guide/azure-dataops-architecture-design). +- Content creators manage complex or important content, or they expect the content to scale and grow in complexity and importance. + +To help you effectively use source control with Azure DevOps, you need to be aware of considerations and meet certain prerequisites: + +- **Git**: To commit and push changes to a remote repository, content creators need to [download](https://git-scm.com/downloads) and install _Git_. Git is a distributed version control system that tracks changes in your files. To learn about the basics of Git, see [What is Git?](/devops/develop/git/what-is-git). +- **Tools**: To use Git, content creators need to use either a [command line interface (CLI)](https://git-scm.com/book/en/v2/Getting-Started-The-Command-Line) or a graphical user interface (GUI) client that has integrated source control management (SCM), like [Visual Studio](/visualstudio/extensibility/internals/source-control) or [Visual Studio Code](https://code.visualstudio.com/docs/sourcecontrol/overview). +- **Licenses and permissions**: To create and use an Azure Repos Git repository, content creators must: + - Have their Azure DevOps [access level](/azure/devops/organizations/billing/buy-basic-access-add-users?view=azure-devops&preserve-view=true) set to _Basic_ (as opposed to _Stakeholder_). + - Belong to an Azure DevOps [organization](/azure/devops/user-guide/manage-organization-collection?view=azure-devops&preserve-view=true#add-users-to-your-organization) and a [project](/azure/devops/organizations/security/add-users-team-project?view=azure-devops&tabs=preview-page&preserve-view=true). + - Have appropriate Azure DevOps [repository permissions](/azure/devops/repos/git/set-git-repository-permissions?view=azure-devops&preserve-view=true). + - Work with Power BI items only due to the Git integration constraints when using a Power BI Premium capacity (A SKUs) or PPU workspaces. +- **Fabric Git integration**: To sync content in a remote repository with a Fabric workspace, content creators use [Fabric Git integration](/fabric/cicd/git-integration/intro-to-git-integration). This tool is important because it tracks and manages changes to content that's created in the Fabric portal, like dataflows. + +### Integrate Azure Log Analytics + +You can use Azure Log Analytics to gather valuable information to support [data-level auditing](../guidance/powerbi-implementation-planning-auditing-monitoring-data-level-auditing.md#azure-log-analytics) of workspace items. Azure Log Analytics is a component of the [Azure Monitor](/azure/azure-monitor/) service. Specifically, [Azure Log Analytics integration with Power BI](../transform-model/log-analytics/desktop-log-analytics-overview.md) allows you to capture semantic model events from all semantic models in a Power BI workspace. It's only supported for workspaces that use Fabric or Premium capacity. For information about how to set up and use Azure Log Analytics for Power BI and Fabric, see [Data-level auditing: Azure Log Analytics](../guidance/powerbi-implementation-planning-auditing-monitoring-data-level-auditing.md#azure-log-analytics) and [Configure Azure Log Analytics in Power BI](../transform-model/log-analytics/desktop-log-analytics-configure.md). + +After you set up Azure Log Analytics integration and the connection is enabled (for a supported workspace), semantic model events are automatically captured and continually sent to an Azure Log Analytics workspace. The semantic model logs are stored in [Azure Data Explorer](/azure/data-explorer/data-explorer-overview), which is an append-only database that's optimized for capturing high-volume, near real-time telemetry data. + +Use cases for using Azure Log Analytics include: + +- You want to monitor strategically important semantic models, like centralized models that you provide to decentralized teams in a [managed self-service](../guidance/powerbi-implementation-planning-usage-scenario-managed-self-service-bi.md) usage scenario. +- You want to audit or investigate semantic models that have a high impact on resource utilization, like Fabric capacity. +- You want detailed analytics about query and usage patterns for semantic models. + +To use Azure Log Analytics, you must set up and pay for an [Azure Log Analytics workspace](/services-hub/unified/health/log-analytics-workspace) as part of your Azure subscription. You pay for Azure Log Analytics with a pay-as-you-go subscription. For more information, see [Azure Log Analytics Pricing](/services-hub/unified/health/azure-pricing). + +### Integrate Azure Data Lake Storage Gen2 + +You can connect a workspace to an Azure Data Lake Storage (ADLS) Gen2 account. When you connect a workspace to ADLS Gen2, you can store data for Power BI dataflows (also called dataflows Gen1) and semantic model backups. For information about how to set up and use ADLS Gen2 to store data from Power BI dataflows, see [Configuring dataflow storage to use Azure Data Lake Gen 2](../transform-model/dataflows/dataflows-azure-data-lake-storage-integration.md). + +Setting [Azure connections](../admin/service-admin-portal-azure-connections.md) in the Fabric admin portal doesn't mean that all Power BI dataflows for the tenant are stored by default in an ADLS Gen2 account. To use a specific storage account (instead of internal storage), each workspace must be explicitly connected. It's critical that you set the workspace Azure connections _before you create any Power BI dataflows_ in the workspace. + +The following two sections present reasons why you might integrate a workspace with ADLS Gen2. + +#### Storage of Power BI dataflows data + +If you bring your own data lake, the data for Power BI dataflows (Gen1) can be accessed directly in Azure. Direct access to [dataflow storage in ADLS Gen2](../transform-model/dataflows/dataflows-azure-data-lake-storage-integration.md) is helpful when you want other users or processes to view or access the data. It's especially helpful when your goal is to reuse dataflows data beyond Power BI. + +There are two options for assigning storage: + +- [Tenant-level storage](../guidance/powerbi-implementation-planning-usage-scenario-self-service-data-preparation.md#tenant-level-storage): This option is helpful when you want to centralize all data for Power BI dataflows into one ADLS Gen2 account. +- [Workspace-level storage](../guidance/powerbi-implementation-planning-usage-scenario-self-service-data-preparation.md#workspace-level-storage): This option is helpful when business units manage their own data lake or have certain data residency requirements. + +> [!TIP] +> If you use Fabric, we recommend that you use [dataflows Gen2](/fabric/data-factory/dataflows-gen2-overview), which can store data in [different destinations](/fabric/data-factory/dataflows-gen2-overview#data-destinations), including [OneLake](/fabric/onelake/onelake-overview). Dataflows Gen2 are more flexible than dataflows Gen1, because they provide more options to [integrate with other data pipelines](/fabric/data-factory/dataflows-gen2-overview#integration-with-data-pipelines) and they benefit from [high-scale compute](/fabric/data-factory/dataflows-gen2-overview#high-scale-compute). + +#### Backup and restore for Power BI semantic models + +The [Power BI semantic model backup and restore feature](../enterprise/service-premium-backup-restore-dataset.md) is supported for workspaces that are assigned to Fabric capacity, Premium capacity, or PPU. This feature uses the same ADLS Gen2 account that's used to store Power BI dataflows data (described in the previous section). + +Semantic model backups help you: + +- Comply with data retention requirements. +- Store routine backups as part of a [disaster recovery strategy](/azure/well-architected/reliability/disaster-recovery). +- Store backups in a different region. +- Migrate a data model. + +## Solution-level integration + +You can integrate certain services at the level of individual items, like semantic models or reports. These integrations can enable specific use cases and extend the functionality of your Power BI items. + +### Integration with Microsoft Fabric + +Power BI is part of Fabric, but Power BI is a distinct workload in Fabric that can integrate with the other experiences that are unified under the Fabric umbrella. If you're familiar with only working with Power BI, it's important to understand the possibilities and opportunities to apply other workloads, items, and features in Fabric. + +The following sections present examples of how you can integrate Power BI content with Fabric to extend the capabilities of Power BI. + +#### OneLake integration with semantic models + +Content creators who make Power BI semantic models can use [OneLake integration](../enterprise/onelake-integration-overview.md) to write model tables to [Delta tables in OneLake](/fabric/data-engineering/lakehouse-and-delta-tables). After the initial copying of the in-memory tables, they can then be reused from OneLake for other use cases, without the need to copy them. The Delta tables are accessible via a [lakehouse](/fabric/data-engineering/lakehouse-overview) in Fabric. Users can also create shortcuts to access the tables so they can use them from another lakehouse or a different item type, like a [data warehouse](/fabric/data-warehouse/data-warehousing). + +Use cases for using OneLake integration with semantic models include: + +- Reuse data from a semantic model that isn't already available in OneLake. +- Reuse data from a semantic model for use in another Fabric experience. +- Create snapshots of a semantic model table. + +#### Semantic link integration with semantic models via notebooks + +Content creators who build semantic models or analyze data in notebooks can use [semantic link](/fabric/data-science/semantic-link-overview) to read and write to semantic models from a notebook in Fabric. Semantic link has a wide range of benefits for Power BI developers, including enhanced productivity, automation, and the ability to quickly and easily perform ad hoc analysis in code. + +Use cases for using semantic link integration with semantic models include: + +- Automate testing of semantic models by evaluating DAX queries and comparing the results to known baselines. +- Programmatically manage semantic models by running [Best Practice Analyzer](../guidance/powerbi-implementation-planning-auditing-monitoring-data-level-auditing.md#best-practice-analyzer) over multiple models at the same time to identify and classify possible issues. +- Save common templates and patterns for DAX measures and business logic (like currency conversion) that can be applied to new semantic models. +- Analyze and visualize data from a semantic model by using Python. +- Validate models created by data scientists by using the business logic from a semantic model. +- Use data from a semantic model to enrich analysis. + +> [!TIP] +> The [semantic-link-labs](https://github.com/microsoft/semantic-link-labs/) Python library further extends the utility of semantic link. It's a valuable tool for anyone who creates and manages semantic models and wants to improve productivity and efficiency of the model creation or management process. + +Even if you don't know Python, you can use [Copilot](/fabric/get-started/copilot-notebooks-chat-pane) and [Chat-magics](/fabric/get-started/copilot-notebooks-chat-magics) to gain assistance writing functional Python code to obtain a useful result. + +#### Data Activator integration with Power BI reports + +Content creators or consumers who build or use Power BI reports can use [Data Activator](/fabric/data-activator/data-activator-get-data-power-bi) to automate actions and notifications based on data changes. Similar to [data alerts from dashboard tiles](../create-reports/service-set-data-alerts.md), a user can set alerts on a Power BI visual and define triggers for those alerts. The user can also extend this functionality to use [Custom actions to trigger a Power Automate flow](/fabric/data-activator/data-activator-trigger-power-automate-flows) that can initiate other downstream changes. + +Use cases for Data Activator integration with Power BI include: + +- Automated anomaly detection, by setting an alert to trigger when a value exceeds a threshold. +- Automated regression testing of business-critical reports, by setting an alert to trigger when a value (like previous year sales, or a budget variance) exceeds a threshold. + +### Integration with Microsoft Office 365 + +There are many ways to integrate Power BI with Microsoft 365 products, like [Excel](../collaborate-share/service-connect-power-bi-datasets-excel.md), [PowerPoint](../collaborate-share/service-power-bi-powerpoint-add-in-about.md), and [Outlook](/power-platform-release-plan/2022wave1/power-bi/power-bi-integration-outlook-office-hub). + +#### Use Power BI data in Excel + +Users who prefer working in Excel can use either Analyze in Excel or live connected tables to use Power BI data. + +Content consumers who have Build permission for a semantic model can connect to the model from Excel to use [Analyze in Excel](../collaborate-share/service-analyze-in-excel.md). This approach allows users to explore models so they can perform their own ad hoc analysis with PivotTables. + +Use cases for Analyze in Excel include: + +- Users prefer to analyze data in Excel instead of using Power BI. +- Users want to conduct [personal BI](../guidance/powerbi-implementation-planning-usage-scenario-personal-bi.md) to create their own reports in Excel. +- Users want to use Power BI data to support existing analysis in Excel. + +> [!TIP] +> If you expect users to connect to a semantic model from Excel, ensure that you take the necessary steps to train them how to use it, and organize your semantic model in a helpful way. For example, organize fields into display folders, and hide tables and fields that aren't intended for use in reports. + +Analyze in Excel uses Multidimensional Expressions (MDX) for queries instead of Data Analysis Expressions (DAX) used by Power BI reports. MDX queries can experience poorer performance compared to equivalent DAX queries. Ensure that users understand that they should use Analyze in Excel for high-level aggregate analysis, and perform more detailed analysis by using Power BI or other Fabric experiences. + +Also, some features in a semantic model, like [field parameters](../create-reports/power-bi-field-parameters.md) and [dynamic measure format strings](../create-reports/desktop-dynamic-format-strings.md), don't work in Analyze in Excel. For other considerations and limitations, see [this article](../collaborate-share/service-analyze-in-excel.md#considerations-and-limitations). + +You can also get Power BI data in Excel by using [live-connected tables](../collaborate-share/service-analyze-in-excel.md#export-to-excel-with-live-connection). With this approach, users who export data from a Power BI report visual receive an Excel workbook that contains a table populated with data. The table query automatically retrieves the latest data when the workbook is opened, or when the table is manually refreshed. + +Use cases for live-connected tables include: + +- Users want to investigate or analyze the data in a particular visual. +- Users need to regularly export data to support a valid business case. +- You're performing manual testing of a [semantic model](../guidance/powerbi-implementation-planning-content-lifecycle-management-validate.md#manually-test-semantic-models) or [report](../guidance/powerbi-implementation-planning-content-lifecycle-management-validate.md#manually-test-reports). + +While exporting live-connected tables is better than exporting disconnected tables from a Power BI report, you should encourage users to avoid exporting any data. Exported data presents governance challenges and data security risks that can lead to data exfiltration from the organization. Instead, consider training users to connect to semantic models from Excel or Power BI Desktop to perform their own analysis, and to safely share the results with their colleagues. + +Managing data exports is an important [change management](../guidance/fabric-adoption-roadmap-change-management.md) exercise to improve the maturity of your [data culture](../guidance/fabric-adoption-roadmap-data-culture.md) and enable people to use Power BI effectively. + +#### Integrate Power BI reports in PowerPoint + +You can use the [Power BI add-in for PowerPoint](../collaborate-share/service-power-bi-powerpoint-add-in-about.md) to add live, interactive Power BI reports or specific visuals to PowerPoint slides. This feature is a good alternative to inserting static screenshots because the visuals can be filtered and interacted with during a presentation. + +PowerPoint is a useful tool to complement existing Power BI reports, but it doesn't scale as a primary distribution method. Instead, use report distribution methods, like Power BI apps, and look for opportunities for PowerPoint integration to complement or extend them. + +Managing distribution of reports as flat files and PowerPoint presentations is an important [change management](../guidance/fabric-adoption-roadmap-change-management.md) exercise to improve the maturity of your [data culture](../guidance/fabric-adoption-roadmap-data-culture.md) and [content delivery scope](../guidance/fabric-adoption-roadmap-content-delivery-scope.md), and to enable people to use Power BI effectively. + +Use cases of Power BI integration in PowerPoint include: + +- [Continuously play a presentation in slide show mode](../collaborate-share/service-power-bi-powerpoint-add-in-view-present.md#automatically-refresh-data-during-slide-shows) with up-to-date Power BI reports, for example on a large screen in a factory. +- Freeze snapshots of a specific view so that report data doesn't update automatically, for example when you want to review point-in-time reports from a past date. +- Share a presentation with live Power BI reports so people can see the latest data, for example when you want an audience to review the presentation and reports before you present it. + +Fabric administrators can control its use with the _[Enable Power BI add-in for PowerPoint](/fabric/admin/service-admin-portal-export-sharing#enable-power-bi-add-in-for-powerpoint)_ tenant setting. For other considerations and limitations, see [this article](../collaborate-share/service-power-bi-powerpoint-add-in-about.md#considerations-and-limitations). + +### Integration with Power Platform + +Power BI is part of [Power Platform](/power-platform/). As such, Power BI integrates well with other applications in the Power Platform family, such as Power Apps, Power Automate, and Power Pages. + +- [Power Apps](/power-apps/powerapps-overview) lets you quickly create and deploy low-code applications in your organization. +- [Power Automate](/power-automate/getting-started) lets you automate tasks and workflows by creating logical flows that trigger either automatically, on a schedule, or in response to a manual action. You can create [cloud flows](/power-automate/overview-cloud) that run unattended without a dedicated machine. You can also use the [Power Automate desktop application](/power-automate/desktop-flows/getting-started-windows-11) to author [desktop flows](/power-automate/overview-cloud) that require a machine because they use robotic process automation to simulate user actions. +- [Power Pages](/power-pages/introduction) lets you create external-facing business websites with a low-code user interface. + +#### Use the Power Apps visual in a Power BI report + +You can integrate Power Apps in Power BI by using the [Power Apps visual](/power-apps/maker/canvas-apps/powerapps-custom-visual). This visual allows you to display an interactive, functional Power Apps canvas app within a Power BI report. In Power BI, you can select fields to add to the Power Apps visual. Then, in Power Apps, you can use these fields to create data-driven labels and functionality to enhance your app. Together, the integration of Power BI reports and Power Apps enables a wide range of use cases that can help people make decisions and take actions by using data in a report. + +There are some licensing considerations to keep in mind if you take this approach. To use the Power Apps visual in the Power BI report, a report viewer must have a [Power Apps license](/power-platform/admin/pricing-billing-skus) in addition to any required Power BI per-user licenses. Alternatively, you can use a [pay-as-you-go plan](/power-platform/admin/pay-as-you-go-overview) for Power Apps and Power Automate. + +Use cases of the Power Apps visual include: + +- Facilitate writeback to a database, for example to add comments to certain customers or to modify forecast values from within a Power BI report. +- Facilitate direct actions informed by the Power BI report, such as contacting customers from a customer satisfaction report. +- Allow users to submit forms from within the Power BI report, such as feedback forms, polls, or surveys. + +In an embedded scenario, the Power Apps visual is only supported for the [Embed for your organization](../developer/embedded/embed-organization-app.md) scenario and not the [Embed for your customers](../developer/embedded/embed-customer-app.md) scenario. For other limitations, see [Limitations of the Power Apps visual](/power-apps/maker/canvas-apps/powerapps-custom-visual#limitations-of-the-power-apps-visual). + +#### Integrate a Power BI report in a Power Apps canvas app + +You can integrate [Power BI dashboard tiles within a Power Apps canvas app](/power-apps/maker/canvas-apps/how-to/build-powerbi-visual). With this approach, the primary consumption medium is the Power App, which is enhanced by the Power BI tile. You embed tiles by using the [Power BI tile control](/power-apps/maker/canvas-apps/controls/control-power-bi-tile) during canvas app development. + +#### Take actions in Power BI from Power Automate + +You can use Power Automate to automate specific actions in Power BI, such as report export, semantic models refresh, or DAX query evaluation. This capability can be helpful to streamline certain tasks or improve productivity. + +Use cases for automating Power BI from Power Automate include: + +- Trigger refresh of a semantic model when an upstream data source is updated. +- Automate distribution of Power BI reports or paginated reports. +- Add rows to a Power BI semantic model table when a flow is triggered. + +#### Trigger a Power Automate flow from Power BI + +You can also use Power BI to trigger a Power Automate cloud flow in three ways: + +- Use the Power Automate visual in a Power BI report. +- Use data alerts from a Power BI dashboard tile. +- [Create flows to notify changed goals](../create-reports/service-goals-power-automate.md) in Power BI. + +With this approach, you aren't automating Power BI actions as much as you're responding to events that happen in Power BI. These events can either be triggered manually (like the Power Automate visual) or automatically (like data alerts). You can also use data from Power BI downstream in the flows, which can help you automate more specific and relevant actions. + +There are some licensing considerations to keep in mind with this approach. To use the Power Automate visual in a Power BI report, the report viewer must have access to the Power Automate flow and a [Power Automate license](/power-platform/admin/power-automate-licensing/types), if necessary, in addition to any required Power BI per-user licenses. Alternatively, you can use a [pay-as-you-go plan](/power-platform/admin/pay-as-you-go-overview) for Power Apps and Power Automate. + +Use cases for triggering a Power Automate flow from Power BI include: + +- Update or add rows to an Excel table from within a Power BI report by using the Power Automate visual. +- Automate regression testing by setting up reports and dashboards to report differences in current values with known baselines, and by setting data alerts on dashboard tiles. +- Notify a team or individual when there are unexpected values or anomalies in semantic model data by using data alerts. To use the Power Automate visual in a Power BI report, the report viewer must have access to the Power Automate flow and a [Power Automate license](/power-platform/admin/power-automate-licensing/types), if necessary, in addition to any required Power BI per-user licenses. Alternatively, you can use a [pay-as-you-go plan](/power-platform/admin/pay-as-you-go-overview) for Power Apps and Power Automate. + +#### Embed a Power BI report in a Power Pages website + +You can [embed a Power BI report in a Power Pages website](/power-pages/admin/set-up-power-bi-integration), which allows you to show Power BI reports on your external-facing website made with Power Pages. This approach streamlines the Embed for your customers scenario by [enabling the Power BI Embedded service](/power-pages/admin/set-up-power-bi-integration#enable-power-bi-embedded-service) from the Power Platform admin center. + +There are some licensing considerations to keep in mind with this approach. To embed Power BI reports in a Power Pages website, you must have an F, P, EM, or A SKU. You also need an appropriate [Power Pages license](/power-platform/admin/powerapps-flow-licensing-faq#power-pages). + +Use cases for embedding a Power BI report in a Power Pages website include: + +- Distribute reports via a custom portal to external users or customers. +- Display website analytics, like subscribers or traffic for your website. +- Enhance your Power Pages website with interactive Power BI visualizations. + +In addition to the [limitations of Power BI Embedded](../developer/embed-service-principal.md#considerations-and-limitations), there are also [specific limitations for embedding a report in Power Pages](/power-pages/admin/set-up-power-bi-integration#considerations-and-limitations). For example, the report must be published to the same workspace as its connected semantic model. Ensure that you take these considerations into account before you decide to embed Power BI content in a Power Pages website. + +### OneDrive and SharePoint integration + +OneDrive and SharePoint are commonly used because they're convenient options to store content and data files for Power BI. By integrating OneDrive and SharePoint, you can further enhance their sharing capabilities. + +#### OneDrive refresh of Power BI Desktop files + +When you save a Power BI Desktop (.pbix) file to OneDrive for Work or School, or SharePoint, you can import that file into a workspace from OneDrive instead of publishing it from Power BI Desktop. By doing so, you can benefit from [OneDrive refresh](../connect-data/refresh-data.md#onedrive-refresh), where the data model is automatically updated, usually within an hour. + +Use cases for using OneDrive refresh include: + +- Self-service users want to streamline publishing of Power BI Desktop files. +- Content creators want to track and manage changes while collaborating in OneDrive. + +In addition to integrating OneDrive for an individual .pbix file for semantic models and reports, you can also set up [workspace-level integration with OneDrive](../collaborate-share/service-create-the-new-workspaces.md#set-a-workspace-onedrive). + +#### Preview Power BI Desktop files in OneDrive and SharePoint + +When you share a Power BI Desktop file with people via OneDrive or SharePoint, they can [preview the report](../collaborate-share/service-sharepoint-viewer.md) from OneDrive or SharePoint without opening it in Power BI Desktop. This capability works only for reports that are connected to a shared semantic model, or Power BI Desktop files that contain a report and an import semantic model. Additionally, you can't preview Power BI Desktop files that are 1 GB or larger. For more information, see [Considerations and limitations](../collaborate-share/service-sharepoint-viewer.md#considerations-and-limitations). + +There are some licensing considerations to keep in mind with this approach. Users require a Power BI Pro license to preview Power BI Desktop files in OneDrive or SharePoint. For more information, see [Prerequisites to viewing reports in OneDrive and SharePoint](../collaborate-share/service-sharepoint-viewer.md#prerequisites-to-viewing-report-in-onedrive-and-sharepoint). + +Use cases for using OneDrive to preview files include: + +- Content creators use OneDrive or SharePoint to facilitate collaboration. +- Content creators who use OneDrive integrations, like [OneDrive refresh](../connect-data/refresh-data.md#onedrive-refresh), or use it to [track and manage changes](../collaborate-share/service-sharepoint-viewer.md#version-history) to .pbix files, want the convenience to preview the files before they open them. + +#### Embed Power BI reports in SharePoint Online + +You can integrate Power BI with SharePoint by [embedding Power BI reports in SharePoint Online](../collaborate-share/service-embed-report-spo.md) (also known as _secure embed_). The report experience is the same as when users view them in a Fabric workspace by using a link shared with direct access. [Row-level security](/fabric/security/service-admin-row-level-security) is enforced, together with item permissions. Users must have direct access to reports in order to view them in a SharePoint site. + +Use cases for embedding Power BI reports in SharePoint Online include: + +- You want to distribute reports from a SharePoint portal instead of via a Fabric workspace. This approach can be useful when you want to distribute reports from several workspaces to a specific audience. +- You want to embed reports that support collaboration or decision making in your SharePoint site. + +### Integration with Visual Studio and VS Code + +Many developers are familiar with using [Visual Studio](/visualstudio/get-started/visual-studio-ide) or [Visual Studio Code (VS Code)](/shows/visual-studio-code/) to manage source files and metadata. These tools provide several options to integrate with Power BI and Fabric. + +#### Develop semantic models by using Visual Studio with Analysis Services projects + +If developers prefer working in Visual Studio, they can [develop and deploy semantic models from Visual Studio](/analysis-services/tools-and-applications-used-in-analysis-services?view=power-bi-premium-current&preserve-view=true) instead of Power BI Desktop. In this case, they need Visual Studio 2017 or a later edition, and the 2.9.14 version (or higher) of the SQL Server Data Tools (SSDT) extension. + +> [!TIP] +> Developers who prefer a Visual Studio-like experience to build and manage semantic models may find it more effective to use [Tabular Editor](https://tabulareditor.com/). Tabular Editor is an external tool that connects to a local model open in Power BI Desktop, or a remote model via the XMLA read/write endpoint. It also supports scripting and batch tasking to improve developer productivity. +> +> For more information, see [Advanced data model management](../guidance/powerbi-implementation-planning-usage-scenario-advanced-data-model-management.md). + +#### Manage items with VS Code + +If developers prefer working in VS Code, they can use extensions to facilitate some of their work with Power BI via the VS Code application. + +There are several tools that they can use to manage different parts of Power BI from VS Code: + +- **[TMDL](https://github.com/microsoft/vscode-tmdl)**: An official VS Code extension from Microsoft that provides language support for the Tabular Model Definition Language (TMDL) to work with semantic models that use the TMDL metadata format. +- **[Power BI Studio](https://marketplace.visualstudio.com/items?itemName=GerhardBrueckl.powerbi-vscode)**: A community-developed VS Code extension that uses the Power BI REST API to view and manage items in a workspace. +- **[Power BI VSCode Extension Pack](https://marketplace.visualstudio.com/items?itemName=GerhardBrueckl.powerbi-vscode-extensionpack)**: A collection of VS Code extensions that enables developers to work in VS Code with Fabric and Power BI. It includes both the TMDL and Power BI Studio extensions. + +VS Code integration is also supported by other Fabric experiences, like [notebooks](/fabric/data-engineering/setup-vs-code-extension) for data engineering and data science, or for managing Power BI semantic models by using [semantic link](#semantic-link-integration-with-semantic-models-via-notebooks) (described earlier). + +### Python or R integration + +You can run Python or R scripts in Power BI semantic models and reports to extend the functionality of these items. This capability can be helpful for content creators who are familiar with Python or R, and who create and distribute content for business users via Power BI. + +Content owners or creators who are proficient in Python or R might benefit from using [notebook items](/fabric/data-engineering/how-to-use-notebook) in a Fabric capacity. For many use cases, notebooks are a preferred option over Python and R integration with Power BI. That's because they provide more options to create and maintain solutions built in these languages, they also have fewer limitations and typically involve less effort to support. + +#### Run Python or R code in a semantic model + +You can integrate Python or R code as part of the data transformations that you perform in a semantic model that uses import storage mode. This integration lets you transform data or perform advanced analytics with Python or R whenever you refresh the model. + +To refresh a published semantic model that uses Python or R integrated in Power Query, you must use an on-premises data gateway in [personal mode](../connect-data/service-gateway-personal-mode.md). That's because the Python or R code runs locally by using the Python or R installed on the machine. This setup is typically challenging to manage and maintain. If you need to use Python or R in a semantic model, we recommend alternative approaches, like notebooks in Fabric. + +#### Create Python or R visuals in Power BI reports + +You can integrate Python or R with Power BI reports in order to create custom visuals with Python libraries, like Seaborn or R packages like ggplot2. These visuals are fully customizable and support interactive features in Power BI like rendering a filtered result, cross-filtering, custom tooltips, drilldown, and drillthrough. + +Ensure that all your Python or R visuals use [Python libraries](../connect-data/service-python-packages-support.md#python-packages-that-are-supported-in-power-bi) and [R packages](../connect-data/service-r-packages-support.md#r-packages-that-are-supported-in-power-bi) that are supported in Fabric. If you use an unsupported library or package, the visual won't render in the Power BI service, even when the visual renders in your report in Power BI Desktop. + +While you can transform data and make calculations as part of a Python or R custom visual, it isn't recommended. Placing this logic in the Python or R visual can result in slower render durations, and greater difficulty to maintain the visual and achieve harmonization across visuals and reports in calculation logic. + +Instead, add your logic to DAX calculations by creating measures, and perform your transformations further upstream, such as in Power Query or the data source, if possible. + +### Custom visuals for Power BI reports + +There are other options for creating custom visuals in Power BI reports aside from Python and R. While not explicitly integration, it's possible to use custom visuals in Power BI reports for advanced or specific use cases. You can [create your own custom visual](../developer/visuals/develop-power-bi-visuals.md)—which requires no integration with other services—or obtain a visual from [AppSource](https://appsource.microsoft.com/marketplace/apps?product=power-bi-visuals&exp=kyyw&page=1&filters=pay-as-you-go), which can be either free or [requires a license](../developer/visuals/custom-visual-licenses.md). Depending on the custom visual, it might involve integration with a third-party service and you will need to agree to their license terms. + +If you're thinking about using custom visuals to extend the functionality of Power BI reports, consider [Deneb](https://deneb-viz.github.io/). Deneb is a community-developed, [certified custom visual](../developer/visuals/power-bi-custom-visuals-certified.md) that allows you to use a declarative [JSON syntax](https://www.json.org/json-en.html) of the [Vega](http://vega.github.io/vega) or [Vega-Lite](http://vega.github.io/vega-lite) languages to build your own visualizations. Deneb has a large community and many templates, which makes it a good choice for report creators who want to create their own visuals without using JavaScript, Python, or R. + +### Integration with other third-party services + +There are other third-party services which offer integration with Power BI. + +The following section presents third-party services, together with use cases that are important to consider. + +#### Integration with semantic models via the XMLA endpoint + +In Power BI, external tools can connect to your Power BI semantic models by using the XMLA endpoint. There are both open source and commercially available tools that you can use to enhance productivity or extend the functionality of your existing semantic models. + +Here are some examples of tools that can integrate with semantic models via the XMLA endpoint: + +- [PowerShell cmdlets](/analysis-services/powershell/analysis-services-powershell-reference?view=power-bi-premium-current&preserve-view=true) to automate certain semantic model tasks. +- [Power BI report builder](../paginated-reports/report-builder-power-bi.md) to query semantic models with DAX and build paginated reports. +- [Tabular Editor](https://tabulareditor.com/), a third-party tool to develop and manage semantic models. +- [DAX Studio](https://daxstudio.org/), a third-party tool to author and optimize DAX queries. +- [ALM toolkit](http://alm-toolkit.com/), a third-party tool to compare and deploy semantic models. + +For more information about XMLA endpoints and client applications and tools that use them, see [Semantic model connectivity and management with the XMLA endpoint in Power BI](../enterprise/service-premium-connect-tools.md#client-applications-and-tools). The XMLA endpoint is only supported for workspaces that have their license mode set to Fabric capacity, Premium capacity, or Premium Per User. + +You can [enable the XMLA endpoint](../enterprise/troubleshoot-xmla-endpoint.md#enabling-the-xmla-endpoint) and set it to read, or read/write from the Power BI workload options in the [admin portal](../enterprise/service-admin-premium-workloads.md#outbound-connectivity). There are also [several tenant settings](../enterprise/service-premium-connect-tools.md#security) that you can use to control which users and groups can use the XMLA endpoint. + +:::image type="icon" source="media/common/checklist.svg" border="false"::: + +**Checklist** - When planning to integrate Power BI with other services, key decisions and considerations include: + +- **Define the requirement**: Describe what you're trying to achieve and what the expected benefit is of doing so. +- **Describe why you can't accomplish the task in Power BI alone**: Define the challenges or limitations that prevent you from fulfilling this requirement with the built-in tools and features in Power BI. +- **Identify the services that can help you fulfill the requirement**: Compile a list of the services that can help you achieve your objective. Depending on the requirement, there might be only one plausible option. +- **Identify any potential risks, limitations, or considerations**: Carefully plan and consider the implications for this integration for different areas, like security, licensing, governance, and user enablement. +- **Research how you'll set up the integration**: Read the appropriate technical documentation and compile a step-by-step protocol that applies to your specific scenario for how you'll integrate Power BI with the service or tool. Pay special attention to possible troubleshooting or customization of this integration that you might need to do. +- **Conduct a test or proof of concept (POC)**: Before you set up the integration for your tenant, workspace, or item, first perform a representative trial to test any assumptions and reveal any challenges or limitations. Conducting a test or POC is important. +- **Set up training and monitoring**: Ensure that centralized teams are equipped to monitor the new service and its effect on usage in your tenant. Prepare relevant training material so that people can use the new service, and that helps them to avoid issues. + +## Related content + +For more considerations, actions, decision-making criteria, and recommendations to help you with Power BI implementation decisions, see [Power BI implementation planning](powerbi-implementation-planning-introduction.md). diff --git a/powerbi-docs/guidance/powerbi-implementation-planning-introduction.md b/powerbi-docs/guidance/powerbi-implementation-planning-introduction.md index 9f4b4e38d8..e0f5205408 100644 --- a/powerbi-docs/guidance/powerbi-implementation-planning-introduction.md +++ b/powerbi-docs/guidance/powerbi-implementation-planning-introduction.md @@ -1,14 +1,14 @@ --- title: "Power BI implementation planning" description: "An introduction to the Power BI implementation planning series of articles." -author: denglishbi +author: peter-myers ms.author: daengli -ms.reviewer: maroche +ms.reviewer: asaxton ms.service: powerbi ms.subservice: powerbi-resource ms.topic: conceptual ms.custom: fabric-cat, video-RWUWA9 -ms.date: 06/27/2024 +ms.date: 10/12/2024 --- # Power BI implementation planning @@ -34,15 +34,10 @@ When you implement Power BI, there are many subject areas to consider. The follo - [Security](powerbi-implementation-planning-security-overview.md) - [Information protection and data loss prevention](powerbi-implementation-planning-info-protection-data-loss-prevention-overview.md) - [Data gateways](powerbi-implementation-planning-data-gateways.md) -- Integration with other services +- [Integration with other services](powerbi-implementation-planning-integration-with-other-services.md) - [Auditing and monitoring](powerbi-implementation-planning-auditing-monitoring-overview.md) - [Adoption tracking](powerbi-implementation-planning-adoption-tracking.md) -> [!NOTE] -> The series is a work in progress. We will gradually release new and updated articles over time. -> -> In addition to these subject areas, managing your Fabric or Premium capacity usage is an important part of your Power BI implementation. It affects not only Power BI, but also the other experiences in Fabric. For information about how to manage your Fabric capacity, see [Manage your Fabric capacity](/fabric/admin/capacity-settings?tabs=power-bi-premium&preserve-view=true) and [Evaluate and optimize your Microsoft Fabric capacity](/fabric/enterprise/optimize-capacity). - ## Usage scenarios The series includes usage scenarios that illustrate different ways that creators and consumers can deploy and use Power BI: @@ -54,10 +49,7 @@ The series includes usage scenarios that illustrate different ways that creators ## Purpose -When completed, the series will: - -- Complement the [Fabric adoption roadmap](fabric-adoption-roadmap.md), which describes considerations for successful Microsoft Fabric and Power BI adoption and a healthy data culture. Power BI implementation planning guidance that correlates with the adoption roadmap goals will be added to this series. -- Replace the [Power BI adoption framework](https://github.com/pbiaf/powerbiadoption) (together with the [Fabric adoption roadmap](fabric-adoption-roadmap.md)), which is a lightweight set of resources (videos and presentation slides) that were designed to help Microsoft partners deploy Power BI solutions for their customers. Relevant adoption framework action items will be merged into this series. +This series complements the [Fabric adoption roadmap](fabric-adoption-roadmap.md), which describes considerations for successful Microsoft Fabric and Power BI adoption and a healthy data culture. Power BI implementation planning guidance that correlates with the adoption roadmap goals will be added to this series. ## Recommendations @@ -66,7 +58,7 @@ To set yourself up for success, we recommend that you work through the following 1. Read the complete [Fabric adoption roadmap](fabric-adoption-roadmap.md), familiarizing yourself with each roadmap subject area. Assess your current state of Fabric adoption, and gain clarity on the data culture objectives for your organization. 1. Explore Power BI implementation planning articles that are relevant to you. Start with the [Power BI usage scenarios](powerbi-implementation-planning-usage-scenario-overview.md), which convey how you can use Power BI in diverse ways. Be sure to understand which usage scenarios apply to your organization, and by whom. Also, consider how these usage scenarios might influence the implementation strategies you decide on. 1. Read the articles for each of the subject areas that are listed above. You might choose to initially do a broad review of the contents from top to bottom. Or you might choose to start with subject areas that are your highest priority. Carefully review the key decisions and actions that are included for each topic (at the end of each section). We recommend that you use them as a starting point for creating and customizing your plan. -1. When necessary, refer to [Power BI documentation](/power-bi/) for details on specific topics. +1. When necessary, refer to [Power BI documentation](/power-bi/) and [Fabric documentation](/fabric/) for details on specific topics. ## Target audience @@ -87,7 +79,7 @@ This series is certain to be helpful for organizations that are in their early s ## Acknowledgments -The Power BI implementation planning articles are written by Melissa Coates, Kurt Buhler, and Peter Myers. Matthew Roche, from the Fabric Customer Advisory Team, provides strategic guidance and feedback to the subject matter experts. +The Power BI implementation planning articles are written by [Melissa Coates](https://www.linkedin.com/in/melissacoatesprofile), [Kurt Buhler](https://www.linkedin.com/in/kurtbuhler), and [Peter Myers](https://www.linkedin.com/in/peterjsmyers). [Matthew Roche](https://www.linkedin.com/in/matthewroche), from the Fabric Customer Advisory Team, provides strategic guidance and feedback to the subject matter experts. ## Related content diff --git a/powerbi-docs/paginated-reports/paginated-capacity-planning.md b/powerbi-docs/paginated-reports/paginated-capacity-planning.md index a40d640124..dd0708778a 100644 --- a/powerbi-docs/paginated-reports/paginated-capacity-planning.md +++ b/powerbi-docs/paginated-reports/paginated-capacity-planning.md @@ -91,15 +91,15 @@ Run the report several times, and use the metrics app to get the average CPU sec ### Calculate the max report renders -Use this formula to calculate the maximum concurrent report renders that a capacity can handle, before it [overloads](/fabric/enterprise/throttling#track-overages-and-rejected-operations). To learn more about Capacity Units (CU), SKU and Power BI v-cores, refer to [Capacity](/fabric/enterprise/licenses#capacity). +Use this formula to calculate the maximum concurrent report renders that a capacity can handle, before it [overloads](/fabric/enterprise/throttling#track-overages-and-rejected-operations). -```$ \text {max concurrent report renders} = {\text {capacity units for your capacity} \times {30} \over \text {your report's CPU processing time (in seconds)} \times {8}} $``` +$ \text {max concurrent report renders} = {\text {number of capacity SKU cores} \times {30} \over \text {your report's CPU processing time (in seconds)}} $ ### Calculate the max number of users Using the estimated [five percent concurrency](#how-many-users-can-a-sku-handle) for the correlation between the number of total users, and the maximum concurrent renders, you can get the number of total users a SKU can handle. -```$ \text {max SKU users} = {\text {max concurrent report renders} \over 0.05} $``` +$ \text {max SKU users} = {\text {max concurrent report renders} \over 0.05} $ ### Calculate capacity resources for multiple reports @@ -107,7 +107,7 @@ You can use an extended formula to estimate the capacity needed for different re Upload several paginated reports with different number of daily renders, and use the metrics app to get the average CPU processing time for each one. The sum of all your report renders per day should be equal to 100%. When you have all the information, use this formula. -```$ \text {max concurrent report renders} = {\text {capacity units for your capacity} \times {30} \over {8} \times {{\text {A renders} \times \text {A processing time}} + \text {B renders} \times \text {B processing time} + \text {...} + \text{N renders} \times \text{N processing time}}}$``` +$ \text {max concurrent report renders} = {\text {number of capacity SKU cores} \times {30} \over {\text {A renders} \times \text {A processing time}} + \text {B renders} \times \text {B processing time} + \text {...} + \text{N renders} \times \text{N processing time}}$ ## Examples @@ -117,11 +117,11 @@ This section includes two examples, one for the [regular calculation](#regular-c Let’s assume that you're running a paginated report on an *F64* or *P1* SKU that has eight cores. The total CPU usage for 10 runs is 40 seconds, so the average CPU time per reports is four seconds. -```$ 60 = {8 \times {30} \over 4} $``` +$ 60 = {8 \times {30} \over 4} $ When using the second formula, you get a maximum of 1,200 users. -```$ 1,200 = {60 \over 0.05} $``` +$ 1,200 = {60 \over 0.05} $ For *F128* or *P2* SKUs, you can multiply these numbers by two, as the capacity has twice the number of CPU cores. diff --git a/powerbi-docs/support/service-tips-for-finding-help.md b/powerbi-docs/support/service-tips-for-finding-help.md index 548b1d0c04..355ff3acc9 100644 --- a/powerbi-docs/support/service-tips-for-finding-help.md +++ b/powerbi-docs/support/service-tips-for-finding-help.md @@ -51,7 +51,6 @@ Do videos fit your learning style better? Power BI has two sets: The training options available are nearly endless, from in-person lab training to short videos. - [Microsoft Learn training for Power BI.](/training/powerplatform/power-bi?WT.mc_id=powerbi_landingpage-docs-link) -- [Free Power BI webinars,](../fundamentals/webinars.md) live and on-demand. You can find more options online, such as: diff --git a/powerbi-docs/transform-model/dataflows/dataflows-develop-solutions.md b/powerbi-docs/transform-model/dataflows/dataflows-develop-solutions.md index ca9a7dab27..35a47b9d87 100644 --- a/powerbi-docs/transform-model/dataflows/dataflows-develop-solutions.md +++ b/powerbi-docs/transform-model/dataflows/dataflows-develop-solutions.md @@ -6,12 +6,12 @@ ms.author: davidi ms.reviewer: '' ms.service: powerbi ms.subservice: pbi-dataflows -ms.topic: how-to +ms.topic: concept-article ms.date: 11/10/2023 LocalizationGroup: Data from files +#customer intent: As a Power BI user I want to learn how to use data flows in Power BI. --- - # Develop solutions with dataflows Power BI *dataflows* are an enterprise-focused data prep solution that enables an ecosystem of data that's ready for consumption, reuse, and integration. This article presents some common scenarios, links to articles, and other information to help you understand and use dataflows to their full potential. @@ -36,7 +36,7 @@ Getting access to these [Premium features of dataflows](dataflows-premium-featur You can't consume PPU dataflows (or any other content) outside the PPU environment (such as in Premium or other SKUs or licenses). -For Premium capacities, your consumers of dataflows in Power BI Desktop don't need explicit licenses to consume and publish to Power BI. But to publish to a workspace or share a resulting semantic model, you'll need at least a Pro license. +For Premium capacities, your consumers of dataflows in Power BI Desktop don't need explicit licenses to consume and publish to Power BI. But to publish to a workspace or share a resulting semantic model, you need at least a Pro license. For PPU, everyone who creates or consumes PPU content must have a PPU license. This requirement varies from the rest of Power BI in that you need to explicitly license everyone with PPU. You can't mix Free, Pro, or even Premium capacities with PPU content unless you migrate the workspace to a Premium capacity. @@ -100,7 +100,7 @@ Imagine you have a dataflow that's large, but you want to build semantic models ### Solution: Use DirectQuery dataflows -DirectQuery can be used whenever a workspace's enhanced compute engine (ECE) setting is configured explicitly to **On**. This setting is helpful when you have data that doesn't need to be loaded directly into a Power BI model. If you're configuring the ECE to be **On** for the first time, the changes that allow DirectQuery will occur during the next refresh. You'll need to refresh it when you enable it to have changes take place immediately. Refreshes on the initial dataflow load can be slower because Power BI writes data to both storage and a managed SQL engine. +DirectQuery can be used whenever a workspace's enhanced compute engine (ECE) setting is configured explicitly to **On**. This setting is helpful when you have data that doesn't need to be loaded directly into a Power BI model. If you're configuring the ECE to be **On** for the first time, the changes that allow DirectQuery will occur during the next refresh. You need to refresh it when you enable it to have changes take place immediately. Refreshes on the initial dataflow load can be slower because Power BI writes data to both storage and a managed SQL engine. To summarize, by using DirectQuery with dataflows enables the following enhancements to your Power BI and dataflows processes: @@ -139,7 +139,7 @@ Imagine you run a query on the source system, but you don't want to provide dire ### Solution 1: Use a view for the query or optimize the query -By using an optimized data source and query is your best option. Often, the data source operates best with queries intended for it. Power Query has advanced query-folding capabilities to delegate these workloads. Power BI also provides step-folding indicators in Power Query Online. Read more about types of indicators in the [step-folding indicators documentation](/power-query/step-folding-indicators). +By using an optimized data source and query is your best option. Often, the data source operates best with queries intended for it. Power Query advances query-folding capabilities to delegate these workloads. Power BI also provides step-folding indicators in Power Query Online. Read more about types of indicators in the [step-folding indicators documentation](/power-query/step-folding-indicators). ### Solution 2: Use Native Query @@ -151,7 +151,7 @@ By breaking a dataflow into separate ingestion and consumption dataflows, you ca ## Ensure customers use dataflows whenever possible -Imagine you have many dataflows that serve common purposes, such as conformed dimensions like customers, data tables, products, and geographies. Dataflows are already available in the ribbon for Power BI. Ideally, you want customers to primarily use the dataflows you've created. +Imagine you have many dataflows that serve common purposes, such as conformed dimensions like customers, data tables, products, and geographies. Dataflows are already available in the ribbon for Power BI. Ideally, you want customers to primarily use the dataflows you created. ### Solution: Use endorsement to certify and promote dataflows diff --git a/powerbi-docs/transform-model/dataflows/dataflows-premium-workload-configuration.md b/powerbi-docs/transform-model/dataflows/dataflows-premium-workload-configuration.md index fc3e9e2328..f53745cc2f 100644 --- a/powerbi-docs/transform-model/dataflows/dataflows-premium-workload-configuration.md +++ b/powerbi-docs/transform-model/dataflows/dataflows-premium-workload-configuration.md @@ -1,15 +1,16 @@ --- title: Configure Power BI Premium dataflow workloads -description: How to configure Power BI Premium for dataflow workloads +description: Learn how to configure Power BI Premium for dataflow workloads, including enabling dataflows, refining settings, and optimizing performance. author: luitwieler ms.author: jeluitwi ms.reviewer: '' ms.service: powerbi ms.subservice: pbi-dataflows ms.topic: how-to -ms.date: 03/27/2022 +ms.date: 10/07/2024 ms.custom: references_regions LocalizationGroup: Data from files +#customer intent: As a Power BI user I want to learn how to configure dataflow workloads in Power BI Premium. --- # Configure Power BI Premium dataflow workloads @@ -23,13 +24,13 @@ The first requirement for using dataflows in your Power BI premium subscription ![Admin portal for dataflows in Power BI premium](media/dataflows-premium-workload-configuration/dataflows-premium-workloads-01.png) -After enabling the dataflows workload, it is configured with default settings. You might want to tweak these settings as you see fit. Next, we'll describe where these settings live, describe each, and help you understand when you might want to change the values to optimize your dataflow performance. +After enabling the dataflows workload, it's configured with default settings. You might want to tweak these settings as you see fit. Next, we describe where these settings live, describe each, and help you understand when you might want to change the values to optimize your dataflow performance. ## Refining dataflow settings in Premium -Once dataflows are enabled, you can use the **Admin portal** to change, or refine, how dataflows are created and how they use resources in your Power BI Premium subscription. Power BI Premium doesn't require memory settings to be changed. Memory in Power BI Premium is automatically managed by the underlying system. The following steps show how to adjust your dataflow settings. +Once dataflows are enabled, you can use the **Admin portal** to change, or refine, how dataflows are created and how they use resources in your Power BI Premium subscription. Power BI Premium doesn't require memory settings to be changed. Memory in Power BI Premium is automatically manages the underlying system. The following steps show how to adjust your dataflow settings. -1. In the **Admin portal**, select **Tenant settings** to list all capacities that have been created. Select a capacity to manage its settings. +1. In the **Admin portal**, select **Tenant settings** to list all capacities created. Select a capacity to manage its settings. ![Select a capacity to manage settings](media/dataflows-premium-workload-configuration/dataflows-premium-workloads-02.png) @@ -37,13 +38,13 @@ Once dataflows are enabled, you can use the **Admin portal** to change, or refin ![Change the size of a capacity](media/dataflows-premium-workload-configuration/dataflows-premium-workloads-03.png) -#### Premium capacity SKUs - scale up the hardware +### Premium capacity SKUs - scale up the hardware Power BI Premium workloads use v-cores to serve fast queries across the various workload types. [Capacities and SKUs](../../enterprise/service-premium-what-is.md#capacities-and-skus) includes a chart that illustrates the current specifications across each of the available workload offerings. Capacities of A3 and greater can take advantage of the compute engine, so when you want to use the enhanced compute engine, start there. #### Enhanced compute engine - an opportunity to improve performance -The [enhanced compute engine](dataflows-premium-features.md#the-enhanced-compute-engine) is an engine that can accelerate your queries. Power BI uses a compute engine to process your queries and refresh operations. The enhanced compute engine is an improvement over the standard engine, and works by loading data to a SQL Cache and uses SQL to accelerate table transformation, refresh operations and enables DirectQuery connectivity. When configured to **On** or **Optimized** for computed entities, if your business logic allows for it, Power BI uses SQL speed up the performance. Having the engine **On** also provides for DirectQuery connectivity. Make sure your dataflow usage is leveraging the enhanced compute engine properly. Users can configure the enhanced compute engine to be on, optimized, or off on a per-dataflow basis. +The [enhanced compute engine](dataflows-premium-features.md#the-enhanced-compute-engine) is an engine that can accelerate your queries. Power BI uses a compute engine to process your queries and refresh operations. The enhanced compute engine is an improvement over the standard engine, and works by loading data to a SQL Cache and uses SQL to accelerate table transformation, refresh operations, and enables DirectQuery connectivity. When configured to **On** or **Optimized** for computed entities, if your business logic allows for it, Power BI uses SQL speed up the performance. Having the engine **On** also provides for DirectQuery connectivity. Make sure your dataflow usage is using the enhanced compute engine properly. Users can configure the enhanced compute engine to be on, optimized, or off on a per-dataflow basis. > [!NOTE] > The enhanced compute engine is not yet available in all regions. @@ -56,9 +57,9 @@ This section provides guidance for common scenarios when using dataflow workload Slow refresh times are usually a parallelism issue. You should review the following options, in order: -1. A key concept for slow refresh times is the nature of your data preparation. Whenever you can optimize your slow refresh times by taking advantage of your data source actually doing the preparation and performing upfront query logic, you should do so. Specifically, when using a relational database such as SQL as your source, see if the initial query can be run on the source, and use that source query for your initial extraction dataflow for the data source. If you cannot use a native query in the source system, perform operations that the dataflows [engine can fold to the data source](/power-query/power-query-folding). +1. A key concept for slow refresh times is the nature of your data preparation. Whenever you can optimize your slow refresh times by taking advantage of your data source actually doing the preparation and performing upfront query logic, you should do so. Specifically, when using a relational database such as SQL as your source, see if the initial query can be run on the source, and use that source query for your initial extraction dataflow for the data source. If you can't use a native query in the source system, perform operations that the dataflows [engine can fold to the data source](/power-query/power-query-folding). -2. Evaluate spreading out refresh times on the same capacity. Refresh operations are a process that requires significant compute. Using our restaurant analogy, spreading out refresh times is akin to limiting the number of guests in your restaurant. Just as restaurants will schedule guests and plan for capacity, you also want to consider refresh operations during times when usage is not at its full peak. This can go a long way toward alleviating strain on the capacity. +2. Evaluate spreading out refresh times on the same capacity. Refresh operations are a process that requires significant compute. Using our restaurant analogy, spreading out refresh times is akin to limiting the number of guests in your restaurant. Just as restaurants schedule guests and plan for capacity, you also want to consider refresh operations during times when usage isn't at its full peak. This can go a long way toward alleviating strain on the capacity. If the steps in this section don't provide the desired degree of parallelism, consider upgrading your capacity to a higher SKU. Then follow the previous steps in this sequence again. @@ -82,12 +83,12 @@ Take the following steps when investigating scenarios where the Compute engine i 1. Limit computed and linked entities that exist across workspace. -2. When you perform your initial refresh with the compute engine turned on, then data gets written in the lake and in the cache. This double write means these refreshes will be slower. - -3. If you have a dataflow linking to multiple dataflows, make sure you schedule refreshes of the source dataflows so that they do not all refresh at the same time. +2. When you perform your initial refresh with the compute engine turned on, the data gets written in the lake and in the cache. This double write means refreshes are slower. +3. If you have a dataflow linking to multiple dataflows, make sure you schedule refreshes of the source dataflows so that they don't all refresh at the same time. ## Related content + The following articles provide more information about dataflows and Power BI: * [Introduction to dataflows and self-service data prep](dataflows-introduction-self-service.md) diff --git a/powerbi-docs/transform-model/dataflows/dataflows-streaming.md b/powerbi-docs/transform-model/dataflows/dataflows-streaming.md index 5a031d1227..6883da990e 100644 --- a/powerbi-docs/transform-model/dataflows/dataflows-streaming.md +++ b/powerbi-docs/transform-model/dataflows/dataflows-streaming.md @@ -7,27 +7,23 @@ ms.reviewer: '' ms.service: powerbi ms.subservice: pbi-dataflows ms.topic: how-to -ms.date: 11/10/2023 +ms.date: 10/07/2024 LocalizationGroup: Data from files +#customer intent: As a Power BI user I want to learn how to use streaming dataflows in Power BI. --- # Streaming dataflows (preview) Organizations want to work with data as it comes in, not days or weeks later. The vision of Power BI is simple: the distinctions between batch, real-time, and streaming should disappear. Users should be able to work with all data as soon as it's available. - -> [!IMPORTANT] -> Streaming dataflows has been retired, and is no longer available. [Azure Stream Analytics](/azure/stream-analytics/no-code-stream-processing) has merged the functionality of streaming dataflows. For more information about the retirement of streaming dataflows, see the [retirement announcement](https://powerbi.microsoft.com/en-us/blog/announcing-the-retirement-of-streaming-dataflows/). - Analysts usually need technical help to deal with streaming data sources, data preparation, complex time-based operations, and real-time data visualization. IT departments often rely on custom-built systems, and a combination of technologies from various vendors, to perform timely analyses on the data. Without this complexity, they can't provide decision makers with information in near real time. -*Streaming dataflows* allow authors to connect to, ingest, mash up, model, and build reports based on streaming in near real-time data directly in the Power BI service. The service enables drag-and-drop, no-code experiences. - +*Streaming dataflows* allow authors to connect to, ingest, mash up, model, and build reports based on streaming in near real-time data directly in the Power BI service. The service enables drag-and-drop, no-code experiences. You can mix and match streaming data with batch data if you need to through a user interface (UI) that includes a *diagram view* for easy data mashup. The final item produced is a dataflow, which can be consumed in real time to create highly interactive, near real-time reporting. All of the data visualization capabilities in Power BI work with streaming data, just as they do with batch data. +> [!IMPORTANT] +> Streaming dataflows has been retired, and is no longer available. [Azure Stream Analytics](/azure/stream-analytics/no-code-stream-processing) has merged the functionality of streaming dataflows. For more information about the retirement of streaming dataflows, see the [retirement announcement](https://powerbi.microsoft.com/en-us/blog/announcing-the-retirement-of-streaming-dataflows/). :::image type="content" source="media/dataflows-streaming/dataflows-streaming-01.png" alt-text="Diagram showing an example of mixed streaming and batch data in a simple workflow that creates real-time reports in Power BI."::: - Users can perform data preparation operations like joins and filters. They can also perform time-window aggregations (such as tumbling, hopping, and session windows) for group-by operations. - Streaming dataflows in Power BI empower organizations to: * Make confident decisions in near real time. Organizations can be more agile and take meaningful actions based on the most up-to-date insights. @@ -114,7 +110,7 @@ When streaming dataflows detect the fields, you can see them in the list. There' You can always edit the field names, or remove or change the data type, by selecting more options (**...**) next to each field. You can also expand, select, and edit any nested fields from the incoming messages, as shown in the following image. -:::image type="content" source="media/dataflows-streaming/dataflows-streaming-07.png" alt-text="Screenshot that shows remove, rename and data type options for input data."::: +:::image type="content" source="media/dataflows-streaming/dataflows-streaming-07.png" alt-text="Screenshot that shows remove, rename, and data type options for input data."::: ### Azure IoT Hub @@ -365,7 +361,7 @@ You can also see the details of a specific record (a "cell" in the table) by sel ### Static preview for transformations and outputs -After you add and set up any steps in the diagram view, you can test their behavior by selecting the static data button. +After you add and set up any steps in the diagram view, you can test their behavior by selecting the static data button. :::image type="icon" source="media/dataflows-streaming/dataflows-streaming-26.png"::: @@ -373,7 +369,7 @@ After you do, streaming dataflows evaluate all transformation and outputs that a :::image type="content" source="media/dataflows-streaming/dataflows-streaming-27.png" alt-text="Screenshot that shows a static data preview with the refresh and hide options highlighted."::: -You can refresh the preview by selecting **Refresh static preview** (1). When you do this, streaming dataflows take new data from the input and evaluate all transformations and outputs again with any updates that you might have performed. The **Show or Hide details** option is also available (2). +You can refresh the preview by selecting **Refresh static preview** (1). When you do this, streaming dataflows take new data from the input and evaluate all transformations and outputs again with any updates that you might perform. The **Show or Hide details** option is also available (2). ### Authoring errors diff --git a/powerbi-docs/transform-model/datamarts/datamarts-access-control.md b/powerbi-docs/transform-model/datamarts/datamarts-access-control.md index 994c71af83..07b67ab841 100644 --- a/powerbi-docs/transform-model/datamarts/datamarts-access-control.md +++ b/powerbi-docs/transform-model/datamarts/datamarts-access-control.md @@ -1,14 +1,16 @@ --- title: Control access to datamarts (preview) -description: Control who can access and use datamarts +description: Learn how to control access to datamarts in Power BI, including setting workspace roles, viewer restrictions, and configuring row-level security. author: davidiseminger ms.author: davidi ms.reviewer: '' ms.service: powerbi ms.subservice: pbi-dataflows ms.topic: how-to -ms.date: 11/10/2023 +ms.date: 09/30/2024 LocalizationGroup: Data from files +ms.custom: FY25Q1-Linter +#customer intent: As a Power BI user I want to learn how to control access to datamarts. --- # Control access to datamarts @@ -16,7 +18,8 @@ LocalizationGroup: Data from files This article describes access control to datamarts, including row level security, rules in Power BI Desktop, and how datamarts might become inaccessible or unavailable. ## Workspace roles -Assigning users to the various workspace roles provides the following capabilities with respect to Datamarts: + +Assigning users to the various workspace roles provides the following capabilities with respect to Datamarts: | Workspace role | Description | |---|---| @@ -27,74 +30,72 @@ Assigning users to the various workspace roles provides the following capabiliti ## Viewer restrictions -The Viewer role is a more limited role in comparison with the other workspace roles. In addition to fewer SQL permissions given to viewers, there are more actions they are restricted from performing. +The Viewer role is a more limited role in comparison with the other workspace roles. In addition to fewer SQL permissions given to viewers, there are more restricted actions. | Feature | Limitation | |---|---| -|**Settings**|Viewers have read-only access, so they cannot rename datamart, add description, or change sensitivity label.| +|**Settings**|Viewers have read-only access, so they can't rename datamart, add description, or change sensitivity label.| |**Model view**|Viewers have read-only mode on the Model view.| -|**Run queries**|Viewers do not have full DML/DDL capabilities unless granted specifically. Viewers can read data using SELECT statement in SQL query editor and use all tools in the toolbar in the Visual query editor. Viewers can also read data from Power BI Desktop and other SQL client tools.| -|**Analyze in Excel**|Viewers do not have permission to Analyze in Excel.| -|**Manually update semantic model**|Viewers cannot manually update the default semantic model to which the datamart is connected.| -|**New measure**|Viewers do not have permission to create measures.| -|**Lineage view**|Viewers do not have access to reading the lineage view chart.| -|**Share/Manage permissions**|Viewers do not have permission to share datamarts with others.| -|**Create a report**|Viewers do not have access to create content within the workspace and hence cannot build reports on top of the datamart.| - +|**Run queries**|Viewers don't have full DML/DDL capabilities unless granted specifically. Viewers can read data using SELECT statement in SQL query editor and use all tools in the toolbar in the Visual query editor. Viewers can also read data from Power BI Desktop and other SQL client tools.| +|**Analyze in Excel**|Viewers don't have permission to Analyze in Excel.| +|**Manually update semantic model**|Viewers can't manually update the default semantic model to which the datamart is connected.| +|**New measure**|Viewers don't have permission to create measures.| +|**Lineage view**|Viewers don't have access to reading the lineage view chart.| +|**Share/Manage permissions**|Viewers don't have permission to share datamarts with others.| +|**Create a report**|Viewers don't have access to create content within the workspace and hence can't build reports on top of the datamart.| ## Row level security Row-level security (RLS) can be used to restrict data access for specified users to a datamart. Filters restrict data access at the row level, and you can define filters within roles. In the Power BI service, members of a workspace have access to datamarts in the workspace, and RLS doesn't restrict such data access. -You can configure RLS for datamarts in the **Datamart editor**. The configured RLS on datamarts automatically gets applied to downstream items, including the auto-generated semantic models and reports. +You can configure RLS for datamarts in the **Datamart editor**. The configured RLS on datamarts automatically gets applied to downstream items, including the auto-generated semantic models and reports. > [!NOTE] > Datamarts use the enhanced row-level security editor, which means that not all row-level security filters supported in Power BI can be defined. Limitations include expressions that today can only be defined using DAX including dynamic rules such as USERNAME() or USERPRINCIPALNAME(). To define roles using these filters switch to use the DAX editor. ### Define Row Level Security (RLS) roles and rules for Datamarts - To define RLS roles, take the following steps: -1. Open your datamart and select **Manage Roles** from the ribbon. - :::image type="content" source="media/datamarts-access-control/datamarts-access-control-02.png" alt-text="Screenshot of the manage roles ribbon button."::: +1. Open your datamart and select **Manage Roles** from the ribbon. + :::image type="content" source="media/datamarts-access-control/datamarts-access-control-02.png" alt-text="Screenshot of the manage roles ribbon button." lightbox="media/datamarts-access-control/datamarts-access-control-02.png"::: -2. Create new RLS roles using the **Row security settings** window. You can define a combination of filters on tables and select **Save** to save the role. - :::image type="content" source="media/datamarts-access-control/datamarts-access-control-03.png" alt-text="Screenshot of the row security settings window."::: +2. Create new RLS roles using the **Row security settings** window. You can define a combination of filters on tables and select **Save** to save the role. + :::image type="content" source="media/datamarts-access-control/datamarts-access-control-03.png" alt-text="Screenshot of the row security settings window." lightbox="media/datamarts-access-control/datamarts-access-control-03.png"::: -3. Once the role is saved, select **Assign** to add users to the role. Once assigned, select **Save** to save the role assignments and close the RLS settings modal. +3. Once the role is saved, select **Assign** to add users to the role. Once assigned, select **Save** to save the role assignments and close the RLS settings modal. :::image type="content" source="media/datamarts-access-control/datamarts-access-control-04.png" alt-text="Screenshot of the row security settings selections."::: To validate the roles created, take the following steps: -1. Select the **View as** button from the ribbon. - :::image type="content" source="media/datamarts-access-control/datamarts-access-control-05.png" alt-text="Screenshot of the view as ribbon button."::: +1. Select the **View as** button from the ribbon. + :::image type="content" source="media/datamarts-access-control/datamarts-access-control-05.png" alt-text="Screenshot of the view as ribbon button." lightbox="media/datamarts-access-control/datamarts-access-control-05.png"::: -2. Select the role to be validated by selecting the check box for the role, then select **OK**. +2. Select the role to be validated by selecting the check box for the role, then select **OK**. :::image type="content" source="media/datamarts-access-control/datamarts-access-control-06.png" alt-text="Screenshot of the manage view as role window."::: -3. The data view shows the access that the selected role has. - :::image type="content" source="media/datamarts-access-control/datamarts-access-control-07.png" alt-text="Screenshot of the view as results."::: +3. The data view shows the access that the selected role has. + :::image type="content" source="media/datamarts-access-control/datamarts-access-control-07.png" alt-text="Screenshot of the view as results." lightbox="media/datamarts-access-control/datamarts-access-control-07.png"::: To revert to your access, select the **View as** button on the ribbon again, and select **None**. :::image type="content" source="media/datamarts-access-control/datamarts-access-control-08.png" alt-text="Screenshot of the view as role window with none selected."::: - ## How datamarts become unavailable A datamart can get marked as an unavailable datamart when one of the following situations occurs. **Situation 1:** When a Premium workspace is changed from Premium to non-premium, all datamarts in that workspace become unavailable. The **Datamart editor** becomes unavailable and downstream usage of the datamart and auto-generated semantic models is blocked. Users or administrators must upgrade the workspace to its original Premium capacity to restore datamarts. -**Situation 2:** When dataflow updates a datamart and associated semantic model, but due to a system lock the datamart or semantic model update is pending, the datamart becomes unavailable. The **Datamart editor** isn't accessible when a datamart goes into unavailable state. The **try again** action, shown in the following image, enables users to trigger synchronization between dataflow, datamart and semantic model. It may take a few minutes to complete the requested action but downstream consumption can be continued. +**Situation 2:** When dataflow updates a datamart and associated semantic model, but due to a system lock the datamart or semantic model update is pending, the datamart becomes unavailable. The **Datamart editor** isn't accessible when a datamart goes into unavailable state. The **try again** action, shown in the following image, enables users to trigger synchronization between dataflow, datamart and semantic model. It can take a few minutes to complete the requested action but downstream consumption can be continued. :::image type="content" source="media/datamarts-access-control/datamarts-access-control-01.png" alt-text="Screenshot of the request access setting."::: -**Situation 3:** When a Premium workspace is migrated to another Premium capacity in a different region, the datamart will become unavailable with the error: "Unable to open the datamart because the workspace region has changed. To open the datamart, reconnect the workspace to the region connected when the datamart was created." This behavior is by design since the region where the datamarts were created must be the region where the workspace resides, and migrations are not supported. +**Situation 3:** When a Premium workspace is migrated to another Premium capacity in a different region, the datamart becomes unavailable with the error: "Unable to open the datamart because the workspace region has changed. To open the datamart, reconnect the workspace to the region connected when the datamart was created." This behavior is by design since the region where the datamarts were created must be the region where the workspace resides, and migrations aren't supported. ## Related content -This article provided information about controlling access to datamarts. + +This article provided information about controlling access to datamarts. The following articles provide more information about datamarts and Power BI: @@ -105,8 +106,7 @@ The following articles provide more information about datamarts and Power BI: * [Create reports with datamarts](datamarts-create-reports.md) * [Datamart administration](datamarts-administration.md) - For more information about dataflows and transforming data, see the following articles: + * [Introduction to dataflows and self-service data prep](../dataflows/dataflows-introduction-self-service.md) * [Tutorial: Shape and combine data in Power BI Desktop](../../connect-data/desktop-shape-and-combine-data.md) - diff --git a/powerbi-docs/transform-model/datamarts/datamarts-administration.md b/powerbi-docs/transform-model/datamarts/datamarts-administration.md index cdd3dae443..cfabf936bc 100644 --- a/powerbi-docs/transform-model/datamarts/datamarts-administration.md +++ b/powerbi-docs/transform-model/datamarts/datamarts-administration.md @@ -1,21 +1,23 @@ --- title: Administration of datamarts (preview) -description: Manage and administer datamarts +description: Learn how to manage and administer datamarts in Power BI, including enabling datamarts, tracking usage, viewing audit logs, and understanding limitations. author: davidiseminger ms.author: davidi ms.reviewer: '' ms.service: powerbi ms.subservice: pbi-dataflows ms.topic: how-to -ms.date: 11/10/2023 +ms.date: 10/07/2024 LocalizationGroup: Data from files +#customer intent: As a Power BI user I want to learn how to manage and administer datamarts in Power BI. --- # Administration of datamarts -You can administer the use and settings for datamarts just like you can administer other aspects of Power BI. This article describes and explains how to administer your datamarts, and where to find the settings. +You can administer the use and settings for datamarts just like you can administer other aspects of Power BI. This article describes and explains how to administer your datamarts, and where to find the settings. ## Enabling datamarts in the admin portal + Power BI administrators can enable or disable datamart creation for the entire organization or for specific security groups, using the setting found in the Power BI **admin portal**, as shown in the following image. :::image type="content" source="media/datamarts-administration/datamarts-administration-06.png" alt-text="Screenshot of the admin portal to enable or disable datamarts."::: @@ -26,42 +28,40 @@ In the Power BI admin portal, you can review a list of datamarts along with all :::image type="content" source="media/datamarts-administration/datamarts-administration-02.png" alt-text="Screenshot of the admin portal to track datamarts."::: - Existing Power BI admin APIs for getting workspace information work for datamarts as well, such as *GetGroupsAsAdmin* and the workspace scanner API. Such APIs enable you, as the Power BI service administrator, to retrieve datamarts metadata along with other Power BI item information, so you can monitor workspace usage and generate relevant reports. ### Viewing audit logs and activity events -Power BI administrators can audit datamart operations from the **Microsoft 365 Admin Center**. Audit operations supported on datamarts are the following: +Power BI administrators can audit datamart operations from the **Microsoft 365 Admin Center**. Audit operations supported on datamarts are the following items: * Create * Rename * Update * Delete -* Refresh +* Refresh * View To get audit logs, complete the following steps: -1. Sign in to the Power BI admin portal as the administrator and navigate to **Audit logs**. -2. In the **Audit logs** section, select the button to go to **Microsoft 365 Admin Center** +1. Sign in to the Power BI admin portal as the administrator and navigate to **Audit logs**. +2. In the **Audit logs** section, select the button to go to **Microsoft 365 Admin Center** :::image type="content" source="media/datamarts-administration/datamarts-administration-03.png" alt-text="Screenshot of the admin portal to view audit logs."::: -3. Get audit events by applying search criteria. +3. Get audit events by applying search criteria. :::image type="content" source="media/datamarts-administration/datamarts-administration-04.png" alt-text="Screenshot of the Microsoft 365 admin center audit section."::: -4. Export audit logs and apply filter for datamart operations. +4. Export audit logs and apply filter for datamart operations. :::image type="content" source="media/datamarts-administration/datamarts-administration-05.png" alt-text="Screenshot of the Microsoft 365 admin center audit search criteria."::: - ### Using REST APIs for activity events Administrators can export activity events on datamarts by using existing supported REST APIs. The following articles provide information about the APIs: + * [Admin - Get Activity Events - REST API (Power BI Power BI REST APIs)](/rest/api/power-bi/admin/get-activity-events) * [Track user activities in Power BI](/power-bi/admin/service-admin-auditing) ## Capacity utilization and reporting -Datamart CPU usage is free during preview, including datamarts and queries on SQL endpoints of a datamart. Autogenerated semantic model usage is reported for throttling and autoscaling. To avoid incurring costs during the preview period, consider using a Premium Per User (PPU) trial workspace. - +Datamart CPU usage is free during preview, including datamarts and queries on SQL endpoints of a datamart. Autogenerated semantic model usage is reported for throttling and autoscaling. To avoid incurring costs during the preview period, consider using a Premium Per User (PPU) trial workspace. ## Considerations and limitations @@ -69,25 +69,24 @@ The following limitations should be considered when using datamarts: * Datamarts aren't currently supported in the following Power BI SKUs: EM1/EM2 and EM3. * Datamarts aren't available in workspaces that are bound to an Azure Data Lake Gen2 storage account. -* Datamarts aren't available in sovereign or government clouds. -* Datamart extract, transform and load (ETL) operations can currently only run for up to 24 hours -* Datamarts currently officially support data volumes of up to 100 GB. +* Datamarts aren't available in sovereign or government clouds. +* Datamart extract, transform, and load (ETL) operations can currently only run for up to 24 hours +* Datamarts officially support data volumes of up to 100 GB. * Currently datamarts don’t support the currency data type, and such data types are converted to float. * Data sources behind a VNET or using private links can't currently be used with datamarts; to work around this limitation you can use an on-premises data gateway. * Datamarts use port 1948 for connectivity to the SQL endpoint. Port 1433 needs to be open for datamarts to work. * Datamarts only support Microsoft Entra ID and do *not* support managed identities or service principals at this time. * Beginning February 2023, datamarts support any SQL client. * Datamarts aren't currently available in the following Azure regions: - * West India - * UAE Central - * Poland - * Israel - * Italy +* West India +* UAE Central +* Poland +* Israel +* Italy Datamarts are supported in all other Azure regions. - -**Datamart connectors in Premium workspaces** +## Datamart connectors in Premium workspaces Some connectors aren't supported for datamarts (or dataflows) in Premium workspaces. When using an unsupported connector, you may receive the following error: *Expression.Error: The import "<"connector name">"* matches no exports. Did you miss a module reference? @@ -116,10 +115,9 @@ The following connectors aren't supported for dataflows and datamarts in Premium The use of the previous list of connectors with dataflows or datamarts is only supported workspaces that aren't Premium. - - ## Related content -This article provided information about the administration of datamarts. + +This article provided information about the administration of datamarts. The following articles provide more information about datamarts and Power BI: @@ -131,5 +129,6 @@ The following articles provide more information about datamarts and Power BI: * [Access control in datamarts](datamarts-access-control.md) For more information about dataflows and transforming data, see the following articles: + * [Introduction to dataflows and self-service data prep](../dataflows/dataflows-introduction-self-service.md) * [Tutorial: Shape and combine data in Power BI Desktop](../../connect-data/desktop-shape-and-combine-data.md) diff --git a/powerbi-docs/transform-model/datamarts/datamarts-analyze.md b/powerbi-docs/transform-model/datamarts/datamarts-analyze.md index 28ac56805d..6ffe448b21 100644 --- a/powerbi-docs/transform-model/datamarts/datamarts-analyze.md +++ b/powerbi-docs/transform-model/datamarts/datamarts-analyze.md @@ -1,14 +1,15 @@ --- title: Analyzing datamarts (preview) -description: Analyze your datamarts with various tools that are available +description: Learn how to analyze your datamarts using various tools such as the Datamart editor and SQL Query Editor, and get insights into your data effectively. author: davidiseminger ms.author: davidi ms.reviewer: '' ms.service: powerbi ms.subservice: pbi-dataflows -ms.topic: how-to -ms.date: 11/10/2023 +ms.topic: concept-article +ms.date: 10/01/2024 LocalizationGroup: Data from files +#customer intent: As a Power BI user I want to learn how to analyze my datamarts. --- # Analyzing datamarts @@ -19,10 +20,9 @@ You can analyze your datamarts with multiple tools, including the **Datamart edi The **Datamart editor** provides an easy visual interface to analyze your datamarts. The following sections provide guidance on how to use the **Datamart editor** to gain insights into your datamarts, and your data. - ### Visual query -Once you've loaded data into your datamart, you can use the **Datamart editor** to create queries to analyze your data. You can use the Visual Query editor for a no-code experience to create your queries. +Once you load data into your datamart, you can use the **Datamart editor** to create queries to analyze your data. You can use the Visual Query editor for a no-code experience to create your queries. There are two ways to get to the Visual query editor: @@ -30,13 +30,13 @@ In the **Data grid** view, create a new query using the **+ New Query** button o :::image type="content" source="media/datamarts-analyze/datamarts-analyze-01.png" alt-text="Screenshot of the new query button on the data grid ribbon."::: -Alternatively you can use the **Design view** icon found along the bottom of the Datamart editor window, shown in the following image. +Alternatively you can use the **Design view** icon found along the bottom of the Datamart editor window, shown in the following image. :::image type="content" source="media/datamarts-analyze/datamarts-analyze-02.png" alt-text="Screenshot of the design view icon in the datamart editor."::: -To create a query, drag and drop tables from the Object explorer on the left on to the canvas. +To create a query, drag and drop tables from the Object explorer on the left on to the canvas. -:::image type="content" source="media/datamarts-analyze/datamarts-analyze-03.png" alt-text="Screenshot of dragging a table onto the canvas of the datamart editor."::: +:::image type="content" source="media/datamarts-analyze/datamarts-analyze-03.png" alt-text="Screenshot of dragging a table onto the canvas of the datamart editor." lightbox="media/datamarts-analyze/datamarts-analyze-03.png"::: Once you drag one or more tables onto the canvas, you can use the visual experience to design your queries. The datamart editor uses the similar Power Query diagram view experience to enable you to easily query and analyze your data. Learn more about [Power Query diagram view](/power-query/diagram-view). @@ -44,26 +44,27 @@ As you work on your Visual query, the queries are automatically saved every few The following image shows a sample query created using the no-code Visual Query editor to retrieve the *Top customers by Orders*. -:::image type="content" source="media/datamarts-analyze/datamarts-analyze-04.png" alt-text="Screenshot of sample query results in the datamart editor."::: +:::image type="content" source="media/datamarts-analyze/datamarts-analyze-04.png" alt-text="Screenshot of sample query results in the datamart editor." lightbox="media/datamarts-analyze/datamarts-analyze-04.png"::: There are a few things to keep in mind about the Visual Query editor: -* You can only write DQL (not DDL or DML) + +* You can only write DQL (not DDL or DML) * Only a subset of Power Query operations that support [Query folding](/power-query/power-query-folding) are currently supported * You can't currently open the visual query in Excel - ### SQL Query Editor -The **SQL Query Editor** provides a text editor to write queries using T-SQL. To access the built-in SQL query editor, select the **SQL query editor view** icon located at the bottom of the datamart editor window. +The **SQL Query Editor** provides a text editor to write queries using T-SQL. To access the built-in SQL query editor, select the **SQL query editor view** icon located at the bottom of the datamart editor window. :::image type="content" source="media/datamarts-analyze/datamarts-analyze-05.png" alt-text="Screenshot of the S Q L query editor view icon."::: -The SQL Query editor provides support for intellisense, code completion, syntax highlighting, client-side parsing and validation. Once you’ve written the T-SQL query, select **Run** to execute the query. As you work on your SQL query, the queries are automatically saved every few seconds. A “saving indicator” that shows up in your query tab at the bottom indicates that your query is being saved. The **Results** preview is displayed in the **Results** section. The **Download in Excel** button opens the corresponding T-SQL Query to Excel and executes the query, enabling you to view the results in Excel. The **Visualize results** allows you to create reports from your query results within the SQL query editor. +The SQL Query editor provides support for intellisense, code completion, syntax highlighting, client-side parsing, and validation. Once you write the T-SQL query, select **Run** to execute the query. As you work on your SQL query, the queries are automatically saved every few seconds. A “saving indicator” that shows up in your query tab at the bottom indicates that your query is being saved. The **Results** preview is displayed in the **Results** section. The **Download in Excel** button opens the corresponding T-SQL Query to Excel and executes the query, enabling you to view the results in Excel. The **Visualize results** allows you to create reports from your query results within the SQL query editor. There are a few things to keep in mind about the Visual Query editor: -* You can only write DQL (not DDL or DML) -:::image type="content" source="media/datamarts-analyze/datamarts-analyze-15.png" alt-text="Screenshot of the SQL query editor query results."::: +* You can only write DQL (not DDL or DML) + +:::image type="content" source="media/datamarts-analyze/datamarts-analyze-15.png" alt-text="Screenshot of the SQL query editor query results." lightbox="media/datamarts-analyze/datamarts-analyze-15.png"::: ## Analyze outside the editor @@ -73,19 +74,22 @@ Datamarts provide a SQL DQL (query) experience through your own development envi ### When to Use In-Built Querying vs External SQL Tooling -The no-code visual query editor and datamart editor are available within Power BI for your datamart. The no-code visual query editor enables users who aren't familiar with the SQL language, while the datamart editor is helpful for quick monitoring of the SQL DB. +The no-code visual query editor and datamart editor are available within Power BI for your datamart. The no-code visual query editor enables users who aren't familiar with the SQL language, while the datamart editor is helpful for quick monitoring of the SQL DB. For a querying experience that provides a more comprehensive utility, combining a broad group of graphical tools with many rich script editors, SQL Server Management Studio (SSMS) and Azure Data Studio (ADS) are more robust development environments. ### When to Use SQL Server Management Studio vs Azure Data Studio + While both analysis experiences offer extensive development environments for SQL querying, each environment is tailored toward separate use cases. You can use SSMS for: + * Complex administrative or platform configuration * Security management, including user management and configuration of security features * Live query statistics or client statistics Use ADS for: + * macOS and Linux users * Mostly editing or executing queries * Quick charting and visualizing set results @@ -96,9 +100,7 @@ For developers and analysts with SQL experience, using SQL Server Management Stu To connect to a datamart’s SQL endpoint with client tooling, navigate to the semantic model settings page by selecting the **Datamarts (Preview)** tab in Power BI. From there, expand the **Server settings** section and copy the connection string, as shown in the following image. -:::image type="content" source="media/datamarts-analyze/datamarts-analyze-07.png" alt-text="Screenshot of the server settings connection string."::: - - +:::image type="content" source="media/datamarts-analyze/datamarts-analyze-07.png" alt-text="Screenshot of the server settings connection string." lightbox="media/datamarts-analyze/datamarts-analyze-07.png"::: ### Get started with SSMS @@ -110,19 +112,19 @@ Once the **Connect to Server** window is open, paste the connection string copie :::image type="content" source="media/datamarts-analyze/datamarts-analyze-09.png" alt-text="Screenshot of the S Q L server connect to server window."::: -When the connection has become established, the object explorer displays the connected SQL DB from your datamarts and its respective tables and views, all of which are ready to be queried. +When the connection is established, the object explorer displays the connected SQL DB from your datamarts and its respective tables and views, all of which are ready to be queried. :::image type="content" source="media/datamarts-analyze/datamarts-analyze-10.png" alt-text="Screenshot of the object explorer showing datamart tables and views."::: -To easily preview the data within a table, right-click on a table and select **Select Top 1000 Rows** from the context menu that appears. An autogenerated query returns a collection of results displaying the top 1,000 rows based on the primary key of the table. +To easily preview the data within a table, right-click on a table and select **Select Top 1000 Rows** from the context menu that appears. An autogenerated query returns a collection of results displaying the top 1,000 rows based on the primary key of the table. :::image type="content" source="media/datamarts-analyze/datamarts-analyze-11.png" alt-text="Screenshot of the context menu in object explorer."::: The following image shows the results of such a query. -:::image type="content" source="media/datamarts-analyze/datamarts-analyze-12.png" alt-text="Screenshot of the context menu query results."::: +:::image type="content" source="media/datamarts-analyze/datamarts-analyze-12.png" alt-text="Screenshot of the context menu query results." lightbox="media/datamarts-analyze/datamarts-analyze-12.png"::: -To see the columns within a table, expand the table within **Object explorer**. +To see the columns within a table, expand the table within **Object explorer**. :::image type="content" source="media/datamarts-analyze/datamarts-analyze-13.png" alt-text="Screenshot of the object explorer information."::: @@ -130,11 +132,9 @@ When you connect to datamart using SSMS or other client tools, you can see views A datamart shows two other roles as *admin* and *viewer* under security when connected using SSMS. Users added to a workspace in any of the *Admin* or *Member* or *Contributor* roles get added to the *admin* role on the datamart. Users added to the *Viewer* role in the workspace get added to *viewer* role in the datamart. - ## Relationships metadata -The extended property *isSaaSMetadata* added in the datamart lets you know that this metadata is getting used for SaaS experience. You can query this extended property as below: - +The extended property *isSaaSMetadata* added in the datamart lets you know that this metadata is getting used for SaaS experience. You can query this extended property as shown: ```sql SELECT [name], [value] @@ -142,17 +142,14 @@ FROM sys.extended_properties WHERE [name] = N'isSaaSMetadata' ``` - -The clients (such as the SQL connector) could read the relationships by querying the table-valued function like the following: - +The clients (such as the SQL connector) could read the relationships by querying the table-valued function like the following example: ```sql SELECT * FROM [metadata].[fn_relationships](); ``` - -Notice there are *relationships* and *relationshipColumns* named views under metadata schema to maintain relationships in the datamart. The following tables provide a description of each of them, in turn: +Notice there are *relationships* and *relationshipColumns* named views under metadata schema to maintain relationships in the datamart. The following tables provide a description of each of them, in turn: [metadata].[relationships] @@ -165,12 +162,10 @@ Notice there are *relationships* and *relationshipColumns* named views under met | ToSchemaName | Nvarchar(128) | Schema name of sink table "To"which relationship is defined | | ToObjectName | Nvarchar(128) | Table/View name "To"which relationship is defined | | TypeOfRelationship | Tinyint | Relationship cardinality, the possible values are: 0 – None 1 – OneToOne 2 – OneToMany 3 – ManyToOne 4 – ManyToMany | -| SecurityFilteringBehavior | Tinyint | Indicates how relationships influence filtering of data when evaluating row-level security expressions. The possible values are 1 – OneDirection 2 – BothDirections 3 – None - | +| SecurityFilteringBehavior | Tinyint | Indicates how relationships influence filtering of data when evaluating row-level security expressions. The possible values are 1 – OneDirection 2 – BothDirections 3 – None| | IsActive | Bit | A boolean value that indicates whether the relationship is marked as Active or Inactive. | | RelyOnReferentialIntegrity | Bit | A boolean value that indicates whether the relationship can rely on referential integrity or not. | -| CrossFilteringBehavior | Tinyint | Indicates how relationships influence filtering of data. The possible values are: 1 – OneDirection 2 – BothDirections 3 – Automatic - | +| CrossFilteringBehavior | Tinyint | Indicates how relationships influence filtering of data. The possible values are: 1 – OneDirection 2 – BothDirections 3 – Automatic| | CreatedAt | Datetime | Date the relationship was created. | | UpdatedAt | datetime | Date the relationship was modified. | | DatamartObjectId | Navrchar(32) | Unique identifier for datamart | @@ -186,9 +181,7 @@ Notice there are *relationships* and *relationshipColumns* named views under met | CreatedAt | datetime | ate the relationship was created. | | DatamartObjectId | Navrchar(32) | Unique identifier for datamart | - -You can join these two views to get relationships added in the datamart. The following query will join these views: - +You can join these two views to get relationships added in the datamart. The following query joins these views: ```sql SELECT @@ -204,12 +197,14 @@ FROM [METADATA].[relationships] AS R JOIN [metadata].[relationshipColumns] AS C ON R.RelationshipId=C.RelationshipId ``` + ## Limitations -- Visualize results currently does not support SQL queries with an ORDER BY clause. +Visualize results currently doesn't support SQL queries with an ORDER BY clause. ## Related content -This article provided information about analyzing data in datamarts. + +This article provided information about analyzing data in datamarts. The following articles provide more information about datamarts and Power BI: @@ -220,7 +215,7 @@ The following articles provide more information about datamarts and Power BI: * [Access control in datamarts](datamarts-access-control.md) * [Datamart administration](datamarts-administration.md) - For more information about dataflows and transforming data, see the following articles: + * [Introduction to dataflows and self-service data prep](../dataflows/dataflows-introduction-self-service.md) * [Tutorial: Shape and combine data in Power BI Desktop](../../connect-data/desktop-shape-and-combine-data.md) diff --git a/powerbi-docs/transform-model/datamarts/datamarts-create-reports.md b/powerbi-docs/transform-model/datamarts/datamarts-create-reports.md index c3e825a80f..64ab15376d 100644 --- a/powerbi-docs/transform-model/datamarts/datamarts-create-reports.md +++ b/powerbi-docs/transform-model/datamarts/datamarts-create-reports.md @@ -1,22 +1,21 @@ --- title: Create reports using datamarts (preview) -description: Use your datamarts to create reports and share with users +description: Learn how to create and share reports using datamarts in Power BI, including live connections, composite models, and SQL Endpoints. author: davidiseminger ms.author: davidi ms.reviewer: '' ms.service: powerbi ms.subservice: pbi-dataflows ms.topic: how-to -ms.date: 11/10/2023 +ms.date: 10/01/2024 LocalizationGroup: Data from files +#customer intent: As a Power BI user I want to learn how to share reports using datamarts in Power BI. --- # Create reports using datamarts Datamarts let you create reusable and auto-generated semantic models to create reports in various ways in Power BI. This article describes the various ways you can use datamarts, and their auto-generated semantic models, to create reports. - -For example, you can establish a live connection to a shared semantic model in the Power BI service and create many different reports from the same semantic model. You can create your perfect data model in Power BI Desktop and publish it to the Power BI service. Then you and others can create multiple different reports in separate .pbix files from that common data model and save them to different workspaces. - +For example, you can establish a live connection to a shared semantic model in the Power BI service and create many different reports from the same semantic model. You can create your perfect data model in Power BI Desktop and publish it to the Power BI service. Then you and others can create multiple different reports in separate .pbix files from that common data model and save them to different workspaces. Advanced users can build reports from a datamart using a composite model or using the SQL Endpoint. Reports that use datamarts can be created in either of the following two tools: @@ -26,13 +25,13 @@ Reports that use datamarts can be created in either of the following two tools: Let's take a look at how datamarts can be used with each, in turn. -## Create reports in the Power BI service +## Create reports in the Power BI service **Scenario 1:** From within the datamart experience, using the ribbon and the main home tab, navigate to the **New report** button. This provides a native, quick way to create reports. Selecting **New report** opens a browser tab to the report editing canvas to a new report that is built on the semantic model. When you save your new report you're prompted to choose a workspace, provided you have write permissions for that workspace. If you don't have write permissions, or if you're a free user and the semantic model resides in a Premium-capacity workspace, the new report is saved in your *My workspace*. -**Scenario 2:** Using the auto-generated semantic model and action menu in the workspace: In the Power BI workspace, navigate to the auto-generated semantic model and select the **More** menu (...) to create a report in the Power BI service. +**Scenario 2:** Using the auto-generated semantic model and action menu in the workspace: In the Power BI workspace, navigate to the auto-generated semantic model and select the **More** menu (...) to create a report in the Power BI service. Selecting the **More** opens the report editing canvas to a new report that is built on the semantic model. When you save your new report, it's saved in the workspace that contains the semantic model as long as you have write permissions on that workspace. If you don't have write permissions, or if you're a free user and the semantic model resides in a Premium-capacity workspace, the new report is saved in your *My workspace*. @@ -40,15 +39,14 @@ Selecting the **More** opens the report editing canvas to a new report that is b In the data hub, you'll see datamarts and their associated auto-generated semantic models. Select the datamart to navigate to the datamart's details page where you can see the datamart’s metadata, supported actions, lineage and impact analysis, along with related reports created from that datamart. Auto-generated semantic models derived from datamarts behave the same as any semantic model. -To find the datamart, you begin with the data hub. The image below shows the data hub in the Power BI service, with the following numbered information: +To find the datamart, you begin with the data hub. The image below shows the data hub in the Power BI service, with the following numbered information: -1. Select a datamart to view its datamart details page -2. Select the **More** menu (...) to display the options menu -3. Select **Details** to view details summary. +1. Select a datamart to view its datamart details page +2. Select the **More** menu (...) to display the options menu +3. Select **Details** to view details summary. :::image type="content" source="media/datamarts-create-reports/datamarts-create-reports-01.png" alt-text="Screenshot of how to find datamarts in the power B I service." lightbox="media/datamarts-create-reports/datamarts-create-reports-01.png"::: - ## Create reports using Power BI Desktop You can build reports from semantic models with **Power BI Desktop** using a Live connection to the semantic model. For information on how to make the connection, see [connect to semantic models from Power BI Desktop](/power-bi/connect-data/desktop-report-lifecycle-datasets). @@ -57,19 +55,19 @@ For advanced situations where you want to add more data or change the storage mo Complete the following steps to connect to a datamart in Power BI Desktop: -1. Navigate to the datamart settings in your workspace and copy the SQL endpoint connection string. -2. In Power BI Desktop select the **SQL Server connector** from the ribbon or from **Get Data**. -3. Paste the connection string into the connector dialog. -4. For authentication, select *organizational account*. -5. Authenticate using Microsoft Entra ID - MFA (the same way you would connect to Power BI) -6. Select **Connect**. -7. Select the data items you want to include or not include in your semantic model. - -For more information, see [connect to on-premises data in SQL Server](/power-bi/connect-data/service-gateway-sql-tutorial). You don't need to set up a gateway with datamarts to use them in Power BI. +1. Navigate to the datamart settings in your workspace and copy the SQL endpoint connection string. +2. In Power BI Desktop select the **SQL Server connector** from the ribbon or from **Get Data**. +3. Paste the connection string into the connector dialog. +4. For authentication, select *organizational account*. +5. Authenticate using Microsoft Entra ID - MFA (the same way you would connect to Power BI) +6. Select **Connect**. +7. Select the data items you want to include or not include in your semantic model. +For more information, see [connect to on-premises data in SQL Server](/power-bi/connect-data/service-gateway-sql-tutorial). You don't need to set up a gateway with datamarts to use them in Power BI. ## Related content -This article provided information about creating reports using datamarts. + +This article provided information about creating reports using datamarts. The following articles provide more information about datamarts and Power BI: @@ -80,7 +78,7 @@ The following articles provide more information about datamarts and Power BI: * [Access control in datamarts](datamarts-access-control.md) * [Datamart administration](datamarts-administration.md) - For more information about dataflows and transforming data, see the following articles: + * [Introduction to dataflows and self-service data prep](../dataflows/dataflows-introduction-self-service.md) * [Tutorial: Shape and combine data in Power BI Desktop](../../connect-data/desktop-shape-and-combine-data.md) diff --git a/powerbi-docs/transform-model/datamarts/datamarts-discovery.md b/powerbi-docs/transform-model/datamarts/datamarts-discovery.md index d2bf346be0..b09fbd792d 100644 --- a/powerbi-docs/transform-model/datamarts/datamarts-discovery.md +++ b/powerbi-docs/transform-model/datamarts/datamarts-discovery.md @@ -9,6 +9,7 @@ ms.service: powerbi ms.subservice: pbi-dataflows ms.topic: concept-article ms.date: 09/24/2024 +ms.custom: FY25Q1-Linter LocalizationGroup: Data from files --- @@ -30,7 +31,7 @@ For more information about a datamart, to explore reports, to view lineage, or t A page displays the information about the datamart, provides a button to create a new report, share datamart, pull data into Excel or view lineage. Related reports for the selected datamart are also displayed, if any exist. You can also navigate to the datamart editor, its settings, or manage permissions. -The page also shows the workspace where the datamart is located, its endorsement status, its last refresh time, and any sensitivity settings that have been applied. It also displays the datamart's SQL endpoint connection string and the datamart's description. +The page also shows the workspace where the datamart is located, its endorsement status, its last refresh time, and any sensitivity settings that are applied. It also displays the datamart's SQL endpoint connection string and the datamart's description. The following image shows the datamarts information page. @@ -44,7 +45,7 @@ The following image shows the lineage of a datamart. To view any dependent items of the selected datamart, select the **Impact analysis** menu, which is displayed along the right side of the screen. -:::image type="content" source="media/datamarts-discovery/datamarts-discovery-03.png" alt-text="Screenshot of datamart impact analysis pane."::: +:::image type="content" source="media/datamarts-discovery/datamarts-discovery-03.png" alt-text="Screenshot of datamart impact analysis pane." lightbox="media/datamarts-discovery/datamarts-discovery-03.png"::: ### Data hub in Power BI Desktop @@ -64,7 +65,7 @@ Selecting a datamart from the list enables the **Connect** button in the window. ## Related content -This article provided information about creating reports using datamarts. +This article provided information about creating reports using datamarts. The following articles provide more information about datamarts and Power BI: diff --git a/powerbi-docs/transform-model/datamarts/datamarts-sharing-manage-permissions.md b/powerbi-docs/transform-model/datamarts/datamarts-sharing-manage-permissions.md index 4b77bd3e75..4b2e2094c5 100644 --- a/powerbi-docs/transform-model/datamarts/datamarts-sharing-manage-permissions.md +++ b/powerbi-docs/transform-model/datamarts/datamarts-sharing-manage-permissions.md @@ -1,14 +1,16 @@ --- -title: Sharing Power BI datamarts and managing permissions (preview) -description: Share and manage permissions using Power BI datamarts. +title: Sharing Power BI datamarts and managing permissions +description: Learn how to share Power BI datamarts and manage permissions effectively to provide users with specific access and enhance collaboration. author: davidiseminger ms.author: davidi ms.reviewer: '' ms.service: powerbi ms.subservice: pbi-dataflows -ms.topic: how-to -ms.date: 11/10/2023 +ms.topic: concept-article +ms.date: 09/30/2024 +ms.custom: FY25Q1-Linter LocalizationGroup: Data from files +#customer intent: As a Power BI user I want to learn how to share Power BI datamarts and manage permissions. --- # Sharing datamarts and managing permissions (preview) @@ -18,27 +20,26 @@ This article describes the ways you can share your datamarts and manage its perm ## Sharing datamarts for consumption Once a datamart has been created, you can share it for downstream consumption by other users in your organization. Sharing a datamart enables the recipient to access the datamart in the following ways: + * **SQL connection string:** Connect to the datamart’s underlying SQL connection string and query the datamart from SQL client tools. + * **Auto-generated semantic model:** Build content based on the datamart’s underlying semantic model, by providing *Build* permissions. -There are a few ways to share a datamart, described in the following sections. +There are a few ways to share a datamart, described in the following sections. ### Share from a workspace While in the workspace, select the **Share** option from the datamart’s context menu, as shown in the following image. -:::image type="content" source="media/datamarts-sharing-manage-permissions/datamarts-sharing-manage-permissions-01.png" alt-text="Screenshot of sharing a datamart from a workspace."::: - +:::image type="content" source="media/datamarts-sharing-manage-permissions/datamarts-sharing-manage-permissions-01.png" alt-text="Screenshot of sharing a datamart from a workspace." lightbox="media/datamarts-sharing-manage-permissions/datamarts-sharing-manage-permissions-01.png"::: ### Share from the data hub -To share a datamart from the data hub, select **Share** from the datamart’s context menu within the data hub. You can perform this sharing from any of tabs in that window: **All**, **My data**, **Trusted in your org** or **Recommended**. +To share a datamart from the data hub, select **Share** from the datamart’s context menu within the data hub. You can perform this sharing from any of tabs in that window: **All**, **My data**, **Trusted in your org** or **Recommended**. The following image shows selecting the context menu from within the data hub, and selecting **Share**. - -:::image type="content" source="media/datamarts-sharing-manage-permissions/datamarts-sharing-manage-permissions-02.png" alt-text="Screenshot of sharing a datamart from the data hub."::: - +:::image type="content" source="media/datamarts-sharing-manage-permissions/datamarts-sharing-manage-permissions-02.png" alt-text="Screenshot of sharing a datamart from the data hub." lightbox="media/datamarts-sharing-manage-permissions/datamarts-sharing-manage-permissions-02.png"::: ### Share from datamart information page @@ -52,14 +53,13 @@ You can also select the **Share datamart** button from the information panel its :::image type="content" source="media/datamarts-sharing-manage-permissions/datamarts-sharing-manage-permissions-04.png" alt-text="Screenshot of sharing a datamart from the information panel in the data hub using a button on the datamart information screen." lightbox="media/datamarts-sharing-manage-permissions/datamarts-sharing-manage-permissions-04.png"::: - ### The share datamart process -Regardless of which way you choose to share a datamart, the **Grant people access** window appears so you can enter the names or email addresses of the people or groups (distribution groups or security groups) in your organization with whom you want to grant access to the datamart. +Regardless of which way you choose to share a datamart, the **Grant people access** window appears so you can enter the names or email addresses of the people or groups (distribution groups or security groups) in your organization with whom you want to grant access to the datamart. -You can choose whether recipients can reshare the datamart with others in the organization, by selecting the checkbox next to **Allow recipients to share this datamart**. There's an option to allow users to create Power BI reports (from scratch, autocreate, paginated reports) on top of the default semantic model that is connected to the datamart by selecting the checkbox next to **Build reports on the default semantic model**. Both of these options are selected by default. +You can choose whether recipients can reshare the datamart with others in the organization, by selecting the checkbox next to **Allow recipients to share this datamart**. There's an option to allow users to create Power BI reports (from scratch, autocreate, paginated reports) on top of the default semantic model that is connected to the datamart by selecting the checkbox next to **Build reports on the default semantic model**. Both of these options are selected by default. -You can also choose to send recipients a message to provide more context, by typing a message into the **Add a message (optional)** field in the **Grant people access** window. +You can also choose to send recipients a message to provide more context, by typing a message into the **Add a message (optional)** field in the **Grant people access** window. The following image shows the **Grant people access** window. @@ -67,7 +67,7 @@ The following image shows the **Grant people access** window. Once you grant access, recipients receive an email stating they've been granted access to the datamart. The email includes a button titled **Open this datamart** that opens the datamart's information page. -When recipients open the link or otherwise navigate to the shared datamart, its information page shows the SQL connection string for connecting to the datamart. Users can use client tools other than Power BI, such as SSMS, to query the datamart using T-SQL. +When recipients open the link or otherwise navigate to the shared datamart, its information page shows the SQL connection string for connecting to the datamart. Users can use client tools other than Power BI, such as SSMS, to query the datamart using T-SQL. The following image highlights the **SQL connection string** in a datamart information window. @@ -79,16 +79,14 @@ The following image highlights the **Create a report** entry point in a datamart :::image type="content" source="media/datamarts-sharing-manage-permissions/datamarts-sharing-manage-permissions-05.png" alt-text="Screenshot of Create a report for a datamart." lightbox="media/datamarts-sharing-manage-permissions/datamarts-sharing-manage-permissions-05.png"::: - > [!NOTE] > Sharing a datamart allows the recipient to access the datamart for downstream consumption and not to collaborate on the datamart creation. To enable other creators to collaborate on the datamart, you must provide Member, Admin or Contributor access to the workspace where the datamart is created. - ## Manage permissions The Manage permissions page shows the list of users who have been given access by either assigning to Workspace roles or item permissions (as described earlier in this article). -If you're an Admin or Member, go to your workspace and select **More options** which shows the context menu and select **Manage permissions**. +If you're an Admin or Member, go to your workspace and select **More options** which shows the context menu and select **Manage permissions**. :::image type="content" source="media/datamarts-sharing-manage-permissions/datamarts-sharing-manage-permissions-08.png" alt-text="Screenshot of selecting Manage Permissions from the workspace context menu." lightbox="media/datamarts-sharing-manage-permissions/datamarts-sharing-manage-permissions-08.png"::: @@ -96,12 +94,13 @@ For users who were provided workspace roles, it shows the corresponding user, wo :::image type="content" source="media/datamarts-sharing-manage-permissions/datamarts-sharing-manage-permissions-09.png" alt-text="Screenshot of the datamart Manage Permissions page." lightbox="media/datamarts-sharing-manage-permissions/datamarts-sharing-manage-permissions-09.png"::: -You can choose to add or remove permissions using the **Manage permissions** experience. **Remove reshare** removes the *Reshare* permissions. **Remove access** removes all item permissions and stops sharing the datamart with the specified user. +You can choose to add or remove permissions using the **Manage permissions** experience. **Remove reshare** removes the *Reshare* permissions. **Remove access** removes all item permissions and stops sharing the datamart with the specified user. :::image type="content" source="media/datamarts-sharing-manage-permissions/datamarts-sharing-manage-permissions-10.png" alt-text="Screenshot of the Remove reshare permission selected from the datamart Manage Permissions page." lightbox="media/datamarts-sharing-manage-permissions/datamarts-sharing-manage-permissions-10.png"::: ## Related content -This article provided information about creating reports using datamarts. + +This article provided information about creating reports using datamarts. The following articles provide more information about datamarts and Power BI: @@ -113,7 +112,7 @@ The following articles provide more information about datamarts and Power BI: * [Access control in datamarts](datamarts-access-control.md) * [Datamart administration](datamarts-administration.md) - For more information about dataflows and transforming data, see the following articles: + * [Introduction to dataflows and self-service data prep](../dataflows/dataflows-introduction-self-service.md) -* [Tutorial: Shape and combine data in Power BI Desktop](../../connect-data/desktop-shape-and-combine-data.md) \ No newline at end of file +* [Tutorial: Shape and combine data in Power BI Desktop](../../connect-data/desktop-shape-and-combine-data.md) diff --git a/powerbi-docs/transform-model/datamarts/datamarts-understand.md b/powerbi-docs/transform-model/datamarts/datamarts-understand.md index 98162edde5..ccfbcb75c0 100644 --- a/powerbi-docs/transform-model/datamarts/datamarts-understand.md +++ b/powerbi-docs/transform-model/datamarts/datamarts-understand.md @@ -6,9 +6,10 @@ ms.author: davidi ms.reviewer: '' ms.service: powerbi ms.subservice: pbi-dataflows -ms.topic: how-to -ms.date: 11/10/2023 +ms.topic: concept-article +ms.date: 10/07/2024 LocalizationGroup: Data from files +#customer intent: As a Power BI user I want to learn about best practices and concepts for datamarts in Power BI. --- # Understand datamarts @@ -17,14 +18,14 @@ This article describes and explains important concepts about datamarts. ## Understand semantic model (default) -Datamarts provide a semantic layer that is automatically generated and synchronized with the contents of the datamart tables, their structure, and underlying data. This layer is provided in an automatically generated semantic model. This automatic generation and synchronization enables you to further describe the domain of data with things like hierarchies, friendly names and descriptions. You can also set formatting specific to your locale or business requirements. With datamarts, you can create measures and standardized metrics for reporting. Power BI (and other client tools) can create visuals and provide results for such calculations based on the data in context. +Datamarts provide a semantic layer that is automatically generated and synchronized with the contents of the datamart tables, their structure, and underlying data. This layer is provided in an automatically generated semantic model. This automatic generation and synchronization enables you to further describe the domain of data with things like hierarchies, friendly names, and descriptions. You can also set formatting specific to your locale or business requirements. With datamarts, you can create measures and standardized metrics for reporting. Power BI (and other client tools) can create visuals and provide results for such calculations based on the data in context. -The **default** Power BI semantic model created from a datamart eliminates the need to connect to a separate semantic model, set up refresh schedules, and manage multiple data elements. Instead, you can build your business logic in a datamart and its data will be immediately available in Power BI, enabling the following: +The **default** Power BI semantic model created from a datamart eliminates the need to connect to a separate semantic model, set up refresh schedules, and manage multiple data elements. Instead, you can build your business logic in a datamart and its data is immediately available in Power BI, enabling the following: * Datamart data access through the Semantic model Hub. * Capability to analyze in Excel. * Capability to quickly create reports in the Power BI service. -* No need to refresh, synchronize data or understand connection details. +* No need to refresh, synchronize data, or understand connection details. * Build solutions on the web without needing Power BI Desktop. During preview, default semantic model connectivity is available using [DirectQuery](../../connect-data/desktop-directquery-about.md) only. The following image shows how datamarts fit into the process continuum starting with connecting to data, all the way through creating reports. @@ -33,8 +34,8 @@ During preview, default semantic model connectivity is available using [DirectQu Default semantic models are different from traditional Power BI semantic models in the following ways: -* The XMLA endpoint supports read-only operations and users can't edit the semantic model directly. With XMLA read-only permission you can query the data in a query window. -* The default semantic models don't have data source settings and users don't need to enter credentials. Rather, they use automatic single sign-on (SSO) for queries. +* The XMLA endpoint supports read-only operations and users can't edit the semantic model directly. With XMLA read-only permission you can query the data in a query window. +* The default semantic models don't have data source settings and users don't need to enter credentials. Rather, they use automatic single sign-on (SSO) for queries. * For refresh operations, semantic models use the semantic model author credentials to connect to the managed datamart’s SQL endpoint. With Power BI Desktop users can build composite models, enabling you to connect to the datamart’s semantic model and do the following: @@ -46,19 +47,19 @@ Finally, if you don't want to use the default semantic model directly, you can c ### Understand what's in the default semantic model -Currently, tables in the datamart are automatically added to the default semantic model. Users can also manually select tables -or views from the datamart they want included in the model for more flexibility. Objects that are in the default semantic model -will be created as a layout in the model view. +Currently, tables in the datamart are automatically added to the default semantic model. Users can also manually select tables +or views from the datamart they want included in the model for more flexibility. Objects that are in the default semantic model +are created as a layout in the model view. -The background sync that includes objects (tables and views) will wait for the downstream semantic model to not be in use to -update the semantic model, honoring bounded staleness. Users can always go and manually pick tables they want or not want in -the semantic model. +The background sync that includes objects (tables and views) waits for the downstream semantic model to not be in use to +update the semantic model, honoring bounded staleness. Users can always go and manually pick tables they want or not want in +the semantic model. ## Understand incremental refresh and datamarts You can create and modify incremental data refresh, similar to dataflows and semantic model incremental refresh, using the datamart editor. Incremental refresh extends scheduled refresh operations by providing automated partition creation and management for datamart tables that frequently load new and updated data. -For most datamarts, incremental refresh will involve one or more tables that contain transaction data that changes often and can grow exponentially, such as a fact table in a relational or star database schema. If you use an incremental refresh policy to partition the table, and refresh only the most recent import partitions, you can significantly reduce the amount of data that must be refreshed. +For most datamarts, incremental refresh involves one or more tables that contain transaction data that changes often and can grow exponentially, such as a fact table in a relational or star database schema. If you use an incremental refresh policy to partition the table, and refresh only the most recent import partitions, you can significantly reduce the amount of data that must be refreshed. Incremental refresh and real-time data for datamarts offers the following advantages: @@ -94,7 +95,7 @@ Use *Deployment Pipelines* for changes to ensure the best performance, and to en ### Considerations and limitations for proactive caching * Power BI currently caps the duration of caching operations to 10 minutes. -* Constraints of uniqueness/non-null for particular columns will be enforced in the Import model and will fail the cache building if the data doesn't conform. +* Constraints of uniqueness/non-null for particular columns will be enforced in the Import model and cache building fails if the data doesn't conform. ## Related content @@ -113,5 +114,3 @@ For more information about dataflows and transforming data, see the following ar * [Introduction to dataflows and self-service data prep](../dataflows/dataflows-introduction-self-service.md) * [Tutorial: Shape and combine data in Power BI Desktop](../../connect-data/desktop-shape-and-combine-data.md) - - diff --git a/powerbi-docs/transform-model/desktop-connect-dataflows.md b/powerbi-docs/transform-model/desktop-connect-dataflows.md index 76207e643d..578f38a781 100644 --- a/powerbi-docs/transform-model/desktop-connect-dataflows.md +++ b/powerbi-docs/transform-model/desktop-connect-dataflows.md @@ -1,5 +1,5 @@ --- -title: Connect to data created by Power Platform dataflows in Power BI Desktop +title: Connect to Power Platform dataflows in Power BI Desktop description: Learn how to easily connect to, use, and get the best performance while using dataflows in Power BI Desktop. author: davidiseminger ms.author: davidi @@ -7,8 +7,9 @@ ms.reviewer: '' ms.service: powerbi ms.subservice: pbi-dataflows ms.topic: how-to -ms.date: 12/30/2022 +ms.date: 09/30/2024 LocalizationGroup: Connect to data +#customer intent: As a Power BI user I want to learn how to connect to dataflows and get the best performance in Power BI Desktop. --- # Connect to data created by Power Platform dataflows in Power BI Desktop @@ -27,7 +28,7 @@ To use the Power Platform dataflows connector, you must be running a recent vers ## Desktop performance -**Power BI Desktop** runs locally on the computer on which it's installed. Ingestion performance of dataflows is determined by various factors. Those factors include the size of the data, your computer's CPU and RAM, network bandwidth, distance from the data center, and other factors. +**Power BI Desktop** runs locally on the computer on which it's installed. Ingestion performance of dataflows determines various factors. Those factors include the size of the data, your computer's CPU and RAM, network bandwidth, distance from the data center, and other factors. You can improve data ingestion performance for dataflows. For example, if the ingested data size is too large for **Power BI Desktop** to manage on your computer, you can use linked and computed entities in dataflows to aggregate the data (within dataflows) and ingest only the pre-prepared, aggregated data. @@ -35,7 +36,7 @@ In that manner, the processing of large data is performed online in dataflows, r ## Other considerations -Most dataflows reside in the Power BI service tenant. However, **Power BI Desktop** users can't access dataflows that are stored in Azure Data Lake Storage Gen2 account, unless they're the owner of the dataflow, or they've been explicitly authorized to the dataflow’s CDM folder. Consider the following situation: +Most dataflows reside in the Power BI service tenant. However, **Power BI Desktop** users can't access dataflows that are stored in Azure Data Lake Storage Gen2 account, unless they're the owner of the dataflow, or they are explicitly authorized to the dataflow’s CDM folder. Consider the following situation: 1. Anna creates a new workspace and configures it to store dataflows in the organization’s data lake. 2. Ben, who is also a member of the workspace Anna created, wants to use Power BI Desktop and the dataflow connector to get data from the dataflow Anna created. diff --git a/powerbi-docs/transform-model/desktop-formula-editor.md b/powerbi-docs/transform-model/desktop-formula-editor.md index caa7cb6536..f028224c6d 100644 --- a/powerbi-docs/transform-model/desktop-formula-editor.md +++ b/powerbi-docs/transform-model/desktop-formula-editor.md @@ -6,9 +6,11 @@ ms.author: davidi ms.reviewer: '' ms.service: powerbi ms.subservice: pbi-transform-model -ms.topic: conceptual -ms.date: 02/10/2023 +ms.topic: concept-article +ms.date: 09/30/2024 LocalizationGroup: Transform and shape data +ms.custom: FY25Q1-Linter +#customer intent: As a Power BI user I want to learn how to use keyboard shortcuts in Power BI Desktop. --- # Formula editor in Power BI Desktop diff --git a/powerbi-docs/transform-model/desktop-import-and-display-kpis.md b/powerbi-docs/transform-model/desktop-import-and-display-kpis.md index 17beb39559..22be564095 100644 --- a/powerbi-docs/transform-model/desktop-import-and-display-kpis.md +++ b/powerbi-docs/transform-model/desktop-import-and-display-kpis.md @@ -7,8 +7,10 @@ ms.reviewer: '' ms.service: powerbi ms.subservice: pbi-transform-model ms.topic: how-to -ms.date: 1/3/2023 +ms.date: 09/30/2024 LocalizationGroup: Model your data +ms.custom: FY25Q1-Linter +#customer intent: As a Power BI user I want to learn how to import KPIs from an Excel workbook. --- # Import and display KPIs in Power BI @@ -27,6 +29,6 @@ To import and display KPIs: 1. Imported KPIs are best used in standard visualization types, such as the **Table** type. Power BI also includes the **KPI** visualization type, which should only be used to create new KPIs. - :::image type="content" source="media/desktop-import-and-display-kpis/desktoppreviewfeatureon3.png" alt-text="Screenshot of Power BI Desktop showing Table1 fields selected in Field pane."::: + :::image type="content" source="media/desktop-import-and-display-kpis/desktoppreviewfeatureon3.png" alt-text="Screenshot of Power BI Desktop showing Table1 fields selected in Field pane." lightbox="media/desktop-import-and-display-kpis/desktoppreviewfeatureon3.png"::: You can use KPIs to highlight trends, progress, or other important indicators. diff --git a/powerbi-docs/transform-model/desktop-mobile-geofiltering.md b/powerbi-docs/transform-model/desktop-mobile-geofiltering.md index 90a56d904e..ff207c8562 100644 --- a/powerbi-docs/transform-model/desktop-mobile-geofiltering.md +++ b/powerbi-docs/transform-model/desktop-mobile-geofiltering.md @@ -1,13 +1,14 @@ --- -title: Set geographic filters in Power BI Desktop for the mobile apps +title: Set geographic filters in Power BI Desktop description: Learn how to set your model’s geographic filtering in Power BI Desktop, so you can automatically filter data for your location in Power BI mobile apps. author: paulinbar ms.author: painbar ms.service: powerbi ms.subservice: pbi-transform-model ms.topic: how-to -ms.date: 02/28/2023 +ms.date: 10/01/2024 LocalizationGroup: Model your data +#customer intent: As a Power BI user I want to learn hot to set geographic filtering for Power BI models. --- # Set geographic filters in Power BI Desktop for use in the mobile app @@ -26,7 +27,7 @@ For example, say you're a sales manager that travels to meet customers, and you 2. Select a column with geographic data, for example, a City column. - :::image type="content" source="media/desktop-mobile-geofiltering/power-bi-desktop-geo-column.png" alt-text="Screenshot of the Data Category dropdown list with City highlighted."::: + :::image type="content" source="media/desktop-mobile-geofiltering/power-bi-desktop-geo-column.png" alt-text="Screenshot of the Data Category dropdown list with City highlighted." lightbox="media/desktop-mobile-geofiltering/power-bi-desktop-geo-column.png"::: 3. On the **Column tools** tab, select **Data category**, then the correct category, in this example, **City**. @@ -41,9 +42,9 @@ For example, say you're a sales manager that travels to meet customers, and you ## Create visuals with your geographic data -1. Switch to the Report view, :::image type="icon" source="media/desktop-mobile-geofiltering/power-bi-desktop-report-icon.png"::: and create visuals that use the geographic fields in your data. +1. Switch to the Report view :::image type="icon" source="media/desktop-mobile-geofiltering/power-bi-desktop-report-icon.png":::, and create visuals that use the geographic fields in your data. - :::image type="content" source="media/desktop-mobile-geofiltering/power-bi-desktop-geo-report.png" alt-text="Screenshot of Report view showing a map and a clustered bar chart visual."::: + :::image type="content" source="media/desktop-mobile-geofiltering/power-bi-desktop-geo-report.png" alt-text="Screenshot of Report view showing a map and a clustered bar chart visual." lightbox="media/desktop-mobile-geofiltering/power-bi-desktop-geo-report.png"::: In this example, the model also contains a calculated column that brings city and state together into one column. To learn more, see [creating calculated columns in Power BI Desktop](desktop-calculated-columns.md). diff --git a/powerbi-docs/transform-model/desktop-visual-calculations-overview.md b/powerbi-docs/transform-model/desktop-visual-calculations-overview.md index a8ee608e03..0781ab36ca 100644 --- a/powerbi-docs/transform-model/desktop-visual-calculations-overview.md +++ b/powerbi-docs/transform-model/desktop-visual-calculations-overview.md @@ -7,7 +7,7 @@ ms.reviewer: '' ms.service: powerbi ms.subservice: pbi-transform-model ms.topic: how-to -ms.date: 09/30/2024 +ms.date: 10/10/2024 LocalizationGroup: Model your data no-loc: [RUNNINGSUM, MOVINGAVERAGE, COLLAPSE, COLLAPSEALL, EXPAND, EXPANDALL, PREVIOUS, NEXT, FIRST, LAST, ROWS, COLUMNS, ROWS COLUMNS, COLUMNS ROWS, NONE, HIGHESTPARENT, LOWESTPARENT, ISATLEVEL, RANGE, WINDOW, OFFSET, INDEX, ORDERBY] --- @@ -67,7 +67,7 @@ The visual calculations window opens in **Edit** mode. The **Edit** mode screen :::image type="content" source="media/desktop-visual-calculations-overview/desktop-visual-calculations-03.png" alt-text="Screenshot showing areas of the visual calculations edit screen."::: -To add a visual calculation, type the expression in the formula bar. For example, in a visual that contains **Sales Amount** and **Total Product Cost** by **Fiscal Year**, you can add a visual calculation that calculates the profit for each year by typing: `"Profit = [Sales Amount] – [Total Product Cost]"`. +To add a visual calculation, type the expression in the formula bar. For example, in a visual that contains **Sales Amount** and **Total Product Cost** by **Fiscal Year**, you can add a visual calculation that calculates the profit for each year by typing: `Profit = [Sales Amount] – [Total Product Cost]`. :::image type="content" source="media/desktop-visual-calculations-overview/desktop-visual-calculations-04.png" alt-text="Screenshot of entering a visual calculation."::: diff --git a/powerbi-docs/transform-model/log-analytics/desktop-log-analytics-faq.md b/powerbi-docs/transform-model/log-analytics/desktop-log-analytics-faq.md index 57b12b97f6..780b9b41fa 100644 --- a/powerbi-docs/transform-model/log-analytics/desktop-log-analytics-faq.md +++ b/powerbi-docs/transform-model/log-analytics/desktop-log-analytics-faq.md @@ -7,8 +7,9 @@ ms.reviewer: '' ms.service: powerbi ms.subservice: pbi-transform-model ms.topic: how-to -ms.date: 11/10/2023 +ms.date: 10/01/2024 LocalizationGroup: Transform and shape data +#customer intent: As a Power BI user I want to learn about Azure Log Analytics integration. --- # Azure Log Analytics in Power BI - FAQ @@ -16,73 +17,75 @@ Power BI is integrating with Azure Log Analytics (LA) to enable administrators a ## Frequently asked questions -**Question:** What areas of Power BI are available for Log Analytics integration? +### What areas of Power BI are available for Log Analytics integration? *Answer:* Semantic model activity logs (such as Analysis Services engine traces) are currently available. -**Question:** When should I use Log Analytics for the Analysis Services engine? +### When should I use Log Analytics for the Analysis Services engine? *Answer:* Engine logs are detailed and can be high volume and large, averaging 3-4 KB each for complex semantic models. Therefore we recommend carefully considering when to use logging for the Analysis Service engine. **Typical use cases for logging are performance investigations, scale/load testing or pre-release validation.** -**Question:** Which Analysis Services events are supported? What will the logs look like? +### Which Analysis Services events are supported? What do the logs look like? *Answer:* For information on events and logs see [events and schema](desktop-log-analytics-configure.md#events-and-schema). -**Question:** I can't get Owner permissions for Azure Log Analytics in my organization, is there a workaround? +### I can't get Owner permissions for Azure Log Analytics in my organization. Is there a workaround? + +*Answer:* Yes, you need some help from administrators: -*Answer:* Yes, you'll need some help from administrators: OPTION 1: An Azure admin can grant you Owner rights in Log Analytics only to perform the initial configuration in Power BI. After you complete the initial configuration, they can reduce your access to Contributor or lower as required. + OPTION 2: For workspace level configuration, you can add an Azure admin as a Power BI workspace admin and ask them to configure logging for your workspace. After logging is configured, you can remove their access to your workspace. -**Question:** I can't get workspace Admin permissions for Power BI in my organization, is there a workaround? +### I can't get workspace Admin permissions for Power BI in my organization. Is there a workaround? *Answer:* Yes. Refer to option 2 in the previous question. -**Question:** What happens if I send logs from many Power BI workspaces to the same Log Analytics workspace? How do I differentiate? +### What happens if I send logs from many Power BI workspaces to the same Log Analytics workspace? How do I differentiate? -*Answer:* Each log entry is marked with the correspondent Power BI Workspace Id. +*Answer:* Each log entry is marked with the corresponding Power BI Workspace ID. -**Question:** Can we configure Log Analytics for non-Premium workspaces? +### Can we configure Log Analytics for non-Premium workspaces? *Answer:* No, currently only Premium workspaces are supported. -**Question:** How long does it take for logs to appear in Log Analytics? +### How long does it take for logs to appear in Log Analytics? *Answer:* Typically within 5 minutes of the activity being generated in Power BI. The logs are sent continuously. -**Question:** What happens when I disconnect Log Analytics? Will I lose my data? +### What happens when I disconnect Log Analytics? Will I lose my data? -*Answer:* Disconnecting is a non-destructive operation. Logs stop flowing to Log Analytics, but everything else remains unchanged. Power BI won't alter permissions or delete data. +*Answer:* Disconnecting is a nondestructive operation. Logs stop flowing to Log Analytics, but everything else remains unchanged. Power BI doesn't alter permissions or delete data. -**Question:** How much data is retained in Log Analytics? +### How much data is retained in Log Analytics? *Answer:* The default retention period is 31 days. You can adjust data retention period within the Azure portal, which currently can be increased to 730 days (two years). -**Question:** What if the tenant administrator disables workspace-level logging? +### What if the tenant administrator disables workspace-level logging? -*Answer:* No new Log Analytics configurations can be made at the workspace-level if that occurs. Any existing workspaces that have Log Analytics already configured will continue to send logs. +*Answer:* No new Log Analytics configurations can be made at the workspace-level if that occurs. Any existing workspaces that have Log Analytics already configured continue to send logs. -**Question:** Do you support Blob Store and Event Hubs destinations in Log Analytics? +### Do you support Blob Store and Event Hubs destinations in Log Analytics? *Answer:* Blob Store and Event Hubs destinations aren't currently supported, but your feedback is welcomed on how useful you would find those destinations. -**Question:** What happens if I move my workspace out of a Premium capacity? +### What happens if I move my workspace out of a Premium capacity? -*Answer:* Currently the Log Analytics configuration won't be deleted, but logs will stop flowing when the semantic model isn't in a Premium capacity. If you move it back to Premium capacity, logs will begin to flow again. +*Answer:* Currently the Log Analytics configuration aren't deleted, but logs stop flowing when the semantic model isn't in a Premium capacity. If you move it back to Premium capacity, logs begin to flow again. -**Question:** Do you support workspace v1 for Log Analytics? +### Do you support workspace v1 for Log Analytics? *Answer:* There's no ability to configure Log Analytics for individual v1 workspaces. -**Question:** There are numerous events logged from the Analysis Services engine. Can I choose which ones I want? +### There are numerous events logged from the Analysis Services engine. Can I choose which ones I want? *Answer:* Currently you can't choose which events to log. -**Question:** How much will Log Analytics cost? +### How much will Log Analytics cost? -*Answer:* Azure Log Analytics bills storage, ingestion, and analytical queries independently. Cost also depends on the geographic region. It will vary depending on how much activity is generated, how long you choose to store the data, and how often you query it. An average Premium capacity generates about 35 GB of logs monthly, but the storage size of logs can be higher for heavily utilized capacities. For for information, see the [pricing calculator](https://azure.microsoft.com/pricing/calculator/). +*Answer:* Azure Log Analytics bills storage, ingestion, and analytical queries independently. Cost also depends on the geographic region. It varies depending on how much activity is generated, how long you choose to store the data, and how often you query it. An average Premium capacity generates about 35 GB of logs monthly, but the storage size of logs can be higher for heavily utilized capacities. For for information, see the [pricing calculator](https://azure.microsoft.com/pricing/calculator/). ## Related content diff --git a/powerbi-docs/transform-model/quick-measure-suggestions.md b/powerbi-docs/transform-model/quick-measure-suggestions.md index 55c7c26a3b..fb51c13127 100644 --- a/powerbi-docs/transform-model/quick-measure-suggestions.md +++ b/powerbi-docs/transform-model/quick-measure-suggestions.md @@ -1,22 +1,23 @@ --- title: Quick measure suggestions -description: Quick measure suggestions assist creation of DAX measures using natural language +description: Quick measure suggestions help the creation of DAX measures using natural language, making it easier and faster to generate common DAX calculations. author: Sujata994 ms.author: sunaraya ms.reviewer: '' ms.custom: '' ms.service: powerbi ms.subservice: pbi-transform-model -ms.topic: how-to -ms.date: 09/21/2023 +ms.topic: concept-article +ms.date: 10/07/2024 LocalizationGroup: Create reports +#customer intent: As a Power BI user I want to know more about DAX measures using natural language. --- -# Quick measure suggestions -Quick measure suggestions assist creation of DAX measures using natural language instead of using templates or writing DAX from scratch. -:::image type="content" source="media/quick-measure-suggestions/dax-measure-suggestion.png" alt-text="Screenshot of an example of a DAX measure suggestion."::: +# Quick measure suggestions +Quick measure suggestions assist creation of DAX measures using natural language instead of using templates or writing DAX from scratch. This feature can be used to jump-start creation of common DAX measures scenarios such as: + - Aggregated columns (Optional filters) - Count of rows (Optional filters) - Aggregate per category @@ -31,12 +32,15 @@ This feature can be used to jump-start creation of common DAX measures scenarios - Top N values for a category - Information functions +:::image type="content" source="media/quick-measure-suggestions/dax-measure-suggestion.png" alt-text="Screenshot of an example of a DAX measure suggestion."::: + ## Enable measure suggestions -To enable the feature, you will need to first navigate to the **Options** menu of Power BI Desktop and turn on the preview switch for **Quick measure suggestions**: + +To enable the feature, you'll need to first navigate to the **Options** menu of Power BI Desktop and turn on the preview switch for **Quick measure suggestions**: :::image type="content" source="media/quick-measure-suggestions/enable-preview.png" alt-text="Screenshot of how to enable preview from the options menu of Power BI Desktop."::: -After you have enabled the feature, you can access the Quick measure suggestions, by launching Quick measure from the Home or Modeling tab of the ribbon and selecting **Suggestions**: +After you enable the feature, you can access the Quick measure suggestions, by launching Quick measure from the Home or Modeling tab of the ribbon and selecting **Suggestions**: :::image type="content" source="media/quick-measure-suggestions/suggestions-tab.png" alt-text="Screenshot of how to access the feature from the suggestions tab of the Quick measure pane."::: @@ -44,171 +48,214 @@ Here you can describe the measure you want to create and hit **Generate** (or en :::image type="content" source="media/quick-measure-suggestions/dax-measure-suggestion.png" alt-text="Screenshot of an example of a DAX measure suggestion."::: -You should always validate the DAX suggestions to make sure they meet your needs. If you’re satisfied with a suggested measure, you can click the **Add** button to automatically add the measure to your model. +You should always validate the DAX suggestions to make sure they meet your needs. If you’re satisfied with a suggested measure, you can select the **Add** button to automatically add the measure to your model. ## Natural language examples -To help demonstrate the feature here are some natural language examples for each of the supported measure scenarios. -### Aggregated columns + +To help demonstrate the feature, here are some natural language examples for each of the supported measure scenarios. + +## Aggregated columns + Apply aggregations to a column to return a single value. Our supported aggregations include sum, count, distinct count, distinct count no blanks, average, min, max, median, variance, and standard deviation. Examples: -- Show me sum of sales -- Get total sales -- Count products -- How many products are there -- Unique users -- Distinct count of users no blanks -- Get the number of unique users and exclude blanks -- What is the max price -- Median age - -#### Optional filters + +- Show me sum of sales +- Get total sales +- Count products +- How many products are there +- Unique users +- Distinct count of users no blanks +- Get the number of unique users and exclude blanks +- What is the max price +- Median age + +## Aggregated Columns Optional filters + For aggregated columns, you can also specify one or more filter conditions. If there are multiple filter conditions, you can specify if you want an intersection (&&/AND) or union (||/OR) of the filters. Examples: -- How many customers in London -- Total sold units in 2022 -- Calculate sales where Product is Word and Region is North -- Sales where Product is Word or Region is North -- Sales filtered to Product is Word && Region is North -- Sales for Product is Word || Region is North - -### Count of rows + +- How many customers in London +- Total sold units in 2022 +- Calculate sales where Product is Word and Region is North +- Sales where Product is Word or Region is North +- Sales filtered to Product is Word && Region is North +- Sales for Product is Word || Region is North + +## Row Counts + Count the number of records in the specified table. You don’t need to specify the table if there is only one table. Examples: + - Count records of sales table -- Count sales table -- Sales table row count -- Count rows of sales table +- Count sales table +- Sales table row count +- Count rows of sales table + +## Row Counts Optional filters -#### Optional filters For row counts, you can also specify one or more filter conditions. If there are multiple filter conditions, you can specify if you want an intersection (&&/AND) or union (||/OR) of the filters. Examples: -- Count rows of sales table where Product is Word and Region is North -- Count of sales table where Product is Word or Region is North + +- Count rows of sales table where Product is Word and Region is North +- Count of sales table where Product is Word or Region is North - Count record of sales table filtered to Product is Word && Region is North - Get the row count of sales table for Product is Word || Region is North -### Aggregate per category +## Aggregate per category + Compute a measure for each distinct value in a category and then aggregate the results to return a single value. Our supported aggregates include average, weighted average, min, max, variance. -Examples: -- Average sales per store -- Average score per category weighted by priority -- Min score per product -- Max units per store +Examples: + +- Average sales per store +- Average score per category weighted by priority +- Min score per product +- Max units per store + +## Mathematical operations -### Mathematical operations Perform mathematical operations with numeric columns, measures, or aggregated columns. For scenarios across columns within a table, you can either average (AVERAGEX) or sum up (SUMX) the result in order to return a single value. -Examples: -- Sales - Cogs -- Sales minus Cogs -- Sales divided by target revenue times 100 -- Sales / target revenue * 100 -- EU Sales + JP Sales + NA Sales -- For each row in Sales table calculate Price * Units and sum up the result -- For each row in Sales table sum up Price * Units -- For each row in Sales table calculate Price * Discount and then get the average -- For the Sales table get the average of Price * Discount +Examples: -### Selected value -Get the selected value of a column. This is typically used when paired with a single-select slicer or filter so that the measure will return a non-blank value. +- Sales - Cogs +- Sales minus Cogs +- Sales divided by target revenue times 100 +- Sales / target revenue * 100 +- EU Sales + JP Sales + NA Sales +- For each row in Sales table calculate Price * Units and sum up the result +- For each row in Sales table sum up Price * Units +- For each row in Sales table calculate Price * Discount and then get the average +- For the Sales table get the average of Price * Discount + +## Selected value + +Get the selected value of a column. Column value is typically used when paired with a single-select slicer or filter so that the measure will return a non-blank value. Examples: -- What is the selected product -- Which product is selected -- Selected value for product -### If condition -Return values based on conditions. If you are returning string values, you will need to use double quotes. Conditions can use the following comparison operators: =, ==, <>, <, >, <=, >= +- What is the selected product +- Which product is selected +- Selected value for product + +## If condition + +Return values based on conditions. If returning string values, use double quotes. Conditions can use the following comparison operators: =, ==, <>, <, >, <=, >= Examples: -- If sales > 10,000 return "high sales" else "low sales" -- If sales are greater than 10,000 display "high sales" otherwise display "low sales" -- If selected value for product is blank, display "no product selected" else show selected product -- If selected product = Power BI, show "PBI" else "other" -### Text operations +- If sales > 10,000 return "high sales" else "low sales" +- If sales are greater than 10,000 display "high sales" otherwise display "low sales" +- If selected value for product is blank, display "no product selected" else show selected product +- If selected product = Power BI, show "PBI" else "other" + +## Text operations + Perform text operations with columns, measures, or aggregated columns. For scenarios across columns within a table, we’ll merge (CONCATENATEX) the result in order to return a single value. Examples: -- "The selected product is " & selected product -- Display "The selected product is " concatenated with the selected product -- Header_measure & " - " & Subheader_measure -- For each row in Geography Dim table concatenate State & ", " & City and combine the result -- For each row in Geography Dim table get State & ", " & City and merge -### Time intelligence +- "The selected product is " & selected product +- Display "The selected product is " concatenated with the selected product +- Header_measure & " - " & Subheader_measure +- For each row in Geography Dim table concatenate State & ", " & City and combine the result +- For each row in Geography Dim table get State & ", " & City and merge + +## Time intelligence + These time intelligence scenarios require using a properly marked date table or auto date/time hierarchy. For YTD scenarios you can specify "fiscal" or "fiscal calendar" to base the calculation on the fiscal calendar (ends on June 30th). Examples: -- YTD sales -- Sales fiscal YTD -- Get the sales year to date -- Sales MTD -- Quarter to date sales -- YTD sales for US and Canada -- Change of sales from the previous year -- Sales YoY change -- Month over month change for sales -- Sales QoQ Percent change -- Sales for the same period last year -- Sales for the same period last month -- 28 day rolling average sales -- 28 – day rolling avg sales - -### Relative time filtered value + +- YTD sales +- Sales fiscal YTD +- Get the sales year to date +- Sales MTD +- Quarter to date sales +- YTD sales for US and Canada +- Change of sales from the previous year +- Sales YoY change +- Month over month change for sales +- Sales QoQ Percent change +- Sales for the same period last year +- Sales for the same period last month +- 28 day rolling average sales +- 28 – day rolling avg sales + +## Relative time filtered value + Apply a relative time filter that filters your measure or aggregated column to the last N hours / days / months / years. Examples: -- Unique users in the last 4 hours -- Unique users in the last 5 days -- Total sales for the last 6 months -- Total sales for the last 2 years -### Most / least common value +- Unique users in the last 4 hours +- Unique users in the last 5 days +- Total sales for the last 6 months +- Total sales for the last 2 years + +## Most / least common value + Return the value with the most or least number of occurrences in a specified column. Examples: -- Most common value in Product -- Which value in Product is most common -- What is the most common value in Product -- Which value in Product is least common -- What is the least common value in Product -### Top N filtered value +- Most common value in Product +- Which value in Product is most common +- What is the most common value in Product +- Which value in Product is least common +- What is the least common value in Product + +## Top N filtered value + Compute a measure or aggregated column that is filtered to the top N categorical values based on that same measure or aggregated column. Examples: -- Total sales for the top 3 products -- Sum of sales filtered to the top 3 products -- Average score for the top 5 students -- Avg score filtered to the top 5 students -### Top N values for a category +- Total sales for the top 3 products +- Sum of sales filtered to the top 3 products +- Average score for the top 5 students +- Avg score filtered to the top 5 students + +## Top N values for a category + Get a concatenated list of the top N values within a column based on a measure or aggregated column. Examples: -- Top 3 products with the most total sales -- Top 3 products by sales -- What are the top 3 products in sales -### Information functions +- Top 3 products with the most total sales +- Top 3 products by sales +- What are the top 3 products in sales + +## Information functions + Return system or user information such as the current date/time or the current user's email, domain, or username. Examples: + - Today's date -- Now -- Return the current user email -- Return the current domain name and username -- Return the current user’s domain login +- Now +- Return the current user email +- Return the current domain name and username +- Return the current user’s domain sign in ## Limitations and considerations -- Quick measure suggestions are NOT a replacement for learning DAX. The suggestions provided by the feature are meant to help fast track measure creation; however, you will still need to validate the DAX suggestions because they can be wrong or not match your intent. + +The following are limitations and considerations: + +- Quick measure suggestions are NOT a replacement for learning DAX. The suggestions provided by the feature are meant to help fast track measure creation; however, you still need to validate the DAX suggestions because they can be wrong or not match your intent. - The feature isn't supported for LiveConnect data models. -- The feature is powered by a machine learning model that is currently only deployed to US datacenters (East US and West US). If your data is outside the US, the feature will be disabled by default unless your tenant admin enables **Allow user data to leave their geography tenant setting**: - +- The feature is powered by a machine learning model that is currently only deployed to US datacenters (East US and West US). If your data is outside the US, the feature is disabled by default unless your tenant admin enables **Allow user data to leave their geography tenant setting**: + :::image type="content" source="media/quick-measure-suggestions/quick-measure-suggestions-admin-setting.png" alt-text="Screenshot of the admin setting for measure suggestions."::: + +## Related content + +You might also be interested in the following articles: + +- [Use quick measures for common calculations](desktop-quick-measures.md) +- [Create calculated columns in Power BI Desktop](desktop-calculated-columns.md) +- [Create calculated tables in Power BI Desktop](desktop-calculated-tables.md)