テナントを跨いだPowerBIの活用について
弊社にはazureのテナントが二つあります。 1のテナントにはPowerBIProライセンスがあり、2つ目のテナントのDateLake内にはPowerBIで可視化させたいデータがあります。 2つ目のデータを可視化させるためにオンプレミスデータゲートウェイを活用すれば1つ目のテナントにあるPowerBIを使用できますでしょうか。 宜しくお願い致します。 Convert Japanese to English: Title: Utilization of Power BI across…
Use ADF to load a file in TEAMS into ADL
Is there a way to load an xlsx file in teams into Azure Data Lake using Azure Data Factory? Thanks geert
Azure Data Lake Gen2 - Use Case Advice
I am collecting weather data (history and forecast) from a third part web service. Since there will be a lot of data, and it will not have high use, I was planning to use Azure Data Lake Gen2 with blob storage, and storing the data in JSON files. My…
Error while loading data from Azure Data Lake Gen2 to Azure Synapse Analytics using Azure Data Factory
Hi All, I am getting an error while loading data from Azure Data Lake Gen2 to Azure Synapse Analytics using Azure Data Factory. I am unable what exactly the error is? What permission has to be provided? How?. This error occurs when I use Polybase…
Data Factory Copy activity failing with Error code 2200 - Operation failed as split count exceeding upper bound of 1000000
Hi , I am copying large data from ADLS gen 1 to ASDW(Azure synapse Analytics) using Data Factory (polybase set to yes and mentioned blob storage settings). The Source data is *.parquet format and is partitioned. Copy activity fails. In first attempt…
HDInsight azure adls gen2 'InternalServerError' ARM Template deployment
Creating Azure HDinsight Spark cluster with ADLS Gen 2,Userassigned managed idnetity with StorageBlobdataOwner role. Successfully assigned msi role to storage but getting error with HDInsight deployment(Internal server error) Theres some issue with…
![](https://techprofile.blob.core.windows.net/images/VfQFAmOikEWfBHko2XlWTA.png?8D7F33)
Getting size of each folders in containers in adls gen2
Hi, I need to get length of folders and files of Adls gen2 using powershell .. my requirement is ilke -- i will pass container name then it should recursively should show length of files and all folders and it should iterate through all folders…
How to get list of child files/directories having parent DataLakeDirectoryClient class instance
Hello All! I have to scan whole data lake file system. Having code like: PagedIterable<PathItem> pItems = ((DataLakeFileSystemClient)prmParent).listPaths(); for( PathItem pItem : pItems ){ if pItem.isDirectory() ){ …
Unable to add service principle, groups to the $logs container in adls2
Recently enabled storage analytics on ADLS Gen2 storage account.I can see the $logs container and the logs are writing to this on an hourly basis. But when I'm trying to add service principal to this container getting permission denied. I'm able to…
Data Masking in Azure data Factory
We are using Azure Data factory to move data from Source like Azure SQL and Azure Postgres to destination as Azure data lake.There is some sensitive data which needs to be masked. Is it possible to have data masking in Azure Data factory during…
Getting Data from azure datalake through api
Hi Team, Looking for any api available to get data from azure data lake directly ,with out using any data transformation tools like pipelines/dataflow. Csv or Json files sitting in datalake storage folders, so the same data or files can be fetched…
My partitions are returning inconsistent results (Mapped JSON file data)
I have a set of tables based on a number of JSON files taken from a data lake v2 container. I'm building an external table using derived data from JSON mapping, then using derived columns based on those columns to generate an identifier. When I go to…
Azure Data Lake Gen2 PUT authorization
I'm trying to create a Shared Access Signature client side in my Node app. The reason being that I do not want to stream files through my app. I want the user to be able to upload a file to my Azure Data Lake Gen2 Blob Storage container directly. I have…
Writing parquet file throws…An HTTP header that's mandatory for this request is not specified
I have two ADLSv2 storage accounts, both are hierarchical namespace enabled. In my Python Notebook, I'm reading a CSV file from one storage account and writing as parquet file in another storage, after some enrichment. I am getting below error when…
Where to store delta-files
Hey guys! I have been wondering about the below question for a while and I hope you can help me get a good night sleep again. Question: When you are working in Azure DataBricks to do transformations, you can save the results using a delta format.…
Error Setting Access Rights on ADL (Gen1) User Folders During HDInsight Cluster Creation
Hello All The issue occurs when the cluster is created. One of the last operations that Ambari performs are 'Post user creation hook for 1 users'. This fail with the attached errors. Here is some more information: It's an ESP cluster -- we…
![](https://techprofile.blob.core.windows.net/images/-kBKepa7U0a5d4kNlSs0Pg.png?8D81DE)
![](https://techprofile.blob.core.windows.net/images/VfQFAmOikEWfBHko2XlWTA.png?8D7F33)
Can't connect to our ADLS gen2 in Powerbi Online (scheduled refresh)
Hi all, We have recently set up a ADLS gen 2 and we want to update our reports automatically. In the powerBI desktop version everything works, but online I get the following error: Failed to update data source credentials: The credentials provided…
Mounting entire ADLS on azure databricks
Hi, I want to mount entire ADLS storage account on databricks. I've checked in documents I can mount a single filesystem at a time but I want to mount entire ADLS on databricks. I've around 70 containers in my ADLS and I want to mount all of them…
Adls Availability Alerts
Hi Team, I have set alerts on my adls gen2 such as if availability is less than 80 percent it should send alert notification. My question is this availability is Microsoft responsibility, right ? how to find out root cause of why this availability…
Copy Different type of file from Gen1 Azure lake to Azure Gen2 lake with attribute( like last updated)
I need to migrate all my data from Azur data lake Gen1 to Lake Gen2. In my lake we have different types of file mixed (.txt, .zip,.json and many other). We want to move them as-it-is to GEN2 lake. Along with that we also want to maintain last updated…