Filtering data for last 24 months in Mapping Data Flows
How can I filter data using Mapping Data Flows for the last 24 months, starting from the max date and going back? The date column is in the mm/dd/yyyy format.
Azure Synapse Link with F&O missing tables
Hello, I'm facing this problem with synapse link, I did: Connect F&O environment with power apps. Established an incremental link with data lake. I can see D365 Finance and Operation in the list of tables in manage tables of the link. My problem…
Failure happened on 'destination' side. ErrorCode=DeltaInvalidCharacterInColumnName.
This exception occurred when I use the pipeline of data factory to copy data from sql server to lakehouse. But I didn't find any problems with the raw data.
How to copy Blobs/Files on Azure DataLake Gen2 using python
Hello: I am trying to use Python from my desktop to copy blobs/files between containers (folder) on Azure DataLake Gen2. I know I have access to the folders and I know my key is correct because I have another script that allows me to upload the files…
Data lake going to be replaced by delta lake
Hi friends, recently I have seen an architecture that uses Delta Lake for bronze, silver, and gold layers and ADF as a whole ingestion and movement service. Data validation is mostly done by using Azure data bricks in the silver layer. And each layer was…
BlobModifiedWhileReading exception when writing to azure data lake storage from multiple k8s pods
Hello, We are using Azure data lake storage gen 2 for storing data consumed and processed in our microservices. The data written is in JSONL format i.e. JSON messages separated by newline character. The data is being stored in a blob file which is…
I get the following error while Sinking data in Azure: ErrorCode: 'EndpointUnsupportedAccountFeatures'. Message: 'This endpoint does not support BlobStorageEvents or SoftDelete. Please disable these account features if you would like to use this endpoint.
When I try to select the file path under Linked service while sinking the data (ADLS Gen2), I get this error: "ErrorCode: 'EndpointUnsupportedAccountFeatures'. Message: 'This endpoint does not support BlobStorageEvents or SoftDelete. Please disable…
getting Failure happened on 'Source' side. ErrorCode=UserErrorOdbcOperationFailed,
I am getting this error while perform incremental data loading from oracle database to azure datalake storage Failure happened on 'Source' side.…
azure datalake compute engine
Hi friends,I have read that A data lake is simply storage—unlike a relational data warehouse, there is no compute engine associated with it, but in azure datalake documentation I read that it has hadoop and spark ccomute, so please help me to…
How to create log file for pipeline execution in Azure Synapse Analytics?
What are the steps to create a log file for the execution of the pipeline in Azure Synapse Analytics? I am looking to include specific columns, such as pipeline_run_id, pipeline status, pipeline start time, pipeline end time, pipeline duration, rows…
Identify user who aquired lease over a file in ADLS
Hi, I have 2 queries below Is there a way to identify the "last modified by" for a file stored in ADLS? As we can see last modified date easily, is there a mechanism to add " last modified by" as well so that it is easier to…
Upsert Load in DataLake through ADF
What is the best way to design the upsert load in Datalake. I can do upsert in DW/DB easily but need more understanding on how to achieve in DataLake as data has file structure. My source is SQL Server (DB) Ingestion will be done through ADF …
How long does the Azure Fluid Relay Service store customer data?
The Azure Fluid Relay Service documentation talks about data storage, however, there isn't any mention of how long the service stores customer data.
I have an ADLS Gen2 with hierarchical, its not allowing for failover
I have created an ADLS Gen2 with hierarchical and its not allowing for failover, I need to do failover testing
Copy Activity error in Synapse Pipelines
I am encountering an error when attempting to copydata to SQL Server from a JSON file located in the Datalake. Specifically, we've encountered the following error: { "errorCode": "2200", "message":…
Architecture design tool and implementation
Hi friends, I have to design and work on some PoC and RFPs, I have looked at some tools such as Draw io but it is extremely inflexible and hard to get Azure icons as per requirements, the icons provided by Azure are small, how to resize them. This task…
Copy data activity from synapse serverless sql pool to ADLS gen 2
Hi, I have synapse serverless sql pool configured to ADLS Gen 2 account where the container is filled with files from D365 using synapse link. I need to build datawarehouse in Azure. We have very tight restrictions on budget and don't want to go…
Create job_execution_id and get pipeline_run id in the dataflow
I am creating a file in the dataflow and want to create a column for job_execution id so that every times the job run I get a unique id for the file and also want to get the pipeline_run_id on every execution. Sample job job_execution_id and…
Read data from sql server and update via synapse spark pool
In the following manger I am reading a table form sql server using synapse spark pool How can I update the data from synapse spark pool eg: Update a column
Error consuming Azure AutoML time series forecasting model in Power BI.
Screenshot 2024-03-12 102758.png, Model.png,error.png I've deployed a time series forecasting model using Azure AutoML, but when attempting to consume it in Power BI, I encounter an error. and this is the data data1.png.The error message doesn't provide…