I'm glad that you were able to resolve your issue and thank you for posting your solution so that others experiencing the same thing can easily reference this! Since the Microsoft Q&A community has a policy that "The question author cannot accept their own answer. They can only accept answers by others "I'll repost your solution in case you'd like to accept the answer.
Ask: When using the dataset in a dataflow I also need to set the parameters when including the dataflow in a pipeline or when starting a debug-session.
When I try to fetch a Data Preview in the Source section of the Dataflow I get this error: at Source '***file': org.apache.hadoop.fs.azure.AzureException: No credentials found for account ***-files in the configuration, and its container $root is not accessible using anonymous credentials. Please check if the container exists first. If it is not publicly available, you have to provide account credentials.
When I trigger the pipeline I get the same error.
The linked service had the correct auth settings to the storage account. It works well in the pipeline but fails in the Dataflow.
Solution: The problem was how you configure your dynamic dataset to the storage container.
Wrong:
When you fill in the parameters like this:
- Path:
container/path
- File:
file.xml
.. the dataset will work in your pipelines but will fail in your dataflow with the above mentioned credentials error.
Correct:
- Path:
container
- File:
path/file.xml
This way the dataflow will also work with the credentials of the linked service.
If I missed anything please let me know and I'd be happy to add it to my answer, or feel free to comment below with any additional information.
If you have any other questions, please let me know. Thank you again for your time and patience throughout this issue.
Please don’t forget to Accept Answer
and Yes
for "was this answer helpful" wherever the information provided helps you, this can be beneficial to other community members.