The Save-AzDataFactoryLog cmdlet downloads log files associated with Azure HDInsight processing of Pig or Hive projects or for custom activities to your local hard drive.
You first run the Get-AzDataFactoryRun cmdlet to get an ID for an activity run for a data slice, and then use that ID to retrieve log files from the binary large object (BLOB) storage associated with the HDInsight cluster.
If you do not specify the DownloadLogs parameter, the cmdlet just returns the location of log files.
If you specify DownloadLogs without specifying an output directory (Output parameter), the log files are downloaded to the default Documents folder.
If you specify DownloadLogs along with an output folder (Output), the log files are downloaded to the specified folder.
This command saves log files for the activity run with the ID of 841b77c9-d56c-48d1-99a3-8c16c3e77d39 where the activity belongs to a pipeline in the data factory named LogProcessingFactory in the resource group named ADF.
The log files are saved to the C:\Test folder.
Example 2: Save log files to default Documents folder
Indicates that this cmdlet downloads log files to your local computer.
If Output folder is not specified, files are saved to Documents folder under a subfolder.
This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable,
-InformationAction, -InformationVariable, -OutBuffer, -OutVariable, -PipelineVariable,
-ProgressAction, -Verbose, -WarningAction, and -WarningVariable. For more information, see
about_CommonParameters.
The source for this content can be found on GitHub, where you can also create and review issues and pull requests. For more information, see our contributor guide.