Hello @alexander tikhomirov ,
Thanks for the question and using MS Q&A platform.
Unfortunately, in Spark job definition activity doesn't populates the output, so that can be utilized for the next activities.
Note: By default, the Spark Job definition activity writes output to the path which you have passed in the command line arguments:
Since you already knows the output folder of the Spark job definition. Once the spark job definition is successfully, you can directly parameterize the output path in the next activity.
Hope this will help. Please let us know if any further queries.
- Please don't forget to click on
or upvote
button whenever the information provided helps you. Original posters help the community find answers faster by identifying the correct answer. Here is how
- Want a reminder to come back and check responses? Here is how to subscribe to a notification
- If you are interested in joining the VM program and help shape the future of Q&A: Here is how you can be part of Q&A Volunteer Moderators