I have no experience with Azure data factory. And there is just too much that I don't know about your environment to be able to give you an exact answer.
Somewhere there needs to be an "event" that indicates that the install process can now proceed. What options do you have in Azure DF? Can it run a process, or create a file or write to an eventlog somewhere? If nothing else, you might have to write a program/script that periodically queries the Azure processes to verify that the jobs completed successfully, and the install process can now proceed.
When someone says, "Log into a VM", to me that implies an interactive (desktop) session. If you are trying to automate something, you don't want to use anything that requires a graphical interface. "Click ok to continue" is bad, scripts have no good way to "see the screen".
But all you really need to do is to launch a process on the target machine that runs as an account that has sufficient rights (Administrator?) to run the queries/programs. You mentioned "some automation tool", do you have something installed or wish to use?
If not, and again depending on your environment, one solution is to use Powershell's Invoke-Command to run the install on the target machine. So you would have Server1 which monitors the Azure processes. When it determines that it is "time to go", it would copy any scripts, install files, and data files to the C drive of TargetMachine2. Then it would "Invoke-Command -ComputerName TargetMachine2 -Scriptblock {whatever}" to execute the install on that machine.
That's going to require that you have WinRM configured to allow the connectivity. If not, then as I mentioned earlier you can use the Windows task scheduler or PSexec to launch the processes on the target machine.
In any non-interactive script that you develop, be sure to write it so that you capture stdout and stderr of any programs that it executes. Write that to a log file so that when something goes wrong you can review the logs to figure out what happened.