The script should be run from a "Service Account" and you shouldn't really need to see the running script if you are generating appropriate log files.
I guess its all according to how the errors show if it locks up. If its errors in the script, then you should be able to capture this.
Generate the log files in a central location so everyone can see them.
If you want to go further, use a dashboard to view the log files and if setup and refreshed appropriately you should be able to see near real time results (and therefore know if its not running for any reason).
If this script takes a long time, just write the start time and end time to another log file. If the end time isn't being written in an expected time frame, then someone needs to investigate.
Well yes, a log would be good enough, but not to the managers liking. My manager stored his remote connection on my computer. Now I can see what he see's.
Sounds like you have "trust" issues.
You should not be logging into the server with the service account.
The scheduled task uses the service account to run the script. You logon with your own "admin" account to configure the scheduled task.
Not sure where you are running the scripts from, but usually it would be some sort of integration server which may be running various scripts for various purposes.
Access to said server would usually be limited to those who support the scripts\server and no-one else. Tech staff only.
Even better if you set it up to run as a windows service using something like NSSM (Non-Sucking Service Manager). I'd recommend NSSM as you mentioned the main issue is that the script sometimes gets locked.
If you use the general tools from the windows server resource kit to create a service from a PowerShell script, it doesn't have good control of the execution state. If the script crashes, the service does not know and the service will just keep running. NSSM doesn't have those weaknesses. This way, you don't even need to be logged onto the server. Still need to sort out the logging though.