Powershell, dont copy locked files

Debbie Coates 1 Reputation point
2021-03-10T14:15:42.733+00:00

I have written a PowerShell script to copy excel files from one location to another, However, I believe I am also copying over files that were in the middle of being written, so therefor i get the file, but its got an error and I cant open it. I need to edit my script so that it picks up all but the most recent one (as this will be the file in use). My current script picks up All the files received in the last 10 mins, can anyone help me filter out the most recent one? This is my script $StartDate = (Get-Date).Addminutes(-10) $EndDate = Get-Date write $StartDate write $EndDate $StrSource ="filesystem::\ipl\dfs\SecureFTP\Rospen-Yield" $StrTarget= "filesystem::\ipl\dfs\Shares\KPI Team\Secure\Backing Data\Data Uploads\Test" Get-ChildItem $StrSource -filter "Report_B*.xlsx" -Recurse | Where-Object {($.LastWriteTime.Date -ge $StartDate.Date) -and ($.LastWriteTime.Date -le $EndDate.Date)} | Copy-Item -Destination $StrTarget

Windows for business Windows Server User experience PowerShell
0 comments No comments
{count} votes

2 answers

Sort by: Most helpful
  1. Michael Taylor 60,161 Reputation points
    2021-03-10T15:01:40.643+00:00

    Yes that is easily doable in PS but I'm not convinced that it'll solve your problem. It sounds like you are having Excel files dropped to a particular location on disk and you're trying to copy them elsewhere. It is possible for your script to run while a large Excel file is still being copied although I would have expected Windows to fail the request with a file share error as somebody is writing the file while you're trying to read it.

    Personally I think the best option here is to change your timing. Rather than looking for all files written in the last 10 minutes consider also requiring that they have been written at least 1 minute ago. This should really ensure that you aren't trying to read a file being written although it is still possible, although highly unlikely that a file takes over a minute to write.

       $StartDate = (Get-Date).Addminutes(-10)  
       $EndDate = (Get-Date).AddMinutes(-1)  
       write $StartDate  
       write $EndDate  
       $StrSource ="filesystem::\\ipl\dfs\SecureFTP\Rospen-Yield"  
       $StrTarget= "filesystem::\\ipl\dfs\Shares\KPI Team\Secure\Backing Data\Data Uploads\Test"  
         
       Get-ChildItem $StrSource -filter "Report_B*.xlsx" -Recurse | Where-Object {($.LastWriteTime.Date -ge $StartDate.Date) -and ($.LastWriteTime.Date -le $EndDate.Date)} | Copy-Item -Destination $StrTarget  
    

    Although your script is also dependent upon it running every 10 minutes exactly otherwise you'll either copy too many files or too few as well. Temporal based scripts are unreliable in my experience. You might consider instead of copying the item just moving it. Then you can simplify your script to grab all the files and try to move each one. If a move fails then ignore it as it is still being written so it'll get picked up next time.

    But to answer your original question about skipping the "last one" note that it would depend upon the file naming convention. If you have Report_B_1, Report_B_2 and Report_C_1 then which one is the "last one"? I'm going to assume that you have a serialized process that is copying the files into this folder one by one so the "last one" is the most recently written file. In that case use the Select command to take all but the last one. Alternatively order by the last write time descending and skip the first one.

       $items = Get-ChildItem ...  
       $items | Sort-Object -Property LastWriteTime | Select-Object -SkipLast 1  
    
    0 comments No comments

  2. Rich Matheisen 47,901 Reputation points
    2021-03-10T16:12:38.667+00:00

    If you want to process only files that aren't "in use" you try to open the file in "exclusive" (i.e. no sharing) mode. Here's an example -- just be certain to Close/Dispose of the file before you try using it!:

    $FilePaths = "C:\Junk\Test_Formula.xlsx", "C:\Junk\WordList.txt"
    
    $OpenOK = $false
    ForEach ($File in $FilePaths){
        $OpenOK = $false
        Try{
            $FileStream = [System.IO.File]::Open($File,'Open','Write')
            $OpenOK = $true
        }
        Catch{
            $_.Exception.InnerException.Message
            $OpenOK = $false
        }
        if ($OpenOK){
            $FileStream.Close()
            $FileStream.Dispose()
        }
        else {
            Continue
        }
        Write-Host "Processing the file $File"
    }
    
    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.