Reputation: 29
Consider a scenario:
My logic is 1) Read each CSV files for each user 1 by 1 2) put the list for RoboCopy to copy files from Server A to Server B each user folder 3) it works for me but it is super slow. I want to make it faster.
Note: Mandatory to use PowerShell
My Code look like this
$Get_CSV_File_INFO = @(Get-Content $Full_CSV_Path )
$SourcePath = "z:\"
foreach($a in $Get_CSV_File_INFO)
{
if($a -match '.zip')
{
$RS_Report_Name.add($a) |Out-Null
}
}
$RS_Report_Name | % { Robocopy $SourcePath $path $_} | Out-Null
Any suggestion might help.
I am not logging and not showing any output; it improved the speed but still Copying total of "2.57 MB" files took around 9 minutes which is not good in a real life scenario.
Upvotes: 1
Views: 1521
Reputation: 29
Thanks Matthew Wetmore for the suggestion to run PowerShell Script on Destination server and Thanks Travis for the Suggestions to divide the arraylist
$Report_array2 = $Report_array[$Initial_Counter .. $Counter]
Robocopy "$SourcePath" "$path" $Report_array2 /MT /Z |out-Null
Also I use /MT for Multi Threading and /Z(Restart Mode) performance increased from Minutes to few Seconds
Upvotes: 0
Reputation: 2413
Assuming $RS_Report_Name
is an array filtered to only the files you want to copy changing the robocopy
to copy all files instead of one by one should have a big impact.
robocopy $SourcePath $path $RS_Report_Name
If that still exceeds the max command length, split the array into smaller groups say 10-50 and run Robocopy
on each.
Upvotes: 1