Reputation: 15
I have a txt with 19 500 000 of lines (each line is a full path of file or folder). I need to go with a foreach loop and get the ACLs, later save the output into CSV, but I need to split this CSV in files every 1000 lines.
I am at a loss; I appreciate any help or recommendation.
Upvotes: 0
Views: 182
Reputation: 175085
As Abraham Zinala mentions, you might want to take advantage of Get-Content -ReadCount
, to read the file in chunks of 1000 lines at a time, then output to CSV and continue with the next chunk:
$outputCounter = 1
Get-Content path\to\huge\file.txt -ReadCount 1000 |ForEach-Object {
# Iterate over file paths in chunk, fetch ACL information, then write all 1000 records to a new CSV file
$_ |ForEach-Object {
Get-Item -LiteralPath $_ -PipelineVariable item |Get-ACL |Select @{Name='Path';Expression={$item.FullName}},Owner -ExpandProperty Access
} |Export-Csv ".\path\to\output_${outputCounter}.csv" -NoTypeInformation
# Increment output file counter
$outputCounter++
}
Upvotes: 1