Reputation: 133
I have a text file that contains file names, file sizes, and created dates for before (.txt) and after (.txt.Z) compression. The data is separated by commas and looks like this:
Note: The File names below are not the actual file names. I will be receiving this type of file weekly, so each week the files would be different names.
File1.txt,1449124525,Jul 09 01:13
File2.txt,2601249364,Jul 09 01:30
File3.txt,18105630,Jul 09 01:01
File4.txt,732235442,Jul 09 01:17
File1.txt.Z,130652147,Jul 09 01:13
File2.txt.Z,217984273,Jul 09 01:30
File3.txt.Z,2320129,Jul 09 01:01
File4.txt.Z,61196011,Jul 09 01:17
etc...
Currently, the code that I have inserts the first row into SQL 44 times (there are 22 total file names, so 44 total with before and after compression).
$file = Get-Content "MY_FILE.txt"
$line = $null
foreach ($line in $file)
{
#Split fields into values
$line = $file -split (",")
$FileName = $line[0]
$FileSize = $line[1]
$FileDate = $line[2]
#Format Date Field
$DateString = $FileDate
$DateFormat = "MMM dd HH:mm"
$Culture = $(New-Object System.Globalization.CultureInfo -ArgumentList "en-US")
$DateString = $DateString -replace "\s+"," "
$NewDate = [Datetime]::ParseExact($DateString, $DateFormat, $Culture)
$FileDate = Get-Date $NewDate -Format "yyyy-MM-dd HH:mm:ss"
#SQL Connection Info
$Connection = New-Object System.Data.SQLClient.SQLConnection
$Connection.ConnectionString = "server='MY_SERVER';database='MY_DATABASE';trusted_connection=true;"
$Connection.Open()
$Command = New-Object System.Data.SQLClient.SQLCommand
$Command.Connection = $Connection
#Insert into SQL
$sql = "INSERT INTO [MY_DATABASE].[dbo].[MY_TABLE] ([FileName],[FileSize],[FileDate]) VALUES ('" + $FileName + "'," + $FileSize + ",'" + $FileDate + "')"
$Command.CommandText = $sql
$Command.ExecuteReader()
}
$Connection.Close()
Another tricky thing that I would love to be able to do would be to load each file, and its corresponding size and date, to the same row for before and after compression. I cannot seem to grasp how I would be able to get that. The above part is more important, though. Anyway, I would want it to look like this in SQL:
| InFileName | InFileSize | InFileDate | OutFileName | OutFileSize | OutFileDate |
-------------------------------------------------------------------------------------
| File1.txt | 1449124525 | Jul 09 01:13 | File1.txt.Z | 130652147 | Jul 09 01:13 |
| File2.txt | 2601249364 | Jul 09 01:30 | File2.txt.Z | 217984273 | Jul 09 01:30 |
| File3.txt | 18105630 | Jul 09 01:01 | File3.txt.Z | 2320129 | Jul 09 01:01 |
| File4.txt | 732235442 | Jul 09 01:17 | File4.txt.Z | 61196011 | Jul 09 01:17 |
Thanks!
Upvotes: 0
Views: 11322
Reputation: 28963
I suggest you process the data first, before importing into SQL server. If you're using comma-separated-values, use Import-CSV.
As an addition to that, you can specify column names when importing, so I've added empty columns for the compressed files.
Then loop over the rows, and merge in the compressed files - note that their names come from the "InFileName" column, and get moved to the "OutFileName" column. It's not particularly efficient; for every file without a .z
ending, it loops through all the rows to find the corresponding .z
file.
$fileHeaders = 'InFileName','InFileSize','InFileDate','OutFileName','OutFileSize','OutFileDate'
$inData = Import-Csv D:\f.txt -Header $fileHeaders
$outData = foreach ($row in $inData) {
if ($row.InFileName -notmatch '\.z$') {
$outFile = $inData | Where {$_.InFileName -match "$($row.InFileName).."}
$row.OutFileName = $outFile.InFileName
$row.OutFileSize = $outFile.InFileSize
$row.OutFileDate = $outFile.InFileDate
$row
}
}
e.g. after that:
$outData | ft -AutoSize
InFileName InFileSize InFileDate OutFileName OutFileSize OutFileDate
---------- ---------- ---------- ----------- ----------- -----------
File1.txt 1449124525 Jul 09 01:13 File1.txt.Z 130652147 Jul 09 01:13
File2.txt 2601249364 Jul 09 01:30 File2.txt.Z 217984273 Jul 09 01:30
File3.txt 18105630 Jul 09 01:01 File3.txt.Z 2320129 Jul 09 01:01
File4.txt 732235442 Jul 09 01:17 File4.txt.Z 61196011 Jul 09 01:17
Then loop over $outData and you'll have to change your SQL inserts and so on to handle the 6 fields. You'll still need all the date parsing / field processing code as well, which I left out completely.
Upvotes: 2
Reputation: 2718
It looks like in the first line of your foreach loop you are calling split on the entire $file array rather than on the $line you are working with. It should work if you swap $line = $file -split (",")
for $line = $line -split (",")
. Although you may want to use a different name for either the parameter or the variable you are assigning it to.
Upvotes: 2