Reputation: 4332
I have the following script (adapted from here) for uploading files via ftp for a website.
$files = @(dir -Path $path)
foreach ($file in $files) {
if ($file.GetType().FullName -eq 'System.IO.FileInfo') {
"uploading $file"
$uri = New-Object System.Uri($ftp+$file.Name)
$webclient.UploadFile($uri, $file.FullName)
}elseif ($file.GetType().FullName -eq 'System.IO.DirectoryInfo') {
Recurse $file.FullName
}
This works fine if all files go to the root of the directory. The problem I am having is that there are subdirectories for the site under the root. This places (as expected) all files at the root regardless of where they exist in the actual directory structure.
Is there a simple way to transfer all of the files while maintaining the directory structure of the source. I'm sure I could put something together using split-path, but I just wanted to make sure that I wasn't overlooking something before I went any further.
Thanks.
Upvotes: 1
Views: 823
Reputation: 6223
Per request converted from the comments:
geekswithblogs.net has a solution for recursive FTP copy.
Upvotes: 2
Reputation: 72610
Perhaps Microsoft Documentation can help here :
The URI may be relative or absolute. If the URI is of the form "ftp://contoso.com/%2fpath" (%2f is an escaped '/'), then the URI is absolute, and the current directory is /path. If, however, the URI is of the form "ftp://contoso.com/path", first the .NET Framework logs into the FTP server (using the user name and password set by the Credentials property), then the current directory is set to /path.
Upvotes: 0