Reputation: 6675
I have an application written in .NET 3.5 that uses FTP to upload/download files from a server. The app works fine but there are performance issues:
It takes a lot of time to make connection to the FTP server. The FTP server is on a different network and has Windows 2003 Server (IIS FTP). When multiple files are queued for upload, the change from one file to another creates a new connection using FTPWebRequest and it takes a lot of time (around 8-10 seconds).
Is is possible to re-use the connection? I am not very sure about the KeepAlive property. Which connections are kept alive and reused.
The IIS-FTP on Windows Server 2003 does not support SSL so anyone can easily see the username/password through a packet sniffer such as WireShark. I found that windows Server 2008 supports SSL over FTP in its new version if IIS 7.0.
I basically want to improve the upload/download performance of my application. Any ideas will be appreciated.
** Please note that 3 is not an issue but I would like people to have comments on it
Upvotes: 33
Views: 37818
Reputation: 749
I did some benchmarks on FtpWebRequest, similar to @Sid 's response above. Setting KeepAlive to true does improve, but not the asynchronous calls in my test. The test consists of
1) FtpWebRequest for check of file existence 2) FtpWebRequest for upload 3) FtpWebRequest for rename file on server
Test FTP client 30 files of size 5 Kbytes took ... 14.897 seconds
Test upload (alive true, connection name) 30 files of size 5 Kbytes took ... 34.229 seconds
Test async(alive true, connection name) 30 files of size 5 Kbytes took ... 40.659 seconds
Test send thread (alive true, connection name) 30 files of size 5 Kbytes took ... 38.926 seconds, 30 files
what did improve was an implementation of an FTP client made using the Socket class
the benchmark is here
https://github.com/pedro-vicente/ftp_t
Upvotes: 0
Reputation: 59
Try this below code, you will get better performence:
private void Upload144_Click(object sender, EventArgs e)
{
OpenFileDialog fileobj = new OpenFileDialog();
fileobj.InitialDirectory = "C:\\";
//fileobj.Filter = "Video files (*.mp4)";
//fileobj.ShowDialog();
if (fileobj.ShowDialog() == DialogResult.OK)
{
if (fileobj.CheckFileExists)
{
string test = Properties.Settings.Default.Connection;
SqlConnection con = new SqlConnection(test);
con.Open();
string correctfilename = System.IO.Path.GetFileName(fileobj.FileName);
SqlCommand cmd = new SqlCommand("Insert into Path(ID,Path5) VALUES ((select isnull(MAX(id),0) + 1 from Path),'\\Videos\\" + correctfilename + "')", con);
cmd.ExecuteNonQuery();
string path = Application.StartupPath.Substring(0, Application.StartupPath.Length - 10);
con.Close();
//For Progressbar
DataTable dt = new DataTable();
// SqlDataAdapter da = new SqlDataAdapter(cmd);
// da.Fill(dt);
timer5.Enabled = true;
// FOR FtpServer File Upload::
string uploadfile = fileobj.FileName;
string uploadFileName = new FileInfo(uploadfile).Name;
string uploadUrl = "ftp://ftp.infotech.com/";
FileStream fs = new FileStream(uploadfile, FileMode.Open, FileAccess.Read);
try
{
long FileSize = new FileInfo(uploadfile).Length; // File size of file being uploaded.
Byte[] buffer = new Byte[FileSize];
fs.Read(buffer, 0, buffer.Length);
fs.Close();
fs = null;
string ftpUrl = string.Format("{0}/{1}", uploadUrl, uploadFileName);
FtpWebRequest requestObj = FtpWebRequest.Create(ftpUrl) as FtpWebRequest;
requestObj.Method = WebRequestMethods.Ftp.UploadFile;
requestObj.Credentials = new NetworkCredential("[email protected]", "test@123");
Stream requestStream = requestObj.GetRequestStream();
requestStream.Write(buffer, 0, buffer.Length);
requestStream.Flush();
requestObj = null;
}
catch (Exception ex)
{
//MessageBox.Show("File upload/transfer Failed.\r\nError Message:\r\n" + ex.Message, "Succeeded", MessageBoxButtons.OK, MessageBoxIcon.Information);
}
}
}
}
Upvotes: -1
Reputation: 1
i was working a few days with that... and speed was really low, nothing to compare with FileZilla... finally i solved with multithreads. 10 threads making connections for download gives me a good rate, maybe even could be improved more.. with a standar ftprequest configuration
PeticionFTP.ConnectionGroupName = "MyGroupName"
PeticionFTP.ServicePoint.ConnectionLimit = 4
PeticionFTP.ServicePoint.CloseConnectionGroup("MyGroupName")
PeticionFTP.KeepAlive = False
PeticionFTP.UsePassive = False
PeticionFTP.UseBinary = True
PeticionFTP.Credentials = New NetworkCredential(lHost.User, lHost.Password)
Upvotes: 0
Reputation: 2114
This link describes ConnectionGroupName and KeepAlive affects: WebRequest ConnectionGroupName
Upvotes: 4
Reputation:
I know it's an old thread, but I recently had to go through a similar situation.
We needed to download 70+ XML files from an ftp server in under 25 minutes without opening more than 5 connections during that time-frame.
These were all the alternatives we tried:
We ended up using old-fashioned ftp batch script. It's fast and I only use one connection to download all the files. It isn't flexible, but it's much faster than everything else I've tried (75 files in under 20 minutes).
Upvotes: 1
Reputation: 1546
I have done some experimentation (uploading about 20 files on various sizes) on FtpWebRequest with the following factors
KeepAlive = true/false
ftpRequest.KeepAlive = isKeepAlive;
Connnection Group Name = UserDefined or null
ftpRequest.ConnectionGroupName = "MyGroupName";
Connection Limit = 2 (default) or 4 or 8
ftpRequest.ServicePoint.ConnectionLimit = ConnectionLimit;
Mode = Synchronous or Async
see this example
My results:
Use ConnectionGroupName,KeepAlive=true took (21188.62 msec)
Use ConnectionGroupName,KeepAlive=false took (53449.00 msec)
No ConnectionGroupName,KeepAlive=false took (40335.17 msec)
Use ConnectionGroupName,KeepAlive=true;async=true,connections=2 took (11576.84 msec)
Use ConnectionGroupName,KeepAlive=true;async=true,connections=4 took (10572.56 msec)
Use ConnectionGroupName,KeepAlive=true;async=true,connections=8 took (10598.76 msec)
Conclusions
FtpWebRequest
has been designed to support an internal connection pool. To ensure, the connection pool is used, we must make sure the ConnectionGroupName
is being set.
Setting up a connection is expensive. If we are connecting to the same ftp server using the same credentials, having the keep alive flag set to true will minimize the number of connections setup.
Asynchronous is the recommended way if you have a lot of files to ftp.
The default number of connections is 2. In my environment, a connection limit of 4 will give to me the most overall performance gain. Increasing the number of connections may or may not improve performance. I would recommend that you have the connection limit as a configuration parameter so that you can tune this parameter in your environment.
Hope you would find this useful.
Upvotes: 41
Reputation: 11
To resolve the problem about performance you simply need to set:
ftpRequest.ConnectionGroupName = "MyGroupName";
ftpRequest.KeepAlive = false;
ftpRequest.ServicePoint.CloseConnectionGroup("MyGroupName");
Upvotes: 1
Reputation: 5183
Single Point of advice:
LOWER BUFFER/CHUNK-SIZES SIGNIFICANTLY REDUCE PERFORMANCE
Reason: Many more disk i/o, memory i/o, ftp stream init and many many more factors
Upvotes: 0
Reputation: 294187
It doesn't matter if the individual connections take long to connect as long as you can launch many in parallel. If you have many items to transfer (say hundreds) then it makes sense to launch tens and even hundreds of WebRequests in parallel, using the asynchronous methods like BeginGetRequestStream and BeginGetResponse. I worked on projects that faced similar problems (long connect/authenticate times) but by issuing many calls in parallel the overall throughput was actually very good.
Also it makes a huge difference if you use the async methods or the synchronous one, as soon as you have many (tens, hundreds) of requests. This applies not only to your WebRequests methods, but also to your Stream read/write methods you'll use after obtaining the upload/download stream. The Improving .Net Performance and Scalability book is a bit outdated, but much of its advice still stands, and is free to read online.
One thing to consider is that the ServicePointManager class sits there lurking in the Framwework with one sole purpose: to ruin your performance. Make sure you obtain the ServicePoint of your URL and change the ConnectionLimit to a reasonable value (at least as high as how many concurrent requests you intend).
Upvotes: 18
Reputation: 28608
I strongly suggest Starksoft FTP/FTPS Component for .NET and Mono. It has a connection object that you can cache and reuse.
Upvotes: 2
Reputation: 31161
Debug Network
A few tricks for simple network debugging:
tracert
from a DOS shell).ftp
command.telnet server 21
.The results will provide clues to solving the problem.
Network Hardware
For a slow trace route:
Network Configuration
For a slow ping:
Validate API
A slow command-line FTP session will tell you that the problem is not isolated to the FTP API you are using. It does not eliminate the API as a potential problem, but certainly makes it less likely.
Network Errors
If packets are being dropped between the source and destination, ping will tell you. You might have to increase the packet size to 1500 bytes to see any errors.
FTP Queue Server
If you have no control over the destination FTP server, have an intermediary server receive uploaded files. The intermediary then sends the files to the remote server at whatever speed it can. This gives the illusion that the files are being sent quickly. However, if the files must exist on the remote server as soon as they are uploaded, then this solution might not be viable.
FTP Server Software
Use a different FTP daemon on the FTP server, such as ProFTPd as a Windows service. (ProFTPd has plug-ins for various databases that allow authentication using SQL queries.)
FTP Server Operating System
A Unix-based operating system might be a better option than a Microsoft-based one.
FTP Client Software
There are a number of different APIs for sending and receiving files via FTP. It might take some work to make your application modular enough that you can simply plug in a new file transfer service. A few different APIs are listed as answers here.
Alternate Protocol
If FTP is not an absolute requirement, try:
Upvotes: 5
Reputation: 33476
Look at this page - http://www.ietf.org/rfc/rfc959.txt
It says of using different port when connecting to be able to reuse the connection.
Does that work?
Upvotes: 2
Reputation: 14187
I have had good results with EDT's ftp library:
http://www.enterprisedt.com/products/edtftpnet/overview.html
Upvotes: 1
Reputation: 5764
I'd recommend switching to rsync.
Pros :
Optimised for reducing transfer time.
Supports SSH for secure transfer
Uses TCP so makes your IT dept/firewall guys happier
Cons:
No native .NET support
Geared towards linux server installations - though there are decent windows ports like DeltaCopy
Overall though it's a much better choice than FTP
Upvotes: 1
Reputation: 6163
AFAIK, each FtpWebRequest has to set up a new connection - including logon to the server. If you want to speed up the FTP transfers, I would recommend that you use an alternate FTP client instead. Some of these alternate clients can login and then perform multiple actions using the same command connection.
Examples of such clients incldue: http://www.codeproject.com/KB/IP/FtpClient.aspx which also includes a good explanation as to why these libraries can operate faster than the standard FtpWebRequest and http://www.codeproject.com/KB/macros/ftp_class_library.aspx which looks like a simple enough implementation also.
Personally, I rolled my own implementation of FTP back in the .NET 1.1 days before the FtpWebRequest was introduced and this still works well for .NET 2.0 onwards.
Upvotes: 0
Reputation: 69242
You should definitely check out BITS which is a big improvement over FTP. The clear-text passwords aren't the only weakness in FTP. There's also the issue of predicting the port it will open for a passive upload or download and just overall difficulty when clients are using NAT or firewalls.
BITS works over HTTP/HTTPS using IIS extensions and supports queued uploads and downloads that can be scheduled at low priority. It's overall just a lot more flexible than FTP if you are using Windows on the client and server.
Upvotes: 3
Reputation: 9575
KeepAlive is working. FtpWebRequest caches connections inside, so they can be reused after some time. For details and explanation of this mechanism you can look to ServicePoint.
Another good source of information is to look into FtpWebRequest source (you can do it on VS2008).
Upvotes: 0
Reputation: 9660
Personally I have migrated all of our apps away from using FTP for file upload/download, and instead rolled a solution based on XML Web Services in ASP.NET.
Performance is much improved, security is as much or as little as you want to code (and you can use the stuff built in to .NET) and it can all go over SSL with no issues.
Our success rate getting our clients' connections out through their own firewalls is FAR better than running FTP.
Upvotes: 2