Getting error when trying to upoad large file

Apr 25, 2011 at 6:28 AM
Edited Apr 26, 2011 at 5:48 AM

Hi Richard,

When I am tring to upload a 21MB file, I am getting the following error.

Error: File transfer failed after transferring 6,619,136 bytes in 65 seconds

 During the transfer, it got disconnected 2 times. And finally got finish with the above exception.

Can you please suggest me what to do with this? And can you please let me know what is maximum file size it support?

Thanks



 

STOR cuteftppro.exe
Response: 150 Opening connection for data transfer.

Error: Disconnected from server: ECONNABORTED - Connection aborted

 

Apr 26, 2011 at 5:57 AM

Hi Richard,

Are you looking into this issue? I am waiting for your reply. Please reply to this thread.

Thanks

Coordinator
Apr 26, 2011 at 8:52 AM

Hello

Apologies for the slight delay in responding - we have had some national holidays here in the UK ;)

That is not expected behavior - are you connecting in ACTIVE mode?

I have uploaded much larger files than this with success. Are you able to provide me with any trace information, and specific exception data so that I can look in to this further when I get time?

Thanks,

Richard.

Apr 26, 2011 at 9:55 AM

Hi Richard,

Thanks for the response.

I am using FileZila with Active mode only. The defalut timeout for filezila is 20 seconds.

I will try to put some logging in ur code and provide you the trace.

Thanks

Apr 26, 2011 at 6:26 PM

Hi Richard

The error because of Timeout. I am not getting where I can set the timeout. Here is teh error

Status: Connecting to 65.52.5.52:21...

Status: Connection attempt failed with "ETIMEDOUT - Connection attempt timed out".

Error: Could not connect to server

Thanks

 Pradyumna

Coordinator
Apr 27, 2011 at 2:30 PM

Hi Pradyumma

Thanks for the extra info - I will make a note and try to investigate at my next earliest opportunity.

Richard

Apr 29, 2011 at 5:53 AM

Hi Richard,

What type of instance you have taken on Azure. I have tried with Small and Large instance. Then I tried to upload 21MB file. But still the same problem.

Error: Disconnected from server: ECONNABORTED - Connection aborted

 

Thanks

Feb 1, 2012 at 11:41 AM

Hi Pradyumna,

 

You can find a solution here (it's in french) :

http://blog.d-cube.fr/post/2012/02/01/Windows-Azure-Load-Balancing-Timeout.aspx

You can try TCPKeepAlive in FTPServer.ThreadRun() just after the SocketHelpers.CreateTcpListener()

on the socket m_socketListen.Server

 

Manu

Mar 11, 2012 at 2:57 AM

It actually has nothing to do with the size of the file, rather how long it takes to upload it.  The Azure Load Balancer will close an idle connection after 60 seconds.  Uploading a file that takes more than 60 seconds is still considered idle.  Filezilla has a "keep-alive" feature that will send a "keep-alive" message to keep the connection open, but it doesn't appear to work.

The problem with your solution is that it's on the wrong side of the equation.  The timeout is happening between the ftp client and the Azure load balancer, not the worker role.

So, in summary, I too am still searching for a solution.

Coordinator
Mar 11, 2012 at 1:43 PM

There are a couple of workarounds posted, mpantana; though I have yet to find time to integrate them in to the codebase. As this is an open-source project, feel free to make the changes and contribute back to the community.

For TCP socket communication on Azure, ServicePointManager.SetTcpKeepAlive(true, 30000, 30000) is listed as an option. TCP Keep-Alive packets will keep the connection from your client to the load balancer open during a long-running HTTP request. For example if you’re using .NET WebRequest objects in your client you would set ServicePointManager.SetTcpKeepAlive(…) appropriately.

Another solution would be to force the FTP2Azure bridge to send some data to the client every 30 seconds, for example.

HTH,

Richard.

Mar 11, 2012 at 4:38 PM
Hey Richard,

Thanks for the project, it's pretty awesome. I saw the suggestions, and tried the ServicePointManager one, but it didn't seem to have any effect. It's quite possible I just don't have it in the right place. That said, it still seems like the problem lies with the incoming client connection, not the ftp2azure worker role. You mentioned the TCP keep-alive from the client. Since this is an FTP bridge, the client could be any ftp client, in my case filezilla. Filezilla does have an option to send keep-alive's, but it doesn't solve the issue. You still get disconnected after 60 seconds. Admittedly, I'm in a bit over my head here, so I could be missing the obvious.

I'm really hoping to get this to work, so I'll keep poking around at it and post any findings.

Thanks,

Matt

On Sun, Mar 11, 2012 at 8:43 AM, richardparker <notifications@codeplex.com> wrote:

From: richardparker

There are a couple of workarounds posted, mpantana; though I have yet to find time to integrate them in to the codebase. As this is an open-source project, feel free to make the changes and contribute back to the community.

For TCP socket communication on Azure, ServicePointManager.SetTcpKeepAlive(true, 30000, 30000) is listed as an option. TCP Keep-Alive packets will keep the connection from your client to the load balancer open during a long-running HTTP request. For example if you’re using .NET WebRequest objects in your client you would set ServicePointManager.SetTcpKeepAlive(…) appropriately.

Another solution would be to force the FTP2Azure bridge to send some data to the client every 30 seconds, for example.

HTH,

Richard.

Read the full discussion online.

To add a post to this discussion, reply to this email (ftp2azure@discussions.codeplex.com)

To start a new discussion for this project, email ftp2azure@discussions.codeplex.com

You are receiving this email because you subscribed to this discussion on CodePlex. You can unsubscribe on CodePlex.com.

Please note: Images and attachments will be removed from emails. Any posts to this discussion will also be available online at CodePlex.com