How do I upload a file across multiple interfaces on OS X or Linux?

I have a large file that I want to download from a server that I have root access to. I also have several different concurrent Internet connections from my machine to the server at my disposal.

Do you know any protocol, (S) FTP client, HTTP client, AFP client or any other file transfer protocol server and client combination that supports multi-threaded downloads over different connections?

+2


source to share


3 answers


One option is the "old fashioned" multipart file.

split -b 50m hugefile multiparthugefile_

      

This will create multiparthugefile_a

, multiparthugefile_b

and so on. To join them, use the command cat

:

cat multiparthugefile_* > hugefile_rejoined

      



For actual file transfers using different interfaces, the flag wget --bind-address=ADDRESS

should work:

--bind-address=ADDRESS    bind to ADDRESS (hostname or IP) on local host.

      

This problem seems like something that Bittorrent is positioned to do well, but I'm not sure exactly how you would go about it.

Maybe create a temporary tracker (or use something like OpenBitTorrent.com ) and run multiple clients locally - assuming clients support the LAN transfer function, each client will grab different pieces from the server and share them with (local) clients. You will end up with multiple copies of the file locally, but it will only be transmitted over the Internet once

+2


source


Any of them ? You will need a web server hosting the same file on all interfaces.



+2


source


http - check one of the various download managers (e.g. firefox from http://www.downthemall.net/ extension) there is also an FTP downloader that supports multiple streams

-1


source







All Articles