ftp vs http vs scp

From: Jules Richardson <julesrichardsonuk_at_yahoo.co.uk>
Date: Fri May 28 05:39:13 2004

Bottom line to me is that HTTP is a pretty heavyweight and bloated
protocol, whereas FTP is a lot cleaner. So for raw data transfer I'd
always prefer an FTP server.

Password security is an issue because it's perhaps not as good as HTTPS
- but then with HTTPS aren't we getting into pay-through-the-nose server
certificate territory? And it's certainly no worse than HTTP. Of course
it depends on your actual intended setup anyway - if on a LAN or over a
VPN then plain-text password transfer may be less of an issue, or maybe
anonymous FTP access is fine for a particular situation anyway.

It's worth thinking about what may be between your clients and servers
too. HTTP data is much more likely to be transparently cached somewhere
along the line (which may have security implications), but the flipside
is that port 80 is generally less likely to be blocked than port 21 for

My memory's hazy here - I wrote both an FTP client and server a few
years ago for a particular project - but I seem to remember that there
are essentially two methods of file transfer for FTP - one where the
server makes the data connection back to the client (which from a
server's point of view is more secure) and one where the server tells
the client what port to connect to for a transfer and the client then
opens a connection to that port on the server. The latter method really
messes up firewalls :)

Determining file type etc. is pretty much a non-issue I'd say; it's a
higher level server function than the raw protocol, so it shouldn't
affect file transfer and only dictate what the server ultimately does
with the file.

> HTTP also offers nothing like REST

REST is fantastic for FTP and was historically a big reason why I hated
HTTP servers for file transfer over FTP. However, not all FTP servers
always support it which is a shame.

HTTP does support restarts, at least in a lot of later incarnations - I
think the problem's more that a lot of common browsers typically weren't
smart enough to cope with it. I've used wget to do restarts when
transferring data from HTTP servers before though.

> manipulation commands like ACCT, CWD, CDUP, SMNT, STOU, APPE, RNFR,

Agreed - and there are some nice FTP clients out there if your users
aren't into typing stuff on the commandline (Aside - I don't believe
that the FTP protocol defines passing wildcards to the DELE command, and
neither is there any form of 'MDEL' for multiple deletes, and RMD
doesn't allow deleteing a directory unless it's empty. Deleting lots of
files without a graphical client can be a pain!)

> indeed HTTP doesn't really offer file transfer at all -

I'm actually very anti-HTTP to be honest. Great for what it was
originally intended for, but it really bugs me the way current view is
"the Web is the Internet" and unless something can be hacked to run over
HTTP then it isn't worth doing.

> it's just common for a request to be
> satisfied by serving up a copy of a file, and this is (ab)used as a
> file-transfer mechanism.

Upload's even more of a mess from what I remember, requiring something
at the server end (be it Perl, compiled CGI, Java or whatever) to handle
and save the incoming data stream - i.e. there's no standard for
actually saving an upload to the filestore. Similarly, if you do want to
administer files behind a web server then the solution is going to be
localised as there's no standard for that either.

You may even have encoding issues when uploading a file, and we've all
come across problems with referencing files by HTTP due to the protocol
not allowing certain characters and some servers handling illegal
characters differently to others.

scp? Never used it. How portable is it to different platforms?


Received on Fri May 28 2004 - 05:39:13 BST

This archive was generated by hypermail 2.3.0 : Fri Oct 10 2014 - 23:37:13 BST