Hosting Files with FTP in 2022?

Hey all, I just went through a lot of reading and steps to configure FTP on my firewall only to end up looking for another solution. FTP was great 10 years ago when I used it at work but I’d like to do something a bit more modern.
I have a NAS that is securely inaccessible from the WAN. I have been trying to come up with a good solution when someone wants to share a large data set with me. The NAS comes with a cloud access utility but I am a little concerned with it’s security.
I have been tossing around the idea of exposing the NAS with a container recipe, if you will, that I found on reddit. They make use of their private domain to access it using DDNS. It seems a bit convoluted and less secure than using OpenVPN and restricting access to just the NAS and the one user I need to add for the time being. Not really sure how to go about that for the best user experience.

Here is the container recipe that gave me the idea and has started me down this rabbit hole.
How do you share large datasets in 2022?

EDIT: I have a non-static IP

Either a static IP or dynamic DNS is probably required for all “classic” server-client solutions. Solutions that use peer-to-peer protocols usually don’t have that restriction.

What does that even mean? :sweat_smile: I guess the NAS is not accessible from the internet?

If you’re looking for a sort of drop-in replacement for FTP, i guess SCP SFTP is a safe bet. In general, I find SSH to be quite versatile. If you don’t want to use SCP SFTP, you can tunnel arbitrary TCP connections (like FTP, samba, webdav, etc) using the -L option.

In terms of security I think SSH is among the best protocols out there. The only downside I can think of is that SSH usually (unless configured differently) gives access to a shell and to execute arbitrary commands on the server, so it might take some effort to lock down.

1 Like

Yeah locking it down isn’t really something I know how to do. FTP was kind of a turn key solution for this.

I really thought having a NAS would make this easier but I was half tempted to buy a CrushFTP license and revert to what I know.

So, someone wants to share a bunch of files with you and you are providing the place to put the files? That seems backwards to me. Let me explain…

If they have the files on their storage system then they should share them from that place with you (click buttons and make available for you to download by whatever means makes sense). Whereas if you have the files and want to share with them then you would share them from your NAS and click the buttons to make them available for them to download. If you both need to work on the files at the same time that’s what collaboration platforms are for, usually something cloud based but it could be a “local” cloud on your NAS (but that can get complicated fast).

The protocol(s) you use really depends on what you are doing. I would say FTP is probably a bad idea these days but bear with me for a sec. When FTP was built into web browsers it was pretty convenient as long as you didn’t care about the lack of encryption. These days if I’m going to take the time to install a program to interact with the files (such as FileZilla) then you might as well use somethin more secure like SFTP which is just a slight modification to the FTP server settings you already know.

I am lazy and just use Dropbox for this kind of thing.

Yeah it felt a little backwards. But I think he has limited access to cloud storage in dropbox and google drive. The transfer would take several passes.

Sftp on this server is lacking. I would need to make the user admin. Which is not something I want to do. I think the solution is an sftp server in a container. I reached out to official support for their official answer.

The official support channels are always the best place to get started! I’m not sure what sort of NAS you have but on Synology setting up SFTP is pretty easy and the user shouldn’t have to have access to anything more than needed (but depending on your DSM version it might require some fenagling around such as editing the users shell in /etc/passwd and making sure FTP access is enabled for the user).

Dropbox is $9.99/month for 2 TB. Not sure how long your project is or how much your time is worth but it’s something to think about. Best of luck!

Thanks ext1580
I use a few servers for work and don’t normally need this kind of storage for myself. Otherwise paying for some storage makes sense.
I think nextcloud is actually a pretty decent option. I am looking at a few configurations of running it. Before I pick one.
Owncloud also seems pretty decent.
I’ll let you know how it goes on here.

winscp works pretty well.
Do you have to expose the SSH port to the internet?
I feel like reddit has a lot of people who are super opposed to opening any ports but I don’t really know the alternatives.

1 Like

Of course, the server port needs to be exposed to the internet, otherwise you cannot connect to the file server from the internet. That applies just as well to FTP or any other server-based service.

The sole fact that there is an SSH server listening on that port automatically raises some red flags for some people, but in such cases I think that is quite unfounded. You are normally warned not to expose your SSH port to the internet if possible. And I absolutely agree with that sentiment - as long as we are talking about a port that belongs to a server which is used to execute arbitrary commands, like managing the machine, etc. (you know, the stuff that you normally do with SSH).

In this case though I’m talking about a dedicated SSH server (separate from the management SSH server) with its own user database that preferably runs inside a VM or container and is locked down in such a way that it can only be used for manipulating files in a certain directory.

1 Like

Do you have any resources that describe the process further? It sounds like a good way to run things.

Unfortunately, no. Also of course it depends on your environment. If you can run docker containers, atmoz/sftp might be worth a look (haven’t tested it myself). I wrote SCP earlier, but I really meant SFTP (I confused them).

OK so after looking at a bunch of options I went with Nextcloud… after I installed Owncloud and tried it out and got it mostly configured for my share, I then read about Nextcloud and it’s rising popularity. It seems to be where the action is these days. So I started over and installed Nextcloud and have to say it’s a pretty great option for file sharing, it’s fast, it’s user friendly and it looks clean. Highly recommend. I can see why people say avoid FTP.

1 Like

Reddit send filled with people who don’t actually realize what an open port is. Opening a port isn’t a bad thing, the security risk comes with the software on the other side of the open port. But this is still the same regardless of if you open a port or use something like tailscale.

To directly answer your question; FTP is fine. There’s nothing inherently vulnerable about it. It just fell out of favor because it required additional software to do anything beyond download and it’s unexpected by default.

SFTP and FTPS exist (one used ssh and the other is FTP with a cert but it just became easier to do all the transfers over plain old HTTPS.

I chose Nextcloud for basic file sharing after encountering lots of customers who block all popular file sharing sites and services. I didn’t want to host it internally, so I have NC hosted on an inexpensive Namecheap web site hosting plan, and it has working great for my simple file sharing needs. I’m having to be careful with the number and size of the files I host on NC to avoid violating the Namecheap web hosting TOS, but it’s been fine for my particular limited use case.

I thought about using my hosting service but I didn’t want to blow out the bandwidth on my website.
I am now running up on the issue of pfblocker stopping people from out of country from accessing my site legitimately. I don’t want to open up fully and most people don’t have static IPs so I am starting to see the benefits of hosting it elsewhere.

OR… You could adjust pfBlocker to make you aliases (I use alias deny for blocking and alias native for allowing) that you can use in firewall rules you create, allowing you to control exactly what rules go where and automatically update the aliases. Then you upgrade that web server to a Turing Pi 2 k8 cluster and profit. :grin: https://www.youtube.com/watch?v=tdf9UmmKEFY

We actually had a requirement for this exact scenario. Look at FileSender.org . I would set it up as a VM Linux box that has lives in a DMZ and drops the files onto your NAS via a policy. if the files are really huge, then setup a second interface on the linux box that has a static IP in the internal subnet (just dont put a gateway on the internal nic), which will keep the firewall out of the way for the transfers.

Alternatively, if you stick with the SFTP (please go sftp, not ftp), put your DNS behind cloudflare (its free) and setup rules to only accept traffic from his specific IP (assuming it doesnt change, or if he has a DDNS that is kept up to date for his IP if it changes).

2 Likes

Probably not the teckie answer that the original poster is looking for…but…

Virtually zero effort involved;

Slightly more involved, but relatively straight forward;

  • Setup AWS File Transfer with an S3 Bucket behind it - Only drawback is the eyewatering starting cost of ~£200 / month which for transferring one-off files is a tad pricey

All depends on your specific use case really

How do you set that up?