Are you faced with migrating or transferring an enormous amount of files or data? If so, join the crowd! With more and more distributed infrastructure, applications, and the advent of cloud platforms, moving data around has never been more prevalent.
But do yourself and your organization a favor and resist the temptation to go to your browser and punch in search terms like: “how to optimize my ftp server” or “make ftp upload faster”.
The reality is that FTP is outdated, insecure, relatively slow, and prone to issues which are likely to occupy your valuable time. And worst of all, data and files can be easily intercepted while in motion. If you don’t believe me, then check out this quick screen capture sequence with Wireshark, demonstrating that your credentials are stored in plain text, in the open absent careful configuration.
And lest we forget, selecting and deploying an FTP client involves additional time and effort and when you are finally ready to start, you can look forward to time spent babysitting your data transfer jobs.
Got it up and running yet? Get ready for the following:
- FTP file transfer failed
- FTP server broken pipe
- 400 errors including Connection closed; transfer aborted.
- 552 Requested file action aborted. Exceeded storage allocation.
- 553 Requested action not taken. File name not allowed.
Another well known fact is that FTP error handling is nearly non-existent.
And please don’t fall into the familiar trap of creating retry automation—this is more manual work which is likely to fail due to the huge variety of file types, systems, character encoding, endpoint APIs and transient network issues. And last but not least, if you are really moving big amounts of data, you’ll have to run another sync job at the end to pick up all the new files that weren’t there when you started. Good luck with that one!
What about the end user? A successful cloud migration or backup is not complete without retaining file and data ownership and permissions. If you are managing a migration or backup for tens, hundreds, or thousands of users FTP cannot map permissions between an EFSS, CMS or cloud storage.
To be sure, FTPS is a step up due to encryption before transfer, however, management issues become clear very quickly:
- Purchasing or self signing a SSL certificate is expensive and burdensome.
- What happens when there are many multiple sources and targets.
- What about cloud sources and targets that are strictly API based?
- Have any non-technical FTPS users?
- And what about sharing with contractors or external users?
Now, you may be thinking, “is this guy hiding under a rock—hasn’t he heard of SFTP?”
Although more secure and automated, SFTP presents many of the same issues and risks are still present. Mistakes like uploading the wrong config files to remote servers or giving access to infected machines can bring down entire servers or completely compromise your business critical data and files. And don’t ignore that SFTP configuration adds another layer of complexity that needs to be deployed and managed on an on-going basis. (In an upcoming post, I’ll look at Managed File Transfer products, the 1990s answer to these problems).
In today’s cloud-based, digitally transformed world, having a simple, fast, secure and fully automated file migration and sync mechanism and data migration are necessities for an organization. Stop babysitting FTP transfers and take real control of your data movement.