Big File TransferPicture a rural health system sending large, high-res brain scans from a stroke patient to a city hospital for analysis. Now picture a field engineer sharing 3-D AutoCAD files of a new oil field with decision-makers at corporate. Imagine an enterprise migrating stored data from an old data center to a new one. Today’s collaborative workforce, in almost every industry, depends on big file transfer capabilities.

When it comes to many professional indulgences, you’ve enjoyed the advantages of “rolling your own.” But big file transfers, despite your good intentions to save money, are’t something to do by yourself. When you take an honest look at the simplicity (and security) of today’s managed file transfer solutions, it makes no sense to go solo. DIY file transfer systems puts a big strain on your already overextended IT department, and poses big challenges for regulatory compliance due to the security risks involved.

DIY File Transfer Systems for Larger Files Make No Sense

Without clear policies in place, employees usually send large files over email or use a consumer file transfer application. With Microsoft Exchange, for instance, you have a 10-gigabyte default attachment size limit; Gmail has a 25GB limit. MIME encoding bloats the file even more, which confuses your users (“How can my 20GB attachment be too big when the limit says 25?” They’ll ask). Even if it gets out of your network, the recipient rejects the attachment, resulting in delivery failure. Plus, when you’re backing up your email servers, these large attachments require a lot of storage space.

When delivery fails, your senders usually open up your worst enemy: the consumer file transfer app, such as Dropbox, which doesn’t actually transfer the file to a recipient. Instead it stores it in a cloud, and the recipient gets a link. The safety of your file, therefore, depends on third-party cloud security. If employees use several different apps, you’ve not only lost control of data security in transit; you have no idea where sensitive data lives.

Why Not Automate With FTP?

Many IT departments develop ad hoc file transfer scripts to meet the varying needs of their customers. Your IT department may even be using legacy scripts created by employees who’ve since left the company. It’s also likely they’re not updating these scripts to satisfy the demands associated with transferring a high volume of large files, both in and out of the enterprise. With usernames, IP addresses are libraries constantly in flux; it’s inconvenient for IT to tweak automation scripts all the time. When it comes to large file transfers, there’s no reason for your IT department to be coordinating a process they should have delegated a long time ago.

The inconvenience of updating automation scripts isn’t FTP’s biggest hindrance. Big file transfers via FTP can be incredibly slow depending on your network’s upload speeds. When the transfer hits a firewall, or when sessions time out, FTP doesn’t support automated transfer resume.

Also, opening additional ports means increasing your network security risk, and FTP transfer itself just isn’t secure. A Ponemon Institute study shared by Computer Weekly revealed that 44 percent of IT leaders felt they had no ability to control user access to sensitive documents. Additionally, six in 10 admitted sending sensitive files to the incorrect recipients. Imagine a large healthcare organization sharing a patient’s file from a picture-archiving and communication system (PACS) over FTP and delivering it to the wrong recipient or exposing it to a data breach. Or imagine a critical patient file arriving hours later over FTP while a patient anxiously awaits scan results. A managed file-transfer solution, automated from within the PACS workflow, would eliminate these problems.

Rolling Your Own Just Isn’t Worth It

Rolling your own is usually about two things: thriftiness and craftsmanship. In terms of the latter, though, few IT decision-makers can create a file transfer system that’s as fine-tuned as a managed solution. And when you face significant costs related to lost files, process management or regulatory fines, rolling your own becomes pretty expensive.

Going with an MFT solution will take a lot of work and worries off your plate.


You might say that the entire point of a Managed File Transfer (MFT) system is to do exactly that: provide centralized management and control. For example, let’s say that your company is subject to the Payment Card Industry Data Security Standard (PCI DSS). Requirement 4 of PCI DSS is to “encrypt transmission of cardholder data and sensitive information across public networks,” such as the Internet. Let’s also say that you frequently need to transmit cardholder data to partner companies, such as vendors who will be fulfilling requests.

One option is to simply allow someone within your company to email that information, or to have an automated process do so. You’ll need to ensure that everyone remembers to encrypt those emails — you did remember to get digital certificates for everyone, correct? — every single time. If someone forgets, you’ve created the potential for a data breach, and it’s not going to look very good for your company on the evening news.

Another option is to automate the file transfer using an MFT solution. That solution can be centrally configured to always apply PGP‐based encryption to the file, to always require an FTP‐over‐SSL connection with the vendors’ FTP servers, and to always require 256‐bit AES encryption. You don’t have to remember those details beyond the initial configuration — it’s
centrally configured. Even if your users need to manually transfer something ad‐hoc — perhaps an additional emergency order during the Christmas rush — your MFT solution will “know the rules” and act accordingly. Your users’ lives become easier, your data stays protected, and everyone sleeps more soundly at night. This central control is often referred to as policy-based configuration because it’s typically configured in one spot and enforced — not just applied — to your entire MFT infrastructure, regardless of how many physical servers and clients you are running.
What’s the difference between enforced and applied? Making a configuration change is applying it. That doesn’t, of course, stop someone else from coming along behind you and applying a new configuration. The idea with policies is that they’re configured sort of on their own, and that they’re protected by a unique set of permissions that govern who can modify them—they’re not just wide‐open to the day‐to‐day administrators who maintain your servers. In many cases, a review/approve workflow may have to be followed to make a change to a policy. Once set, the policies are continually applied to manageable elements such as MFT client software and MFT servers. A server administrator can’t just re-configure a server, because the policy prevents it. The MFT solution ensures that your entire MFT infrastructure stays properly configured all the time.

– From The Tips and Tricks Guide to Managed File Transfer by Don Jones

To read more, check out the full eBook or stay tuned for more file transfer tips and tricks!