Big File TransferPicture a rural health system sending large, high-res brain scans from a stroke patient to a city hospital for analysis. Now picture a field engineer sharing 3-D AutoCAD files of a new oil field with decision-makers at corporate. Imagine an enterprise migrating stored data from an old data center to a new one. Today’s collaborative workforce, in almost every industry, depends on big file transfer capabilities.

When it comes to many professional indulgences, you’ve enjoyed the advantages of “rolling your own.” But big file transfers, despite your good intentions to save money, are’t something to do by yourself. When you take an honest look at the simplicity (and security) of today’s managed file transfer solutions, it makes no sense to go solo. DIY file transfer systems puts a big strain on your already overextended IT department, and poses big challenges for regulatory compliance due to the security risks involved.

DIY File Transfer Systems for Larger Files Make No Sense

Without clear policies in place, employees usually send large files over email or use a consumer file transfer application. With Microsoft Exchange, for instance, you have a 10-gigabyte default attachment size limit; Gmail has a 25GB limit. MIME encoding bloats the file even more, which confuses your users (“How can my 20GB attachment be too big when the limit says 25?” They’ll ask). Even if it gets out of your network, the recipient rejects the attachment, resulting in delivery failure. Plus, when you’re backing up your email servers, these large attachments require a lot of storage space.

When delivery fails, your senders usually open up your worst enemy: the consumer file transfer app, such as Dropbox, which doesn’t actually transfer the file to a recipient. Instead it stores it in a cloud, and the recipient gets a link. The safety of your file, therefore, depends on third-party cloud security. If employees use several different apps, you’ve not only lost control of data security in transit; you have no idea where sensitive data lives.

Why Not Automate With FTP?

Many IT departments develop ad hoc file transfer scripts to meet the varying needs of their customers. Your IT department may even be using legacy scripts created by employees who’ve since left the company. It’s also likely they’re not updating these scripts to satisfy the demands associated with transferring a high volume of large files, both in and out of the enterprise. With usernames, IP addresses are libraries constantly in flux; it’s inconvenient for IT to tweak automation scripts all the time. When it comes to large file transfers, there’s no reason for your IT department to be coordinating a process they should have delegated a long time ago.

The inconvenience of updating automation scripts isn’t FTP’s biggest hindrance. Big file transfers via FTP can be incredibly slow depending on your network’s upload speeds. When the transfer hits a firewall, or when sessions time out, FTP doesn’t support automated transfer resume.

Also, opening additional ports means increasing your network security risk, and FTP transfer itself just isn’t secure. A Ponemon Institute study shared by Computer Weekly revealed that 44 percent of IT leaders felt they had no ability to control user access to sensitive documents. Additionally, six in 10 admitted sending sensitive files to the incorrect recipients. Imagine a large healthcare organization sharing a patient’s file from a picture-archiving and communication system (PACS) over FTP and delivering it to the wrong recipient or exposing it to a data breach. Or imagine a critical patient file arriving hours later over FTP while a patient anxiously awaits scan results. A managed file-transfer solution, automated from within the PACS workflow, would eliminate these problems.

Rolling Your Own Just Isn’t Worth It

Rolling your own is usually about two things: thriftiness and craftsmanship. In terms of the latter, though, few IT decision-makers can create a file transfer system that’s as fine-tuned as a managed solution. And when you face significant costs related to lost files, process management or regulatory fines, rolling your own becomes pretty expensive.

Going with an MFT solution will take a lot of work and worries off your plate.

Secure-And-Compliant-FTP

The responsibility for safeguarding sensitive company information and securely transferring it falls on the already stretched thin IT departments. Luckily, there are many options available for IT when it comes to file transfer. Email, FTP, USB drives and EFSS services like Dropbox to name more than a few. Yet none are as secure or cost-effective as managed file transfer (MFT).

Simple & Secure File Transfer: 5 Ways to Make it Work for You
Simple & Secure File Transfer: 5 Ways to Make it Work for You

MFT gives IT teams the agility they need to respond faster to business needs. All this while reducing time and resources required for file transfer operations. Here are five ways MFT makes IT better at their job:

  1. Secure and reliable transfers lift the burden from IT professionals. MFT provides a single-source solution with built-in security and encryption capabilities. This means all file transfers – whether they are process-to-process, person-to-process or process-to-person – are guaranteed to be protected.
  2. Out-of-the-box solutions free up valuable time and space. A MFT system offers out-of-the-box solutions that can easily be integrated into an existing IT infrastructure. Implementing a turn-key solution means that file transfer can be managed by less experienced IT administrators.
  3. Streamlined automation improves IT productivity. Many file transfers are initiated on a recurring basis. IT teams can get bogged down confirming transfers to meet SLAs.  The automation that comes with MFT promptly pushes data to the right person at the right time. This means that the IT team doesn’t have to think twice and can remain focused on other tasks.
  4. It’s IT friendly and eliminates errors. MFT incorporates admin, end-user access, analytics and reporting, and automation and workflow. This helps IT teams avoid tedious manual tasks that can lead to errors. Not to mention protection against a security breach via integration with important things like encryption and data loss prevention.
  5. Predictable reporting improves visibility and offers support for IT professionals. For regulated businesses (banks, hospitals, etc.), in-depth reporting is a critical need for file transfer systems. A MFT system incorporates reporting capabilities that ensure firms adhere to strict compliance regulations and are able to provide accurate data in the case of an audit – and fast.

Since businesses run on data, the transfer of data is the heart of today’s organizations – and with a solid MFT system, IT teams know that all data is protected while in transit and at rest.

>> Check out “Simple & Secure File Transfer: 5 Ways to Make it Work for You” to learn more about how we help IT teams with managed file transfer.

 

There is so much to absorb at RSA Conference.  The largest gathering of security vendors, solution providers and practitioners in the U.S. certainly didn’t disappoint as the Moscone Center was buzzing with security education and of course lots of thought provoking conversations.

Many of the people I spoke with shared similar concerns of data breach risk, tighter compliance and auditing requirements, and their lack of visibility and control over the tools that people are using inside their organization to share files and data with other people.  IT leaders are feeling pressure (and rightfully so) to regain control over how people share files with other people.  It was also great hear so many people talking about migrating to the public and private clouds in order to take advantage of benefits such as quick provisioning and elasticity.

My favorite conversations at conferences are usually the ones I have with current customers…. And RSA was no exception.  Quite frankly, the key insights I learn from talking with customers help me do my job better.  Many thanks to the dozen or so Ipswitch customers that stopped by our booth and shared stories of how they have successfully consolidated and replaced the various homegrown file transfer tools and scripts, various vendor products, and manual processes they had been relying on with an Ipswitch MFT solution, resulting in improved efficiencies in their business processes as well as a simplified way to demonstrate compliance and consistently enforce security policies for all their file transfer and file sharing activities.

It’s no secret that more and more companies are turning to the cloud to benefit from all that it has to offer.  Subscribing to a cloud service can offer conveniences over deploying software on-premises, including faster deployment, budgeting flexibility, built-in elasticity, near-perfect uptime and it can be significantly less taxing on IT resources.

Managed File Transfer (MFT) is certainly not being left behind in this cloud revolution.  According to Gartner, adoption of MFT Cloud Services is growing rapidly and now accounts for approximately 10% of the overall MFT market.  While both on-premises and cloud markets will continue to grow about 20% annually, cloud services will become a bigger piece of the MFT pie.

Here’s a nifty graph from the Ponemon Institute’s recently published “The Security of Cloud Infrastructure” report summarizing key cloud drivers from the perspective of both IT/Security and Compliance respondents. Interesting to see that many people believe that cloud services will provide improved security and compliance efforts over doing it themselves on-premises with their resource.

So, how do you feel about cloud security?  Are you comfortable with your organization’s data being moved  into the cloud??  What cloud security measures would make you feel better???

Looking back at 2011, we saw more and more employees using consumer-grade (and often personally owned) file sharing technologies such as USB drives, smartphones, personal email accounts, and file sharing websites to move sensitive company information.  We’ve learned that employees will “do what they need to do” to be productive and get their job done… And if IT doesn’t provide them with the right tools, they will find their own.

2011 was also a record-breaking year for data breaches.  Coincidence?   Perhaps.  But there is no denying the fact that the increased use of non-sanctioned technology in the workplace has created a security loophole in many organizations.  It will become increasingly important for organizations to mitigate this risk to avoid a failed security or compliance audit or worse, a data breach.

Ipswitch can help your organization meet the security, usability and visibility requirements for file sharing.  For example, our Ad hoc Transfer module for MOVEit DMZ enables organization to enforce consistent policies and processes around person‐to‐person file transfers ‐ email encryption, attachment offloading, secure messaging, eDiscovery, and more.  It not only gives companies unparalleled governance, but it also allows end users to send information, with anyone, in a fast, easy, secure, visible, and well managed way.

We will be talking a lot more about the topic of people person-to-person file sharing in 2012, so stay tuned….

Information flows into, within and out of organizations faster and in greater volumes than ever before.  Complicating matters is the growing number of vendor systems, applications and platforms that make up your company’s business infrastructure and touch even your most sensitive and mission-critical information.

If you don’t have visibility into the data and files that are flowing between systems, applications and people — both inside and beyond the company firewall — things can go haywire very quickly.

  • Lost files, security breaches and compliance violations
  • Broken SLAs and other processes that are dependent on files
  • No file lifecycle tracking as data flows between applications, systems and people
  • Damaged partner and customer relationships
  • Lost opportunities

Relying on the reporting capabilities of each individual system has proven to be risky and inefficient.  Chances are, you’re swimming in a sea of not-very-useful-or-actionable data and static reports that are already a week behind with what’s actually happening in your company this very instant.

In today’s blog video, Frank Kenney shares his thoughts why having one consolidated view is critical and why organizations are having such a hard time achieving visibility.

[youtube]http://www.youtube.com/watch?v=ow3l1AetI_Q[/youtube]

When it comes to your file transfers, many questions exist.  Do you have the total visibility your business requires?   How do your customers gain visibility into their file transfers??   Do you have all the information you need to meet your service level agreements (SLAs) as well as enabling transparency about integration and file transfers???  Let Ipswitch help you answer these questions and overcome your visibility challenges.

This morning I was asked if I recommended using transport encryption or file encryption to protect company files and data.

My answer:  “Use both of them, together!”

For starters, here’s a real quick summary of both encryption types:

  • Transport encryption (“data-in-transit”) protects the file as it travels over protocols such as FTPS (SSL), SFTP (SSH) and HTTPS.  Leading solutions use encryption strengths up to 256-bit.
  • File encryption (“data-at-rest”) encrypts an individual file so that if it ever ended up in someone else’s possession, they couldn’t open it or see the contents.  PGP is commonly used to encrypt files.

I believe that using both together provides a double-layer of protection.  The transport protects the files as they are moving…. And the PGP protects the file itself, especially important after it’s been moved and is sitting on a server, laptop, USB drive, smartphone or anywhere else.

Here’s an analogy:  Think of transport encryption as an armored truck that’s transporting money from say a retail store to a bank.  99.999% of the time that armored Brinks truck will securely transport your delivery without any incident.  But adding a second layer of protection – say you put the money in a safe before putting it in the truck – reduces the chance of compromise exponentially, both during and after transport.

One last piece of advice:  Ensure that your organization has stopped using the FTP protocol for transferring any type of confidential, private or sensitive information.  Although it’s an amazing accomplishment that FTP is still functional after 40 years, please please please realize that FTP is does not provide any encryption or guaranteed delivery – not to mention that tactically deployed FTP servers scattered throughout your organization lack the visibility, management and enforcement capabilities that modern Managed File Transfer solutions deploy.

“My company still relies heavily on FTP.  I know we should be using something more secure, but I don’t know where to begin.”

Sound familiar?

The easy answer is that you should migrate away from antiquated FTP software because it could be putting your company’s data at risk – Unsecured data is obviously an enormous liability.  Not only does FTP pose a real security threat, but it also lacks many of the management and enforcement capabilities that modern Managed File Transfer solutions offer.

No, it won’t be as daunting of a task as you think.  Here’s a few steps to help you get started:

  • Identify the various tools that are being used to transfer information in, out, and around your organization.  This would include not only all the one-off FTP instances, but also email attachments, file sharing websites, smartphones, EDI, etc.  Chances are, you’ll be surprised to learn some of the methods employees are using to share and move files and data.
  • Map out existing processes for file and data interactions.  Include person-to-person, person-to-server, business-to-business and system-to-system scenarios.  Make sure you really understand the business processes that consume and rely on data.
  • Take inventory of the places where files live.  Servers, employee computers, network directories, SharePoint, ordering systems, CRM software, etc.  After all, it’s harder to protect information that you don’t even know exists.
  • Think about how much your company depends on the secure and reliable transfer of files and data.  What would the effects be of a data breach?  How much does revenue or profitability depend on the underlying business process and the data that feeds them?
  • Determine who has access to sensitive company information.  Then think about who really needs access (and who doesn’t) to the various types of information.  If you’re not already controlling access to company information, it should be part of your near-term plan.   Not everybody in your company should have access to everything.

Modern managed file transfer solutions deliver not only the security you know your business requires, but also the ability to better govern and control you data…. As well as provide you with visibility and auditing capabilities into all of your organizations data interactions, including files, events, people, policies and processes.

So what are you waiting for?

 

Many customers today expect ‘WAN acceleration’ technology (sometimes referred to as WAN Optimization) as part of their MFT vendor’s solution offering. In general this is a useful addition to the MFT feature set, and can certainly reduce file transfer times in a wide variety of scenarios. However, customers should have realistic expectations of what these acceleration technologies can offer, and be cognizant of the limitations and constraints imposed by the carrier network itself.

Customers should question any absolute, unequivocal claims an MFT vendor makes regarding performance improvements achieved using their particular approach.  A claim of “7x” or “30x” improvement without any documented caveats is simply not credible. The key point is that observed performance enhancements in the WAN are probabilistic, not deterministic. A file transfer occurring multiple times between the same endpoints will in all likelihood produce different latency measurements depending on a large number of factors:

  • Time of day
  • Day of week
  • Physical media traversed
  • Design of intervening switch fabrics and router queues
  • SLA agreements with the carrier
  • End-to-end QoS provisioning (if any)
  • Burstiness (jitter) of co-mingled traffic, etc.

Techniques for improving WAN performance vary by vendor: data caching, compression, truncation, protocol optimization (usually proprietary, as an enhancement to TCP at the transport layer), traffic shaping, and de-duplication, just to name a few. Customers should ask many questions and perform their own “real world” tests to ensure they are in fact receiving the transfer performance improvements they expect, under conditions that are common to their WAN environment.

Over the last few weeks, we’ve been putting the final touches on our next generation of services that will be delivered via the cloud. As with any product or service release, there comes a fair amount of planning including ensuring that one has the best site into competitors, forecast and of course customers. We’ve worked closely with industry analysts, our end-users and prospects and our own internal resources to best understand how and where we should position our cloud services. In presentation after presentation and in conversation after conversation, we were presented market slides showing the enormous growth and opportunity within the overall software as a service (SaaS) markets. The natural reaction is to get excited about all the money we can make in this space; before we did, I issued a strong warning to our team:

“In very much the same way that software is analogous to infrastructure, software as a service is not analogous to infrastructure as a service. That includes integration as a service. The profile of the consumer of SaaS will more than likely expect that things like integration, interoperability, transformation and governance will be part of the service subscription.”

In a nutshell what I was saying was… do not look at forecasts for SaaS and assume that the opportunities for IaaS follow the same trends. If users create content by using services that are delivered via the cloud, they have a reasonable expectation that this content can be shared with other services delivered via the cloud (not necessarily by the same vendor). For example, creating content via salesforce.com and sharing that content with gooddata.com should be as simple as granting the necessary permissions. After all, my Facebook, Twitter and Google+ information is shared by clicking a few buttons. Make no mistake, integration and interoperability are nontrivial, but part of the expectation of using cloud services is that the consumer is shielded from these complexities. As more and more cloud service platforms and providers build in integration and governance technologies the need for a separate IaaS provider will likely diminish.

Don’t get me wrong, I still believe that there is a place for technologies such as managed file transfer and business-to-business integration and collaboration; I definitely believe that Ipswitch will play a significant role in the evolution of those markets. Expect the role of Ipswitch to be evolve as well; not only will we provide the best mechanisms for moving content of any size but we will also govern (or let you govern) that movement and the entire experience around it. This is the centerpiece of Ipswitch’s Cloud strategy.

Last week I ranted a bit about the importance of governing your cloud vendors.  At about the same time, Ipswitch’s Frank Kenney participated in a panel discussion on cloud security at the Interop conference in Las Vegas.

As you know, there is great debate over whether cloud services are secure enough for businesses to use.  I believe that the cloud model will quickly evolve and prove itself to a point where security is deemed no riskier than doing business with solely on-premises tools.

I also believe that member-driven organizations such as the Cloud Security Alliance – which focus on providing security assurance within Cloud Computing – will help us get there.

At the Interop discussion, Frank Kenney spoke about the safety of the cloud, here’s what he had to say:

“Cloud customers have the obligation to assess the risk of allowing data to be stored in a cloud based on how valuable it is to the customers…. The cloud is as secure as you want it to be.

Cloud services can provide value if performance and service-level agreements align with what customers need.  If not, customers shouldn’t buy them.  It’s not ‘the sky is falling’.  Assign risks appropriately.  Security is just one of many things you have to do.”

I, like many others, have received security notifications about the Epsilon data breach.  In the last 48-hours I have been sent email warnings from 8 companies that I trusted with my personal information – Banks, retailers and hotels.

These companies entrusted my private contact information to Epsilon, a 3rd party e-mail marketing company…. And that information has now been compromised by hackers.  Awesome.

Details of this massive breach are still rolling in, but so far the list of affected companies is known to include: Ameriprice Financial; Best Buy; Brookstone; Capital One; Citibank; Disney Destinations; Hilton; Home Shopping Network; JPMorgan Chase; Kroger; LL Bean Visa Card; Marriott; QVC; Robert Half; Red Roof Inn; Ritz-Carlton; Target; The College Board; TiVo; US Bank; Walgreens; 1-800-FLOWERS.  And there are likely many more that we haven’t heard about yet.

The Epsilon e-mail breach is a warning about the data security standards employed by third-party service providers, as well as a not-so-subtle reminder to organizations to require strong contractual obligations related to security practices with every business partner and third-party provider you do business with.  As we learned with Epsilon, the privacy – and trust – of your customers may depend on it.

Lastly, be on the lookout for scam emails in your inbox.  The Epsilon breach is an example of how hackers can now match your name and email address to companies that you interact with.  So get ready for the onslaught of emails trying to trick you into handing over your online usernames and passwords.  I suggest not clicking links embedded in emails, instead always go to the company website directly and logon from their safe homepage.  Check out this informative article on The Last Watchdog for more on spear phishing risks as well as some commentary by Ipswitch’s Frank Kenny on data breaches and customer notifications.