This morning I was asked if I recommended using transport encryption or file encryption to protect company files and data.

My answer:  “Use both of them, together!”

For starters, here’s a real quick summary of both encryption types:

  • Transport encryption (“data-in-transit”) protects the file as it travels over protocols such as FTPS (SSL), SFTP (SSH) and HTTPS.  Leading solutions use encryption strengths up to 256-bit.
  • File encryption (“data-at-rest”) encrypts an individual file so that if it ever ended up in someone else’s possession, they couldn’t open it or see the contents.  PGP is commonly used to encrypt files.

I believe that using both together provides a double-layer of protection.  The transport protects the files as they are moving…. And the PGP protects the file itself, especially important after it’s been moved and is sitting on a server, laptop, USB drive, smartphone or anywhere else.

Here’s an analogy:  Think of transport encryption as an armored truck that’s transporting money from say a retail store to a bank.  99.999% of the time that armored Brinks truck will securely transport your delivery without any incident.  But adding a second layer of protection – say you put the money in a safe before putting it in the truck – reduces the chance of compromise exponentially, both during and after transport.

One last piece of advice:  Ensure that your organization has stopped using the FTP protocol for transferring any type of confidential, private or sensitive information.  Although it’s an amazing accomplishment that FTP is still functional after 40 years, please please please realize that FTP is does not provide any encryption or guaranteed delivery – not to mention that tactically deployed FTP servers scattered throughout your organization lack the visibility, management and enforcement capabilities that modern Managed File Transfer solutions deploy.

“My company still relies heavily on FTP.  I know we should be using something more secure, but I don’t know where to begin.”

Sound familiar?

The easy answer is that you should migrate away from antiquated FTP software because it could be putting your company’s data at risk – Unsecured data is obviously an enormous liability.  Not only does FTP pose a real security threat, but it also lacks many of the management and enforcement capabilities that modern Managed File Transfer solutions offer.

No, it won’t be as daunting of a task as you think.  Here’s a few steps to help you get started:

  • Identify the various tools that are being used to transfer information in, out, and around your organization.  This would include not only all the one-off FTP instances, but also email attachments, file sharing websites, smartphones, EDI, etc.  Chances are, you’ll be surprised to learn some of the methods employees are using to share and move files and data.
  • Map out existing processes for file and data interactions.  Include person-to-person, person-to-server, business-to-business and system-to-system scenarios.  Make sure you really understand the business processes that consume and rely on data.
  • Take inventory of the places where files live.  Servers, employee computers, network directories, SharePoint, ordering systems, CRM software, etc.  After all, it’s harder to protect information that you don’t even know exists.
  • Think about how much your company depends on the secure and reliable transfer of files and data.  What would the effects be of a data breach?  How much does revenue or profitability depend on the underlying business process and the data that feeds them?
  • Determine who has access to sensitive company information.  Then think about who really needs access (and who doesn’t) to the various types of information.  If you’re not already controlling access to company information, it should be part of your near-term plan.   Not everybody in your company should have access to everything.

Modern managed file transfer solutions deliver not only the security you know your business requires, but also the ability to better govern and control you data…. As well as provide you with visibility and auditing capabilities into all of your organizations data interactions, including files, events, people, policies and processes.

So what are you waiting for?

 

Many customers today expect ‘WAN acceleration’ technology (sometimes referred to as WAN Optimization) as part of their MFT vendor’s solution offering. In general this is a useful addition to the MFT feature set, and can certainly reduce file transfer times in a wide variety of scenarios. However, customers should have realistic expectations of what these acceleration technologies can offer, and be cognizant of the limitations and constraints imposed by the carrier network itself.

Customers should question any absolute, unequivocal claims an MFT vendor makes regarding performance improvements achieved using their particular approach.  A claim of “7x” or “30x” improvement without any documented caveats is simply not credible. The key point is that observed performance enhancements in the WAN are probabilistic, not deterministic. A file transfer occurring multiple times between the same endpoints will in all likelihood produce different latency measurements depending on a large number of factors:

  • Time of day
  • Day of week
  • Physical media traversed
  • Design of intervening switch fabrics and router queues
  • SLA agreements with the carrier
  • End-to-end QoS provisioning (if any)
  • Burstiness (jitter) of co-mingled traffic, etc.

Techniques for improving WAN performance vary by vendor: data caching, compression, truncation, protocol optimization (usually proprietary, as an enhancement to TCP at the transport layer), traffic shaping, and de-duplication, just to name a few. Customers should ask many questions and perform their own “real world” tests to ensure they are in fact receiving the transfer performance improvements they expect, under conditions that are common to their WAN environment.

Over the last few weeks, we’ve been putting the final touches on our next generation of services that will be delivered via the cloud. As with any product or service release, there comes a fair amount of planning including ensuring that one has the best site into competitors, forecast and of course customers. We’ve worked closely with industry analysts, our end-users and prospects and our own internal resources to best understand how and where we should position our cloud services. In presentation after presentation and in conversation after conversation, we were presented market slides showing the enormous growth and opportunity within the overall software as a service (SaaS) markets. The natural reaction is to get excited about all the money we can make in this space; before we did, I issued a strong warning to our team:

“In very much the same way that software is analogous to infrastructure, software as a service is not analogous to infrastructure as a service. That includes integration as a service. The profile of the consumer of SaaS will more than likely expect that things like integration, interoperability, transformation and governance will be part of the service subscription.”

In a nutshell what I was saying was… do not look at forecasts for SaaS and assume that the opportunities for IaaS follow the same trends. If users create content by using services that are delivered via the cloud, they have a reasonable expectation that this content can be shared with other services delivered via the cloud (not necessarily by the same vendor). For example, creating content via salesforce.com and sharing that content with gooddata.com should be as simple as granting the necessary permissions. After all, my Facebook, Twitter and Google+ information is shared by clicking a few buttons. Make no mistake, integration and interoperability are nontrivial, but part of the expectation of using cloud services is that the consumer is shielded from these complexities. As more and more cloud service platforms and providers build in integration and governance technologies the need for a separate IaaS provider will likely diminish.

Don’t get me wrong, I still believe that there is a place for technologies such as managed file transfer and business-to-business integration and collaboration; I definitely believe that Ipswitch will play a significant role in the evolution of those markets. Expect the role of Ipswitch to be evolve as well; not only will we provide the best mechanisms for moving content of any size but we will also govern (or let you govern) that movement and the entire experience around it. This is the centerpiece of Ipswitch’s Cloud strategy.

Last week I ranted a bit about the importance of governing your cloud vendors.  At about the same time, Ipswitch’s Frank Kenney participated in a panel discussion on cloud security at the Interop conference in Las Vegas.

As you know, there is great debate over whether cloud services are secure enough for businesses to use.  I believe that the cloud model will quickly evolve and prove itself to a point where security is deemed no riskier than doing business with solely on-premises tools.

I also believe that member-driven organizations such as the Cloud Security Alliance – which focus on providing security assurance within Cloud Computing – will help us get there.

At the Interop discussion, Frank Kenney spoke about the safety of the cloud, here’s what he had to say:

“Cloud customers have the obligation to assess the risk of allowing data to be stored in a cloud based on how valuable it is to the customers…. The cloud is as secure as you want it to be.

Cloud services can provide value if performance and service-level agreements align with what customers need.  If not, customers shouldn’t buy them.  It’s not ‘the sky is falling’.  Assign risks appropriately.  Security is just one of many things you have to do.”

During the past year, we shared news of our expanded partner program and new partner web portal, reinforcing our commitment to the channel.

Today, we’re very excited to share news that our suite of MOVEit solutions will now be made available for sale through North American distributor Tech Data.

“Adding MOVEit to their portfolio ensures that our partners will have a strategic offering to meet the evolving needs of their customers.” said Gary Shottes, president, Ipswitch File Transfer.

“Businesses of all sizes are looking to VARs to support their security and compliance needs, and Tech Data and Ipswitch are working together to ensure that VARs have access to the support they need to add the MOVEit solutions to their offerings.” said Stacy Nethercoat, vice president at Tech Data.

Our channel partners will continue to be a critical component of the Ipswitch File Transfer worldwide sales team, providing customers with advisory and consultative solutions.  Please do visit our partner webpage to find a local Distributor or Reseller.

Let’s do a news recap of yesterday. Some tax legislation was passed, lame-duck Congress, celebrity mishaps, missteps and gossip as usual. Oh and there was also notification of a few data breaches; most notably McDonalds, University of Wisconsin and the Gawker website (the folks that bought a prototype of the iPhone 4 after it was lost by an Apple engineer.). Unlike the “it’s been two weeks and it’s still in the news” WikiLeaks data breach, expect McDonalds, UW and Gawker to melt into the ether of public consciousness along with the Jersey Shore, AOL and two dollar a gallon gas prices.

Lately, we are seeing more companies and institutions admitting to data breaches. Passwords get hacked and ATM cards, identities and cell phones are stolen all the time. Expect to here about more breaches as companies move ahead of legislation that forces them to admit security breaches and expect the media to pick up on the stories and run wild with them. What this forces the public to do is look closer at the type of data breach, the type of data that was stolen and what the company or institution did to cause the breach.

 For example:

  • the McDonalds breach was about third-party contractors and not enough governance around customer e-mail
  • the UW breach was about unauthorized access to databases over a two-year period… again not enough governance around data storage and access
  • the Gawker breach was about outdated encryption mechanisms and a rogue organization purposely trying to embarrass that community.

Of these three things, the Gawker breach is most troubling because of the organized and intentional motivations of a rogue organization. This is why the FBI is involved. For the past year I’ve been telling you to classify your data, assign risk to your data and mitigate that risk appropriately. Old news.

The new news is this: even something like a breach involving low risk information can actually damage your brand. And damage to the brand can be costly to repair. So when classifying risk be sure to consider not just the loss of the data but the nature of the media hell-bent on reporting any and all data breaches.

This just in… I’m getting that watch I always wanted for Christmas because I compromised that space in the attic where we hide all the gifts. Happy holidays!

The Ziff Davis survey on Managed File Transfer did a nice job amplifying the aspects of currently deployed file transfer methods people think need the most improvement.

Checking in at #1 and #2 on the “improvements needed to my existing file transfer methods” list are SPEED and SECURITY.  This only fuels the age-old debate of productivity versus security… But that’s a topic for another day!  Needless to say, it’s not surprising that about half of survey respondents say that they need faster file transfers and roughly the same amount say they require stronger security.

Other items on the “improvements” wish list include:  reliability, capacity, scalability, central management, workflow integration, IT infrastructure integration and compliance.

It’s validating to see in the graphic that areas where MFT solutions excel today closely map to those aspects of existing file transfer methods that people say require the most improvement — Reliability, speed, security, up-time and capacity round out the top five.  Efficiency is a common theme with all these items, driven largely by time-sensitive business-critical processes and even SLAs depending on fast and highly available file transfer processes and workflows.

The last point I want to make about the “needs improvement” survey results is that no solution (MFT or other) will magically make a company compliant.  There is no holy grail to achieving regulatory, regional, industry or corporate compliance.  Rather, compliance is the end result of a strategically implemented, documented and monitored initiative that encompasses the entire arsenal of company-sanctioned policies, tools, and of course processes and employee actions.

Coming soon:  I’ve got a few more musings about the survey that focus on deployment challenges as well as the business benefits of MFT.

835UVUTMM99Z

Ziff Davis recently published a study on Managed File Transfer that heralds MFT solutions as “the unsung security and compliance solution”.  Eric Lundquist sets the stage nicely:

“Everyone is talking about the need to collaborate more effectively and put employees closer to customers in a real time business environment.

But until you can assure the security, privacy, and compliance requirements of data transfer, the collaborative enterprise is just a good idea.  MFT is one of those enabling technologies designed to make it a reality.”

The study found that security concerns about current file transfer methods include the usual suspects, such as:  encryption; viruses, user authentication, backup, hacking, enforcing security policies, managing external users, auditing, reporting and defining security policies.

Not surprisingly, data from the study shows that many of those very security concerns that people had with their organizations current file transfer methods are actually strengths of today’s MFT solutions.

Keep in mind that many organizations still rely on homegrown scripts and point-to-point solutions, oftentimes using unencrypted FTP protocol for transport… And with very little visibility, management or policy enforcement.  In addition to being time consuming and expensive to manage and maintain (and commonly built by developers that left the company years ago), many existing file transfer methods are insecure and introduce risk and inefficiency into an organization.

Plus, many companies haven’t even begun to crack the person-to-person nut of file transfer beyond relying on corporate email, unsanctioned personal email or file sharing websites, and even sneakernet!

In my next post, we’ll take a closer look at some of the areas where the study identified MFT solutions as being superior to many commonly used methods for file transfer.

The real highlights for me at last week’s SecureWorld Expo were the attendees who visited Ipswitch’s tradeshow booth.  From global enterprises to small business owners, public utilities to brand name consumer products companies, the people I met described challenging business problems and showed genuine interest in managing and protecting their data.

A couple of visitors jump to mind:

  • The ex-Secret Service agent (Electronic Crimes Task Force), now an independent consultant, who came straight to SecureWorld after flying cross-country to attend another security conference in Atlanta.  Her curiosity about managed file transfer solutions, and her breadth of knowledge about encryption methods and sources of risk I had never even considered, gave us lots to talk about.
  • The Chief ISO from the CA Dept of Water Resources, one of at least 10 people I met from local environmental agencies or private utilities.   I had no idea that the business of managing natural resources was so data intensive!  They have a huge amount of traffic between and among state and county agencies, and send hundreds if not thousands of files per week to private businesses, citizen groups, and individual consumers.  Many of these files contain sensitive information, making it an ideal scenario for Ipswitch’s managed file transfer solutions that can handle high volume data files sent programmatically to a wide number of recipients.

These two booth visitors highlight 2-days worth of insightful conversations I had with customers, prospects and fellow vendors.  Needless to say, I’m very excited to dive into the MFT space and look forward to sharing more insights.

I had the pleasure of attending the SecureWorld Expo last week in Santa Clara, CA, right in the heart of Silicon Valley.   Although it was a relatively small show, the audience was feisty!   And as the first tradeshow I’ve attended as an Ipswitch employee, and my first security-themed show, there was a ton to learn.

The range of exhibitors and their offerings was impressive and instructive.  Attendees (and this reporter) had the opportunity to learn about end point security, patch management, threat management appliances, disaster recovery, identity management, and much more.

Here are a few vendors that caught my eye:

  • ESET – whose live, 2-inch long cockroaches drew cringes and well-earned attention to their anti-virus solutions set;
  • Websense – whose DLP solution is a great complement to Ipswitch’s managed file transfer products, as it automatically identifies content that likely contains data that is sensitive and needs to be secured;
  • Veracode – which is changing the game in application security testing with its SaaS static testing and analysis offerings.

I was tapped to be a panelist for a breakout session entitled “Data Protection – Walking the Thin Line Between Employee Productivity and Security”.  It was a great subject that my fellow panelists handled very well, demonstrating their deep knowledge about security solutions and how they fit (or don’t) within corporate cultures.  I look forward to exploring these questions with Ipswitch’s customers and at other tradeshows in the coming months.

The most insightful conversations I had at the show were with attendees who visited our booth.  More  on those conversations soon….

Tonight I’m blogging from the PCI Council Community Meeting here in Orlando, FL.  Tomorrow we’ll be talking about the new changes in version 2.0 of the PCI DSS audit requirements (set to go into effect in 2011), but tonight was the welcome reception for the 1000 attendees here at the Buena Vista Palace Hotel.

Participation in the PCI Council Community Meeting conference is on the rise.  Two years ago there were about 500 attendees from 300 participating organizations – now the numbers have roughly doubled.  There are probably two major factors behind this.

One factor is the de facto status of PCI DSS as one of the gold standards of information security.  When five competing credit card companies came together in 2004 to publicly agree on a single security standard there was much rejoicing throughout the industry.  And the standard has held up: though major releases have come every two years, the original twelve categories and most of the subcategories remain essentially unchanged from the original.

The second factor is the ever-widening circle of companies that fall under the scope of PCI compliance.  Originally it was large credit card processors and retailers, but in recent years even companies that only handle a few dozen credit card transactions a year have had to take notice.  And as the scope widens, there are more people who want their voices to be heard in the decision-making process, which is where this week’s conference comes in.

I’ll be posting a few more items about this conference in next few days – please stay tuned.