safeharbor-ruling

CJEU Rejects Safe Harbor Rules for User Data Transfer

If you’ve been listening, the CJEU has just rejected the safe harbor rules put into place 15 years ago. The implications of this ruling could render many global companies in a tough spot, specifically companies that rely on the free transfer of data between the EU and US. Companies likely to be affected not only include US social media sites, but US cloud file share sites like Dropbox (and their customers who use their services to store EU citizens’ personal data), global retailers with buyers in the EU, and any US business that manage personal data of EU citizens.

User Privacy Impacts ‘Business As Usual’

Although the changes are not immediately in effect, the demands of user privacy will likely impact ‘business as usual’. It is an obvious backlash to NSA surveillance of citizens online activities without their knowledge or consent. But the cost to global businesses is that it’s going to be harder to provide services and data between the US and Europe.

“If the Safe Harbor rules in place since 2000 are done away with, each country in the European Union could potentially set is own privacy rules and regulations, creating enormous barriers to U.S. firms doing business there.” – USA Today, Europe’s top court rejects ‘Safe Harbor’ ruling

Now the scramble for CISOs in global companies is to find ways to comply with the new ruling. It goes without saying that user privacy is extremely important and should be a fundamental right, but this ruling affects more than Facebook and Google, who may have anticipated and already addressed this issue within their organizations. It most likely will change how companies need to handle data flows between the two continents. About half the world’s data is exchanged between Europe and the US, and rejecting safe harbor means drastic changes for small and medium business alike.

In talking to my colleague, Alessandro Porro, in London this morning about this news, he had the following to say:

“The strike down of the Safe Harbor agreement by the Court of Justice of the European Union (CJEU) adds a large amount of uncertainty and risk to any enterprise whose business involves data movement between the EU and US.  Safe Harbor was found to not meet the requirements of the Data Protection Directive.Whilst the EU’s general approach to data protection has been agreed, the actual regulation is still in consultation and so there could be the flexibility to include clear guidance to these firms.  However, it would be fair to assume that this could impact that target adoption date which is currently the end of the year. Businesses should to start working immediately to audit their data sharing practices, including use of US cloud sharing services like Dropbox, so that they understand exactly where they stand and are ready to act when further guidance is issued. “

Tough for Tech But Win for User Rights

On the other side of this, advocates of user privacy as a fundamental right are cheering a huge win. Edward Snowden was quick to tweet out form his new Twitter handle about the ruling.

In either case, it will be interesting to see how the tech industry reacts to this. Companies will need to start getting a little more creative about how they share data between the US and EU.

What is your company doing to adjust to the new rules?

Related Articles

Practical Guide to Control and Compliance

How ready is your organization to comply with evolving regulatory landscape and security risks?

>> Engage with us next month during the Ipswitch Innovate 2015 User Summit, a two-day (October 21-22) online event for IT pros to learn from each other and our product experts.

innovate-FB-1200x628

ChallengesI recently attended CIOboston, a CIOsynergy event headlined as “A New Dimension to Problem Solving Within the Office of the CIO”. We talked about paradigm shifts propelled by technologies like the cloud, the necessary new engagement models for business and IT and the changing world of expectations to name a few topics. But before getting to all this, our moderator Ty Harmon of 2THEEDGE posed the simple question to the attending 50 or so CIOs and senior IT heads: “What are your challenges?”

Here are the answers that I have assembled. I think there is value in seeing what was/is top of mind for IT leaders in raw form:

  • How do we make the right choices between capital and expense?  Service offerings are growing and additive – the spend never ends.
  • How do we integrate multiple cloud vendors to provide business value?
  • User expectations are being set by the likes of Google and Amazon for great UX, 7X24 support, etc. – but it is my IT staff that is expected to deliver all that on our budget. The business does not want to see the price tag – but they want the same experience that is available at home from these giants.
  • IT needs to run like a business but this takes a lot of doing. It matters how we talk and collaborate. We have to deliver business results that must be measurable.
  • Adoption of the cloud is a challenge. How do we assess what is out there? It is not easy to do apples-to-apples comparisons and security is a big concern.
  • How do we go from private to public cloud? Current skill sets are limited.
  • We are constrained by vendors that are not keeping up with the new technologies! One piece of critical software may want an earlier version of Internet Explorer to run; another may use an obsolete version of SQL Server, etc. This clutter prevents IT departments from moving forward.
  • Business complexity is a challenge. IT is asked to automate – but we must push back to first simplify business processes.
  • “Shadow IT” is an issue. A part of the business goes for a “shiny object” rather than focusing on what is the problem that really needs to be solved. They do so without involving IT. Then IT is expected to step in and make it all work, integrate with other software and support it.
  • Proving ROI is a challenge.
  • Balancing performance, scalability and security is tough.
  • How do you choose old vs. new, flexibility vs. security? It isn’t easy.
  • How do we support more and more devices?
  • How do you fill security holes that are in the cloud?
  • How do you manage user expectations, find the balance for supporting them when you have limited resources.

Many heads nodded as these challenges were spoken of.  But all agreed that these are exciting times and IT will push forward through them and be recognized as the true business enabler that it is. What are your thoughts—were you nodding your head at these questions?

Earlier this week we published new survey findings around IT frustrations with manual file transfer, with the vast majority of respondents equating the process with sitting in traffic. TechRadar Pro reporter Juan Martinez wrote a story about the findings, and I had the chance to catch up with him via email on a few questions he had.

Relatively new to the managed file transfer (MFT) space, Juan wanted to understand why file transfer can be so challenging for today’s organizations, and where the technology is headed. These are two great questions and are behind what’s driving file transfer today, and I thought the content of our email exchange would be interesting to anyone curious about the future of MFT.

So why is file transfer a challenge? For a lot of reasons – but mainly because it’s becoming increasingly complex, with end-user adoption of EFSS (enterprise-grade file sync & share) solutions that not only create data security issues, but also result in additional systems for IT to manage and support. And at the high-end, MFT can get absorbed into major IT undertakings that require an immense investment and consultative implementation— rather than an out of the box solution.

The challenge in getting managed file technology right is balancing the needs of collaborative file sharing vs. integrated file-based system to system integration. End users demand simple file sharing solutions that are quick to get started while IT demands compliance to corporate and regulatory security standards. It is easy to focus on one end of this while ignoring the other. Ipswitch understands that there are multiple scenarios of file transfer and that organizations today are looking to centralize and consolidate their file transfer systems into one, secure, ready to use solution. One major area to look into is complete visibility and control into file transfer processes— this is becoming increasingly important as compliance mandates proliferate and become more encompassing.

And so in our view MFT is clearly evolving, with the market is moving toward secure, manageable, and scalable systems at the core. But MFT is more than just file transfer. Ipswitch sees the need for tightly integrated transfer automation around the system core that allows IT to manage the exchange of any volume of transfers, while efficiently processing files to prepare them for the next step in a business process.

We understand that transfers happen in the context of B2B relationships, and we envision a system that wraps every exchange in metadata about the partners and workflows servers by exchange events. We imagine a system in which a broad range of end-user and system-to-system workflows can be accommodated, with clients’ tools that synchronize across partners, empower mobile workers, automate local and remote transfer processes, and intelligently control the flow of content between partners.

As I noted to Juan, Ipswitch has work going on today in all of these areas, and our customers should expect great things in the years to come.

To read Juan’s story, click here: IT Professionals are dissatisfied with file transfer processes, concerned about security

managed file transfer predictionsIn part one, we heard from Stewart Bond of Info-Tech Research Group on his predictions for the Managed File Transfer (MFT) market. Next up we have Terri McClure, Senior Analyst at ESG (@esganalysttmac), and her thoughts on the IT trends for 2014.

Changing Role of IT:  Over the last year there has been a notable increase in number of end users and LOB managers choosing their own work platforms, resulting from increased consumerization and BYOD trends. In addition, cloud-based solutions like online file sharing applications make it incredibly easy for employees to purchase and deploy themselves with just a few clicks over the internet.  As a result, IT is no longer a command and control role and many IT professionals are struggling with how to deal with these changes in order keep control over and secure company data. Some have tried to block unauthorized “rogue” application usage, only to find employees traveling to their local Starbucks, or using personal hotspots to bypass company VPN or networks. Now, more and more IT are embracing the change and proactively playing a more advisory role to help both employees be productive while simultaneously steering them toward a solution that will meet corporate needs around privacy and security.

Increase in Enterprise File Sharing: Corporate File sharing application usage is expanding throughout organizations and crossing organizational boundaries.  In 2012 ESG research indicated that the majority of online file sharing and collaboration application usage was limited to departmental or groups, but over the last year we’ve seen more and more organizations using sync and share applications to collaborate not only across departments, but with external users like contractors, partners, and clients as well.  To ease IT concerns around sharing corporate data, many vendors have responded by adding granular permission controls and including simple data loss prevention and digital rights management functionality.

Security: Security is still top of mind, but flexibility and the ability to integrate with existing IT systems/tools is increasingly important to IT.  Security features like end to end encryption, antivirus, and remote wipe are still among the most requested sync and share features, but as solutions mature a certain level of security is becoming table stakes for enterprise IT.  Customers are increasingly interested in the ability to integrate solutions with their existing storage solutions through hybrid or private cloud online file sharing deployments, and want increased flexibility with other existing tools (content management, backup, data analytics, mobile application management, etc.).

There is certainly not a lack of perspectives on the IT trends in the year ahead but I’m interested in what the readers think! Leave your thoughts below and feel free to keep this discussion going on Twitter with me @Cheri29.

Possibly not. The Internet’s venerable File Transfer Protocol (FTP) is usually supported by Managed File Transfer (MFT) systems, which can typically use FTP as one of the ways in which data is physically moved from place to place. However, MFT essentially wraps a significant management and automation layer around FTP. Consider some of the things an MFT solution might provide above and beyond FTP itself—even if FTP was, in fact, being used for the actual transfer of data:

  • Most MFT solutions will offer a secure, encrypted variant of FTP as well as numerous other more‐secure file transfer options. Remember that FTP by itself doesn’t offer any form of transport level encryption (although you could obviously encrypt the file data itself before sending, and decrypt it upon receipt; doing so involves logistical complications like sharing passwords or certificates).
  • MFT solutions often provide guaranteed delivery, meaning they use file transfer protocols that give the sender a confirmation that the file was, in fact, correctly received by the recipient. This can be important in a number of business situations.
  • MFT solutions can provide automation for transfers, automatically transferring files that are placed into a given folder, transferring files at a certain time of day, and so forth.
  • MFT servers can also provide set‐up and clean‐up automation. For example, successfully‐transferred files might be securely wiped from the MFT server’s storage to help prevent unauthorized disclosure or additional transfers.
  • MFT servers may provide application programming interfaces (APIs) that make file transfer easier to integrate into your internal line‐of‐business applications.
  • MFT solutions commonly provide detailed audit logs of transfer activity, which can be useful for troubleshooting, security, compliance, and many other business purposes.
  • Enterprise‐class MFT solutions may provide options for automated failover and high availability, helping to ensure that your critical file transfers take place even in the event of certain kinds of software or hardware failures.

In short, FTP isn’t a bad file transfer protocol—although it doesn’t offer encryption. MFT isn’t a file transfer protocol at all; it’s a set of management services that wrap around file transfer protocols—like FTP, although that’s not the only choice—to provide better security, manageability, accountability, and automation.

In today’s business, FTP is rarely “enough.” Aside from its general lack of security—which can be partially addressed by using protocols such as SFTP or FTPS instead—FTP simply lacks manageability, integration, and accountability. Many businesses feel that they simply need to “get a file from one place to another,” but in reality they also need to:

  • Make sure the file isn’t disclosed to anyone else
  • Ensure, in a provable way, that the file got to its destination
  • Get the file from, or deliver a file to, other business systems (integration)

In some cases, the business might even need to translate or transform a file before sending it or after receiving it. For example, a file received in XML format may need to be translated to several CSV files before being fed to other business systems or databases—and an MFT solution can provide the functionality needed to make that happen.

Many organizations tend to look at MFT first for its security capabilities, which often revolve around a few basic themes:

  • Protecting data in‐transit (encryption)
  • Ensuring that only authorized individuals can access the MFT system (authorization and authentication)
  • Tracking transfer activity (auditing)
  • Reducing the spread of data (securely wiping temporary files after transfers are complete, and controlling the number of times a file can be transferred)

These are all things that a simple FTP server can’t provide. Having satisfied their security requirements, organizations then begin to take advantage of the manageability capabilities of MFT systems, including centralized control, tracking, automation, and so forth—again, features that an FTP server alone simply can’t give you.

– From The Tips and Tricks Guide to Managed File Transfer by Don Jones

To read more, check out the full eBook or stay tuned for more file transfer tips and tricks!

Definitely not. To begin with, there are numerous kinds of encryption—some of which can actually be broken quite easily. One of the earlier common forms of encryption (around 1996) relied on encryption keys that were 40 bits in length; surprisingly, many technologies and products continue to use this older, weaker form of encryption. Although there are nearly a trillion possible encryption keys using this form of encryption, relatively little computing power is needed to break the encryption—a modern home computer can do so in just a few days, and a powerful supercomputer can do so in a few minutes.

So all encryption is definitely not the same. That said, the field of cryptography has become incredibly complex and technical in the past few years, and it has become very difficult for business people and even information technology professionals to fully understand the various differences. There are different encryption algorithms—DES, AES, and so forth—as well as encryption keys of differing lengths. Rather than try to become a cryptographic expert, your business would do well to look at higher‐level performance standards.

One such standard comes under the US Federal Information Processing Standards. FIPS specifications are managed by the National Institute of Standards and Technology (NIST); FIPS 140‐2 is the standard that specifically applies to data encryption, and it is managed by NIST’s Computer Security Division. In fact, FIPS 140‐2 is accepted by both the US and Canadian governments, and is used by almost all US government agencies, including the National Security Agency (NSA), and by many foreign ones. Although not mandated for private commercial use, the general feeling in the industry is that “if it’s good enough for the paranoid folks at the NSA, it’s good enough for us too.”

FIPS 140‐2 specifies the encryption algorithms and key strengths that a cryptography package must support in order to become certified. The standard also specifies testing criteria, and FIPS 140‐2 certified products are those products that have passed the specified tests. Vendors of cryptography products can submit their products to the FIPS Cryptographic Module Validation Program (CMVP), which validates that the product meets the FIPS specification. The validation program is administered by NIST‐certified independent labs, which not only examine the source code of the product but also its design documents and related materials—before subjecting the product to a battery of confirmation tests.

In fact, there’s another facet—in addition to encryption algorithm and key strength—that further demonstrates how all encryption isn’t the same: back doors. Encryption is implemented by computer programs, and those programs are written by human beings— who sometimes can’t resist including an “Easter egg,” back door, or other surprise in the code. These additions can weaken the strength of security‐related code by making it easier to recover encryption keys, crack encryption, and so forth. Part of the CMVP process is an examination of the program source code to ensure that no such back doors exist in the code—further validating the strength and security of the encryption technology.

So the practical upshot is this: All encryption is not the same, and rather than become an expert on encryption, you should simply look for products that have earned FIPS 140‐2 certification. Doing so ensures that you’re getting the “best of breed” for modern cryptography practices, and that you’re avoiding back doors, Easter eggs, and other unwanted inclusions in the code.

You can go a bit further. Cryptographic modules are certified by FIPS 140‐2, but the encryption algorithms themselves can be certified by FIPS 197 (Advanced Encryption Standard), FIPS 180 (SHA‐1 and HMAC‐SHA‐1 algorithms). By selecting a product that utilizes certified cryptography, you’re assured of getting the most powerful, most secure encryption currently available.

– From The Tips and Tricks Guide to Managed File Transfer by Don Jones

To read more, check out the full eBook or stay tuned for more file transfer tips and tricks!

On Wednesday, November 3 and Thursday, November 4, Ipswitch File Transfer will be exhibiting and speaking at SecureWorld Expo, the leading regional security conference that brings together the security leaders, experts, senior executives, and policy makers who are shaping the very face of security.

The “Exhibits and Open Sessions Registration” for SecureWorld Expo is complimentary and it gives you access to the expo floor, the keynote presentations, and open industry expert panels. Plus, you’ll get to hear the luncheon keynote from L. Frank Kenney, The Data Breaches You Don’t See Hurt You The Most,” and the industry expert panel Data Protection: Walking the Thin Line Between Employee Productivity and Security.”

Here are the details:

What: SecureWorld Expo – Dallas

Where: Plano Convention Centre, Plano, TX

When: November 3, 2010 and November 4, 2010

Why: Meet the Ipswitch File Transfer team, learn about our solutions (from WS_FTP to MessageWay), listen in on L. Frank Kenney’s luncheon keynote, and keep up to date on the latest in the security world!

Plus, if you visit us and mention this blog post, you’ll receive a Starbucks gift card – on the spot!

See you in Dallas!

Every so often, you have to SYH (shake your head) at the acronyms created by technology companies
Shane O’Neill, Publisher of CIO magazine and CIO.com

O’Neill has a great point. I remember back in my freelance days I was in some meetings where project managers would reach into a box of Alpha-Bits, grab a handful, toss them on the table and produce the newest acronyms for their latest projects.

Just the other day I was working on a post and came across an acronym I was unfamiliar with. I Googled it, I hit Wikipedia and eventually I figured it out, but it took me much longer than I thought it would take.

Who knew there would be so many definitions for three little letters?

O’Neill poses a lighthearted, but interesting question in his article “Ten Ridiculous New Tech Acronyms.” O’Neill asks if it is “any surprise that acronyms have taken over our lives? They fit perfectly in our fast-paced, multi-tasking society. Why say something in words if you can say it in letters?”

When you consider our industry, O’Neill says that the tech acronyms “can be inscrutable, unintentionally funny, accidentally crass, or just goofy. In total, they add up to a big steaming bowl of alphabet soup.”

Here’s an OMG look at some new LOL acronyms: “Ten Ridiculous New Tech Acronyms

I’ve been asked at least a dozen times over the last month “What are the benefits of a cloud-based hosted subscription versus an on-premises software deployment?”.

“Though this be madness, yet there is method in’t.” ~ Hamlet

There are many benefits of going SaaS, just like there are benefits of deploying on-premises.  It all comes down to the problems you are trying to solve, budgeting preferences, and IT resource availability and expertise.  Here are some benefits of going the hosted route.

  • Fast and easy deployment:  SaaS solutions are often available instantly, providing an amazingly fast time-to-value.  You don’t need to install any software/hardware yourself and there are no complicated firewall or security configurations to work through.
  • Budgeting flexibility & lower up front cost:  Hosted subscriptions are treated as an “operating expense” with no capital investment spent on software/hardware.  Pay-as-you-go subscription plans often lead to quicker purchase decisions because there is no need to get CapEx budget sign-off.
  • Less taxing on your IT resources:  SaaS solutions require significantly less effort to deploy and maintain.  There are no ongoing software upgrades, patches or backups for you to worry about, and no complex security/compliance configurations to be responsible for internally.  Plus, there is no underlying infrastructure to assemble and maintain.
  • Built-in scalability:  The elasticity and high bandwidth of SaaS solutions easily handles spikes in usage and grows as organizational needs expand.
  • Near perfect uptime:  Hosted services are often run in a highly available, load-balanced, automatic failover configuration to ensure even the strictest network and application uptime requirements and SLAs are met.

I’d like to also quickly mention that we’ve had numerous customers initially deploy our MOVEit DMZ Hosted Service as a way to get their Managed File Transfer solution up and running quickly, while they continue to work towards an on-premises deployment.

The growth of SaaS can’t be denied…. The question is, whether ’tis SaaS right for your organization?

Did you kill the web?

Let’s check your alibi. Think of how you spent your morning. Normally, I’d share my morning with you here, what websites I’ve visited and what apps I’ve used, but my boss reads my blog posts, and if she knew how much time I spent on … well, let’s let Chris Anderson illustrate the point I’m trying to make:

You wake up and check your email on your bedside iPad — that’s one app. During breakfast you browse Facebook, Twitter, and The New York Times  — three more apps. On the way to the office, you listen to a podcast on your smartphone. Another app. At work, you scroll through RSS feeds in a reader and have Skype and IM conversations. More apps. At the end of the day, you come home, make dinner while listening to Pandora, play some games on Xbox Live, and watch a movie on Netflix’s streaming service. You’ve spent the day on the Internet — but not on the Web. And you are not alone.”

Chris Anderson and Michael Wolff, in an article on Wired.com titled “The Web Is Dead. Long Live the Internet“, present a compelling argument for the demise of the World Wide Web and how “simpler, sleeker services“, like apps, “are less about the searching and more about the getting.”

Peer to peer file transfers are among the suspects at the crime scene:

The applications that account for more of the Internet’s traffic include peer-to-peer file transfers, email, company VPNs, the machine-to-machine communications of APIs, Skype calls, World of Warcraft and other online games, Xbox Live, iTunes, voice-over-IP phones, iChat, and Netflix movie streaming. Many of the newer Net applications are closed, often proprietary, networks.”

This is one of the most interesting articles I’ve read in a while, give it a read and feel free to share your thoughts and whether or not you’re placing any yellow crime scene tape over your PC.

MOVEit Crypto, the encryption component used to secure data and settings in MOVEit DMZ and MOVEit Central in mission-critical, Internet-exposed applications, has been revalidated under FIPS 140-2 and has been issued certificate #1363.   This certificate should be available on the Cryptographic Module Validation Program (CMVP)’s website (nist.gov) in 1-2 weeks.

The changes in MOVEit Crypto that required the revalidation were mainly related to the introduction of “SHA-2” hashes such as as SHA-256.  As you may already be aware, use of unkeyed SHA-1 hashes will be disallowed in U.S. government applications by the end of the year.  (Weaker hashes such as MD5 and non-cryptographic quality integrity checks such as CRC are already disallowed.)  Fortunately, existing MOVEit products make use of keyed SHA-1 hashes (not the unkeyed hashes that will soon be disallowed), so use of existing MOVEit products with the older version of MOVEit Crypto will be allowed in U.S. government applications well beyond the end of the year.

Ipswitch File Transfer is going (more) global. We’re thrilled to announce the expansion of our already successful Ipswitch FT Partner Program.  It now boasts a number of new benefits for our global partners, including a new Elite Partner Level and a deal registration program.

The Elite Level expansion was created for those partners looking for even greater association and support from Ipswitch File Transfer.  A new deal registration program has also been introduced, which will incent resellers with additional discount points for registering qualified net new sales opportunities on the FT Partner Portal.

read more “Going Global: Ipswitch File Transfer Expands Partner Program”