Ipswitch has launched a new version of our WS_FTP Server solution.  Customers can now deploy WS_FTP Server in a failover configuration.

WS_FTP Server can now be configured to support automatic, unattended failover, enabling your organization to easily achieve high availability for your file transfer processes.  Not only will you increase system uptime, reliability, and performance, but you will now be able to provide uninterrupted access to file transfer users – all critical for helping your company deliver exceptional business performance and meet service level agreements around availability.

Take a quick minute and watch  Ipswitch’s Jonathan Lampe share his thoughts on our new failover capability for WS_FTP Server:

I’ve been back on the road visiting file transfer customers and there’s growing concern out there about the ability to track and predict failure against defined service level agreements (SLAs).  In general, I’m seeing most SLAs in our industry cleave to one or more of the following requirements:

1) Application Availability:  Did our service meet the 99.xxx% goal we set?  Most companies I’ve seen track this in minutes per month and year, and some track this by visibility to key customers.  For example, if the file transfer srvice was unexpectedly down at 3am but only 15 customers would have noticed, can we count it as an outage for only those 15?

2) Round-trip Response Time:  Does our service reliably return results from incoming submissions within X time?  This is big at data centers that self-identify as “item processors” or have an “EDI/transmissions” group.  This can also be further specified by class of customer or work (e.g., higher priority transactions) and time of day.

3) Expected Data Within Defined Transfer Window:  Did we receive (or send) the “right” files during the transmissions window from X:XX to Y:YY?  This one can be harder than it looks.  First, you often have “right files” definitions that have dependencies on control or summary files plus specific file formats, names and sizes.   Then there is the additional challenge of predicting which bundles are “running late” and the question of setting up warning alerts with 30 minutes or 15 minutes to go?

Even with these common requirements in the field, the nature of SLAs continues to evolve.   As we see additional trends develop we’ll continue to note them – please expect more information in the coming months.

There are many reasons why organizations have shifted their approach to file transfer away from being a purely tactical point-solution (which was likely driven by a new/immediate need of a single business unit) to being viewed as a strategic project that’s now considered an important part of an organization’s overall business operation.

Jonathan Lampe recently published a very insightful article on CIO titled “The Evolution of File Transfer in 2011: From Tactical to Strategic”.  Jonathan makes a very insightful case that the increased focus on (and backlash from) data breaches and compliance regulations has played a big role in this evolution.

As Jonathan points out, the grace period for lapses in personal data protection is thankfully over!  And Managed File Transfer technology is being leveraged more and more as a strategic tool to not only facilitate the secure transfer of files, but also in a way that allows for much needed visibility, management and enforcement of company data, both within an organization and also between external partners and customers. And all with auditing and reporting capabilities that satisfy even the strictest of governed environments not to mention person-to-person, transformation and application integration too.

Some highlights of what to expect with the MFT evolution in 2011:

“First, there will be the ongoing challenge to present interfaces and metaphors that are relevant to today’s end users – the days of an FTP client on every desktop are long ago.

Second, there will be increased pressure to more closely integrate with enterprise middleware, authentication and monitoring/control technology.

Finally, there will be the ongoing need to present and manage more information about the flows of data, all within the context of tightening regulations around data privacy”.

Take a quick read of the CIO article…. It’s well worth 5 minutes of your time.

If your file transfer solution could look into the future and predict 3 things for you, what would they be?

To kick this off, here’s a list of predictive needs I often hear from customers:

1) Am I about to miss my service levels, and which ones are about to cost me the most?

2) If I grow X% next year or bring on body Y of new traffic, what do I need to plan for in terms of system capacity, staffing and related technology?

3) Can I test a new transmissions proposal as if the test items were really coming from real people during real transmissions windows…all without affecting production?

Would these be your top 3 predictors as well?  We’d love to know either way.

I just returned from the PCI Security Standards Council .  It was great to spend a couple of days talking tech and trends with other security experts.

The hottest trend this year in the payment security industry is “tokenization”.   This technology lifts credit card numbers from sets of data and replaces them with unique one-way tokens (e.g., “234cew23”) in the data instead.  The original credit card numbers are stored in a “secure token vault” and may only be retrieved by authorized people and processes who present another set of credentials (preferably two-factor credentials).

The reason businesses find tokenization compelling is because PCI requirements state that data sets with credit card numbers must be treated with more care than data sets without that information (e.g., just your name, expiration date, etc.).  The higher degree of care often translates into full encryption, good key management, regular key rotation and a host of other security controls.  All these extra controls cost money, so if businesses can ratchet down the sensitivity of their data with tokenization, they can enjoy cost savings by not having to implement (or audit) other security controls.

Anyone buying in at this stage would be an early adopter: the Council has not yet endorsed the use of this technology.  However, the Council has formed a working group to come up with specific guidance (e.g., are hashes OK, if so, which ones, are unique IDs OK, etc.), so some level of future acceptance seems likely.  So far the working group has only provided a definition of the technology (essentially, the one I provided above).   However, a draft recommendation from the Council with specifics is expected around the new year.

GT News, an association for financial professionals, just posted an article on managed file transfer titled “Data: Transferring the Burden Under PCI DSS” written Jonathan Lampe, VP of Product Management at Ipswitch.

“When evaluating for data security technology, a company should look at four categories: confidentiality, integrity, availability, and auditing. These headlines are designed to assist in assessing whether a data technology or process is likely to provide one-time compliance for the purposes of PCI DSS.”

This article is a very informative read for people living/coping with PCI DSS compliance and looking for a detailed application of MFT solutions to the 12 PCI DSS requirements.  It’s also a good read for people that simply want to know more about MFT and want to learn about Jonathan’s framework for evaluating data security technologies.