In the early years of IT, data was stored on paper tapes

What did an IT position look like in the ’70s, ’80s and ’90s? Far fewer mobile endpoints, for one thing. With respect to today, the history of information technology boasts some surprising differences in day-to-day tasks and the technology that was available. IT support has come a long way, folks.

How Far Back?

IT has been around almost as long as humans. If you think about it, hieroglyphics are just a script devs don’t use anymore. Mechanical devices such as the slide rule, the Difference Engine, Blaise Pascal’s Pascaline and other mechanical computers qualify as IT, too. But this particular journey begins well into the 20th century.

The 1970’s: Mainly Mainframes

Computers of this era were mostly mainframes and minicomputers, and a history of information technology wouldn’t be complete without mentioning them. IT job roles included manually running user batch tasks, performing printer backups, conducting system upgrades via lengthy procedures, keeping terminals stocked with paper and swapping out blown tubes. IT staff was relegated mainly to basements and other clean rooms that housed the big iron. System interconnectivity was minimal at the time, so people had to bridge those gaps themselves. This was the motivation behind the Internet (or the ARPANET, as it was known then).

The 1980’s: Say Hello to the PC

This decade saw the growth of the minicomputer (think DEC VAX computers) and the introduction of the PC. Sysadmins crawled out of the basement and into the hallways and computer rooms of schools, libraries and businesses that needed them onsite. The typical IT roles at this time consisted of installing and maintaining file and print servers to automate data storage, retrieval and printing. Other business roles included installing and upgrading DOS on PCs.

If you worked in a school, you saw the introduction of the Apple II, Commodore 64 and, eventually, the IBM PC. But the personal computer was more expensive, deemed for business use and not deployed in schools very much. It was the Apple II that propelled the education market forward and, if you worked support at a school in the ’80s, you knew all about floppy disks, daisy wheel printers and RS-232 cables.

The 1990’s: Cubicles, Windows and the Internet

This generation of IT worked in cubicles (think “Tron” or “Office Space“), often sharing that space alongside the users they supported. Most employees were using PCs with Windows by this time, and IT support was focused on networking, network maintenance, PC email support, Windows and Microsoft Office installations — and adding memory or graphics cards for those who needed them.

Toward the end of the decade, the Web’s contribution to Internet connectivity became arguably the most requested computing resource among growing businesses. Although there was no Facebook, Twitter or LinkedIn yet (Friendster would kick off that trend in 2002), employers still worried about productivity and often limited Web access. Oh, and if you could go ahead and add modems to PCs, run phone lines for those who needed dial-up access and Internet-enable the business LAN, that would be great.

Today’s IT: Welcome to Apple, Patch Tuesday and BYOD

Today, recent IT job roles have included the rebirth of Mac support, the introduction of social media (and the blocking of its access at work), constant security patches (Patch Tuesday on Windows, for instance), the advent of BYOD and DevOps automation.

The continued consumerization of IT (essentially now BYOD) meant that IT pros had “that kind” of job where friends and family would ask for help without pause. The one common thread through the years? The growth of automation in the IT role — something that will continue to define tomorrow’s helpdesk.

Image source: Wikimedia Commons

CIOFORUM_2I was asked recently to speak on the “What IT skills/roles should reside in the Business” panel at the Premier CIO Forum in Boston, a well-attended and engaging event supported by SIM (Society for Information Management). It was an impressive roster of IT executives from across the New England region.

“New technology is now requiring IT and the Business and to be extraordinary dancing partners” was the introduction to our panel session moderated by Sharon Kaiser, CIO for ABIOMED, Inc. My fellow panelists for analyzing the IT/Business “dance,” who should lead, the right steps to follow, the expected pace, were Matthew Ferm, Managing Partner of Harvard Partners, and Hunter Smith, former CIO of Acadian Asset Management. It was a lively discussion, with a very participative audience.  Here are the highlights:

  • Speed, flexibility and leadership are key for today’s IT. Shadow IT, where pockets of a Business go off on their own to buy, say cloud services or a product, is usually a response to when an IT department is unresponsive. The trouble with such approaches is that it also often silos IT, and many times the business will come back with a need to integrate a hastily purchased product, or even to get it to run.  The lesson is: deep partnership between IT and the Business, continually optimized, is needed. If IT is truly enabling, it will not be viewed just as a gate-keeper but as a partner.
  • For engaging well you need skills in IT and the Business that complement each other.  Thus Business Analysis (BA) as a position residing in a Business is very helpful. It ensures requirements are vetted, understood and relatively fixed, and there will be ownership for what IT will be asked to do. But, IT also needs BA skills on their side, even if it may not be a job title. Most importantly, IT must understand business processes deeply so that the value of a project is understood, and where needed, valid input can be given on process simplification where warranted. The BA role in the Business must understand technology and how IT works for this to be a true partnership.
  • Security, Disaster Recovery, responsibility for LAN/WAN/server environments and access should all reside with IT.  Some roles, such as project management (PM) can be in either IT or the Business, since good PM will be driven by data and not by persuasion or vested interest.  Some roles, such as QA/Testing need to go beyond IT testing a technology developed to meet a business need. It must say, “yes, hit the requirements” to the Business testing out the actual use cases with a process workflow, so that base assumptions and expected value are actually vetted out.

These discussions showed that regardless of company size, the audience had similar experiences: rapidly increasing need for a close, agile relationship between IT and the Business, a huge technology wave of possibilities, and opportunity for re-thinking roles and responsibilities. One must experiment and evolve, as well as establish a strong communications and shared-goal mentality with the Business. I ended by noting, “If you treat IT as a commodity that is what you will get. If you treat it as the leading edge of your Business, you will have a weapon like no other.” The audience very much agreed.

file transfer project blueprintl
To complete a successful file transfer project, you need to put a plan in place.

Cutting over to any new software is daunting, but by following a proven methodology – or blueprint – if you will, you can pave the way for success.

The biggest issue I see come up is wanting to move everything over “as-is” into a new solution in a short amount of time. It’s completely understandable – usually file transfer is just one aspect of a very busy administrator’s day – however, it’s paramount to set the expectation that to be successful, you need to put a plan in place. In this first of a two-part post, I cover the first two steps in a proven four-step plan for ensuring a smooth implementation.

  1. Research and Preparation – Moving one or several processes over to a new system requires some strategy and thought. First, research which processes will be transitioned over to the new file transfer project. Make sure to meet with key stakeholders as you come up with the list. It’s a good idea to focus on some small to medium processes to move over first.At the same time, this is an opportunity for some spring cleaning – to eliminate unused processes, and make other processes more efficient. Because this is a tedious exercise, it should be done well in advance of the actual implementation.The most successful implementations I’ve seen are those done in phases instead of via a large cut-over that is bound to be stressful and problematic. Whether you stage it by business unit or by specific process, breaking down the implementation into smaller chunks will equate to a successful and seamless implementation.
  2. Implementation and Testing – Once the preparation is done, implementing is typically a straightforward process. It’s good to be familiar with the product and also have someone on the project team knowledgeable about current processes. However, when that’s not the case, you need to figure out the relevant processes and translate them into the new product. With custom scripts this can be quite daunting, which is why it’s helpful to use a product that includes integration points and scripts to make things easier.It is crucial to test the system before putting it into production and making changes to avoid any SLA (Service Level Agreement) violations. Most partners will provide test files to ensure a successful test. Both the partner and the administrator should be aware of tests to make sure no test files are processed that could disrupt a production business process. Normally during a test, files are transferred or received and both parties acknowledge the successful receipt and also what should happen after a successful transmission, for example a file is archived or deleted.

In my next post, I’ll cover steps three and four of this proven methodology.

NHBC Logo
“Ipswitch FT’s secure MOVEit solution gives us full visibility and management of file transfers, and enables us to avoid fines of up to £250,000 for non-compliance…”Wayne Watson, information security manager for NHBC 

The National House-Building Council (NHBC) , the UK’s leading home warranty and insurance provider has greatly expanded its use of MOVEit to ensure the organization adheres to  file transfer best practices, while meeting compliance with internal standards and external regulators, including the Financial Conduct Authority (FCA).

Securing Builders’ Drawings, Architectural Designs, Legal Files and More
Secure, managed file transfer (MFT) is a high-priority for NHBC. In the past six months alone, the company has doubled the number of employees successfully using MOVEit, with over 200 active users now securing file transfers. Its business straddles the heavily regulated insurance and building sectors, and daily activities demand a constant flow of secure, confidential, copyright and personal documents and communications. These include builders’ drawings, architectural designs, legal files and more, sent between internal departments and on to external stakeholders such as solicitors, lawyers, builders, architects and homeowners.

No More File Sharing Via USB drives, Email Attachments, or Unsecured Apps
By using Ipswitch File Transfer’s MOVEit system as a compliance solution, NHBC now meets strict ISO 27000 internal security standards and exceeds compliance and regulation requirements such as those set by the FCA and the Data Protection Act (DPA). Previously, NHBC employees had to encrypt and share files via SD cards, USB drives, CD-Rs, email attachments and an assortment of unsecured web-based file sharing apps. But a tremendous shift in attitudes in recent years has led to more organizations like NHBC integrating MFT platforms, making unsecured email attachments and portable media things of the past.

Wayne Watson, information security manager for NHBC, said: “Ipswitch FT’s secure MOVEit solution gives us full visibility and management of file transfers, and enables us to avoid fines of up to £250,000 for non-compliance, as well as maintaining our company’s 75-year trusted reputation.”

Let’s start to examine the impact of end-to-end visibility and ways it can be put to work for your organization.  For starters, let’s dig into correlation.

Correlation involves identifying related actions and events as a file moves through a series of business processes (including what happens after a file is moved, renamed, or deleted), and using that information to make business decisions.  Correlation can also associate file transfer metadata with downstream processes such as whether a product was shipped or an invoice was paid after an order was received from a customer.

Ipswitch’s Frank Kenney shares some thoughts in the video below on why correlation is an especially important part of visibility and how it enables you to really understand not only file transfers, but also the applications, processes, purchase orders and other items in your infrastructure that tie back to customers, SLA’s and revenue..

[youtube]http://www.youtube.com/watch?v=ZOSoT95oFUg[/youtube]
Correlation enables users to easily view all the events related to the transfer and consumption of a single file or set of files, including subsequent applications and resulting business processes.  For example, they can track a file through a complete workflow and throughout its entire lifecycle, even if it was shared with a customer or business partner  – critical insight that can impact the quality and timeliness of work, service level agreements, not to mention revenue and profitability.

Information flows into, within and out of organizations faster and in greater volumes than ever before.  Complicating matters is the growing number of vendor systems, applications and platforms that make up your company’s business infrastructure and touch even your most sensitive and mission-critical information.

If you don’t have visibility into the data and files that are flowing between systems, applications and people — both inside and beyond the company firewall — things can go haywire very quickly.

  • Lost files, security breaches and compliance violations
  • Broken SLAs and other processes that are dependent on files
  • No file lifecycle tracking as data flows between applications, systems and people
  • Damaged partner and customer relationships
  • Lost opportunities

Relying on the reporting capabilities of each individual system has proven to be risky and inefficient.  Chances are, you’re swimming in a sea of not-very-useful-or-actionable data and static reports that are already a week behind with what’s actually happening in your company this very instant.

In today’s blog video, Frank Kenney shares his thoughts why having one consolidated view is critical and why organizations are having such a hard time achieving visibility.

[youtube]http://www.youtube.com/watch?v=ow3l1AetI_Q[/youtube]

When it comes to your file transfers, many questions exist.  Do you have the total visibility your business requires?   How do your customers gain visibility into their file transfers??   Do you have all the information you need to meet your service level agreements (SLAs) as well as enabling transparency about integration and file transfers???  Let Ipswitch help you answer these questions and overcome your visibility challenges.

You’re going to be hearing more and more about “VISIBILITY” from Ipswitch, so I’d like to quickly start this blog post with our definition of visibility in the context of files and data flowing into, within and out of your company:

Visibility:  “Unobstructed vision into all data interactions, including files, events, people, policies and processes”

Fast, easy access to critical file and data transfer information is a must-have – it’s critical to the success of your business.  Whether it’s tracking and reporting on SLAs, analyzing file transfer metrics to identify bottlenecks and improve efficiency, or providing customers and partners with easy self-service access to the file transfer information they require – as well as countless other business objectives – unobstructed visibility is imperative.

Having one consolidated view into all of the systems and processes involved in your organizations file and data transfers will deliver tremendous business value and a competitive edge.  Please do take a couple of minutes to watch Ipswitch’s Frank Kenney share his perspective on why visibility is important.

[youtube]http://www.youtube.com/watch?v=qsxzweLBRGA&feature=channel_video_title[/youtube]

“My company still relies heavily on FTP.  I know we should be using something more secure, but I don’t know where to begin.”

Sound familiar?

The easy answer is that you should migrate away from antiquated FTP software because it could be putting your company’s data at risk – Unsecured data is obviously an enormous liability.  Not only does FTP pose a real security threat, but it also lacks many of the management and enforcement capabilities that modern Managed File Transfer solutions offer.

No, it won’t be as daunting of a task as you think.  Here’s a few steps to help you get started:

  • Identify the various tools that are being used to transfer information in, out, and around your organization.  This would include not only all the one-off FTP instances, but also email attachments, file sharing websites, smartphones, EDI, etc.  Chances are, you’ll be surprised to learn some of the methods employees are using to share and move files and data.
  • Map out existing processes for file and data interactions.  Include person-to-person, person-to-server, business-to-business and system-to-system scenarios.  Make sure you really understand the business processes that consume and rely on data.
  • Take inventory of the places where files live.  Servers, employee computers, network directories, SharePoint, ordering systems, CRM software, etc.  After all, it’s harder to protect information that you don’t even know exists.
  • Think about how much your company depends on the secure and reliable transfer of files and data.  What would the effects be of a data breach?  How much does revenue or profitability depend on the underlying business process and the data that feeds them?
  • Determine who has access to sensitive company information.  Then think about who really needs access (and who doesn’t) to the various types of information.  If you’re not already controlling access to company information, it should be part of your near-term plan.   Not everybody in your company should have access to everything.

Modern managed file transfer solutions deliver not only the security you know your business requires, but also the ability to better govern and control you data…. As well as provide you with visibility and auditing capabilities into all of your organizations data interactions, including files, events, people, policies and processes.

So what are you waiting for?

 

As George Hulme recently wrote, the vision of Senator Richard Blumenthal’s data breach legislation is simple enough:  Protect individuals’ personally identifiable information from data theft, and penalize firms that don’t adequately secure their customers’ information.

Clearly, there’s a need for organizations to better secure confidential and private customer information.  It seems that a week rarely passes without a new high-profile data breach in the news.  In fact, 2011 is trending to be the worst-ever year for data breaches.  And that is despite many U.S. states introducing legislation that expands the scope of state laws, sets stricter requirements related to notification of data breaches involving personal information, and increases penalties for those responsible for breaches.

The need to protect customer data is unanimously shared by honest people worldwide…. The issue is HOW to effectively govern and enforce the various data protection requirements and laws?

I agree with Senator Blumenthal’s concept of establishing “appropriate minimum security plans”…. But color me skeptical on the government’s ability to appropriately monitor and enforce those plans, especially after witnessing the mighty struggles at effectively governing the dozens of state laws already on the books.

My skepticism is shared by many, including Mark Rasch, director of cybersecurity and privacy consulting at Computer Sciences Corporation:  “The devil is in the details with these laws.  We’ve had regulations, from Gramm-Leach-Bliley to HIPAA, that purport to help protect consumer data.  Companies are already victims in these attacks, so why are we penalizing them after a breach?  I think that’s because it’s easier to issue fines than it is to track down the criminals and go after them.”

In my opinion, business leaders need to prioritize their own internal efforts to properly protect sensitive information rather than wait on the government to catch up.  First order of business is to identify where confidential files and data live in your organization and ensure visibility of that info (after all, how can you protect what you don’t know about?).  Fortunately, there are technology solutions available to help organizations better manage and govern their critical files and data as they are being moved and consumed both internally and with business partners and across people, systems and various business applications.

You might say that the entire point of a Managed File Transfer (MFT) system is to do exactly that: provide centralized management and control. For example, let’s say that your company is subject to the Payment Card Industry Data Security Standard (PCI DSS). Requirement 4 of PCI DSS is to “encrypt transmission of cardholder data and sensitive information across public networks,” such as the Internet. Let’s also say that you frequently need to transmit cardholder data to partner companies, such as vendors who will be fulfilling requests.

One option is to simply allow someone within your company to email that information, or to have an automated process do so. You’ll need to ensure that everyone remembers to encrypt those emails — you did remember to get digital certificates for everyone, correct? — every single time. If someone forgets, you’ve created the potential for a data breach, and it’s not going to look very good for your company on the evening news.

Another option is to automate the file transfer using an MFT solution. That solution can be centrally configured to always apply PGP‐based encryption to the file, to always require an FTP‐over‐SSL connection with the vendors’ FTP servers, and to always require 256‐bit AES encryption. You don’t have to remember those details beyond the initial configuration — it’s
centrally configured. Even if your users need to manually transfer something ad‐hoc — perhaps an additional emergency order during the Christmas rush — your MFT solution will “know the rules” and act accordingly. Your users’ lives become easier, your data stays protected, and everyone sleeps more soundly at night. This central control is often referred to as policy-based configuration because it’s typically configured in one spot and enforced — not just applied — to your entire MFT infrastructure, regardless of how many physical servers and clients you are running.
What’s the difference between enforced and applied? Making a configuration change is applying it. That doesn’t, of course, stop someone else from coming along behind you and applying a new configuration. The idea with policies is that they’re configured sort of on their own, and that they’re protected by a unique set of permissions that govern who can modify them—they’re not just wide‐open to the day‐to‐day administrators who maintain your servers. In many cases, a review/approve workflow may have to be followed to make a change to a policy. Once set, the policies are continually applied to manageable elements such as MFT client software and MFT servers. A server administrator can’t just re-configure a server, because the policy prevents it. The MFT solution ensures that your entire MFT infrastructure stays properly configured all the time.

– From The Tips and Tricks Guide to Managed File Transfer by Don Jones

To read more, check out the full eBook or stay tuned for more file transfer tips and tricks!

Possibly not. The Internet’s venerable File Transfer Protocol (FTP) is usually supported by Managed File Transfer (MFT) systems, which can typically use FTP as one of the ways in which data is physically moved from place to place. However, MFT essentially wraps a significant management and automation layer around FTP. Consider some of the things an MFT solution might provide above and beyond FTP itself—even if FTP was, in fact, being used for the actual transfer of data:

  • Most MFT solutions will offer a secure, encrypted variant of FTP as well as numerous other more‐secure file transfer options. Remember that FTP by itself doesn’t offer any form of transport level encryption (although you could obviously encrypt the file data itself before sending, and decrypt it upon receipt; doing so involves logistical complications like sharing passwords or certificates).
  • MFT solutions often provide guaranteed delivery, meaning they use file transfer protocols that give the sender a confirmation that the file was, in fact, correctly received by the recipient. This can be important in a number of business situations.
  • MFT solutions can provide automation for transfers, automatically transferring files that are placed into a given folder, transferring files at a certain time of day, and so forth.
  • MFT servers can also provide set‐up and clean‐up automation. For example, successfully‐transferred files might be securely wiped from the MFT server’s storage to help prevent unauthorized disclosure or additional transfers.
  • MFT servers may provide application programming interfaces (APIs) that make file transfer easier to integrate into your internal line‐of‐business applications.
  • MFT solutions commonly provide detailed audit logs of transfer activity, which can be useful for troubleshooting, security, compliance, and many other business purposes.
  • Enterprise‐class MFT solutions may provide options for automated failover and high availability, helping to ensure that your critical file transfers take place even in the event of certain kinds of software or hardware failures.

In short, FTP isn’t a bad file transfer protocol—although it doesn’t offer encryption. MFT isn’t a file transfer protocol at all; it’s a set of management services that wrap around file transfer protocols—like FTP, although that’s not the only choice—to provide better security, manageability, accountability, and automation.

In today’s business, FTP is rarely “enough.” Aside from its general lack of security—which can be partially addressed by using protocols such as SFTP or FTPS instead—FTP simply lacks manageability, integration, and accountability. Many businesses feel that they simply need to “get a file from one place to another,” but in reality they also need to:

  • Make sure the file isn’t disclosed to anyone else
  • Ensure, in a provable way, that the file got to its destination
  • Get the file from, or deliver a file to, other business systems (integration)

In some cases, the business might even need to translate or transform a file before sending it or after receiving it. For example, a file received in XML format may need to be translated to several CSV files before being fed to other business systems or databases—and an MFT solution can provide the functionality needed to make that happen.

Many organizations tend to look at MFT first for its security capabilities, which often revolve around a few basic themes:

  • Protecting data in‐transit (encryption)
  • Ensuring that only authorized individuals can access the MFT system (authorization and authentication)
  • Tracking transfer activity (auditing)
  • Reducing the spread of data (securely wiping temporary files after transfers are complete, and controlling the number of times a file can be transferred)

These are all things that a simple FTP server can’t provide. Having satisfied their security requirements, organizations then begin to take advantage of the manageability capabilities of MFT systems, including centralized control, tracking, automation, and so forth—again, features that an FTP server alone simply can’t give you.

– From The Tips and Tricks Guide to Managed File Transfer by Don Jones

To read more, check out the full eBook or stay tuned for more file transfer tips and tricks!

Definitely not. To begin with, there are numerous kinds of encryption—some of which can actually be broken quite easily. One of the earlier common forms of encryption (around 1996) relied on encryption keys that were 40 bits in length; surprisingly, many technologies and products continue to use this older, weaker form of encryption. Although there are nearly a trillion possible encryption keys using this form of encryption, relatively little computing power is needed to break the encryption—a modern home computer can do so in just a few days, and a powerful supercomputer can do so in a few minutes.

So all encryption is definitely not the same. That said, the field of cryptography has become incredibly complex and technical in the past few years, and it has become very difficult for business people and even information technology professionals to fully understand the various differences. There are different encryption algorithms—DES, AES, and so forth—as well as encryption keys of differing lengths. Rather than try to become a cryptographic expert, your business would do well to look at higher‐level performance standards.

One such standard comes under the US Federal Information Processing Standards. FIPS specifications are managed by the National Institute of Standards and Technology (NIST); FIPS 140‐2 is the standard that specifically applies to data encryption, and it is managed by NIST’s Computer Security Division. In fact, FIPS 140‐2 is accepted by both the US and Canadian governments, and is used by almost all US government agencies, including the National Security Agency (NSA), and by many foreign ones. Although not mandated for private commercial use, the general feeling in the industry is that “if it’s good enough for the paranoid folks at the NSA, it’s good enough for us too.”

FIPS 140‐2 specifies the encryption algorithms and key strengths that a cryptography package must support in order to become certified. The standard also specifies testing criteria, and FIPS 140‐2 certified products are those products that have passed the specified tests. Vendors of cryptography products can submit their products to the FIPS Cryptographic Module Validation Program (CMVP), which validates that the product meets the FIPS specification. The validation program is administered by NIST‐certified independent labs, which not only examine the source code of the product but also its design documents and related materials—before subjecting the product to a battery of confirmation tests.

In fact, there’s another facet—in addition to encryption algorithm and key strength—that further demonstrates how all encryption isn’t the same: back doors. Encryption is implemented by computer programs, and those programs are written by human beings— who sometimes can’t resist including an “Easter egg,” back door, or other surprise in the code. These additions can weaken the strength of security‐related code by making it easier to recover encryption keys, crack encryption, and so forth. Part of the CMVP process is an examination of the program source code to ensure that no such back doors exist in the code—further validating the strength and security of the encryption technology.

So the practical upshot is this: All encryption is not the same, and rather than become an expert on encryption, you should simply look for products that have earned FIPS 140‐2 certification. Doing so ensures that you’re getting the “best of breed” for modern cryptography practices, and that you’re avoiding back doors, Easter eggs, and other unwanted inclusions in the code.

You can go a bit further. Cryptographic modules are certified by FIPS 140‐2, but the encryption algorithms themselves can be certified by FIPS 197 (Advanced Encryption Standard), FIPS 180 (SHA‐1 and HMAC‐SHA‐1 algorithms). By selecting a product that utilizes certified cryptography, you’re assured of getting the most powerful, most secure encryption currently available.

– From The Tips and Tricks Guide to Managed File Transfer by Don Jones

To read more, check out the full eBook or stay tuned for more file transfer tips and tricks!