Ipswitch surveyed IT professionals across the globe and it turns out that data security and compliance are top challenges for IT teams in 2016.

How We Did It

Ipswitch polled 555 IT team members who work in companies across the globe with greater than 500 employees. We surveyed IT pros globally, partnering with Vanson Bourne in Europe, between October-November 2015 to learn about their File Transfer habits and goals.

Demographics

255 in the US and 300 in Europe (100 each UK, France and Germany)

Totals by industry:

  • Banking/finance 15%
  • Government 15%
  • Healthcare 16%
  • Manufacturing 10%
  • Insurance 6%
  • Retail 6%
  • Other (includes Technology, Consulting, Utilities/Energy, Construction, & others) 32%

2016 State of Data Security and Compliance Infographic

Click on the infographic to see full size. 

2016-ipswitch-state-of-data-security-and-compliance

Share this Image On Your Site

ftp-broncos
Ipswitch’s FTPS server gave the Broncos the defense they needed for protecting data in motion.

Data Security a Huge Issue for NFL Teams

After a season of highs and lows, the Denver Broncos are headed to Super Bowl 50 to face the Carolina Panthers. But teamwork, dedication and hard work aren’t the only things that contributed to the Broncos’ surge to the NFL’s championship game.

The amount of data generated by an NFL team is staggering. Besides statistics, plays, strategies and a crunch of information that would make some quarterbacks’ heads hurt, the business of running a professional sports team requires videos, photos and graphics to be distributed to special events, marketing and fans relations partners.

Because of email and private network restrictions, all of this data used to be downloaded to discs, thumb drives or hard drives. They would then be delivered by hand to players, coaches and other important members of the Broncos team.

WS_FTP is Broncos’ Choice for an FTPS Server

The franchise’s use of Ipswitch WS_FTP Server, a FTPS (file transfer protocol secure) server,  gave it a great defense for protecting data in motion. This data includes plays, high-definition videos, graphics and more to players, coaches and business partners. You could argue file transfer capabilities didn’t directly get the Broncos to the biggest game in football, but it certainly didn’t hurt.

But this process was time-consuming, inefficient and not to mention a huge data security risk. Ipswitch’s WS_FTP Server  came to the rescue the same way Brock Osweiler saved the day – or at least didn’t blow it – this past season when quarterback Peyton Manning missed some of the action with an injured foot.

Unlike Osweiler, who subbed for Manning only temporarily, WS_FTP Server was a permanent solution to the Broncos’ file transfer woes. WS_FTP Server is secure enough to keep confidential team information out of the wrong hands – some would unfairly imply out of the New England Patriots’ hands. It’s also powerful enough to handle the influx and growth of data, and gives ultimate visibility and control for top achievement.

Another key quality of WS_FTP Server is its uninterrupted service that increases uptime, availability and consistent performance with a failover configuration. Unlike the Microsoft Surface tablets that failed the Patriots during the recent AFC Conference Championship, WS_FTP Server won’t go down, or leave the Broncos’ files in limbo, unprotected and undelivered.

NFL Becoming a Technology-Driven Business

The NFL’s need for quality IT service goes beyond devices displaying plays and diagrams. File transfer played a role in the way football went from throwing a pig skin down a grassy field to being a technology-driven business.

By providing partners with just a username and password, transferring files is completed in just a few clicks. So before the Broncos head to Santa Clara for the big game, the team can rest easy knowing its files are secure and accessible by all players, coaches, team executives and business professionals keeping the team running smoothly.

Read the Ipswitch File Transfer Case Study: Denver Broncos

We’ll find out Sunday if the Broncos and Manning can beat the tough Panthers, if the commercials will make us laugh and if Beyoncé and Coldplay will dazzle with their halftime show. But one thing the Broncos and all Ipswitch customers will always be assured of is the success, security and compliance of WS_FTP Server file transfer solution.

 

mobile-device-hackerDid you know your mobile phone and wearables are just as appealing to hackers as your online bank account? No one is impervious to increasingly sophisticated mobile device hacking. Case in point, James Clapper, the U.S. director of national intelligence (DNI), had his phone hacked last month with calls rerouted to the Free Palestine Movement. And in October 2015, CIA director John Brennan’s mobile device fell victim to the activity of a group of “pot-smoking teenagers.” Bottom line? Not even next-gen hardware is completely safe.

So long as support enforces two-factor authentication and staff doesn’t access free Wi-Fi hotspots (especially when handling business data), a mobile phone should be safe, right? Nope. As noted by Dialed In and Wired, determined hackers do a lot more with your mobile and wearable technology than you may realize.

Mobile Phones: Hackers’ Best Friend

Any iPhone newer than the 4 comes with a high-quality accelerometer, or “tilt sensor.” If hackers access this sensor and you leave the phone on your desk, it is possible for them to both detect and analyze the vibration of your computer keyboard and determine what you’re typing, with 80 percent accuracy. So, say you type in the address of your favorite banking web portal and then your login credentials; hackers now have total access.

App developers have wised up to hackers targeting microphones and made it much more difficult to gain access without getting caught. Enterprising criminals, however, have discovered a way to tap a phone’s gyroscope. This lets the user play Angry Birds or any other orientation-based program and detect sound waves through it. So, next time you talk about finances with your significant other while three-starring a new level in your go-to mobile game, you may also be giving hackers the information they need to steal from you.

Targeting RFID Chips

In an effort to make retail purchases easier and more secure, many credit cards come equipped with RFID chips. Smartphones, meanwhile, include near-field communication (NFC) technology that allows them to transmit and receive that RFID data. The risk, here, is that hackers who manage to compromise your phone can leverage malware to read the information from a card’s RFID chip if you’re storing it in a nearby wallet or card-carrying mobile case. Then they can make a physical copy. You’re defrauded and don’t even know it.

“Say Cheese”

Mobile cameras have also come under scrutiny, since hacking this feature lets attackers take snaps of you or your family whenever and wherever they want. Despite improvements in basic phone security, though, it’s still possible for malicious users to take control of your camera. It goes like this: Operating systems like Android now mandate that a preview of any new photograph must be displayed on-screen, but don’t specify the size of this image. As a result, cybercriminals can take surreptitious photographs and then send them to anyone at any location.

MDM Leads to Risk

A large number of smartphones contain weak mobile device management (MDM) tools installed by carriers. And although reaching these tools in a target phone requires close proximity and the use of rogue base stations or femtocells, the risk is substantial. Attackers could take total control of your device.

Fit or Foul?

Mobile phones aren’t the only technology at risk; wearables are also open to attack. What can hackers do to these devices? Back in March 2015, wearable maker Fitbit was notified by researchers that their device could be hacked in fewer than 10 seconds. While initial reports focused on logical changes such as altering steps taken or distance walked, as noted by The Hacker News, it wasn’t long before hackers discovered a way to inject malware that potentially spreads to all synced devices.

Potentially Lethal Consequences

Security flaws in wireless-enabled pacemakers could allow hackers to take control of (and then stop) this critical device as well. In September 2015, a team from the University of Southern Alabama managed to access a functioning pacemaker and “kill” a medical mannequin attached to the device.

Medical devices such as insulin pumps and implantable defibrillators have notoriously weak security — a lack of encryption and weak or default passwords, in particular — of which cybercriminals can easily take control. The result? Delivering a fatal drug overdose or shocking perfectly healthy patients without warning.

Be Diligent About Mobile Security

The lion’s share of existing security issues stem from poor app development in mobile and wearable devices. Mobile device developers prioritize speed over security and eschew critical features such as encrypted commands, limited application sessions and disabling repeat requests. And while recognizing these flaws is the first step to improving mobile safety, users need to be aware of today’s risk factors. Right now, hackers can do far more with a mobile or wearable than the user may realize.

how it pros can navigate through a job interview
What can you do to make the IT job interview go well?

You’ve landed an IT job interview. That’s the good news. Now you have the interview itself, and let’s be honest, it’s never fun. Most candidates don’t like putting on a show of the software and protocols they’re familiar with. Even actors aren’t in love with auditioning. The “social” aspect of recruitment isn’t something you should need to ace for an admin position, but it has to be done.

If the job is a really good one — the technical work that’ll challenge your current support acumen (and compensate you well for the weekend maintenance) — you probably have a bit of an imposter complex even just applying. When the “ideal candidate” is an infosec wizard, how dare you present yourself? But hey, you believe you can do it, and the pay is great. So read that magazine and wait to be met.

Find Strengths in Technical Weaknesses

What can you do to make the IT job interview go well? Some things should be no-brainers, but there’s a reason think pieces keep pounding them into your head (present article excluded). Don’t be “creepy” with company research, advises InformationWeek, and don’t dress for the beach unless an offbeat SMB suggests otherwise. Do pay attention to the job description, though (don’t ask questions it already answered), and learn enough about the employer to imply a healthy interest.

Ultimately, play to your strengths. Lawyers have a saying: If the facts are against you, argue the law; if the law is against you, argue the facts. If you don’t have hands-on experience in data center migration, stress your credentials in bandwidth control during this process. Show that you know what’s involved in secure file transfers even if you haven’t managed them offsite. If your formal credentials are thin, play up your experience in the network trenches during the Super Bowl traffic spike.

Be Mindful of the Interviewers Who Don’t Work in IT

With luck, your interview with an IT rep will find some common ground. There may be scripts you’re both comfortable reading or security issues you should both be following. This will give you the chance to talk like a human as well as what the job will involve. One of the bigger challenges of an IT job interview, however, is that you may also meet someone from the business side. This guy knows only vaguely what network monitoring tools are and is probably a bit intimidated by the idea of bandwidth or network latency. In other words, they probably feel like the imposter, interviewing someone for a seat in ops they don’t fully understand.

But one thing you definitely don’t want to do is remind the interviewer of their own uncertainties. Talk confidently about the work, without going so deep into the technical weeds that the interviewer isn’t sure what you’re saying. Although this shorthand may demonstrate fluency in a multi-vendor environment, it can also suggest you can’t communicate well with the other departments.

You’re a Social Animal

For better or worse, a job interview is a social interaction. Some sysadmins and IT pros would gladly trade the spotlight for wrestling with a wonky script or normalizing office bandwidth.

Nonetheless, this can produce a disconnect. As one IT candidate reported by Dice.com said when asked to describe the ideal work environment, “I just want a job where I can go in a room, do my work and be left alone.”

That candidate probably speaks for many admins, developers, and other overworked helpdesks, but he didn’t get the job. Business people (including those who work for nonprofits and government) tend to celebrate charisma, and for good reason: The job is all about meeting client needs, which means talking to the customer to understand what they really want.

The good news? Your competition is other techies, probably just as geeky at heart.

The bottom line is that if you’re comfortable about your qualifications for the job — even if it is pushing your limits — that confidence will show through, and help you navigate the rocky spots. And who knows, you may be just who they’re looking for.

Football is no longer simply a game played on grass or turf — it's now awash in tech.

Things on the gridiron have changed. Once the province of paper-based play analysis, complicated hand signals and rules reliant on the eyes and ears of human refs, football is now awash in tech. Just take a look at the broken Surface tablets from last week’s AFC championship. With the Panthers and Broncos squaring up for Super Bowl 50 next week, here’s a look at the NFL technology (and IT teams behind it) that help elevate the sport while keeping its time-honored traditions intact.

It starts at Art McNally GameDay Central, located at NFL Headquarters in New York City. From here, Game Operations staff are tasked with prepping every communication and broadcast system before gametime while checking for radio frequency conflicts and handling failures prior to air. From a corporate standpoint, the GameDay crew is analogous to CIOs and their admin staff; they get the “big picture,” ensuring sysadmins on the ground have the information necessary to get their jobs done.

Clean Frequencies

Key to Game Ops is keeping radio frequencies clean. As the number of licensed bandwidths approved by the Federal Communications Commission (FCC) continues to grow, fewer clear channels exist for team officials and their support staff to use. With this in mind, operations must make sure both teams, their coaches and all TV network crews use the right bandwidth spectrum for headsets, microphones and any Wi-Fi connections to prevent accidental “jamming.” Jamming often leads to signal loss at a critical moment.

Operations staff are also responsible for ferreting out any “not-so-accidental” frequency interruptions; the New England Patriots’ “Headsetgate” comes to mind, especially since the team regularly shows potential as a Super Bowl contender. Did they really tamper with headsets? Maybe, maybe not — there have been a number of accusations over the past few years — but what matters for Super Bowl 50 is that Game Ops staff are up to the challenge of tracking down any technical issues regardless of origin or intent.

‘Instant’ Replay

Game Ops staff are also responsible for overseeing the use of NFL Instant Replay technology, which got its start in 1986, was removed in 1992 and then reimplemented in 1999. GameDay teams use the league’s proprietary NFL Vision software to analyze replays and communicate with both the stadium’s replay official and the referee before he goes under the hood — both of which shorten the length of a replay review. Think of it like analytics; the NFL is investing in software that can capture relevant data, serve it up to experts and empower users in (or on) the field.

On the Ground

Crews in the stadium during Super Bowl 50 are responsible for managing a few new pieces of hardware, including Microsoft Surface to analyze defensive and offensive formations. But because these tablets have no Internet access and their software cannot be altered, the league is currently testing a “video review” feature which may be implemented in future seasons.

Not everything works perfectly, though. As noted by Geek Wire, a problem during the December 8, 2015, matchup between Dallas and Washington forced these tablets out of service and left coaches with pen-and-paper diagrams. And on January 24, 2016, in the AFC Championship game, the Patriots suffered significant tablet malfunctions causing more than a few frustrations on the sidelines, especially since the Denver Broncos weren’t required to give up their still-working tablets under the NFL’s “equity rule”. February’s onsite IT will need to not only monitor the performance of the Sideline Viewing System, but its connection to their team’s tablet. System monitoring comes to mind here: Small precursor events in still-picture taking or tablet connections could act as warning signs for larger problems, if caught early enough.

Real-Time Stats

There’s also a need for data aggregation as the league moves toward full adoption of M2M systems like Zebra player tracking. Using RFID chips in each player’s shoulder pads, it is now possible to track their movements in-game in real time, then provide “next-generation stats” for fans. The larger value, however, comes in the form of actionable data obtained by recording and mining the information collected by these sensors. NFL technology professionals are tasked with not only ensuring this data stream is uninterrupted, but also making something useful out of the final product — a curated version of player data that trainers can use to improve Super Bowl performance.

Data Encryption

NFL teams need to transfer highly sensitive files containing details regarding trades, play books, and player contracts. In the past, the Denver Broncos used thumb drives and CDs to physically pass around large data files including that containing high-res video and image files. It was a manual and unstructured process that proved to be a time waster, lacking even basic security controls. Email was not an option because of the file size since most IT teams limit the size on email attachments.

In order to secure their data in motion and move it without hassle, regardless of the size, the Broncos picked Ipswitch WS_FTP software for secure data transfer internally between departments, and externally with partners.

A New Career?

Interested in working support for the NFL? It’s possible: While the Cleveland Browns are hiring an intern, the Washington Redskins need help at the helpdesk and the Seattle Seahawks are looking for a CRM data analyst. Interestingly, the job descriptions read like standard tech sector advertisements; NFL clubs have become enterprises in and of themselves, requiring multiple teams of backend IT personnel in addition to those on the ground during regular and postseason play.

Even the NFL is not all glitz and glory for IT. In fact, the league’s mandate is similar to most tech firms: Keep systems up and running while collecting and curating actionable data. Ultimately it’s a team effort — the work of many, not the power of one, moves the chains and snatches victory from the jaws of overtime.

personal healthcare information

This Thursday, January 28th is Data Privacy Day (aka Data Protection Day in Europe).  The purpose of Data Privacy Day is to raise awareness and promote privacy and data protection best practices. To honor Data Privacy Day, here are some ways you can protect personal healthcare information (PHI) in-motion, an area of focus for healthcare IT teams handling PHI.

Personal Healthcare Info is a Hacker’s Dream

PHI is considered to be the most sought after data by cyber criminals in 2016. Hackers are moving away from other forms of cyber crime such as that which targets bank accounts. Instead they are focusing more on PHI due to the amount of data contained within it. Valuable data within PHI includes social security numbers, insurance policy info, credit card info, and more.

The lack of a consistent approach to data security throughout the healthcare industry also makes healthcare data easier to obtain. The easier it is to steal, the more lucrative the data becomes to hackers. The healthcare industry has had less time than others to adapt to growing security vulnerabilities, and online criminals don’t take long to take notice.

GDPR and the End of Safe Harbor

It’s not news that governments around the globe are doing their part to promote data privacy. They are doing this by legislating data protection of personal data, and reinforcing with significant penalties for non-compliance.  Check out the recent agreement on the European Data Protection Regulation as the most recent example.

What is changing, however, is the rapid growth in data integration across the open Internet between hospitals, service providers like payment processors, insurance companies, government agencies, cloud applications and health information exchanges.  The borderless enterprise is a fact of life.

Using Encryption to Meet Data Privacy Regulations

It’s well known that a security strategy focused on perimeter defense is not good enough. For one reason, healthcare data must move outside its trusted network.  Encryption is the best means to limit access to protected data, since only those with the encryption key can read it. But there are other factors to look at when considering technology to protect data in motion, particularly when compliance with HIPAA or other governmental data privacy regulations is an issue.

Briefly when evaluating cyphers for file encryption, described in FIPS 197, its important to consider key size, eg 128, 192 or 256 bit, which affects security.   It’s also worth considering products with FIPS 140-2 certified cyphers accredited for use by the US government as an added measure of confidence.

Here are several other things to consider to protect data in motion and ensure compliance:

  • End-to-end encryption: Encrypting files while in-transit and at rest protects data from access on trusted servers via malware or malicious agents with secure access to trusted network
  • Visibility for audit: Reports and dashboards to provide centralized access to all transfer activity across the organization can reduce audit time and improve compliance
  • Integration with organizational user directories: LDAP or SAML 2 integration to user directories or identity provider solutions not only improves access control and reduces administrative tasks, but can also provide single sign-on capability and multi-factor authentication
  • Integration with other IT controls: While data integration extends beyond perimeter defense systems, consider integrate with data scanning systems. Antivirus protects your network from malware from incoming files and Data Loss Prevention (DLP) stops protected data from leaving.
  • End-point access to data integration services: There are more constituents than ever that participate in data exchange. Each has unique needs and likely require one or more of the following services:
    • Secure file transfer from any device or platform
    • Access status of data movement to manage Service Level Agreements (SLAs)
    • Schedule or monitor pre-defined automated transfer activities
  • Access control: With the growing number of participants including those outside the company it’s more important then ever to carefully manage access with role-based security.  Ensuring each have appropriate access to the required data and services.
  • File transfer automation: Automation can eliminate misdirected transfers by employees and external access to the trusted network.  Using a file transfer automation tool can also can significantly reduce IT administration time and backlog for business integration process enhancement requests.

Become Privacy Safe Starting with This Webinar

Protecting PHI within the healthcare system doesn’t have to be painful for hospital administrators or doctors to appropriately access PHI, but it does mean having the right technology and good training in place. And in honor of Data Privacy Day, don’t you want to tell your customers that their data is safe? You will be one step closer by signing up to tomorrow’s live webinar.

Learn how you can implement health data privacy controls to secure your healthcare data >> Register Here

For more on this topic register to hear David Lacey, former CISO, security expert, and who drafted original text behind ISO 27001, speak about implementing HIPAA and other healthcare security controls with a managed file transfer solution.

As confirmed by PriceWaterhouseCoopers, attacks against small and midsized businesses (SMBs) between 2013 and 2014 increased by 64 percent. Why? Low price, high reward.

Attackers can break through millions of poorly defended SMBs through automation, gaining access to a treasure trove of data. Small-business vulnerability assessments can identify your weaknesses, but they take time away from daily operations. Is a security vulnerability assessment really worth the resources? These five questions will help you decide.

What Does It Entail?

A vulnerability assessment identifies precious assets as well as how attackers could steal them from you. Not surprisingly, 2014’s most common attack vectors were:

  • Software exploit (53 percent).
  • User interaction, such as opening a malicious email attachment or clicking through an unsafe URL (44 percent).
  • Web application vulnerability, like SQL injection, XSS or remote file inclusion (33 percent).
  • Use of stolen credentials (33 percent).
  • DDoS (10 percent).

It’s impossible to patch every vulnerability. “You can scan and patch 24/7, 365 days a year,” says Forrester security researcher Kelley Mak, “and still not take out a significant chunk.” The key is to identify vulnerabilities that will result in the most damage to your bottom line.

How Frequently Should We Assess?

Frequency depends on what kind of data you store and what kind of business you operate. If you can say yes to the following, you should assess more often:

  • You’ve never assessed security vulnerability before, or it’s been a while. In either case, establish a baseline with frequent assessments for a year or so. Then dial back the frequency.
  • You’re subject to regulatory compliance. If you’re just checking boxes, you’re only getting a limited security picture. Compliance is a baseline, not an effective defensive posture.
  • You’re a contractor for a government agency or valuable enterprise target. Cybercriminals love to use SMB vendors to break into higher-value targets. If one of your employees’ stolen authentication creds cost an enterprise millions of dollars, you’d kiss your contract goodbye.

Can Ops Do It?

Give another sysadmin the SANS 20 recommended list of security controls. If he can understand them, evaluate the business for them and remediate all associated issues, let them handle it.

Already too busy to take on the project? Bring in a specialist. Keep expenses down by getting an initial third-party assessment, drafting an action plan and joining the entire ops team in implementing it.

What Does a Top-Notch Third-Party Assessment Look Like?

Before you hire someone, ask them to explain how they conduct a security vulnerability assessment. According to Robbie Higgins, CISO of AbbVie and author for SearchMidmarketSecurity, their services should include:

  • Information and infrastructure evaluation. The consultant should look at your information systems, stored data, hardware and software. Critical systems like billing, HR, CRM, legal and IP repositories are vital, but you should also focus on minor systems accessible by your own vendors.
  • Current threat landscape. In addition to knowing today’s common exploits and malware trends, your consultant should tell you what types of data attackers are after as of late and what kinds of organizations they’re currently targeting.
  • Awareness of internal soft spots. Attacks don’t always happen because employees are disgruntled. Simple incorrect data entry can expose you to an SQL injection.
  • Estimated impact. Your vendor should explain the degree to which each security vulnerability would affect data integrity, confidentiality and availability of your network resources.
  • Risk assessment. A good vendor combines weaknesses, threat landscape and potential impact to extrapolate your risks in priority order.
  • An action plan. Again, save on security consultation by letting your team execute this roadmap.

Is It Worth It?

Assessments and remediation could cost you in short-term payroll or a third-party consultant’s fee. But if they prevent a data breach that could shut down your business, almost any price is worthwhile.

PrintIt’s a fact of the IT life that technology has a finite lifespan and it’s tough to manage change in technology. Procuring new software and hardware is only half the battle. The other half falls under what happens next and runs the gamut from integration to accessibility to security. This part gets tricky.

Need help? Here are 7 of the most common challenges you’ll face when you manage change during a technology transition, and how to deal with them.

1) Cultural Pushback

IT pros think about the nuts and bolts of new technology implementation from beginning to end, including how to manage . Front-line workers care how a new CRM or analytics tool is going to affect their daily job. IT teams need to communicate why a switchover is happening, the business benefits behind it, and what great things it means to the user. Your best bet is to get them prepared, over-communicate and stay on schedule. Make sure employees and executives alike have had every opportunity to learn what to expect when the transition goes live.

2) Handling Hype

When you manage change in technology you need to manage any hype attached to it. Look at artificial intelligence (AI) solutions. Given their cultural appeal, many users have extremely high expectations and are often disappointed at the end results.  And with respect to the current direction of AI development, according to Hackaday, it’s unlikely that devices will ever live up to expectations. Instead, a “new definition of intelligence” may be required.

In another example, consider the benefits and drawbacks of implementing a new OS such as Windows 10. Some users may want to upgrade to a new OS right away, but we know that an OS switch requires a plethora of testing, such as testing application compatibility and that some of the most important updates for a new OS take at least few months to release.

So what does this mean for IT pros during a tech transition? It means being clear about exactly what new tech will (and won’t) deliver, and communicating this to everyone.

3) Failure Can Happen

Things don’t always go as planned. In some cases new technology can actually make things worse. A recent article from The Independent notes that particulate filters introduced to curb NO2 emissions from vehicles actually had the opposite effect. The same goes for IT. If you are working on a new implementation that is unproven or risky, start small and consider it an A/B test outside the DMZ instead of a big bomb you have to somehow justify blowing up.

4) Risky ROI

While companies love to talk about ROI and technology going hand-in-hand, software-driven revenue is “mostly fiction,” according to Information Week. Bottom line? The more a solution costs to build or buy, the more you’ll need to invest in organizational redesign and retraining. In other words, technology does not operate in a vacuum.

5) Prepare for People

What happens when technology doesn’t work as intended? Employees and executives will come looking for answers. The fastest way to lose their confidence is by clamming up and refusing to talk about what happened or what’s coming next. It may not be worth breaking down the granular backend for them. Being prepared with a high-level explanation and potential timeline for restoration goes a long way toward instilling patience.

6) Lost in Translation

It’s easy for even simple messages to get garbled on their way up the management chain. Before, during and after the implementation of new technology, clarity is your watchword. Short, basic responses in everyday language to tech-oriented questions have the lowest chance of changing form from one ear to the next. You also don’t need to tell all the details. Just tell your users what they need to know. Providing too much information can be harmful and lead to confusion even if they think they understand.

7) It’s Not Fair

Guess what? Even when things are beyond your control, you’re still shouldering the blame. And because new technology implementation never goes exactly as planned, it’s good to have a backup plan. Say you’re rolling out IPv6 support for your website but things aren’t going well; you need an IPv4 reserve in your back pocket to ensure file transfers and page-load times don’t increase your bounce rate or tick off internal staff.

Unfortunately, “it’s not my fault” doesn’t apply in IT, as often as you feel you can say so. On the hook for managing change in technology? Chances are you’ll face at least one in this difficult dozen on the road to effective implementation.

government-monitoringWeb security consists of multiple moving parts that can move in opposite directions. As a result, actions or technologies that improve one aspect of security may weaken another. Some enhancements might end up compromising your overall Web security.

An entanglement of just this sort builds even more complexity around the issue of government monitoring. Should Web traffic be limited in how much merits encryption? Should law enforcement have “back door” access to encrypted activity? More to the point, what are the security implications of these policies or standards with respect to your department?

This concern isn’t about government traffic monitoring in general, however strong (and mixed) many people’s feelings may be about the government monitoring personal content. Your questions relating to encryption are narrower and less ideological, in a sense, because they carry profound implications for your company’s Web security.

A Double-Edged Sword

Online encryption wars are not new; as Cat Zakrzewski reports at TechCrunch, the debate goes back two decades. With so many growing more concerned about Web security, though, the issue has new urgency. In a nutshell: It is widely agreed in cybersecurity that encryption — particularly end-to-end encryption — is one of the most powerful tools in your infosec toolbox. For thieves, stolen data is a worthless jumble if they can’t read it. That’s the point of encryption.

End-to-end encryption provides a layer of protection to data over its full journey, from sender to recipient. Wherever thieves may intercept it along the way, all they can steal is gibberish. Law enforcement’s concern about this depth of encryption, however, is that anyone can use it — from terrorists to common criminals, both of whom have particularly strong reason to avoid being overheard. Moreover, new categories of malware, such as ransomware, work by encrypting the victim’s data so the blackmailer can then demand assets before decrypting it to make it usable again.

For Whom the Key Works

This problem is difficult, but not unusual: If lockboxes are available, cybercriminals can use them to protect their own nefarious secrets. The effective legal response is to then require that all lawfully sold lockboxes come with a universal passkey available to the police, who can then open them. There’s your back-door access.

But that’s where things get complicated. If a universal passkey for back-door access exists, it could potentially fall into the hands of unauthorized users — who can use it to read any encrypted message they intercept. Your personal mail, your bank’s account records, whatever they get access to.

(The NSA and its affiliates abroad can build their own encryption engines without this vulnerability, but such high-powered technology isn’t cheap — beyond the means of most criminals, terrorists and the like, of course.)

More Keys, More Endpoints

A special passkey available to law enforcement would presumably be very closely held, and not the sort of thing bad actors are likely to get their hands on by compromising an FBI clerk’s computer. But the primary concern in cybersecurity is that the software mods needed to provide a back door would make encryption less robust. This means encryption will be less effective for all uses, even the most legitimate ones.

In essence, a lock that two different keys can open is inherently easier for a burglar to pick. According to Reuters, White House cybersecurity coordinator Michael Daniel acknowledged he knew no one in the security community who agreed with him that a back door wouldn’t compromise encryption.

Crucially, this problem is independent of any concern about the governmental misuse of back-door decryption technology. Even if no government agency ever used the back door to decrypt a message, its existence makes it possible for a third party to reverse-engineer the key, or exploit a subtle bug in the backdoor functionality — thus enabling them to read the once-encrypted messages.

Encryption isn’t an absolute security protection; nothing is. But it is one of the most powerful security tools available, and your team is rightfully concerned about the risks of compromising it.

handle iso certificationThe International Organization for Standardization (ISO) is a non-governmental entity of 162 standardizing bodies from multiple industries. By creating sets of standards across different markets, it promotes quality, operational efficiency and customer satisfaction.

Businesses seek ISO certification to signal their commitment to excellence. As a midsized IT service team implementing ISO standards, you can reshape quality management, operations and even company culture.

Choosing the Right Certification

The first step is to decide which sets of standards apply to your area of specialization. Most sysadmins focus on three sets of standards: 20000, 22301 and 27001.

  • ISO 20000 helps organizations develop service-management standards. It standardizes how the helpdesk provides technical support to customers as well as how it assesses its service delivery.
  • ISO 22301 consists of business continuity standards designed to address how you’d handle significant external disruptions, like natural disasters or acts of terrorism. These standards are especially relevant for hospital databases, emergency services, transportation and financial institutions — anywhere big service interruptions could spell a catastrophe.
  • ISO 27001 standardizes infosec management within the organization both to reduce the likelihood of costly data breaches and to protect customers and intellectual property. In support of ISO 27001, ISO 27005 offers concrete guidelines for security risk management.

Decisions, Decisions

Deciding which ISO compliance challenge to tackle first depends on a few different things. If your helpdesk is already working within a framework like ITIL — with a customer-oriented, documented menu of services — ISO 20000 certification will be an easy win that can motivate the team to then tackle a bigger challenge, like security. If you’re particularly concerned about security and want to start there, try combining ISO 22301 and ISO 27001 under a risk-management umbrella. Set up a single risk assessment/risk treatment framework to address both standards at once.

Getting Started

ISO compliance is not about checking off boxes indicating you’ve reached a minimum standard. It’s about developing effective processes to improve performance. With ISO 22301 and 27001, you’ll document existing risks, evaluate them and decide whether to accept or reduce them. With ISO 20000, you’ll document current service offerings and helpdesk procedures like ticket management and identify ways to reduce time to resolution.

Prioritizing

ISO compliance looks a little different to every organization, and IT finds its own balance between risk prevention and acceptance. For instance, if a given risk is low and fixing it would be inexpensive, accept the risk, document it and don’t throw money at preventing it. Whichever standard you start with, though, keep a few principles in mind:

  • Focus on your most critical business processes. Identify what your organization can least afford to lose — financial transactions processing, for example. On subsequent assessments, you can dig deeper into less crucial operations.
  • Identify which vulnerabilities endanger those processes. Without an effective ticketing hierarchy at the helpdesk, a sysadmin could wind up troubleshooting an employee’s flickering monitor while an entire building loses network connectivity.
  • Avoid assessing every process or asset at first. Instead of looking at all in-house IP addresses for ISO 27001, focus on the equipment supporting your most important functions. Again, you can dig deeper after standardizing the way you manage information.
  • Don’t chase irrelevant items. Lars Neupart, founder and CEO of Neupart Information Security Management, finds that ISO 27005 threat catalogs look like someone copied them from a whiteboard without bothering to organize them. Therefore, don’t assume every listed item applies to every situation. As Neupart puts it; “Not everything burns.”
  • Put findings in terms that management can understand. When you’re asking management to pay for implementing new helpdesk services or security solutions, keep your business assessments non-technical. Put information in numerical terms, such as estimating the hourly cost of downtime or the percent of decline in quarterly revenue after a data breach.

So, How Much Is This Going to Cost?

Bonnie del Conte is president of CONNSTEP, Inc., a Connecticut-based company that assists companies in implementing ISO across a range of industries. She says the biggest expenses related to ISO certification are payroll, the creation of assessment documentation and systems (e.g., documentation for periodic assessments, including both paper and software) and new-employee training programs. Firms like hers stipulate consulting fees in addition to the actual certification audit. At the same time, hiring a consultant can reduce the time intervals for standards implementation and audit completion — and prevent mistakes.

Why It’s Worth It

The ultimate goal of ISO certification is to generate measurable value and improvement within IT. It’s about how proactive, progressive awareness and problem-solving prevents disasters, improves service and makes operations more efficient. Its greatest intangible benefit, says del Conte, is often a better relationship between IT and management. “Companies find improved communication from management,” del Conte says, “due to more transparency about expectations and the role that everyone has in satisfying customer expectations.”

Don’t try to become the perfect IT service team or address every security vulnerability the first time around. Hit the most important points and then progressively look deeper with every assessment cycle. As your operations improve, so will IT’s culture and its relationship with the business side. If ISO certification helps you prove that IT is way more than a cost center, it’s worth the investment.

scripting-nightmaresScripting is a popular and powerful choice to automate repeatable file transfer tasks. It can be horrifying, though, to discover just how many scripts are relied on for the successful operation of your infrastructure. This and the time they take to execute are sure to raise the ire of managers and executives. Here are some alternative file-based automation methods to reduce time and errors in common tasks that can benefit from file encryption, deployment redirection, scheduling and more.

DevOps and Automation

It used to be that deployment meant occasionally copying files to a set of physical servers that sat in a datacenter, maybe even on premises. Often this was done via FTP and terminal sessions directly with server names or addresses. If you were smart, you created scripts with the files, their locations and servers hardcoded into them.

Automated scripts are an excellent first step, but they have limitations. When a new server is added, for example, an old one is upgraded or replaced, or virtualization demands changing names and addresses. The result? Script failure. Also, changing OS platforms means a single set of scripts won’t work across all of your servers. Scripts can be error-prone, too, and slow down if they’re not compiled ahead of time.

With the emergence of agile and DevOps practices, there’s no time to manage these ever-changing environments, so the simplest route is to not do it at all. But because you still need to deploy software somewhere, API-based systems help you achieve the automation required without hardcoding the details. The end product is a much more efficient file transfer process.

SLAs: Performance and Security

File-based automation reduces the time you spend scripting. How? Encryption, or redirecting files upon their arrival to the right servers. The consistency and predictability of this process ensures you meet your service-level agreements (SLAs), which stipulate repeatability and the removal of errors. But, in order to achieve the performance metrics you need when working in an agile and nimble organization, you need more than that.

An enterprise-grade managed file transfer solution enables you to transfer files reliably, securely and quickly. Look for a solution that offers an event-driven workflow wherein processes are kicked off either according to a schedule or on-demand based on a trigger. Additionally, file transfer workloads need to happen in parallel, simultaneously deploying across your environments to limit the time it takes to deploy changes.

For peace of mind (yours, specifically), your file transfer solution needs to be hardened. Make sure it uses encryption for all file transfers and integrates with your enterprise identity management to control access to all of your environments. Ultimately, this helps you to conform to the requirements of the most regulated markets (health care and financial, for instance) — as well as local legislation. Your security control should be automated as well, through the use of policy enforcement with secure solutions for authentication (think RADIUS or LDAP).

Finally, you need to know the status of all transfer operations at a glance. With DevOps, constant process monitoring and measuring will lead to further improvements and the removal of bottlenecks. Ensure you have the proper level of reporting and visualization into your file transfers, including those completed, those that may have failed and those that are ongoing.

Moving Files To and From Anywhere

You may need to encrypt and move a file that was just extracted from a business system and is now sitting on a shared drive inside the trusted network. Maybe you need to move an encrypted file sitting on an FTP server in your business partners data center. You need the flexibility to encrypt, rename, process and transfer files from any server and deliver them where you need it.

Don’t Forget the Cloud

Whether you’re still working on-premises or you’ve already moved many of your systems to the cloud, your file transfer processes should work across both. The reality is that most organizations will continue to keep data living on both, likely settling on a hybrid on-premises and public-cloud mix for security and control purposes. Just as the cloud promises to transparently move user workloads across servers in both environments, your file transfer and deployment solution should do the same. In the end, good management will treat you like the hero you are.

Managing Remote Employees

Just 24 percent of workers do their best work in the office during business hours, according to “The Geek Gap” co-author Minda Zetlin, writing for Inc.. In fact, telework is so appealing that nearly half of them would give up certain perks for a remote-work option, and 30 percent would take a pay cut.

Additional data from FlexJobs suggests managing remote employees can save businesses $11,000 annually for each person (you read that right), and that’s for everyone who works at least half-time from home. As if that weren’t enough, many telecommuters claim they’re more productive than their cubicle-inhabiting counterparts, and they’re also happier with their jobs.

For the IT department, managing remote employees poses two major challenges: secure connection and personal device usage. And when you’re offsite, success requires consistent communication and the clear definition of roles and responsibilities. IT departments that not only support, but empower remote work to this end become big contributors to the company’s bottom line.

Security Concerns of Managing Remote Employees

The biggest challenges when managing remote employees, according to Microsoft technical solutions pro Robert Kiilsgaard, isn’t training or application troubleshooting; they’re actually login issues and secure connectivity. “As much as 30 percent of help-desk volume is related to just resetting passwords,” he says. “This is a huge time sink for the help desk, and a complete loss in productivity for the remote associate.”

Specializing in enterprise architecture and IT transformation, Kiilsgaard recommends an Identity-as-a-Service (IaaS) solution, which allows you to manage granular access policies, provide single sign-on (SSO) functionality and facilitate self-service password reset. “If you provide a self-service portal for the end user, you have successfully eliminated that call volume. That doesn’t mean you’ve lowered your cost, but you have lowered your Level 1 service desk ticket queue workload and improved the customer-satisfaction part of your business.”

Accessing Applications

When managing remote employees, many organizations offer a patchwork of tools for application access, including virtual private networks (VPN), virtual desktops and third-party Software-as-a-Service (SaaS) sites. Barring current security concerns, Kiilsgaard also recommends offering a single portal to access all business applications.

If a single access point is of concern to you, there are also reputable applications that allow for the management of passwords via a single sign on. The user only needs to know one password and the application handles the rest. This is inherently safer since the user doesn’t need to know the passwords to any of the business applications/services. This also avoids the cost of deploying, monitoring and managing VPNs and tunneling technologies.

Employees aren’t always savvy about not using public Wi-Fi to access applications, though, leaving them even more vulnerable to man-in-the-middle attacks. They also have to be trained on the risks of using mobile devices to access applications. These include:

  • Enabling remote wipe for lost and stolen devices
  • The responsible use of company data, including storage on personal devices and transmission in email
  • Using only authorized applications when collaborating, sharing and performing work on sensitive data, so third parties don’t gain access to this content

What If He/She’s IT?

Jeremy Cucco, deputy CIO for the University of Puget Sound in Tacoma, Washington, says some of the best IT teams he’s managed during his career have either been from a remote location or those whose members performed remotely. Unfortunately, not every IT position is conducive to telecommuting, and it’s important to make sure these roles are managed with this in mind.

“Functional or business analysts often require face-to-face interaction, and server and LAN administrators may need to work locally on machines rather than remotely,” Cucco says. “Allowing software developers and systems administrators to work remotely has often involved either frank discussions with onsite personnel or a documented policy indicating which positions will and will not be allowed to telework.”

Today’s most in-demand employees — those at the support desk among them — want employers that offer a remote-work option. For this reason, employers who accommodate telework gain a significant competitive advantage. “Telework does require a level of personal maturity,” Cucco says. “However, denying that privilege to all based on the limitations of a few is not an acceptable answer in today’s workplace.”

Making It Happen

With smart access policies, ongoing training and clear communication, the IT department can make itself a powerful partner in managing remote employees, whether its members work in-house or develop solutions from an offsite location. It’s a contribution that increases productivity, unleashes innovation through collaboration and builds the workforce of tomorrow.