Application monitoring can help troubleshoot bandwidth bandits and other disruptions (credit: Jerry John | Flickr)

Cloud computing is a ready-made revolution for SMBs. Forget about server downtime; elastic computing and API-driven development is perfect for smaller organizations with project funding in the mere thousands of dollars.

All that agility is allowing information architects to think big — smartphone connectivity, IoT, lambda architecture — with existing app performance monitoring standards becoming more Web and socially aware.

Perfect world, right? Well, maybe a “perfectable” world. While developers are doing the elastic, agile thing — leveraging the power of pre-built tools through IFTTT or Zapier and getting Big Data tools from GitHub — they’re making assumptions about available bandwidth. They may even add Twilio to the mix so the company can SMS you in the middle of the night when their app hangs.

App Performance: ‘It’s Spinning and Spinning’

“I can’t do anything. It’s just keeps spinning,” you’re thinking. Classic Ajax loader. Users from a different era prefer freezing metaphors, but those are just as obvious, and don’t encompass today’s issues: “My email won’t send,” “My daily sales dashboard won’t load” and, now, “the whole neighborhood’s smart meters are offline.”

A new set of network demands are rounding the corner, foreshadowing a greater need for application performance monitoring: SIEM, Big Data, IoT, compliance and consumer privacy audits. It is the slow death of offline archiving. And for each, file sizes are on the rise and apps are increasingly server-enabled — often with heavy WAN demands.

Open Source, DIY and Buy-a-Bigger-Toolbox

Presented with bandwidth concerns, some support specialists (or DIY-minded developers, as that is often the SMB way) will turn to open-source tools like Cacti to see what they can learn. And they may learn a lot, but often the problem lies deeper inside an app’s environment. As one support specialist explained (known as “crankysysadmin” on Reddit), “It isn’t that easy. There are so many factors that affect performance. It gets even more tricky in a virtualized environment with shared storage and multiple operating systems and complex networking.”

Another admin in the Reddit thread agreed: In terms of app performance monitoring, he responded, “there’s no one-size-fits-all answer. What type of application are we talking? Database? SAP? Exchange? XenApp? Is it a specific workflow that is ‘slow’? What do you consider ‘fast’ for that same workflow?”

Event-Driven Heads-Up for App Hangs and Headaches

App usage spikes have many possible causes, which is precisely why a commercial app monitoring tool that is easy to use when you need it in a pinch can ultimately pay for itself. Depending on site-dependent update policies, types of applications support, regulatory environment, SLAs and cloud vendor resources, you’ll sooner or later be faced with:

  • Massive updates pushed or pulled unexpectedly.
  • Surprise bandwidth-sucking desktop apps.
  • Developer runaway apps.
  • App developer design patterns tilted toward real-time event processing.
  • Movement toward the more elastic management of in-house resources.
  • Management of bandwidth usage by cloud service providers.
  • A need to integrate configuration management with monitoring.
  • Increased support of operational intelligence, allowing for real-time event monitoring as described by Information Age.
  • Monitoring to develop application-dependent situation awareness.

The last of these, situation awareness, deserves an emphasis. Consider the impact of moving monthly reports to hourly, or a BI dashboard suddenly rolled out to distributor-reps. Situational awareness at the app level can ward off resource spikes and sags or even server downtime.

Identify What’s Mission-Critical

Whether the monitoring answer is open source or commercial depends partly on whether your apps are considered mission-critical. For some, VoIP and Exchange have been those applications. The SLA expectation for telephony, for example, is driven by the high reliability for legacy on-premises phone systems that rarely failed. SLAs for VoIP are often held to the same standard.

And what’s mission-critical is probably critical for job security. If the CEO relies on a deck hosted in Sharepoint for briefing at a major conference, and he can’t connect at the right moment — well, you may wish you had a bigger IT staff to hide behind.

CTA-BANNERS-downgrade

Related articles:

Are Your Mission-Critical Applications Starving for Bandwidth?

Noble Truth #5: Network and Application Performance Defines Your Reputation

safeharbor-ruling

CJEU Rejects Safe Harbor Rules for User Data Transfer

If you’ve been listening, the CJEU has just rejected the safe harbor rules put into place 15 years ago. The implications of this ruling could render many global companies in a tough spot, specifically companies that rely on the free transfer of data between the EU and US. Companies likely to be affected not only include US social media sites, but US cloud file share sites like Dropbox (and their customers who use their services to store EU citizens’ personal data), global retailers with buyers in the EU, and any US business that manage personal data of EU citizens.

User Privacy Impacts ‘Business As Usual’

Although the changes are not immediately in effect, the demands of user privacy will likely impact ‘business as usual’. It is an obvious backlash to NSA surveillance of citizens online activities without their knowledge or consent. But the cost to global businesses is that it’s going to be harder to provide services and data between the US and Europe.

“If the Safe Harbor rules in place since 2000 are done away with, each country in the European Union could potentially set is own privacy rules and regulations, creating enormous barriers to U.S. firms doing business there.” – USA Today, Europe’s top court rejects ‘Safe Harbor’ ruling

Now the scramble for CISOs in global companies is to find ways to comply with the new ruling. It goes without saying that user privacy is extremely important and should be a fundamental right, but this ruling affects more than Facebook and Google, who may have anticipated and already addressed this issue within their organizations. It most likely will change how companies need to handle data flows between the two continents. About half the world’s data is exchanged between Europe and the US, and rejecting safe harbor means drastic changes for small and medium business alike.

In talking to my colleague, Alessandro Porro, in London this morning about this news, he had the following to say:

“The strike down of the Safe Harbor agreement by the Court of Justice of the European Union (CJEU) adds a large amount of uncertainty and risk to any enterprise whose business involves data movement between the EU and US.  Safe Harbor was found to not meet the requirements of the Data Protection Directive.Whilst the EU’s general approach to data protection has been agreed, the actual regulation is still in consultation and so there could be the flexibility to include clear guidance to these firms.  However, it would be fair to assume that this could impact that target adoption date which is currently the end of the year. Businesses should to start working immediately to audit their data sharing practices, including use of US cloud sharing services like Dropbox, so that they understand exactly where they stand and are ready to act when further guidance is issued. “

Tough for Tech But Win for User Rights

On the other side of this, advocates of user privacy as a fundamental right are cheering a huge win. Edward Snowden was quick to tweet out form his new Twitter handle about the ruling.

In either case, it will be interesting to see how the tech industry reacts to this. Companies will need to start getting a little more creative about how they share data between the US and EU.

What is your company doing to adjust to the new rules?

Related Articles

Practical Guide to Control and Compliance

How ready is your organization to comply with evolving regulatory landscape and security risks?

>> Engage with us next month during the Ipswitch Innovate 2015 User Summit, a two-day (October 21-22) online event for IT pros to learn from each other and our product experts.

innovate-FB-1200x628

It takes two to make a company.  Sounds like a cliché, you say?  Or is it a bold prediction by the retiring Cisco CEO?  In his last keynote as CEO, John Chambers envisioned that many large companies of the future will only have two employees – a CEO and a CIO.  All other functions will be outsourced.  Did Cisco Live exhibit trends that could support this type of a structure?  Let’s take a look at three trends that I observed:

Trend #1: Cloudy with no chance of rain

The writing is clearly on the wall for custom underutilized hardware.   Whether it is in networking, computing, or storage, customers are embracing (demanding?) the flexibility to purchase resources on demand; that is modular, inter-operable software that can be deployed on commodity hardware.  The hardware can be replaced at any time, with products from any vendor, without the applications missing a heartbeat. Workloads can be running anywhere without the worry of downtime (no rainy days!!). This can be in a private cloud (racks of commodity equipment) or elastic capacity bought in the public cloud.  More importantly, these applications and workloads are increasingly directly related to the business function provided by the company.  Any other supporting functions like sales, CRM, accounting, and collaboration tools are outsourced.  Clouds, public or private, are changing how the business is run.  Businesses will continue to buy capacity in an elastic manner to adapt to the changing business needs.  And all they will need is a CIO to make sure it all keeps working seamlessly.  At Cisco Live, this trend was clearly evident in Cisco’s Application Centric Infrastructure.

Trend #2: Connected Everything

On my way to Cisco Live, I traveled with my personal administrative assistant.  She told me when my airport shuttle was arriving, which terminal and gates my flight was leaving from, and which train I needed to take to get to the hotel.  And she never missed a beat. Photo 5 Her (his?) name is “Google Now” and s/he is helped by an army of robots connecting everything imaginable.  Needless to say, I didn’t need to visit the “Connected Everything” exhibits at Cisco Live to convince me that tomorrow’s traveling CEOs may not need a dedicated human assistant.  This function is being automated where possible and outsourced when not.

Trend #3: Eyes on Everything

So, who is the watch guard of all of this outsourced and automated infrastructure to make sure the critical nature of the business is not impacted?  Infrastructure monitoring was the 3rd prominent theme at Cisco Live.  More specifically, it was with the mindset of a “Single Pane of Glass“.  When business critical needs are met by a diverse set of resources, the need for a single pane of glass is even greater.  Automation and correlation of the raw data from multiple sources, translated into meaningful business metrics would allow both the CEO and the CIO to make decisions in real time when necessary and generate analysis to support their decisions.

Of course, we may sound overly imaginative with the notion of a two-person company.  But a two-function company (a business function and a technology function) is certainly within the realm of possibility.  And if Chambers is right, we will certainly see companies reorganizing around these two disciplines.

WUG banner
Our own single pane of glass just got better. Check out the new WhatsUp Gold version 16.3.

 

In just a few days we’ll be listening to “Auld Lang Syne” and watching the ball drop in Times Square. As we plan deeper into 2015 I found myself reading Gartner’s Top 10 Strategic Technology Trends for 2015 and want to share a few thoughts based on two of them:

  • Cloud/Client Computing: For businesses, Cloud/Client Computing has an additional component beyond Gartner’s omni-portable linkage between the cloud’s compute/data/management and client devices. Apps for the business cannot be viewed in isolation. Beyond data synchronization, IT will also have to address the integration layer between public cloud and private cloud, and between cloud and on premise applications, for rich sharing and use of data within business workflows.
  • Risk-Based Security and Self-Protection: We seem to have reached a tipping point that Gartner alludes to: security can no longer be fully managed by IT. There are just too many threats, and the paradigm shift of applications themselves pre-empting some of these threats will be welcome. Gartner correctly views this as part of a multifaceted approach. We believe that monitoring of how threats spread will lead to new dynamic response methodologies, perhaps bot-implemented, going well beyond today’s analysis of threat signatures. Stopping threats rather than dealing with their consequences is something for IT to look forward to.

Speaking of stopping threats, are you constantly on edge about the safety of your stored and transferred files? Using the right file transfer system is paramount in securing files and sensitive data. The MOVEit Managed File Transfer System is designed specifically to give control over sensitive data to the IT department, to ensure better security throughout the entire file transfer life cycle. Download our white paper entitled Security Throughout The File Transfer Life-Cycle to learn more.

As we head into 2015, what will the New Year have in store for IT? Only time will tell!

The evaluations are complete and the decision has been made, a move to the cloud is in the best interest of your organization. Transferring workloads to the cloud in order to free up or discard costly on-premise resources for the fast deployment and flexibility of an elastic environment has overwhelming appeal, but now what? Despite the many advantages of a cloud environment there are still pitfalls that need to be navigated in order to ensure a positive engagement and user experience. To that end, I would offer two pieces of advice to colleagues looking to transform their organization from a strictly on-premise environment to a cloud user.  dddd

First, pick the right provider. While this may seem like an obvious and simplistic statement, I can’t begin to stress how important this is and caution how many cloud transfers have met their untimely demise due to a less than adequate partner. When evaluating service providers there are certain non-negotiable items you must account for. Chief among them are security, reliability and responsiveness. Like it or not, there is an element of control you are ceding in this relationship and top-notch support and trust are paramount. You want a secure, integrated, centrally managed and easy-to-use environment with service level agreements (SLAs) that commit to minimum standards of availability and performance, especially at peak demand. Timely responses to change requests, backup needs and security patches are also key considerations.

Second, choose the right workloads. The cloud can be a powerful and efficient tool for your business, but it does not mean that every application is best suited to reside in a cloud environment. When developing your integration strategy keep in mind that low to medium security workloads, those without stringent latency requirements, and where the workload is elastic with variable traffic will work well.  Some workloads need data to be frequently pulled in-house for use by other systems and are perhaps best left in-house.  High-security and compliance monitoring needs are also more suited for on-premise use.  Keep integration requirements in mind as some workloads that are tied to proprietary hardware are also not good candidates for public clouds but may be fine for a private or hybrid environment.

The cloud can transform your organization if you manage it correctly, but it takes due diligence on your part to ensure that the move goes as planned. By doing your research ahead of time and developing a list of key considerations for your business, you can ensure that the process will be both smooth and successful.

 

 

The evaluations are complete and the decision has been made, a move to the cloud is in the best interest of your organization. Transferring workloads to the cloud in order to free up or discard costly on-premise resources for the fast deployment and flexibility of an elastic environment has overwhelming appeal, but now what? Despite the many advantages of a cloud environment there are still pitfalls that need to be navigated in order to ensure a positive engagement and user experience. To that end, I would offer two pieces of advice to colleagues looking to transform their organization from a strictly on-premise environment to a cloud user.  dddd

First, pick the right provider. While this may seem like an obvious and simplistic statement, I can’t begin to stress how important this is and caution how many cloud transfers have met their untimely demise due to a less than adequate partner. When evaluating service providers there are certain non-negotiable items you must account for. Chief among them are security, reliability and responsiveness. Like it or not, there is an element of control you are ceding in this relationship and top-notch support and trust are paramount. You want a secure, integrated, centrally managed and easy-to-use environment with service level agreements (SLAs) that commit to minimum standards of availability and performance, especially at peak demand. Timely responses to change requests, backup needs and security patches are also key considerations.

Second, choose the right workloads. The cloud can be a powerful and efficient tool for your business, but it does not mean that every application is best suited to reside in a cloud environment. When developing your integration strategy keep in mind that low to medium security workloads, those without stringent latency requirements, and where the workload is elastic with variable traffic will work well.  Some workloads need data to be frequently pulled in-house for use by other systems and are perhaps best left in-house.  High-security and compliance monitoring needs are also more suited for on-premise use.  Keep integration requirements in mind as some workloads that are tied to proprietary hardware are also not good candidates for public clouds but may be fine for a private or hybrid environment.

The cloud can transform your organization if you manage it correctly, but it takes due diligence on your part to ensure that the move goes as planned. By doing your research ahead of time and developing a list of key considerations for your business, you can ensure that the process will be both smooth and successful.

 

 

HealthIT According to a recent Ponemon Institute report, seventy-two percent of the 600 IT professionals surveyed believed their cloud service providers would fail to inform them of a data breach that involved the theft of confidential business data, and 71 percent believed the same for customer data.

Healthcare organizations have been hesitant to relinquish any perceived control over their information, and yet the investments and resources required to securely store and manage files “on-premise” has become a burden most facilities can no longer shoulder. IT teams lack the bandwidth and expertise to manage the growing volume and traffic of Protected Health Information (PHI). The move to the cloud has become inevitable because of the increasing complexity and burden of managing compliance processes.

Moreover, given the recent Omnibus ruling from September 2013, compliance with the Health Insurance Portability and Accountability Act of 1996 (HIPAA) has never been more pervasive. With security breaches occurring at an alarming rate combined with the expansion of federal regulations, the push towards compliance has fueled businesses large and small to explore the necessary requirements – and options available – when it comes to achieving and maintaining HIPAA compliance.

Cloud-based solutions provide significant value for the healthcare industry, providing organizations with superior security and control when managing sensitive health data, especially PHI. In speaking with our customers in organizations required to adhere to HIPAA regulations, a cloud-based managed file transfer (MFT) solution offers numerous advantages: industrial-grade security, lower risk, reduced time and resources needed to achieve and maintain HIPAA compliance, higher reliability and availability backed up by service level agreements, and cost savings as IT staff is freed up to focus on other operational tasks.

The benefits of cloud provide a compelling reason for organizations to move to a managed cloud environment; here are a few best practices to keep in mind:

  • Invest in partners that are well-equipped to manage the breadth of HIPAA standards, and who are able to provide the tools needed to demonstrate compliance to your auditors;
  • Make sure to look for partners that provide a packaged HIPAA compliant environment that satisfies electronic protected health information (ePHI)-related legal obligations in HIPAA/HITECH legislation; and
  • Recognize from the start that your HIPAA compliance will usually involve a hybrid solution that combines both cloud and on-premise elements. A combination can provide the enabling “fabric” that will make it possible to do business moving forward.

To read more on this topic, check out my full article in HITECH Answers.

Cloud SecurityRecent news from Intralinks is just the latest where the security of Enterprise File Sync and Share (EFSS) vendors, like Box and Dropbox, are questioned. The EFSS rival reports that that generating links to share documents can put sensitive data at risk through several basic flaws. Once a link is generated to only be accessible by trusted sources, it turns out that it can actually be viewed by third parties (aka not the people you want accessing it). Intralinks said they discovered the vulnerability as part of Google AdWords research.

 While the companies are scrambling to address the issue and patch the flaw (at time of publishing Dropbox had issued a fix), it presents the opportunity to once again distinguish EFSS from Managed File Transfer. We spend a lot of time talking about this with customers and prospective clients and recently developed a White Paper and blog post on the topic. Check them out and let us know what you think.

I recently attended CIOboston, a CIOsynergy event headlined as “A New Dimension to Problem Solving Within the Office of the CIO”. We talked about paradigm shifts propelled by technologies like the cloud, the necessary new engagement models for business and IT and the changing world of expectations to name a few topics. But before getting to all this, our moderator Ty Harmon of 2THEEDGE posed the simple question to the attending 50 or so CIOs and senior IT heads: “What are your challenges?” ww

Here are the answers that I have assembled. I think there is value in seeing what was/is top of mind for IT leaders in raw form:

  • How do we make the right choices between capital and expense?  Service offerings are growing and additive – the spend never ends.
  • How do we integrate multiple cloud vendors to provide business value?
  • User expectations are being set by the likes of Google and Amazon for great UX, 7X24 support, etc. – but it is my IT staff that is expected to deliver all that on our budget. The business does not want to see the price tag – but they want the same experience that is available at home from these giants.
  • IT needs to run like a business but this takes a lot of doing. It matters how we talk and collaborate. We have to deliver business results that must be measurable.
  • Adoption of the cloud is a challenge. How do we assess what is out there? It is not easy to do apples-to-apples comparisons and security is a big concern.
  • How do we go from private to public cloud? Current skill sets are limited.
  • We are constrained by vendors that are not keeping up with the new technologies! One piece of critical software may want an earlier version of Internet Explorer to run; another may use an obsolete version of SQL Server, etc. This clutter prevents IT departments from moving forward.
  • Business complexity is a challenge. IT is asked to automate – but we must push back to first simplify business processes.
  • “Shadow IT” is an issue. A part of the business goes for a “shiny object” rather than focusing on what is the problem that really needs to be solved. They do so without involving IT. Then IT is expected to step in and make it all work, integrate with other software and support it.
  • Proving ROI is a challenge.
  • Balancing performance, scalability and security is tough.
  • How do you choose old vs. new, flexibility vs. security? It isn’t easy.
  • How do we support more and more devices?
  • How do you fill security holes that are in the cloud?
  • How do you manage user expectations, find the balance for supporting them when you have limited resources.
  • Many heads nodded as these challenges were spoken of.  But all agreed that these are exciting times and IT will push forward through them and be recognized as the true business enabler that it is. What are your thoughts—were you nodding your head at these questions?

ChallengesI recently attended CIOboston, a CIOsynergy event headlined as “A New Dimension to Problem Solving Within the Office of the CIO”. We talked about paradigm shifts propelled by technologies like the cloud, the necessary new engagement models for business and IT and the changing world of expectations to name a few topics. But before getting to all this, our moderator Ty Harmon of 2THEEDGE posed the simple question to the attending 50 or so CIOs and senior IT heads: “What are your challenges?”

Here are the answers that I have assembled. I think there is value in seeing what was/is top of mind for IT leaders in raw form:

  • How do we make the right choices between capital and expense?  Service offerings are growing and additive – the spend never ends.
  • How do we integrate multiple cloud vendors to provide business value?
  • User expectations are being set by the likes of Google and Amazon for great UX, 7X24 support, etc. – but it is my IT staff that is expected to deliver all that on our budget. The business does not want to see the price tag – but they want the same experience that is available at home from these giants.
  • IT needs to run like a business but this takes a lot of doing. It matters how we talk and collaborate. We have to deliver business results that must be measurable.
  • Adoption of the cloud is a challenge. How do we assess what is out there? It is not easy to do apples-to-apples comparisons and security is a big concern.
  • How do we go from private to public cloud? Current skill sets are limited.
  • We are constrained by vendors that are not keeping up with the new technologies! One piece of critical software may want an earlier version of Internet Explorer to run; another may use an obsolete version of SQL Server, etc. This clutter prevents IT departments from moving forward.
  • Business complexity is a challenge. IT is asked to automate – but we must push back to first simplify business processes.
  • “Shadow IT” is an issue. A part of the business goes for a “shiny object” rather than focusing on what is the problem that really needs to be solved. They do so without involving IT. Then IT is expected to step in and make it all work, integrate with other software and support it.
  • Proving ROI is a challenge.
  • Balancing performance, scalability and security is tough.
  • How do you choose old vs. new, flexibility vs. security? It isn’t easy.
  • How do we support more and more devices?
  • How do you fill security holes that are in the cloud?
  • How do you manage user expectations, find the balance for supporting them when you have limited resources.

Many heads nodded as these challenges were spoken of.  But all agreed that these are exciting times and IT will push forward through them and be recognized as the true business enabler that it is. What are your thoughts—were you nodding your head at these questions?

Earlier this week we published new survey findings around IT frustrations with manual file transfer, with the vast majority of respondents equating the process with sitting in traffic. TechRadar Pro reporter Juan Martinez wrote a story about the findings, and I had the chance to catch up with him via email on a few questions he had.

Relatively new to the managed file transfer (MFT) space, Juan wanted to understand why file transfer can be so challenging for today’s organizations, and where the technology is headed. These are two great questions and are behind what’s driving file transfer today, and I thought the content of our email exchange would be interesting to anyone curious about the future of MFT.

So why is file transfer a challenge? For a lot of reasons – but mainly because it’s becoming increasingly complex, with end-user adoption of EFSS (enterprise-grade file sync & share) solutions that not only create data security issues, but also result in additional systems for IT to manage and support. And at the high-end, MFT can get absorbed into major IT undertakings that require an immense investment and consultative implementation— rather than an out of the box solution.

The challenge in getting managed file technology right is balancing the needs of collaborative file sharing vs. integrated file-based system to system integration. End users demand simple file sharing solutions that are quick to get started while IT demands compliance to corporate and regulatory security standards. It is easy to focus on one end of this while ignoring the other. Ipswitch understands that there are multiple scenarios of file transfer and that organizations today are looking to centralize and consolidate their file transfer systems into one, secure, ready to use solution. One major area to look into is complete visibility and control into file transfer processes— this is becoming increasingly important as compliance mandates proliferate and become more encompassing.

And so in our view MFT is clearly evolving, with the market is moving toward secure, manageable, and scalable systems at the core. But MFT is more than just file transfer. Ipswitch sees the need for tightly integrated transfer automation around the system core that allows IT to manage the exchange of any volume of transfers, while efficiently processing files to prepare them for the next step in a business process.

We understand that transfers happen in the context of B2B relationships, and we envision a system that wraps every exchange in metadata about the partners and workflows servers by exchange events. We imagine a system in which a broad range of end-user and system-to-system workflows can be accommodated, with clients’ tools that synchronize across partners, empower mobile workers, automate local and remote transfer processes, and intelligently control the flow of content between partners.

As I noted to Juan, Ipswitch has work going on today in all of these areas, and our customers should expect great things in the years to come.

To read Juan’s story, click here: IT Professionals are dissatisfied with file transfer processes, concerned about security

Let’s face it: For many companies that handle payment card data, the search for a safe and secure way to store and transfer information in the cloud hasn’t always led to a feeling of full-blown confidence. And, the reality of so many new breaches doesn’t help.

While the road to PCI compliance can seem long and daunting, it is possible – and with the right guidance, can be easier than you thought. So, for those feeling like pulling their hair out, worry not!

Check out this article in Retail Online Integration in which I outline four actions that are important to making PCI compliance a tangible and achievable reality:

  • Understand the difference between PCI compliance and certification,
  • Get the business involved,
  • Develop a plan, and
  • Make education a priority.

Retailers and other companies required to be PCI compliant, we’d love to hear from you – please share your experience or questions.