Information security isn’t what it used to be — firewalls, although necessary, are not enough to prevent a data breach. The problem for IT is that the old methods of keeping data secure are not enough to stop intruders who, for instance, use sophisticated phishing attacks on unaware employees.

Ashok Sankar, director of cybersecurity at Raytheon-Websense, said in Computer Weekly that cybercriminals are determined to breach company security walls, no matter how long it may take them. But these concerns can’t pose a roadblock to innovations in, say, the cloud, and impede businesses in their efforts to access new markets and gain a competitive advantage.

RSA president Amit Yoran agrees, according to SC Magazine, citing infosecurity as fundamentally broken. Firewalls and policing network perimeters are just things that make you “feel safe” but don’t address real security problems.

The evolution of security is widely discussed in the technology community:

Traditional approaches to security are making us more vulnerable to attack, suggests Yoran. It’s time to rethink security to become less reactive and more resilient.

Measure Your Detection Deficit

Teach employees to use all of their mobile devices, cloud applications and business innovations securely. “This means understanding their needs, explaining to them the security implications and coming to a consensus on what can and what cannot be done,” says Sankar. “If employees want flexibility, they must understand the responsibilities that go with that.”

Stop measuring security strength by the number of attacks a system has endured and stopped. Instead, monitor the time elapsed between the data breach and when the intruder has been detected and contained — otherwise known as the detection deficit.

Firewalls Aren’t Impervious to Breaches

Firewalls do little to contain invasions at the business level too. In order to best protect the assets of your organization, prepare for an advanced persistent threat (APT), which is usually purposeful and done with malicious intent.

Assess Your Loopholes and Know What to Protect

The first step is to prioritize. Align your security goals with those of business executives to determine which assets are most sensitive. “It is now imperative to develop a layered security approach that will amp up the security arsenal with a 360-degree visibility into all corners of the network,” warned Chloe Green, security reporter for Information Age.

Ultimately, you need to improve how you monitor and detect for a data breach, which can come out of loopholes in your security system that lockdown protocol is ineffective against once malware has been installed. Once these endpoints are closed, you’ll be able to better protect your most important information.

What Absolutely Needs Securing?

According to a report by the privacy and data-protection team at Baker & Hostetler LLP, 36 percent of problems were borne out of employee negligence — only 22 percent came from external theft.

Informing your employees not only on what information they have to protect but also, how they should protect it, will lower the majority of your post-breach data loss risk.

Preparing for an APT Prepares You for the Worst

If you’re going to contain the scope of a potential APT, a firewall won’t be enough. End-to-end encryption for data in motion and comprehensive monitoring of all inbound and outbound traffic in your network have to be top priorities. End-to-end encryption protects data being transferred or shared between end-points, whether people or systems. Pair your traditional security solutions with advanced detection and real-time analytics, provided they’re configured to detect malicious activity before it causes actual damage. Differentiate this traffic by identifying patterns with an IP-based device that connects to the network, and you’ll be able to isolate the problem immediately if it occurs.

Security measures can help you minimize the looming threat of a data breach. It’s no longer practical — let alone sustainable — to approach problems with the idea that they can all be prevented once they touch your network.

As a marketer, I’m always interested in hearing what people have to say about promotions. Does a buy-one, get-one offer motivate you? How about a % off discount? Or what about a prize? Does it really make a difference? Do you care? 

With our latest major product release in May, WhatsUp Gold v12,  I breathed a sigh of relief. Finally, new features and new functionality to brag about. But once again the pressure was on to get more people to ‘try’ and ‘buy’.  I got it! Let’s do another gimmick “Is your network healthy? Try our software and be entered to win a Wii FIT” and “Buy and be entered to win a 1 of 12, 1-terabyte Hard Drives.”  So I ask you …does this type of promotion motivate you? Or are you of the mind-set, I just want the product. Give it to me at the best-price.  

We’re currently running a promotion on our website for a “free copy of VoIP Monitor.” What’s the catch? Well, you need to buy a new license of WhatsUp Gold (v12). Good I guess for those who are already ready to purchase and just happen to have VoIP, but does it motivate people to “try” the software first?

I’d be interested in hearing from IT professionals.

The SaaS web-based application delivery model provides corporations with hosted set of business centric applications without a need to purchase, maintain or customize the application to fit their unique needs. 

Many organizations have adopted this model for sales, procurement, CRM and human resources applications for example. Unlike the traditional software acquisition model, where a corporation invests in an application and is required to build the infrastructure to support the application, SaaS requires negligible upfront investment beyond user training. Application maintenance, upgrades and development are the SaaS provider’s responsibility. This is a very attractive value model for many companies. 

When SaaS web-based applications are being evaluated and purchased by a corporation, the IT and network management functions are usually not included in the planning, evaluation and decision process, as IT is perceived as a roadblock. Most frequently, this effort is driven by the business unit or department accessing the application.

This lack of cooperation can cause problems to IT and network management after the application is brought on-line. IT and network management discover the application is deployed and being accessed after the fact. Usually when users contact IT or network management to complain about application performance as the application is bandwidth intensive or existing network infrastructure is near capacity. Another factor to consider is since all SaaS based applications are connected through the Internet outside of the managed corporate network infrastructure, they are subject to any number of issues including forwarding delays, connection reliability and traffic contention. 

Business units evaluating SaaS as an option need to include IT and network management to allow for resource planning and monitoring of end-to-end SaaS specific application traffic to ensure that availability and performance expectations are achieved.

We recently wrote an entry on Blogger predicts the death of death that highlighted the death of IT. 

Au contraire, my friends, IT is a strategic asset and in most cases a competitive differentiator, and there are plenty of examples from the Fortune 500 all the way down to small businesses. Within the context of IT, it all comes down to predictable and reliable delivery of information to both internal and external consituencies. The foundation of predictable and reliable delivery of this actionable information is the network infrastructure. Unfortunately, many IT executives view the network as a utility rather than an asset, and while this is certainly understandable, it is the right view?

The real test is when the network or a server goes down. How is it viewed then? The new reality is down-time is money, no ands, ifs or buts.

IT and networking professionals need to highlight the criticality of the network as a business asset and not just a utility.

Who is more demanding, the IT employee or the IT manager? The general consensus is that the IT employees are.  A recent survey shows that 18 to 31 year old IT professionals are more demanding than previous generations and are disillusioned faster. Many millennials expect that they will immediately start at the top and receive higher starting base pay. As a result, many IT organizations are finding increased turnover as employees find out that they actually have to work their way up as previous generations did.

As baby-boomers retire and leave an expertise gap, the next generation can learn a lot from these “experts” not only in gaining IT experience, but also in how to understand and manage themselves and the work environment they find themselves in. My advice is find a mentor and soak everything up like a sponge. Eventually, you will be the older generation and be facing the same dilemma with the next generation of “millennials”.

Virtualization continues to spread from beyond its mainframe roots to encompass the network.

Cisco announced a new management blade for their 6500 router last month that evidently kicks butt; doubles throughput, eliminates antiquated redundancy protocols and enables 20 times faster failover. It also eliminates the need for complex network architectural designs, multi-homing and multiple IP addresses for redundant standby equipment.

However, it still does require a significant hardware investment, but if you already have a redundant architecture in place, which is already proven to be slower, then a $32K investment may be worth the investigation to improve throughput by 20x and decrease failover times.

Expensive, but still cool stuff.

As Americans, we rarely live without the goods and services we desire. For years, we have heard experts like Alan Greenspan warn that when countries outside of the United States see rapid growing living standards; their currencies will appreciate against ours in parallel. For those of you who are Warren Buffet fans, he was right again. In 2002, Buffet began to purchase foreign currencies, betting on their appreciation versus the dollar. Even a wealthy country like the United States will eventually run out of steam after its citizens spend excessive dollars on foreign produced goods.

Further, once countries like China acquire dollars though the $700 billion US trade deficit, they will begin to dump the dollars once they suspect its valuation is in jeopardy. A country like China, which holds over $1 trillion dollars, has great power in the interconnected global economy, if it begins dumping dollars in favor of Euros thus pushing the dollar to lower levels.

For lots of software companies, the weakening dollar and weakening economy will lead to headaches in meeting growth and profitability objectives. Ironically, these companies tend to be US market centric with a bias or mis-understanding of what being a global company means. In contrast, international software companies find unseen benefits in dark clouds and the current economic status is a perfect example. Global markets offer growth, risk mitigation and excitement but you need to choose wisely. For example, it is unlikely a market like Australia with a population of 20 million people will balance a slowed US economy entirely but it is probable that a select list markets will offer opportunity to supplement the decline in US markets.

Further, being global does not mean you need local representation in every country though that decision is highly dependent on product characteristics like price, complexity and training to name a few. For those that build easy to use products like WhatsUp Gold, the internet and the ecommerce business process offer great opportunity for revenue and profitability enhancements.

I had them all with me tonight – Information Week, CIO, Redmond, eWeek and Network World. Not surprisingly, the world of technology continues to be a world of contradictions and that’s a good thing. Ever since we all witnessed the battles on the airwaves 10 years ago between Cisco (promoting the paperless office) and Xerox and Canon promoting the highest, fastest volume outputs of paper and we in technology took a side that contradicted a colleagues comment or belief – continuing our debates over coffee, beer and long inescapable air flights.

Enter the time machine and here we are 10 + years later and a world of new contradictions.

The jury is still out on SOA. Proponents argue that a distributed, nimble set of web services that replace large monolithic systems is the future. Proponents claim the economies of scale, re-use and distributed network friendly design poses huge benefits for adopters. Yet, the early poll and interview results are uncovering hosts of problems that are frustrating the heck out of CIO and subordinates. Application performance, application reliability and security are all questions and issues in need of a solution.

The jury is in on Virtualization and the ruling looks like guilty for all. We (I) am committed to virtualization like no ones business. Yet, I am reading tonight that the early adopters have already begun to uncover the pitfalls and it would be best to look to third party management companies when deploying a virtualized environment. I cannot say this is a contradiction in pure form but it fits my blog so give me a little break here.

Last, the story that made my night. MIT on Thursday morning will unveil a new working forum whose goal is to expand Kerberos to wireless technologies. Isn’t Kerberos dead? I guess Kerberos really does have 3 heads, I mean lives.

I often debate with friends in the IT industry on the merits of chasing business in the SMB. Lots of friends agree that the growth in the SMB sector continues to be strong but they argue that re-designing existing products for SMB’s has proven to be less than a profitable venture. I disagree with the entire rationale for reasons that might not appear to be obvious.

The reason most ventures have been unprofitable has little to do with the market and more to do with the philosophy of 99.9% of companies who try to push down complexity to SMB’s with new packaging. SMB thrive on easy to use yet effective solutions for complex environments. Vendors who service that market with dedication are profitable. Those that just repackage complexity with a simple title end up failing and end up going back to the enterprise to sell.

When will the vendors learn?

I have been out on the road the past few weeks but I am glad to be back. I was reading about the latest data theft at Boeing today. A disgruntled employee with the intent of hurting his employer placed sensitive data on a thumb drive with the hopes of leaking it to a local Seattle newspaper. As you probably guessed, this man is unlikely to receive any employee awards or merits. What really caught my eye in this story was the ‘potential’ financial impact had the newspaper not done what is right – a whopping $5-$15 billion loss was possible. If you’re like me, your wondering what the heck the data said? Did it unveil the material makeup for it new dream liner or was it indicative of bad business practices?

One of my favorite security lecturers is Bruce Schneier. If you ever have the chance to listen or speak with Bruce, you’ll be entertained and well educated by the end. In reviewing this data breach, Schneier bring up valid points of practicality, “If a company hires an untrustworthy employee, there is almost nothing it can do to prevent theft”, Schneier argues. “What’s done in African mines is they do full-body cavity strip searches every time they leave. That works,” Schneier says.

I’ll talk more about USB thumb drives in a future entry but in the meantime, check out RedCannon Security. I can’t validate whether or works yet but these guys caught my eye as a needed innovation in the security space. RedCannon says it can restrict the types of USB drives that are plugged into computers, monitor what data is pulled from a hard drive, and remotely destroy content if the thumb drive is inserted into an Internet-connected computer. As an extra safeguard, RedCannon says its products can set USB devices to stop working when they are not inserted into a computer connected to the Internet

I admittedly stole the headline from the direct and welcomed article written by Art Wittman, Editor for Network Computing, writing on the topic of security concerns with virtualization.

Well-balanced security professionals know how organizations can dynamically balance the parallel goals of IT performance with security concerns. Still, the “negatites” (negative, often academic technologists) roam in the background preferring we all revert to manual calculations and typewriters in the name of uptight and non-attainable security standards.

The latest attack from the “negatites” focuses on the progressive world of virtualization. Virtualization is a technology I personally believe in and I believe it will be one of the more “game-changing” technologies this generation of IT buyers and sellers will participate in.

read more “Move Away From the Nabobs of Negativity”

If your wondering why confusion is mutating from one area of the network to the next, sit down with a large cup of coffee and read a weeks worth of hypothesis by varying authors regarding the look of converging networks and the resulting change in the network administrators’ role.

Last night I read the follow-up posting from a GigaOM article debating the merits of having a network engineer/administrator on the staff of a Web 2.0 company. The post’s author, Allan Leinwand, says a Web 2.0er recently told him that connecting to the Internet was like connecting to the electrical grid – you don’t need an electrical engineering degree for the latter and you don’t need a network engineer for the former.

I call this absolute nonsense and commend the Network World online staff for correctly pointing out, “the very servers that you read NW blogs on are at a co-location facility. Had we just allowed the third party to manage our systems, we may not have discovered a peering issue that was causing a decrease in site performance. Maybe these Web 2.0ers need to think a little more deeply about the plumbing side of the Web instead of just trying to impress their peers with the latest AJAX script.”

The world of Web 2.0 is exciting but it’s my strong opinion that it provides new challenges and opportunities for the network administrator and not the elimination of his or her job. The comparison of Web 2.0 to an electrical grid is tactically accurate but does not illustrate the enormous complexity that Web 2.0 provide – after all, I can’t recall another technology initiative that promised every device in the network connectivity.

I think we better hold on to the network administrators.