top of page
Writer's pictureThe Oldcommguy

Hybrid IT - The New World Order!


Hybrid IT – The New World Order?

Over the last year or so, there has been a lot of publicity around cloud computing—both public and private. Most of the publicity seems to be centered around how businesses need to move away from their physical on-premises systems to a “pure” cloud-based architecture. However, is this really the case?

Cloud-based networks have many benefits, but so do physical on-premises networks. Which one is better? The answer appears to be both. For instance, public cloud-based solutions are very flexible and allow administrators to spin applications up and down to match business needs. This helps you right-size costs. On the other hand, because of the architecture, IT personnel that have shifted their networks from a physical on-premises environment to a cloud environment have reported several unexpected performance problems. Notable security data breaches have occurred as well. ln 2017, there were 2.6 billion cloud data records breached alone. In addition, compliance reporting can still be painful.

These issues stem from one fundamental misconception. While moving workloads to the cloud may be as simple as a “lift and shift” approach, moving everything else (like network monitoring, performance monitoring, security architectures, troubleshooting activities, and compliance activities) is not that simple. The key to a successful deployment is knowing ahead of time where the issues lie and addressing those potential issues upfront—before they become a service impacting problem. If employees and customers cannot access applications, then the result will be a negative impression.

Let’s look a little deeper into the security concern. Some would argue that most, if not all, of those cloud breaches and performance problems were due to user error. That statement is irrelevant. Any exposure of sensitive personally identifiable information (PII) will still result in government fines, lawsuits, a tarnished brand, and probable job loss for those responsible individuals.

In actuality, there probably is a “security skills” shortage issue for businesses moving to the public cloud, as security best practices take security experts years to understand. However, there are also fundamental architecture differences. This means that security must be a forethought, not an afterthought.

As an example, inline tools are not an option for most public cloud solutions due to the nature of inline security tools. Public cloud vendors do not allow customers access to their network and system layers to deploy any inline security (e.g., intrusion prevention system (IPS), data loss prevention (DLP), or web application firewall (WAF)) tools, as this can create a security risk to their network. So, if you plan to deploy inline security protection, you should understand that it won’t be a “bump in the wire configuration” that you are used to for on-premises devices, like a typical IPS.

Lack of inline tool deployment obviously creates a risk to your cloud instance that you will need to address. So, how do you secure your environment now? A potential option to mitigate the threat would be to use a hybrid architecture that allows you to keep your existing security tools within the physical premises to inspect high risk data (or even general data if you want). Based upon your risk plan, this may provide the protection you need and minimize business risk to an acceptable level. Note, most cloud computing vendors charge you to export data. However, the data bandwidth costs can be limited by simply transferring only the relevant data to the on-premises tools.

With regard to performance concerns, this is not an isolated observation. According to research from Dimensional Data in late 2017, half or more of the companies they surveyed experienced application performance problems. Additionally, 88% of companies surveyed experienced some sort of issue with their cloud environment due to a lack of visibility into what is, and is not, happening within that environment.

Sanjit Ganguli of Gartner Research conducted a separate poll at the Gartner December 2017 Data Center Conference and found that 62 percent were not satisfied with the monitoring data they get from their cloud vendor now that they have moved to the cloud. In addition, 53 percent said that they were blind to what happens in their cloud network.

This lack of visibility is understandable. Once you migrate to the cloud, and during the migration process, you will not have clear visibility into the network layer. You will only be able to get information about the cloud network and some parts of the operating system from cloud-based service providers. They provide summarized metadata on cloud-centric information (network, compute, storage). This includes high-level cloud data (e.g. CPU performance, memory consumption, etc.) and some log data.

What the cloud providers and other cloud tools do not provide is network packet data. This data is absolutely necessary for security forensics and troubleshooting using root cause analysis. Data loss prevention (DLP) tools and most application performance management (APM) tools are dependent upon the packet data for problem analysis. Typical cloud tools provide limited data that is often time-delayed which can dramatically impact tool performance. For instance, tactical data loses 70% of its performance monitoring value after 30 minutes of time.

In addition, cloud providers also do not provide user experience data or the ability to watch conversations. Specifically, this means that you cannot accurately gauge customer quality of experience based upon cloud provider delivered data. In addition, the flow data provided lets you see who the talkers are but does not contain anything about the details of the conversation.

An easy remedy for this issue is to add cloud-based monitoring data sensors (also called virtual taps) to your cloud network. These sensors can replicate copies of the desired data packets and send them to your troubleshooting, security, and or performance tools. This gives your tools the data they need to perform their functions. One key factor though is that the data sensors need to have the ability to scale automatically as needed. As cloud instances get spun up, the sensors capability needs to be able to scale sufficiently as well.

One common misconception is that everything in your physical network has a cloud equivalent. This is not the case. You are moving from an environment where you have full control to an environment where you have limited controls. This situation is akin to moving from ownership of a house to rental of a house. You may still be living in a house, but you are now subject to someone else’s rules; while you pay them money for the privilege. This means that there often ends up being less cost savings than originally planned.

Here is a summary of Hybrid IT/cloud advantages and disadvantages:

What you need to thoroughly understand is what are you migrating and why? While this topic sounds simple, it represents a fundamental stumbling block for IT. Business operation is not just about spinning up apps as fast as possible. The cloud solution you choose to deploy, and how you choose to deploy it, will dictate your data visibility, how you can access data, and long-term costs, i.e. your total cost of ownership (TCO). Many glosses this over and just make quick assumptions because they need to make decisions fast. This decision will make or break the project’s success.

In the end, maybe a hybrid IT environment will give you the best both worlds. Further information and specifics can be found in these resources: How to Avoid the Seven Deadly Sins of Hybrid IT Migration and Top Four Considerations When Migrating to Public Cloud.

Author - Keith Bromley, Sr. Manager, Solutions, Keysight

Senior product management and marketing professional with over 25 years of Hi-Tech software and hardware experience. Responsible for thought leadership, product management and marketing activities for network monitoring, network security, VoIP and unified communications (UC) for enterprise and carrier solutions. This also includes system security and standards compliance responsibilities. I am a subject matter expert on unified communications, diagnostics & monitoring, VoIP, SIP, system management, optical, wireless and wireline infrastructure and network monitoring. I have written over 60 industry whitepapers and made over 25 public presentations covering topics on diagnostics & monitoring, network security, VoIP drivers, SIP, unified communications, and ROI and TCO for network solutions. Prior to Keysight/Ixia, I worked for several national and international telecommunications companies including: ShoreTel, MirGroup, NEC, Cisco Systems, Metro-Optix, DSC and Ericsson. I hold a Bachelor of Science in Electrical Engineering.

55 views
bottom of page