Microsegmentation: Security for Any Data Center

Microsegmentation is a relatively new concept in data security. It relies on small sets of data that are separated from each other and every thing else rather than the use of traditional firewalls that fence in a large section of data at the perimeter. It works in any data center, physical or virtual, although the parameters for use may vary by data center type.

The concept of microsegmentation is the idea that each individual service, server, database (infrastructure physic or virtual) that runs in a data center can be small and isolated. Each individual isolated unit of compute can have a single or a few purposes and those purposes are all the unit cares about. Further, it only opens specific connections to other machines for those purposes. So, if the only purpose is to move data from machine A to machine C over port 8080 then there is only one inbound (A to B on 8080) and one outbound (B to C on 8080) port open. That a unit of compute only takes traffic on those ports and only listens to or speaks to the specific machines and is otherwise isolated. This limits risk in the sense that if a unit becomes exposed, the exposure blast radius ends with the very limited and specific connections things to other machines as opposed to the entire data center being at risk via east-west escalation once inside the perimeter. Therefore, microsegmentation effectively puts a perimeter at each individual unit.

The biggest benefit of microsegmenation is security. Being able to isolate a machine from everything else really limits risk of exposure by reducing surface area. For example, you have a unit where there’s a website, it collects emails for marketing, and has a small database. It connects to nothing else, it doesn’t have any reason or need to communicate with anything else related to the company. Except for the internet and the DB where it collects emails this unit is now completely isolated from the rest of the company’s infrastructure. If someone goes in and pops the server, they gets access to the list of email addresses that people freely gave to a marketing site and nothing outside of the isolated segment. It’s not good, it’s still a problem, but they don’t have access to every other server behind the firewall.

Other benefits of migrosegmentation systems include a robust ability to monitor each unit for normalcy. Anything that is not normal will create an alert and be blocked by default so any breach can be stopped quickly. Most microsegmentation technologies also have what’s referred to as a listening mode where it figures out normal traffic patterns for each unit and can help you create a map the network segments that are necessary and normal.

The main drawback to microsegmentation is that it can make development more difficult. It can shut down things if you’re not using them, and as you go through the steps to microsegmentation you can accidentally shut things down that you actually need access to. It means that you have to transit a lot more information and control over the internet and less in a traditional fashion. This is a cloud perspective mentality, moving into the future, which should be happening anyway. Businesses should be looking at driving things over API’s and utilizing more virtual/automated processes.

There are multiple approaches to microsegmentation. There’s the automated approach where a segment mapping it done by monitoring traffic and there’s the planned approach where segment mapping is designed with pen and paper and manual created. There are appliance driven approaches and SaaS solutions for both automated and planned approaches. The automated approach tends to work better as plans need continual updates which makes it almost moot to have a completely non-automated approach as the plan becomes immediately stale. While it would be more secure if all traffic was pre-planned, you create a bottle neck in development and you’re losing the ability to consistently monitor what normal is that you get with the automated approach. So even if you plan out an approach, you’re generally going to still have some automation to keep things up to date and to monitor. 

An appliance driven approach works at the layer of actual networking between systems and processes. While a software agent or SaaS approach approach is where the appliance basically lives on every unit that you have. This could be your container, your server or your VM, whatever the smallest individual unit is. In this scenario, you can have a lot more precision it can also be portable as each agent is responsible for its own unit. So if you move a unit to another cloud provider or data center, the rules stay with it because they’re happening at the unit level instead of the network level.

There can be disadvantages to software agent approach. The agent is generally has high level access on each unit. The segmentation systems access creates surface area for would-be hackers to attack. Also, the agent can add CPU, RAM and Network load to the units. And the agent has to be installed into every unit.

The best way to get started with microsegmentation is to first figure out if you can actually micro segment. There are network schemas where microsegmentation will not function well. So it depends on how you are currently networked and operate. However, you should always be moving towards a microsegmentation strategy, even if it’s a manual one. Basically reducing surface area, reducing connections between systems by limiting the number entry and exit points each unit has and isolating and segmenting as much as possible.

If you’re just getting started and you’re a startup or a large enterprise, you will likely need to hire a vendor. An expert can help you map out and monitor what is going on in your network and what normal is. When you figure that out, they can help you determine how to segment your network and units.

If you’re currently using a cloud provider all the infrastructure is being written through commands and code, so you can take that appliance approach where you spin up a new instance in the virtual network and it can start to monitor east-west and can write in new rules and modify the surrounding infrastructure. If you’re in a data center, you’re more limited to agent approach. Installing an appliance into each individual unit of compute can be daunting if you have to install that on 40,000 servers and you may want to look for automation that can help with the installs.

Microsegmentation adds a layer of security to your data by limiting connections and access therefore reducing an attack’s surface area. It may not be suitable for every network today, but moving towards microsegmentation’s abilities is a step in the right direction.

About the Author

PWV Consultants is a boutique group of industry leaders and influencers from the digital tech, security and design industries that acts as trusted technical partners for many Fortune 500 companies, high-visibility startups, universities, defense agencies, and NGOs. Founded by 20-year software engineering veterans, who have founded or co-founder several companies. PWV experts act as a trusted advisors and mentors to numerous early stage startups, and have held the titles of software and software security executive, consultant and professor. PWV's expert consulting and advisory work spans several high impact industries in finance, media, medical tech, and defense contracting. PWV's founding experts also authored the highly influential precursor HAZL (jADE) programming language.

Contact us

Contact Us About Anything

Need Project Savers, Tech Debt Wranglers, Bleeding Edge Pushers?

Please drop us a note let us know how we can help. If you need help in a crunch make sure to mark your note as Urgent. If we can't help you solve your tech problem, we will help you find someone who can.

1350 Avenue of the Americas, New York City, NY