Thursday, December 19, 2019

Encryption & Visibility: A Q&A with Kurt Neumann, Part 2 of 2 – TLS 1.3, Deep Packet Inspection and Network Traffic Analysis

Kurt Neumann is a Security Architect for Cisco Threat Analytics and the Co-founder of the private cloud solutions company Antsle, Inc.  He recently joined Cybersecurity Insiders as a panelist for our webinar The Importance of Network Traffic Analysis (NTA) for SOCs. Our discussion touched on the rise in encryption and the new Transport Layer Security protocol, TLS 1.3, and what it means for cybersecurity. It’s a topic that comes up often, so we invited Kurt back for an interview to explore the issue in more depth.

This is Part 2 of that interview. In Part 1, we discussed the new TLS 1.3 standard and its impact on conventional cybersecurity strategies and tools. If you missed Part 1, you can read it here.

Key Takeaways

Part 1:

  • We are heading toward full-encryption environments, including both defensive and malicious use of encryption
  • The new TLS 1.3 protocol will significantly strengthen data security and privacy
  • But, TLS 1.3 will also take away some tools we’ve relied on for traffic visibility

Part 2:

  • There are some workarounds for these tools, and there are new strategies, like network traffic analysis with deep packet inspection (DPI), that can help retain visibility in encrypted environments
  • Adapting to expanded encryption and TLS 1.3 won’t be easy for cybersecurity specialists, but it is doable, and adaption needs to start now

Part 2

Holger Schulze, Cybersecurity Insiders (HS): From our discussion so far, it looks like there are significant challenges to conventional defensive decryption strategies, regardless of whether they are in-band, out-of-band or endpoint-based.

Kurt Neumann (KN): Correct. If you’re an end user, this is good from a privacy standpoint. After all, one of IETF’s top goals with TLS 1.3 was to make sure transmitted information couldn’t be tampered with, forged, or read by anyone other than the sender and receiver.

But if you’re trying to protect a network, it’s a double-edged sword. What you gain in data security you lose in visibility, and it’s going to be hard to protect an environment because of that loss. Compensating for or working around this loss is possible, but the next couple of years are going to hurt. Changes will have to be made.

And the changes are going to have to be made fast. It looks like 1.3 is propagating faster than prior versions. Big players like Google, Microsoft, Mozilla, Apple and others have already implemented it, even if for now they’re being lenient with legacy 1.2 clients. That leniency won’t last long, and if you’re not prepared, you’re going to be getting a lot of connections refused.

HS: So Internet service and content providers are ahead of enterprises right now?

KN: For now, yes, but enterprises have also been implementing TLS 1.3 at a surprising pace. I remember an EMA survey back in January found that a majority of enterprises were either already in the process of enabling TLS 1.3 for internal traffic and/or inbound connections, or were planning to do so within the next six months. That matches what I’m seeing in the field.

I think it goes to show whatever the hurdles and visibility challenges, everyone knows their network is compromised now or at some point will be, and anything that can help make sure threat actors just get garbled data for their efforts is worth the pain.

But there is one important tool that can minimize that pain, and that’s Network Traffic Analysis. NTA can provide visibility into encrypted traffic – without decryption – under TLS 1.3 or any prior version.

HS: Yes, I was struck by that during our webinar discussion. Can you go a little deeper on NTA and how it can help with visibility, especially in the context of TLS 1.3?

KN: Sure. NTA has a big role to play here because it doesn’t rely on decryption to spot threats. NTA uses machine learning to model network behavior. Then it uses the model to detect abnormal behaviors that indicate possible malicious activity.

It’s like someone spotting an abandoned package at an airport, or a surveillance camera catching someone prowling a bank’s perimeter. You don’t have to know what’s in the package or the prowler’s head to know you’ve got a possible threat that needs to be checked out. NTA can trigger that alert, and deliver the contextual data you need to assess it.

Interestingly, NTA derives this context from a technology usually paired with TIA systems: deep packet inspection. In a classic TIA, DPI is used to collect raw telemetry data, pass it through decryption, and then mine both packet characteristics and context as well as packet contents.

If you’re using a really good DPI engine, this classification and contextual metadata alone can provide a solid foundation for machine-learning based threat profiling and behavioral analytics.

HS: So with TLS 1.3 and the general rise in encryption, do you think every DPI-based NTA is now  must-have for everyone?

KN: I don’t know about that, but I think it’s definitely something everyone should at least take a look at. I’ve heard some people respond to TLS 1.3 by saying “I have endpoint defenses that can show me IP and ports, so I’m good.”

But endpoint security is for desktops and they only make up 50-60% or less of a current network! There’s a huge range of devices that can be infected, as our DC camera hijackers remind us. Devices and network edges just aren’t fixed anymore. So focusing on a network strategy just makes sense.

For example, I was recently working on a project for a very large bank. Like many other large organizations, they are so big they don’t even know their own network and devices. And they have a lot of proprietary protocols.

So, they have this core need to know what’s going on from point A to point B, but they have a huge mapping and scaling challenge. Employing a network strategy as a complement to endpoint tools was a necessity.

But traditional header-level monitoring or network flow monitoring is not enough. If you can’t see the payload, you can’t see the API call, you can’t see the data in motion. You can’t see the malware in motion.

With DPI-based NTA, you can regain visibility into encrypted and evasive traffic. You can use traffic patterns to identify apps. You can use statistical models and machine learning to detect complex protocols, like RC4 Encrypted BitTorrent. You can use domain fronting detection and classification to make evasive traffic visible. You can classify traffic spoofing apps and detect tunneling traffic and mining pool traffic. The list goes on and on.

So, yes, I think NTA and DPI are indispensable for the encrypted age.

But you know, I think we’d all do well, too, to keep in mind the young scammer couple who surveilled the streets of DC from their Bucharest apartment. Their story is a reminder that IoT traffic is the most at risk today, and every visibility strategy has to address that.

And even more importantly, it’s a reminder that 95% of major recent breaches started with a spear phishing email. In the end, hammering away at employee training on how to spot and handle suspicious emails may do more to keep us safe than the most advanced cybersecurity stack.

HS: True indeed. This has been very informative. Thank you very much for taking the time to discuss these encryption issues with us.

This concludes Part 2 of the interview. Again, if you missed Part 1, you can read it here.

 

The post Encryption & Visibility: A Q&A with Kurt Neumann, Part 2 of 2 – TLS 1.3, Deep Packet Inspection and Network Traffic Analysis appeared first on Cybersecurity Insiders.


December 19, 2019 at 05:54PM

0 comments:

Post a Comment