My traffic is encrypted, NDR won’t see anything… Wanna bet?
In this world of permanent and increasingly sophisticated cyber-attacks, it seems essential to encrypt some data flows, whether they are sensitive or not. Encryption consists of securing data flows by translating the raw text into encrypted form, thus preventing the data from being read.
There is a misconception that encryption of traffic limits our ability to detect because of the complex evaluation of flows. However, Gatewatcher’s new Beacon Hunter technology addresses this challenge.
Read this article or watch this 15mn webinar to know how to reveal the invisible on your network traffic.
What decryption capabilities are required to deal with the increasing amount of encrypted traffic?
Since 2018, we have seen a significant evolution in the encryption of network flows on Windows, Android, Chrome, Linux, or Mac, thus assuming the end of network analysis.
Indeed, it looked like the end was coming with the latest version of TLS 1.3 encryption, as many experts were predicting. However, the reality was quite different. This version of encryption was added to the many existing encryption protocols such as HTTPS, DNS, VoIP etc., and strengthened the pool of encryption tools that were available, without putting the network analysis issues behind us.
Therefore, with the strengthening of this encryption process, is there actually any need to decrypt network traffic to detect attacks?
In order to answer this question, let’s look at the concrete need in the corporate world, which is different from the private one. The challenge of encryption for individuals is to ensure the confidentiality of their activities. This means maintaining private browsing, for example, or securing data when making an online purchase. A company, on the other hand, has a completely opposite need, as encryption is not an option. The company has to keep control over all activities within its network. Thus, it must use completely different protocols than in the private sphere, where DoH is forbidden and DNS requests are monitored, for example to control and store its history. In addition, with the evolution of existing norms and laws, the need to monitor network activity is growing within companies.
Based on this observation, we can draw the conclusion that it is not necessarily essential to decrypt network traffic, even if this would be preferable from a detection perspective. However, keep in mind that decryption takes different forms depending on the sector to which it is applied (private/business). Furthermore, the approach to decryption and analysis of information will not be the same and will depend on the capabilities of the used protocols.
Can we then decrypt everything? There are, in fact, a large number of technologies involved in decryption such as Network Packet Broker (NPB); TAP SSL; Agent / EDR; WAF; Load Balancer; FirewallProxy…
However, each of them has its own specificities and a variable efficiency according to the environments in which they occur respectively. For example, the Proxy will catch some requests but, in some cases, will not decrypt them because TLS 1.3 will be present. It all depends on where it is positioned.
Regardless of technologies being used, there is no guarantee that 100% of the content will be fully decrypted, especially as some vendors use techniques that prevent the interception of certificates. Even in an ideal world where everything is decryptable, if a large-scale attack occurs, this would have little impact on its remediation because the content of the data (payload) would remain encrypted.
Furthermore, data encryption is not necessarily blocking because there are other resources today that we can use to analyse the network and detect attacks without the content being completely decrypted.
Metadata, a new feature in the detection of encrypted flows
Another key parameter should be taken into account in a decryption protocol: metadata. But what exactly are we talking about? How does it affect detection?
In an encrypted flow, the metadata refers to all the information that will be exchanged, without making the payload accessible. It ensures the proper functioning of statistical analysis and machine learning algorithms, in search of additional data to perfect their analysis.
Let’s imagine we are in the French series “The Bureau”. The intelligence department is tapping a group of people’s phones. However, this group uses encrypted content that does not allow the services to understand what is being said. However, the data headers and metadata have already revealed a certain amount of information without having the payload.
Among these pieces of information, it is possible to know :
- The user name, and therefore the identities of the potential participants in the group
- The number of users
- The number of close calls and their peaks
- The duration of the calls
- The volume of data exchanged
We may lose data, but we earn metadata, which is related to threat intelligence. By studying the statistics and temporal probability of call peaks using Machine Learning and Artificial Intelligence, we can capture essential information about stakeholder conversations.
Rich compendium of information, metadata is ideally suited for the investigation and analysis of network flows.
Beacon hunter – gatewatcher’s innovative approach to fighting against targeted threats
When it comes to detecting targeted APT attacks, the main challenge is to detect the framework that is being used. But beware of false positives! Having a tool that performs these tasks automatically would mark a real evolution in the ability to detect targeted threats. But at this stage, only human analysis, which is very costly in terms of time and resources, is capable of identifying false positives.
C&C (Command & Control) servers are increasingly being used today and represent one of the most damaging types of attack, especially with the use of the Cobalt Strike or Brute Ratel frameworks. A hacker starts by infecting a computer via a phishing email, via security breaches in existing browser plugins or previously infected software, etc. The hacker simulates the traffic on the computer and then uses it to attack the computer. The attacker imitates legitimate traffic (by modifying the TLS headers) in order to hide from the various current detection capabilities, and then executes the commands of the C&C server. This enables him to install additional software, use obfuscation [1], or encrypt and over-encrypt exchanged data. The offender now has complete control over the victim’s computer and can execute any malicious code that may spread to other computers. In this way, a cybercriminal could take full control of the company’s network.
Gatewatcher’s new detection engine, Beacon Hunter analyses metadata to detect specific statistical distributions.
It identifies connections to C&Cs even with encryption and obfuscation, and detects command execution to anticipate future events, again without requiring payload access.
It detects suspicious periodic new connections and randomness that may be dropped by the hacker. Specifically, rather than connecting every 60 seconds, the attacker will connect every 10 to 20 seconds and create a different rhythm of communication to be less detectable on the network. Simple human analysis is no longer enough. This is where the integration of the Beacon Hunter engine into our NDR platform comes into its own, as it is now possible to have a global view of the periodic (or non-periodic) connections within the network.
Thanks to Beacon Hunter, it is now possible to have a global vision of periodic (or not) connections within the network.
No matter the configuration, the activated options (obfuscation, encryption), the existing and upcoming Cobalt Strike and Brute Ratel versions, thanks to Beacon Hunter and AIONIQ, Gatewatcher will always be able to detect threats without generating false positives. Beacon Hunter and AIONIQ are thus the most effective solution for detecting cyber threats on the network in any type of context, from large corporations to small and medium-sized businesses.
[1] Trying to avoid statistical methods by playing with randomness e.g. adding random data to avoid certain tools present on the endpoint
Author: Philippe Gillet, CTO Gatewatcher