Network telemetry provides security analysts with visibility into the actors present on a network, allowing better tracking of adversaries as they enter and move through the network.
To be effective, the telemetry needs to collected from edge devices into data stores where it can be correlated at scale with other signals.
This blog shows how to leverage Logstash running on Linux Ubuntu to stream Netflow/IPFIX telemetry to Azure Sentinel for SIEM+SOAR and Azure Data Explorer (Kusto) for long-term storage.
Install Logstash
Start by installing Logstash:
Verify the installation by writing a few log lines to the console:
Send logs to Azure Sentinel
Install the Azure Log Analytics plugin:
Store the Log Analytics workspace key in the Logstash key store. The workspace key can be found in Azure Portal under Azure Sentinel > Settings > Workspace settings > Agents management > Primary key. While there, also write down the Workspace ID (workspace_id below).
The command prompts for the key.
Create the configuration file /etc/logstash/generator-to-sentinel.conf:
This will create a table called TestLogstash_CL in Azure Sentinel.
Run the pipeline:
The pipeline starts by generating 10 rows and then waits for user inputs to send as extra rows.
Send logs to Azure Data Explorer (Kusto)
Install the Azure Data Explorer plugin:
Create an AAD application in Azure Portal under Azure Active Directory > App registrations > New registration. Write down the Application ID (app_id below) and Tenant ID (app_tenant below).
Create a new key for that app under Certificates & secrets > Client secrets > New client secret and store it in the Logstash key store:
The command prompts for the key.
In Kusto, grant the AAD app ingestor access and initialize the table:
Create the configuration file /etc/logstash/generator-to-kusto.conf:
Run the pipeline:
Receive logs from Netflow
Netflow logs are collected by Filebeat which forwards them to Logstash. For simplicity, below, the output of Logstash is written to the console. This can be replaced by the Log Analytics and Kusto outputs as needed.
Setup Filebeat
Install Filebeat:
Have Filebeat listen for NetFlow UDP traffic on localhost:2055:
Redirect the output of Filebeat from ElasticSearch to Logstash. In /etc/filebeat/filebeat.yml:
Comment out the section output.elasticsearch
Uncomment the section output.logstash
Run Logstach
Create the configuration file /etc/logstash/filebeat-to-stdout.conf:
Run Logstash:
Run Filebeat
In another terminal, run Filebeat:
Generate mock NetFlow traffic
For quick testing, nflow-generator can be used to generate local NetFlow traffic.
In a third terminal, install and run nflow-generator:
Beyond Netflow
Logstash and its companion Filebeat are not limited to forwarding Netflow/IPFIX: they also support a wide variety of other inputs related to network security, including Zeek, Suricata, Snort, etc. See Filebeat modules for details on how to configure them.