Routing multiple syslog types different pipelines and destinations

What is the best way to handle multiple syslog message types that come in but need to be sent to different destinations and have different formatting per destination. Right now I am using logstash and I run a mutate to add a tag on the incoming message and then later look for the tag to determine where I send it, on the input side. Then for the output I have different logstash instances that read and format to send to a destination.

Here is the example scenario. You have devices sending via syslog 514 ie (f5’s, cisco ASA, cisco routers, cisco switches, storage gear, Palos etc.) all the syslog messages would be different so we look at the field and build a mutate based on the raw message like %ASA% for cisco firewalls. Those messages would then need to be sent to different destinations, lets say elk and a monitor platform. The elk messages would need to be converted to json key value pairs, ideally using ecs schema. The monitoring platform would need the messages in a cef format. This is done per device group (i.e f5’s, ASA, etc) even then we may need to filter or add fields based on the traffic after identifying what the message device was.
Like Palo logs for example, once you identify the message as a palo device it is further divided into messages like Malware, URL, System, which have different message formats and fields.

Yep the Routing → Data Routes page is where you want to go. You can set-up routes based on any part of individual event data.

Sample:

from that example, the cisco_asa pipeline would then have two pipelines or something to send to the different destinations in different formats? Also assume below that route I would have a filter for like Palo traffic, Palo system, Palo url, F5, etc? Then would there be a pipeline for ASA-ELK, ASA-monitor, F5-elk, F5-monitor, etc for the different output formats?

when all is in would I have something like this

asa_syslog > cisco-asa-json > elk
asa_syslog > cisco-asa-cef > monitor
palo_traffic_syslog > palo_traffic-json > elk
palo_traffic_syslog > palo_traffic-cef > monitor
palo_malware_syslog > palo_malware-json > elk
palo_malware_syslog > palo_malware-cef > monitor

Was thinking if it was a way to have a single asa_syslog or device type that identified the device and then send the message to another pipeline to parse the messages and breaks out the different events and send the different outputs. something like below.

palo_syslog > Palo_mwalware-json -----> elk
> Palo_url-json --------------------------------/
> palo_system-json ---------------------------/
> Palo_mwalware-json ----|
> Palo_url-json ----------------|
> palo_system-json -----------> monitor

Taking the asa_syslog as an example: 2 routes with a filter as above. The 1st route is NOT set to Final. The 2nd route IS set to Final. The first would send to the json pipeline and to elk. The 2nd would send to cef and to monitor.

Would agree with Jon :slight_smile: and then add up that you have to create different pipelines for each route to shape your data or if you want to have a genreal formating for specific destinations you can use a post processing pipeline. :goat:

In my opinion Packs are the best way. I would suggest you download the Pack for Palo Alto and study how they separate messages. Using a Pack is the most elegant way to handle this, you don’t end up with so many routes under the Routing tab because most of them are in the Pack. The same method used in the PA pack can be used for anything else, even sources where there is no pack, it’s easy to follow the example and make your own.

Agree with Jon - we take a slightly different approach (more similar to a Splunk input) where we define a list of hosts in a CSV file and use it for a lookup:
__inputId.startsWith(‘syslog:in_syslog:’) && C.Lookup(‘syslog-nginx_hosts.csv’, ‘host’).match(host)
From there you can send it to the pipeline or pack of your choice. To send duplicate feeds, copy the route and create a new one below this as Jon said and turn off the final toggle until the last route.

brandonf,

that sounds interesting, in the csv do you also define the pipeline to use per host and read that as well to set the pipeline destination?

I am not sure its possible to send to a pipeline based on the csv ? Perhaps there is a expression filter that can do that?