Is it possible to route events to S3 (via a tee or something) and then process them?

Is it possible to stream logs and events to Cribl Stream, send a copy of that event to S3 (or similar) for archive, and then process/parse/enrich the logs before sending them to another destination? The use cases I’m thinking of are as follows:

  • (Primary Use Case) If a security incident occurs and we need to present evidence in court, how do we protect the forensic integrity of an event/log/etc. if the event is being changed in flight before going to an archive or analysis tool?
  • If we needed to replay a series of logs for some reason or another, it might be helpful to do so from the source events, unmodified by Cribl.
  • The downstream, long-term storage solution may also serve as a repository of data for other teams to use. Providing unmodified events for those teams may be preferred in that case.


2 UpGoats

This is very possible. When you send data through routes, you have the option to flag that route as Final or not (by default, the final flag is set to Yes. When set to Yes the event is consumed in the route. However, if you turn off the Final flag for the route, the event is passed down through the rest of the routes to be evaluated.

Here is how I would set this up:

  • 1st route with identifying filter sends the data to your archive storage with the Passthru (or other pipeline that does not modify the event) pipeline. Ensure the Final flag is set to No.
  • 2nd route with identifying filter sends the data to your analysis destination with the pipeline/pack in place that will manipulate the data as you need it to be for your analysis, detections, etc. Ensure the Final flag is set to Yes so the event is consumed.

In this instance you get an unaltered version of the event going to your archiving destination and your analysis destination.

3 UpGoats