Data flow between stages

i need some input.

We want to have a separate dataflow between different stages within cribl.
So we receive data in our prod environment and want to have an extra dataflow to QS stage, to test parsing changes etc.
One way we know is to configure a second output wihtin “Output Router”
Are there any other ways? Maybe within routes or pipelines?

Thank you for your input

If you are looking to have data come into Cribl and then have the option to send it out in multiple formats to various destinations, you are most likely looking for a Post processing pipeline. That way you can work on the data in a consistent way and then format for each downstream system appropriately.

Also check out the recent blog post:

1 UpGoat

Thank you,
Is there any way to just configure it via pipelines and without destinations?
The best way for us is to deploy it without and restart of the workers.

You could configure two (or more) routes with the same filter. The first routes should have the final flag disabled. The last route should have the flag set to final. This way you can copy the same data stream to multiple destinations without having to make changes to the destination frequently.

A trick here is to also include ... && Math.random() < 0.1 to your route filter to sample the number of events matching and being forwarded to the destination. You could also look into the Sample, Dynamic Sampling, Suppression, and Drop functions if you wanted to reduce the volume in the pipeline.

Thank you for your thoughts and inputs.
We’ll test the last one from @bdalpe which seems to be the best way for us.