I am thought to use concurrent_bounded_queue to pass data between pipeline filters and control the number of products genarated. It seems to be a bad idea because the pipeline deadlocks but I am not sure why?
I would expect that a pop on the queue passed to the next filter wakes up the thread waiting on the push (I limit the capacity of the queue), but this never seems to happen.
The motivations for using the queue is that I pass around objects from a 3'th party api and they might be shared ptr's, secondly it allows me to rely on RAII instead of counting new and deletes.
I attach an example implementation consisting of 3 filter (input filter, transform filter and output filter).
Hope that it is only a problem with the execution of the idea to combine bounded_queue and pipeline.