I'm wondering if there is a way to have a model provide output stats on how long it takes a dispatcher to successfully send out a task sequence to an available operator? Even more useful would be if I could pull out average wait time before the task was sent out to an available operator by center port of the dispatcher. This would allow me to gauge the impact that putting priorities on certain tasks has on creating bottlenecks at other system points.
I've attached a dummy model that is personnel constrained (1 operator and three processors with fast arrival rates).