I am trying to make a processor read 10 inputs from a global table and combine them (process time 1 + process time 2 +... process time 10 ) to generate a total process time for the station.
This way, I understand that I can better capture the real behavior of my process.
The problem is, I don't really know how to tell my processor to read lines 1 to 10 from a global table, and combine their results as a sum.
As you can see on the global table ProcessTime, my cycle time info will always be a distribution. That's why I intend to analyze them task by task instead of a single generic cycle time for the station.
1) Is it possible to read this distributions from a global table and combine their results into a process cycle time for my station? How?
2) I am using a normal distribution for this case. But what I really need is a way to limit maximum and minimum values for this distribution. Something like a Truncated Normal curve "Tnormal". The only difference from the normal curve, is that it accepts 2 more parameters besides average and stdev, I can limit maximum and minimum values in order to better fit a real process.
Ex: If the average of a process is 3 seconds, the operator might take 10 seconds to finish due to a distraction, but he cant perform under 1.5 seconds, never. So here, the normal curve wouldn't be enough to represent this curve. it should be like Tnormal( 3.0,2.0,1.5,10.0 ).
How can I represent this kind of distribution inside flexSim?
Ill attach a dumy model to help on understanding.
Appreciate as always.