question

Kathryn McNeal avatar image
0 Likes"
Kathryn McNeal asked Logan Gold answered

Processing a batch of the same items without having to use a separator after

batchingbyparts.fsm

In the attached model, there are three parts being created: A, B, and C. Batching is performed in the queues, and then the OnEndCollecting trigger sets the batch size for the combiner (this could be an issue if two batches complete at the same time). What I want is for the batch of parts (either all A's, B's, or C's) to be processed at the same time on the combiner, and then exit the combiner as separate parts. Previously we have been modeling this behavior by having 3 separate combiners that pack the items into a tote, a processor that processes the tote, and a separator to get the individual items again. Is there a way to simplify this either entirely in the 3D side, or to add a process flow that ties into the 3D model?

FlexSim 17.0.2
batching
batchingbyparts.fsm (15.9 KiB)
5 |100000

Up to 12 attachments (including images) can be used with a maximum of 23.8 MiB each and 47.7 MiB total.

Logan Gold avatar image
0 Likes"
Logan Gold answered

Here is how I would do it. I am including two example models. The first one (batchingbyparts.fsm) is still using the Combiner, but I have added some other Fixed Resources and logic. The second one (batchingbyparts-processflow.fsm) replaces the Combiner with a Processor and is using Process Flow logic for moving all the flowitems. It also utilizes the Processor's logic to handle process time and using an Operator during that time. As far as I can tell, neither one contradicts what you described in your comment, but let me know if I am mistaken.

First Example Model:

I have Q_A, Q_B, and Q_C set a label named "batchSize" on the first item of a batch in the Queues' OnEndCollecting trigger. The value set on batchSize will reflect a lower batch size if a batch is released early with fewer flowitems than the set batch size of a Queue.

Next, I added an intermediate Queue between the three Queues and the Combiner and named it LoadingQueue. This Queue uses its OnEntry trigger to set its batch size based on the batchSize label on the first item of a batch. I also use the OnEndCollecting trigger to set the Combiner's component list to reflect the current batch size.

I added a Source called PalletSource that only ever creates one pallet when the model first starts and places it in the PalletQueue Queue. PalletQueue is the first input port on the Combiner so the pallet will always be the container flowitem for the Combiner. When the combined pallet and boxes are separated later in the model, the pallet will always come back to PalletQueue and await the next batch to enter the Combiner.

In the Combiner, I set the Combine Mode to be Pack since you indicated you wanted to use an Operator for process time, mostly because having one item (the combined items) for process time with an Operator is easier to handle. I also set the process time to "Use Operator(s) for Process".

Finally, the added Separator is set up with a Process Time of 0 so no additional delay time is introduced. And then I just made sure the output ports were set up to work with the Separator's default Send To Port logic in the Flow tab - container (pallet) goes to output port 1 and contents (boxes) go to output port 2.

Second Example Model:

I removed any logic from Q_A, Q_B, and Q_C except the batching logic. I have the Process Flow use an Event-Triggered Source to create a token every time one of the Queues triggers their OnEndCollecting trigger. Labels are set on the token to keep track of the batch size and originating Queue. The token is then pushed to a local list.

Also in the Process Flow I have a Schedule Source that creates one token when the model starts, which in turn creates a pallet in the PalletQueue Queue. That token then pulls from the list and uses an Assign Labels activity to transfer information from the pulled token - the batch size and queue of the current batch. It also goes through a Sub Flow to create an array named flowitems, which contains references to all the boxes in the batch. That label is used to move the referenced flowitems from their Queue to the pallet, then the pallet is moved to the LoadingQueue Queue.

The combined pallet and boxes goes through the Processor normally (without needing Process Flow), where an Operator is used during process time. When the Processor is finished processing, a Wait for Event activity is triggered in the Process Flow so the "pallet" token will continue with the rest of its Process Flow activities. Those activities move the flowitems from off the pallet and into the Q_Debug Queue, move the pallet to PalletQueue, and remove the flowitems label/array so it can be used again with the next batch.


5 |100000

Up to 12 attachments (including images) can be used with a maximum of 23.8 MiB each and 47.7 MiB total.

Jeff Nordgren avatar image
0 Likes"
Jeff Nordgren answered Kathryn McNeal commented

@Kathryn McNeal,

You are wanting the Combiner to do something it is not designed to do. Attached is your model with the changes that I've made to it.

Instead of using a Combiner, I use a Processor. In the Processor, on the Flow tab of the Properties window, in the Input area, I checked the Pull Strategy check box and chose the option "Round Robin if Available". This option will cycle through its input ports and if one of the Queues are ready to send their batch, then the Processor will pull the flowitems from that Queue.

On each Queue, I've checked the "Flush contents between batches" check box so that the Queue can only hold one batch at a time. In the OnReset trigger of the Queues, I close the output port of the Queue.

In the OnExit trigger of the Queues, I send a message back to the Queue to check to see if the Queue has been emptied. If so, I close the output port of the Queue.

In the OnEndCollecting trigger, I send a message back to the Queue to check to see if the Processor is available. If it is, I set the "maxcontent" of the Processor to the batch size of the Queue and open the output port of the Queue. If the Processor is not available, I resend the message to check the Processor status again in 5 time units. Of course you could change that to whatever makes sense in your situation.

That's pretty much it. Take a look at the model and see if it does what you are wanting it to do. Would something like this work for you?

batchingbyparts-jn1.fsm


· 1
5 |100000

Up to 12 attachments (including images) can be used with a maximum of 23.8 MiB each and 47.7 MiB total.

Kathryn McNeal avatar image Kathryn McNeal commented ·

batchingbyparts-v2.fsm

This doesn't appear to work once you add in an operator (1 operator should be able to process, or at least set up, the entire batch). Is that something that can be resolved?

I have also adjusted the model to try to reflect our reality more accurately. We have to have some kind of max wait time set since it could take an unacceptably long amount of time to reach the number of items for a full batch (at which point we would run the partial batch). We also can't flush the contents between batches since those are items we have on the floor. If there are 20 items of type C in the queue that have arrived from previous processes, some of the items will have to wait in the queue since it will take 4 batches to process all of them. I also changed the pull strategy to "Longest Waiting if Ready" since that is more realistic for us. The model now seems to be processing multiple types of items at the same time.

0 Likes 0 ·

Write an Answer

Hint: Notify or tag a user in this post by typing @username.

Up to 12 attachments (including images) can be used with a maximum of 23.8 MiB each and 47.7 MiB total.