After running a rather large optimization on a relatively simple model, the model size with all the optimizer data in it rose to 200+ MB making the model incredibly slow and unweildy - no surprise there. I managed to open the file again and try to start a new experiment/optimization to reset the data, but it did not remove more than half the added size of the model.
I went on to delete the experimenter from the model toolbox (which also removed the node in the Tools folder of the tree). I also deleted the OptQuest treenode. But the model size after save was still around 70MB. Is there a place in the model tree structure where more data could be orphaned and needs to be deleted?
This is a fairly simple model with no 3D object inported. The file should'nt be bigger than a couple of MB at best.