I've come across a really unexpected correlation this morning that I'm hoping someone has an explanation for. I have a large flow graph with many QT GUI blocks because I'm debugging a design. Mostly Time Sinks and Constellations plots with a couple of Frequency Sinks thrown in. The number of points in some of the time sinks is rather large, on the order of 30k, which allows me to see several packets of data at once.
What I noticed this morning, while debugging a BPSK loopback BER tester, is by disabling a number of Constellation plots which were fed by RRC filters to make the plot pretty, errors went away. The system works as you would expect a simulation with no noise or channel effects to work, perfectly. When I enable those GUI blocks, the system looses packet synchronization within the first minute consistently. Nothing is changed in the data stream between these tests.
So the question is, is there a known cap on GUI plots? Like I said, I have a lot of them and some of them are plotting a large number of points. Could this be causing buffers overruns into data spaces or something scary like that?