Hi folks,

I've run into an odd situation.  I have a custom USB peripheral device which 
generates real-time data.  I monitor this device using a PyQt5 app that I 
wrote.  Periodically I want to capture some of this data in files.  Because of 
a transient OS bug which apparently involves a corner case in the Linux serial 
port driver (!), I cannot guarantee that I can transmit commands to the device 
to shut off data transmission while I'm saving a file.

When my setup gets handicapped with just one-way communication, the PyQt5 file 
selector dialog box pauses for an unusually long time.  My data gets saved to 
disk, but then my application segfaults.  I have to restart the whole device.  
In time-critical data acquisition, this won't be good.

My hypothesis is that the PyQt message queue is getting choked with data during 
the disk write operation, and/or when the QFileDialog box pops up.  I can't 
prove this, but I am contemplating a work-around.  I plan to separate my data 
analysis and annotation windows completely from the live data-acquisition 
application.

I just modified my program so that the data streams to a file in /tmp from the 
acquisition program when I click a Record button.  I can click a Stop button, 
and the /tmp file closes without crashing the live data application.  I can 
repeat this as often as I want.  So far, so good.

My CPU has a dozen cores.  I think that all I need to do now is to start a 
second Python interpreter which runs an analysis-only application.  Then when 
the QFileDialog from the analysis app is on the screen, there's no chance that 
it would disrupt the message queue in the live-acquisition app.

I have used multiprocessing before when I wrote some parallelized code.  That 
program required significant communication between processes, and it's overkill 
for my purpose here.  I don't need communication between the spawning (live 
data) program and the spawned program.  In fact, to the extent that the live 
data program has to pay attention to anything besides the data stream, I think 
it could be bad.

I have been investigating the subprocess module.  I'm looking for something 
which behaves like subprocess.run("python3 my_program.py"), but which does not 
"Wait for command to complete, then return a CompletedProcess instance."

Any suggestions are appreciated.  Thanks!
-- 
https://mail.python.org/mailman/listinfo/python-list

Reply via email to