Change rate of stream reads on U6 | LabJack

Change rate of stream reads on U6

4 posts / 0 new
Last post
wardchriss's picture
Change rate of stream reads on U6

I'm trying to sample several channels at 1000Hz each with the goal of saving the stream as well as displaying it live for annotation during runtime. The problem that I am running into is that while I can control the internal stream clock, and samples per packet, I can't find a setting to change the delay between receiving updates between the computer and the U6. It seems to default to sending over packets once per second (Ideally I'd prefer to have data transmitted back to the computer every 0.1-0.2 sec)

Is there a way to do this with LabJackPython and the U6?


edit: just saw some other looks like the answer i need is in the example

LabJack Support
LabJack Support's picture
Are you using the UD library

Are you using the UD library on Windows, or doing low level communication on Linux/Mac?


I can't find a setting to change the delay between receiving updates between the computer and the U6.

If using UD, the library has a background thread that is transferring stream data from the U6 buffer to UD buffer (computer RAM), and then your program is doing reads from the UD buffer, so are you actually asking about receiving updates between your program and the UD buffer?  That is controlled by how many scans you ask to read in your LJ_ioGET_STREAM_DATA and is also affected by which wait mode you are using for stream reads.


wardchriss's picture
I'm using the Exodriver on

I'm using the Exodriver on linux (raspberry pi 4B running raspbian) and trying to handle the data transfer using low level communication via LabJackPython.

the example in the seems to be helping (running the data stream in a seperate thread without initially converting it). It seems this has increasing the frequency of reads being received by my program.

my goal is to have my program obtain the data stream from the labjack, and simultaneously provide a display of the current data values (updated several times per second for display and rudimentary singal processing to trigger an event) while also saving the stream to a file on the computer.

I am also curious whether i should be concerned about the lag between the signal on the labjack and the computer for triggering an event. I am looking to detect when a signal ceases to cross a threshold for 3 seconds as the trigger for an event - would performance be better to have the U6 do this (can the U6 do this?). Is there a good example of how to synchronize the internal clock of the U6 with a computer?

LabJack Support
LabJack Support's picture
The streaming functionality

The streaming functionality in LabJackPython automatically controls the amount of data streamData reads, which affects timing as the call waits until that data is available. There is no streamConfig or streamData setting to control the amount of data read, but it can be tweaked with the packetsPerRequest U6 member variable just after configuration. The amount of samples read per streamData call is packetsPerRequest*25 when the scan frequency is > 25.

So for 0.20 seconds of data to be read every streamData call, configuration for a 3 channel stream at 1000 Hz scan frequency looks like:

d.streamConfig(NumChannels=3, ChannelNumbers=[0, 1, 2], ChannelOptions=[0, 0, 0], SettlingFactor=1, ResolutionIndex=1, ScanFrequency=1000)

# 0.20 seconds of samples = 0.20 * (SampleFreq)
# SampleFreq=NumChannels*ScanFrequency
numSamples = int(0.10*(1000*3))

# Packets to read per streamData call. Each packet reads 25 samples.
d.packetsPerRequest = int(numSamples / 25)

You can detect a 3 second trigger with stream mode. There in general is some lag since you are reading buffered samples, but this can be minimized by readings samples from the device at a fast enough rate keeping up with the ScanFrequency. Minimize delays between streamData calls.

We do not have an example on synchronizing times. You can get the computer time when you start stream mode, and use that as the streams start time. Then you can calculate scan times based on the start time (computer time based) and ScanFrequency, where 1 / ScanFrequency is the time between scans:

Scan#, Time
1, StartTime
2, StartTime + (1/ScanFrequency)
3, StartTime + 2*(1/ScanFrequency)

Note that each scan contains your channels readings. For example, using the above stream configuration and its first returned streamData (r):

Scan 1 = r["AIN0"][0], r["AIN1"][0], r["AIN2"][0]
Scan 2 = r["AIN0"][1], r["AIN1"][1], r["AIN2"][1]
Scan 3 = r["AIN0"][2], r["AIN1"][2], r["AIN2"][2],