« Close

Datasheets and User Guides

App Notes

Software & Driver

 

5.5 - AIBurst

After receiving a AIBurst command, the LabJack collects 4 channels at the specified data rate, and puts data in the buffer. This continues until the buffer is full, at which time the LabJack starts sending the data to the host. Data is sent to the host 1 scan at a time while checking for a command from the host. If a command is received the burst operation is canceled and the command is executed normally. If the LED is enabled, it blinks at 4 Hz while waiting for a trigger, is off during acquisition, blinks at about 8 Hz during data delivery, and is set on when done or stopped.

Table 5.5-1.

Command  
Byte # Description
0 Bit 7: X
  Bits 6-4: PGA for 1st Channel
  Bits 3-0: MUX command for 1st Channel.
1 Bit 7: X
  Bits 6-4: PGA for 2nd Channel
  Bits 3-0: MUX command for 2nd Channel.
2 Bit 7: X
  Bits 6-4: PGA for 3rd Channel
  Bits 3-0: MUX command for 3rd Channel.
3 Bit 7: X
  Bits 6-4: PGA for 4th Channel
  Bits 3-0: MUX command for 4th Channel.
4 Bits 7-5: Number of scans ( 000 = 1024, 110 = 16)
  Bits 4-3: IO to trigger burst on
  Bit 2: State to trigger on
  Bit 1: Update IO
  Bit 0: LED State
5 Bits 7-4: 1010 (Start Burst)
  Bits 3-0: Bits for IO3 through IO0 States
6 Bit 7: Feature Reports
  Bit 6: Trigger On
  Bits 5-0: AIINTMSB (High 6 bits of sample interval)
7 AIINTLSB (733 = min => ~8192 Hz)
   
   
Response  
Byte # Description
0 Bit 7-6: 10 (binary)
  Bit 5: Buffer Overflow if Backlog = 11111, Checksum Error if Backlog = 0
  Bit 4: PGA Overvoltage
  Bits 3-0: Bits for IO3 through IO0 State
1 Bits 7-5: Iteration Counter
  Bits 4-0: Backlog/256
2 Bits 7-4: Most Significant Bits from 1st Channel
  Bits 3-0: Most Significant Bits from 2nd Chanel
3 Least Significant Byte from 1st Channel
4 Least Significant Byte from 2nd Channel
5 Bits 7-4: Most Significant Bits from 3rd Channel
  Bits 3-0: Most Significant Bits from 4th Chanel
6 Least Significant Byte from 3rd Channel
7 Least Significant Byte from 4th Channel
  • PGA Gain Setting – (Differential Only) 0b000 = 1, 0b001 = 2, 0b010 = 4, 0b100 = 8, 0b101 = 10, 0b110 = 16, 0b111 = 20
  • Mux Settings – 0b0000 = 0-1 (Differential), 0b0001 = 2-3 (Differential), 0b0010 = 4-5 (Differential), 0b0011 = 6-7 (Differential). Single-Ended readings = 0b1000 + AI Number.

LabJackPython Example

  
>>> import u12
>>> d = u12.U12(debug=True)
open called
Writing: [0x0, 0x0, 0x0, 0x0, 0x0, 0x57, 0x0, 0x0]
Received: [0x57, 0x0, 0x0, 0x0, 0xff, 0xff, 0x0, 0x0]
>>> d.rawAIBurst()
Writing: [0x8, 0x9, 0xa, 0xb, 0xe1, 0xa0, 0xa, 0x98]
Received: [0x80, 0x0, 0x99, 0x8, 0x2a, 0x99, 0x2c, 0x6]
Received: [0x80, 0x20, 0x99, 0xc, 0x2a, 0x99, 0x2c, 0x4]
Received: [0x80, 0x40, 0x99, 0xc, 0x2c, 0x99, 0x2a, 0x6]
Received: [0x80, 0x60, 0x99, 0xc, 0x2a, 0x99, 0x2c, 0x4]
Received: [0x80, 0x80, 0x99, 0xc, 0x2c, 0x99, 0x2c, 0x6]
Received: [0x80, 0xa0, 0x99, 0x0, 0x2a, 0x99, 0x2c, 0x4]
Received: [0x80, 0xc0, 0x99, 0xc, 0x2a, 0x99, 0x2c, 0x6]
Received: [0x80, 0x0, 0x99, 0xc, 0x2a, 0x99, 0x2c, 0x6]
{
  'Channel0': [1.2890625, 1.30859375, ..., 1.30859375], 
  'Channel3': [1.279296875, 1.26953125, ..., 1.279296875], 
  'Channel2': [1.46484375, 1.46484375, ..., 1.46484375], 
  'PGAOvervoltages': [False, False, ..., False], 
  'IO3toIO0States': 
    [
      <BitField object: [ IO3 = Low (0), IO2 = Low (0), 
                          IO1 = Low (0), IO0 = Low (0) ] >, 
      <BitField object: [ IO3 = Low (0), IO2 = Low (0), 
                          IO1 = Low (0), IO0 = Low (0) ] >, 
    ..., 
      <BitField object: [ IO3 = Low (0), IO2 = Low (0), 
                          IO1 = Low (0), IO0 = Low (0) ] >], 
  'BufferOverflowOrChecksumErrors': [False, False, ..., False], 
  'Channel1': [1.455078125, 1...., 1.455078125], 
  'IterationCounters': [0, 1, 2, 3, 4, 5, 6, 0], 
  'Backlogs': [0, 0, 0, 0, 0, 0, 0, 0]
}
  

2 comments

It would be awfully helpful to include a description of the AIINT bytes here.  733 ~ 8192Hz is certainly enough to figure it out, but a quick formula would save some debug cycles.  Great otherwise - Thanks!

I looked at the Windows driver source template, and found that the allowable range for Sample Interval is 733 to 16383, and the formula for actual scan rate is:

scanRate = 6000000.0F/((float)(sampleInt * numChannels))