atwright
2005-01-13 23:10:58 UTC
I am currently using a NI DAQPad-6015 and SC-2345 Signal Conditioning Box with SCC-A103 conditioning units. It is configured with a laptop with Windows XP and Labview 7.1. To read in analog ins, I set up a DAQmx task for the 16 NRSE voltage channels sampling in a continuous sampling mode at 5000 fps per channel. The program collects and displays data live, then, when prompted, triggers outside devices through digital outs (a different DAQmx task) and begins writing the analog in data to a file while still displaying a live data feed. I have been looking for an efficient way to clear the buffer of the AI task before writing to the file such that the delay is minimized between the digital signal change and data collection. The best solution I have found still results in a 8ms delay (which is still a little long for our purposes). Here is my current method:<br>- Data is read in one final time for the live feed along with a couple of other commands (digital changes)<br>Then in a sequence structure:<br>- The task is stopped<br>- The data remaining in the buffer is read by setting number of samples per channel to -1<br>- The task is stopped<br>- The task is started<br>- The digital signals are sent out<br>Finally:<br>- Recording of the AIs begins<br>Note: All of the commands (read, stop, start) are the DAQmx subVIs in LabView. If there is going to be a delay, it is important that the AI task is started before the digital signals are sent such that we can record the DO change and measure the delay for correction.<br><br>I realize that the method above seems ackward and inefficient, but it's been the best that I've found through much trial and error. Is there possibly a DAQmx subVI that would clear the buffer quicker/easier? Or some way to improve the rate at which the digital signals are sent to decrease this delay? Or is this just the intrinsic delay for changing DO with Labview in our setup? Any help would be greatly appreciated.