Let's start out by looking at a high level overview of the easy way to write a quick matlab script for making and fetching measurements with a remote instrument:
- Setup and open a connection to the instrument. One of the steps here is setting up the size of the input buffer for reading measurement data.
- Configure the instrument for the measurement.
- Trigger the measurement.
- Fetch the measurement data from the instrument using the query() method and store it in one huge string
- Convert the string of data to a number format array like double for post processing and analysis
This method is easy to implement but can lead to problems when dealing with large amounts of data. There are a couple of reasons for this. The first being it does not use memory very efficiently. Notice between the input buffer allocation, the long string that the data is read to, and the final numerical array there are three large chunks memory being allocated for one set of data. This can lead to "Out of memory" errors. Especially in Windows machines where RAM is fragmented between a number of processes and Matlab needs contiguous memory to store a numerical array. Another reason is the various protocol layers that are used in an IO operation can be unstable when reading a large contiguous chunk of data.
A better and still easy to implement way to fetch large amounts of data is to format it as binary data (versus ASCII) over the IO and to break the data into smaller blocks and reassemble the blocks in Matlab. Lets demonstrate with example Matlab code. For this demonstration we will make 1M double format timing measurements on a 2 GHz oscillator using the Agilent 53230A universal counter. Below we will look at only the code portions that involve reading the measurement data. If you would like a full version of the "Lots_of_Data.m" script used here just email me.
%The Lots_of_Data script makes 1M timestamp measurements on
%a 2GHz oscillator using the 53230A. The measurements are read
%from the instrument in binary form. The measurements are read
%and handled in 500 reading block sizes set the input buffer
%size in bytes.We want to read in 500 readings per block each
%reading is 23 bytes including comma. 'obj1' is the handle to the
buffer = 23 * 500;set(obj1, 'InputBufferSize', buffer);
%Perform measurement settings and trigger
%Here we send readings to 53230A output buffer
%fread()will be used to read all measurements as binary data
%instead of ASCII characters to cut down on overhead
%get prescale value first and convert to string then to double
prescale = fread(obj1,5);
%allocate double array for faster performance
arrayDouble = zeros(1,1e6);
%read blocks of 500 readings from the 53230A at a time.
%Convert the binary values to doubles and add to the array
%of doubles. This method conserves memory versus reading all
%the data in at once
tempBin = fread(obj1,(23*500));
%replace string commas with spaces so we can convert to double array
tempChar = char(tempBin)';
tempChar = strrep(tempChar, ',', ' ');
%replace array of string values to array of doubles
if i ~= 1
arrayDouble((((i-1)*500)+1):(i*500)) = strread(tempChar);
arrayDouble(1:500) = strread(tempChar);
In the above example the 'fread' method was used to fetch the data in binary form instead of ASCII data. This reduces the size of the data being passed over the IO. Also the 1M readings were collected in blocks of 500 that were later reassembled in a numerical array for post processing and analysis. This means we only had to allocate one large chunk of memory instead of three like we did in the first example.
If you know of a better more efficient way to fetch large amounts of measurement data and store it in Matlab please share with a comment below or an email to me. If you are working with extremely large sets of data and get out memory errors check out Matlab's video on optimizing memory usage from the link below. You can also track your Matlab memory usage for optimization purposes using the free Memory Monitor program (link below).