Read lots of rows (records) into a data structure (array)
Us IBM i blokes love reading things from files. Hashtag internet fact.
But sometimes we just want to read a certain number of records, or rows in SQL speak. For example, when building a page by page subfile with say 15 lines per page, we might want to be able to read an entire file, but read it in 15 row chunks.
In a native IO style RPG, we would probably set a counter and increment it by one for each row read, stopping at 15. But using SQL for IO it does all that counting for us 🙂
This means that in most cases we can just wrap the read a DOW loop and fetch until the cows come home i.e. nothing more to fetch.
Let’s look at a quick and dirty example, reading 15 rows at a time into a data structure with 15 elements.
Using SQL for Single Page Subfiles
// read the items runsql FETCH from (someIncrediblyInterestingFile) For 15 rows into :data_structure
// now write the sfl1 recs
for x1 = 1 to 15;
%occur(data_structure) = x1; //do stuff with the data_structure fields
endFor;
HI,
The way I usually do it is:
exec sql open mycursor;
dou nbrrecs < 500; exec sql fetch mycursor for 500 rows into array; nbrrecs = sqlerrd(3); for x1 =1 to nbrrecs; // do stuff with occurrence x1 endfor; enddo; exec sql close mycursor; This way I fixed an sqlrpgle program that fetched one record at a time from running for 52 hours to 26 minutes. Btw: The max number of rows to read is 32767. But there is also a maximum size (bytes) for the array or data structure. This number depends on the release of your os. Cheers, Rene Simons