Big data handling
•Iterator to handle data too big to read on memory at once.
–GPhys::IO.each_along_dims_write – the result also written in file (since the result of operations is often big too.)  Another type of iterator is planned but yet to be implemented.
•Example:
–Without the iterator:
–  in = GPhys::IO.open(infile, varname)
–    ofile = NetCDF.create(ofilename)
–    out = in.mean(0)          #  now, the entire result is on memory
–    GPhys::IO.write( ofile, out )
–    ofile.close
–With the iterator, taking the last dimension to make a loop:
–  in = GPhys::IO.open(infile, varname)
–    ofile = NetCDF.create(ofilename)
–    out = GPhys::IO.each_along_dims_write(in, ofile, -1){ |in_sub|
–                [ in_sub.mean(0) ]   #  written in ofile each time
–              }
–    ofile.close