Add a few more tuneups to the Windows h4h5tools files:
-- Update example test script and move it to the root Windows directory
-- Disable test generator code by default on all configurations
-- Add a note about Visual Studio .NET in the documentation.
Purpose: Revise example project to work in new directory structure. Note that the example project isn't actually run by any of the scripts. This mirrors unix.
Remove comments from reconfigure, and regenerate configure
Edit h4toh5eostest.c so that it does not cause a compiler error when --with-hdfeos2 is not given
Purpose: Create new Visual Studio 2005 project files
Description:
This is a re-write of the Windows project files to use the latest source code, and include hdfeos support. We have dropped support for Visual Studio 6.0, and will only support Visual Studio 2005 in the future.
By default, h4h5tools is built without HDFEOS support, but can easily be enabled. Note that the documentation hasn't been updated yet-- I'll get to that soon.
Tested:
Only built, VS2005 on WinXP
Reject if an HDF-EOS2 file has any EOS2 data set that does not have any field
If an EOS data set does not have any field, H4toH5_sds() cannot be reached; so, H4toH5all_dimscale() and none of EOS handling functions can be called. The conversion will succeed, but the converted file will lose dimension information because it didn't have a chance to call any HDF-EOS2 API. To prevent losing information, h4toh5 explicitly rejects those files.
Perforce @305
Add -nc4strict option, more comments
- if -nc4strict is given and an HDF4 vdata or sds can be an HDF5 dataset that cannot be read by netCDF4, reject it.
- more comments
- add two test cases that are not handled correctly; but, it will not happen in the real world
Perforce @289 ~ @297
Copy attributes if h4toh5 creates new dataset for dimensional scales.
HDF-EOS2 Swath may have dimension maps. If they exist, h4toh5 creates interpolated datasets so that netCDF4 can find corresponding dimensional scales. For example, HDF-EOS2 Swath Geolocation field "Latitude" may have several attributes such as "long_name" and "unit". If there are dimension maps between "Latitude" and a datafield, a dimension scale like "Latitude_0:2_0:2" is created. This dimensional scale should have the s...
Handle HDF4 vdata which is used for EOS attribute and 1d field
Check if vdata is part of EOS, don't use general h4toh5 routine;
- to correctly copy EOS attribute
- to correctly convert EOS 1d field
Since generic h4toh5 routine is not used, it generates HDF4_OBJECT_TYPE = 'vdata' and HDF4_OBJECT_NAME.
Update expected results.
Perforce @258 ~ @262
Make hard link for EOS swath datafield.
- update expected results
Each SWATH has two groups: Datafield and Geolocation field, and they share the same dimension. Due to the restriction of netCDF4, dimension should be located in both Datafield and Geolocation field. So, put a hard link for each dimension so that the dimension can be read from both Datafield and Geolocation field.
Expected results were regenerated. All expected results (except two cases: swath_tll, swath_tll02 which have too...
Handle EOS swath and fix many EOS grid bugs
- make EOS swath conform to CF convention
- EOS test-suite including about 30 grid and swath files and expected results
- for grid: both 2d and 1d projections with different order (YX or XY)
- for swath: various settings on dimensional maps
- fix a bug that does not print accumulated error messages to stderr
It works various cases, and for 48 nsidc real files, it was not crashed. (I'm checking correctness.)
Perforce @207 ~ @250
HDF-EOS2 grid dimensional scales almost conform to CF convention.
For 1-d dimscales, described in 5.1 of CF convention document,
* "lon" and "lat" are both dimensions and variables
For 2-d dimscales, described in 5.2 of CF convention document,
* "lon" and "lat" are variables, but not dimensions
* "XDim" and "YDim" are dimensions, but not variables
The fact that "XDim" and "YDim" are not variables may violate CF convention. However, we didn't want to pretend "XDim" and "YDim" have mea...
Let test drivers for lib and h4toh5 run.
- remove -q argument for h5diff
- don't add AR_RnGd.hdf and MOD10A1.hdf cases because they will be removed from testh4toh5.sh.in too.
Make XDim and YDim coordinate variables
XDim and YDim are both dimensions and variables. Due to "This is a netCDF dimension but not a netCDF variable." attribute, they were not recognized as variables. This changelist fixed this problem.
To prevent nc_close, which is a part of netCDF4, from causing deadly segmentation fault
nc_close() causes segmentation fault while freeing a heap object. When I attached a debugger, it seemed that object is related to "coordinates" attribute. The dataspace of "coordinates" was changed from simple to scalar. After changing this type, nc_close() does not cause crash.
One thing I forgot to mention in the previous commit is that the actual EOS2 files are not tested anymore because the expected ...
Support sinusoidal dimensional scales for HDF-EOS2 grid dataset.
- create XDim, YDim as 1-dimensional array
- create lon, lat variables which are 2-dimensional arrays
- the grid dataset uses XDim and YDim as dimensions and has an attribute "coordinates" = "lon lat"
see #5.2 Two-Dimensional Latitude, Longitude, Coordinate Variables
http://cf-pcmdi.llnl.gov/documents/cf-conventions/1.1/cf-conventions.html
To implement this, control flow in H4toH5all_dimscale has been changed. Previ...
Add fake dimension map (EOS Swath) during conversion
Swath data may not have any dimension maps, this case has not been handled correctly. This case causes an error.
To handle this case without unnecessarily complicated branches, add fake dimension maps. For example, AR_Ocean has Ocean_product_quality_flag datatfield and Latitude, Longitude and Time geofields. Each field defines following dimensions:
- Ocean_product_quality_flag: DataTrack_lo, DataXTrack_lo, Qual_dim
- Time: ...
Fix faults occurring when empty string is passed.
- empty string is recognized as one element, which is incorrect
- Fix
- Before
for (i = j = 0; j <= len; ++j)
if (j == len || s[j] == ',')
- After
for (i = j = 0; j <= len; ++j)
if ((j == len && len) || s[j] == ',')
Add a test-suite for h4toh5 with -eos -nc4 option, and so on
- create a test-suite for h4toh5 with -eos -nc4 option
- AR_RnGd.hdf, MOD10A1.hdf are added (both have grid dataset)
- modify codes that cause warnings regarding "const" modifier
- make it compile with HDF5 1.6
- remove "-q" from h5diff for lib test cases
- add H5_USE_HDFEOS2 substitution rule for test driver
To build with HDF 1.8, configure option would be like:
./configure \
--prefix=/mnt/scr1/cholee/h4h5tools...
Make test run pass successfully.
- remove -q option from h5diff
- do not add DIMENSION_LIST, REFERENCE_LIST, NAME, CLASS attributes for ds dataset by default