Commits
cholee committed 657846458c6
Copy attributes if h4toh5 creates new dataset for dimensional scales. HDF-EOS2 Swath may have dimension maps. If they exist, h4toh5 creates interpolated datasets so that netCDF4 can find corresponding dimensional scales. For example, HDF-EOS2 Swath Geolocation field "Latitude" may have several attributes such as "long_name" and "unit". If there are dimension maps between "Latitude" and a datafield, a dimension scale like "Latitude_0:2_0:2" is created. This dimensional scale should have the same attributes as the original "Latitude" has. This changelist implements this. Implementing this is not straight-forward because HDF-EOS2 API does not support reading those attributes. (HDF-EOS2 guy confirmed that HDF-EOS2 does not support attributes per datafield.) Since attributes are not accessible with HDF-EOS2 API, h4toh5 first accumulates information and copies attributes at the last stage. Copying attributes on-the-fly is not easy to implement because the traversing order is not guaranteed. Initialize EOS2 conversion routine General h4toh5 routine traverses vgroup, sds, vdata and so on. EOS2 conversion routine is called for each dataset if the dataset belongs to HDF-EOS2 geolocation field mark it as source : < source, [ ... ] > else if the dataset belongs to HDF-EOS2 data field for each generated dimensional scale, save < ... , [ target, ... ] > Finalize EOS2 conversion routine for each accumulated information < source, [ target1, target2, ... ] > copy all attributes in source to each target Update affected expected results. Perforce @284