• Skip to sidebar navigation
  • Skip to content

Bitbucket

  • Projects
  • Repositories
    • View all public repositories
  • Help
    • Online help
    • Learn Git
    • Welcome to Bitbucket
    • Keyboard shortcuts
  • Log In
Frank Willmore
  1. Frank Willmore

hdf5_der

Dana Robinson
hdf5_der
Public
Actions
  • Clone
  • Download

Learn more about cloning repositories

You have read-only access

Navigation
  • Source
  • Commits
  • Graphs
  • Branches
  • Network
  • Latest Activities

Commits

John Mainzer
77dfc1aa37f
John Mainzer committed 4aa32be53a509 Aug 2016
[svn-r30274] Interim checkin of work directed at cleaning up the serialization / file
close issue in which the contents of presistent free space manager metadata
can change after the initial flush of the file on file close.

This code largely resolves the issue on file close, but does not attempt
to resolve the serialization issue.

The code has the following known issues:

1) LOTS of debugging code / comments / old code still in place but 
   commented out.  This must be cleaned up before final checkin.

2) The existing code does not handle the case in which the file 
   has an alignment other than 1.  

3) Current code does not settle persistent free space managers 
   correctly if the file is flushed before file close.

4) Odd h5ls bug appears on at least one system in parallel build 
   and test.  Bug can be turned on and off by switching between the
   following config calls on my test system.  This configure call
   causes the bug to appear:

    /mnt/home0/mainzer/bench/cache_image/configure \
        --enable-funcstack \
        --enable-build-mode=debug \
        --disable-shared \
        --enable-parallel \
        --prefix=/mnt/home0/mainzer/bench/cache_image/install

   This configure call makes it go away:

    /mnt/home0/mainzer/bench/cache_image/configure \
        --enable-funcstack \
        --enable-build-mode=debug \
        --disable-shared \
        --enable-parallel \
        --with-szlib=/opt/szip \
        --with-zlib=/opt/zlib \
        --prefix=/mnt/home0/mainzer/bench/cache_image/install

   Note that the difference is the inclusion of the --with-szlib and
   --with-zlib lines.  Unfortunately, the bug does not appear on Jelly 
   with either configuration call (suitably modified for the environment).

   Output from the bug when it appears is as follows:

=============================================================================
Testing h5ls -v tvldtypes1.h5                                          PASSED
Testing h5ls -v tdatareg.h5                                            PASSED
Testing h5ls -w80 -d tdset_idx.h5                                     *FAILED*
    Expected result differs from actual result
    *** ./testfiles/tdset_idx.ls        2016-08-09 02:40:32.765819183 -0500
    --- ./testfiles/tdset_idx.out       2016-08-09 02:40:35.069819210 -0500
    ***************
    *** 12,27 ****
              (19,8) 8, 9
      dset_filter              Dataset {20, 10}
          Data:
    !         (0,0) 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 0, 1,
    !         (2,2) 2, 3, 4, 5, 6, 7, 8, 9, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 0, 1, 2, 3,
    !         (4,4) 4, 5, 6, 7, 8, 9, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 0, 1, 2, 3, 4, 5,
    !         (6,6) 6, 7, 8, 9, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 0, 1, 2, 3, 4, 5, 6, 7,
    !         (8,8) 8, 9, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9,
    !         (11,0) 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 0, 1,
    !         (13,2) 2, 3, 4, 5, 6, 7, 8, 9, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 0, 1, 2, 3,
    !         (15,4) 4, 5, 6, 7, 8, 9, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 0, 1, 2, 3, 4, 5,
    !         (17,6) 6, 7, 8, 9, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 0, 1, 2, 3, 4, 5, 6, 7,
    !         (19,8) 8, 9
      dset_fixed               Dataset {20, 10}
          Data:
              (0,0) 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 0, 1,
    --- 12,18 ----
              (19,8) 8, 9
      dset_filter              Dataset {20, 10}
          Data:
    !         (0,0)         Unable to print data.
      dset_fixed               Dataset {20, 10}
          Data:
              (0,0) 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 0, 1,
    ***************
    *** 34,36 ****
    --- 25,31 ----
              (15,4) 4, 5, 6, 7, 8, 9, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 0, 1, 2, 3, 4, 5,
              (17,6) 6, 7, 8, 9, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 0, 1, 2, 3, 4, 5, 6, 7,
              (19,8) 8, 9
    + H5tools-DIAG: Error detected in HDF5:tools (1.9.234) thread 0:
    +   #000: h5tools_dump.c line 1633 in h5tools_dump_simple_dset(): H5Dread failed
    +     major: Failure in tools library
    +     minor: error in function
h5ls tests failed with 1 errors.
Command exited with non-zero status 1
0.68user 0.42system 0:03.14elapsed 35%CPU (0avgtext+0avgdata 8856maxresident)k
0inputs+6152outputs (0major+743247minor)pagefaults 0swaps

Finished testing testh5ls.sh
=============================================================================

   Normal output is as follows:

=============================================================================
Testing h5ls -v tvldtypes1.h5                                          PASSED
Testing h5ls -v tdatareg.h5                                            PASSED
Testing h5ls -w80 -d tdset_idx.h5                                      PASSED
All h5ls tests passed.
0.66user 0.43system 0:03.10elapsed 35%CPU (0avgtext+0avgdata 8872maxresident)k
0inputs+6096outputs (0major+742118minor)pagefaults 0swaps

Finished testing testh5ls.sh
=============================================================================

Tested on Jelly (serial/debug, serial/debug/check-vfd, parallel/debug)
	  Mercury (serial/debug, serial/debug/check-vfd, parallel/debug)

All tests pass with the exception of the above mentioned h5ls test failure
on Mercury in the parallel/debug build with the first configure call above.

Changed files

  • Git repository management for enterprise teams powered by Atlassian Bitbucket
  • Atlassian Bitbucket v4.4.1
  • Documentation
  • Contact Support
  • Request a feature
  • About
  • Contact Atlassian
Atlassian