diff -pruN 1.6.1-1/CHANGELOG.rst 1.6.4-1/CHANGELOG.rst
--- 1.6.1-1/CHANGELOG.rst	2025-03-07 14:51:10.000000000 +0000
+++ 1.6.4-1/CHANGELOG.rst	2025-08-05 06:26:40.000000000 +0000
@@ -1,50 +1,77 @@
 Change Log
 ----------
 
+Version 1.6.4 (August 5th, 2025):
+
+- Cleanup: pyupgrade --py39-plus
+  By `Kurt Schwehr <https://github.com/schwehr>`_
+- Add better error messages when operating on a closed file (:issue:`274`, :pull:`275`).
+  By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
+
+Version 1.6.3 (June 30th, 2025):
+
+- fix invalid string format specifier, match raises/warns with messages in test suite,
+  remove tests for h5py < 3.7, fix sphinx issue and pr roles in CHANGELOG.rst (:issue:`269`, :pull:`270`).
+  By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
+
+Version 1.6.2 (June 26th, 2025):
+
+- Codespell fixes (:pull:`261`).
+  By `Kurt Schwehr <https://github.com/schwehr>`_
+- Fix hsds/h5pyd test fixture spinup issues (:pull:`265`).
+  By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
+- Fix and add circular referrer tests for Python 3.14 and update CI matrix (:pull:`264`).
+  By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
+- Avoid opening h5pyd file to check if there is a preexisting file,
+  instead remap mode "a" -> "r+", resort to "w" if file doesn't exist (:issue:`262`, :pull:`266`).
+  By `Jonas Grönberg <https://github.com/JonasGronberg>`_ and `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
+- Reduce CI time by installing available scientific-python-nightly-wheels and using pip cache (:pull:`267`).
+  By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
+
 Version 1.6.1 (March 7th, 2025):
 
 - Let Variable.chunks return None for scalar variables, independent of what the underlying
-  h5ds object returns ({pull}`259`).
+  h5ds object returns (:pull:`259`).
   By `Rickard Holmberg <https://github.com/rho-novatron>`_
 
 Version 1.6.0 (March 7th, 2025):
 
-- Allow specifying `h5netcdf.File(driver="h5pyd")` to force the use of h5pyd ({issue}`255`, {pull}`256`).
+- Allow specifying `h5netcdf.File(driver="h5pyd")` to force the use of h5pyd (:issue:`255`, :pull:`256`).
   By `Rickard Holmberg <https://github.com/rho-novatron>`_
-- Add pytest-mypy-plugins for xarray nightly test ({pull}`257`).
+- Add pytest-mypy-plugins for xarray nightly test (:pull:`257`).
   By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
 
 Version 1.5.0 (January 26th, 2025):
 
-- Update CI to new versions (Python 3.13, 3.14 alpha), remove numpy 1 from h5pyd runs ({pull}`250`).
+- Update CI to new versions (Python 3.13, 3.14 alpha), remove numpy 1 from h5pyd runs (:pull:`250`).
   By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
-- Update CI and reinstate h5pyd/hsds test runs ({pull}`247`).
+- Update CI and reinstate h5pyd/hsds test runs (:pull:`247`).
   By `John Readey  <https://github.com/jreadey>`_
 - Allow ``zlib`` to be used as an alias for ``gzip`` for enhanced compatibility with h5netcdf's API and xarray.
   By `Mark Harfouche <https://github.com/hmaarrfk>`_
 
 Version 1.4.1 (November 13th, 2024):
 
-- Add CI run for hdf5 1.10.6, fix complex tests, fix enum/user type tests ({pull}`244`).
+- Add CI run for hdf5 1.10.6, fix complex tests, fix enum/user type tests (:pull:`244`).
   By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
 
 
 Version 1.4.0 (October 7th, 2024):
 
-- Add UserType class, add EnumType ({pull}`229`).
+- Add UserType class, add EnumType (:pull:`229`).
   By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
-- Refactor fillvalue and dtype handling for user types, enhance sanity checks and tests ({pull}`230`).
+- Refactor fillvalue and dtype handling for user types, enhance sanity checks and tests (:pull:`230`).
   By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
-- Add VLType and CompoundType, commit complex compound type to file. Align with nc-complex ({pull}`227`).
+- Add VLType and CompoundType, commit complex compound type to file. Align with nc-complex (:pull:`227`).
   By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
 - Update h5pyd testing.
   By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
-- CI and lint maintenance ({pull}`235`).
+- CI and lint maintenance (:pull:`235`).
   By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
 - Support wrapping an h5py ``File`` object. Closing the h5netcdf file object
-  does not close the h5py file ({pull}`238`).
+  does not close the h5py file (:pull:`238`).
   By `Thomas Kluyver <https://github.com/takluyver>`_
-- CI and lint maintenance (format README.rst, use more f-strings, change Python 3.9 to 3.10 in CI) ({pull}`239`).
+- CI and lint maintenance (format README.rst, use more f-strings, change Python 3.9 to 3.10 in CI) (:pull:`239`).
   By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_
 
 Version 1.3.0 (November 7th, 2023):
diff -pruN 1.6.1-1/PKG-INFO 1.6.4-1/PKG-INFO
--- 1.6.1-1/PKG-INFO	2025-03-07 14:51:19.708416700 +0000
+++ 1.6.4-1/PKG-INFO	2025-08-05 06:26:52.712796200 +0000
@@ -1,6 +1,6 @@
-Metadata-Version: 2.2
+Metadata-Version: 2.4
 Name: h5netcdf
-Version: 1.6.1
+Version: 1.6.4
 Summary: netCDF4 via h5py
 Author-email: Stephan Hoyer <shoyer@gmail.com>, Kai Mühlbauer <kmuehlbauer@wradlib.org>
 Maintainer-email: h5netcdf developers <devteam@h5netcdf.org>
@@ -56,6 +56,7 @@ Requires-Dist: packaging
 Provides-Extra: test
 Requires-Dist: netCDF4; extra == "test"
 Requires-Dist: pytest; extra == "test"
+Dynamic: license-file
 
 h5netcdf
 ========
@@ -318,7 +319,7 @@ The following describes the behavior of
 for a few key versions:
 
 - Version 0.12.0 and earlier, the ``track_order`` parameter`order was missing
-  and thus order tracking was implicitely set to ``False``.
+  and thus order tracking was implicitly set to ``False``.
 - Version 0.13.0 enabled order tracking by setting the parameter
   ``track_order`` to ``True`` by default without deprecation.
 - Versions 0.13.1 to 1.0.2 set ``track_order`` to ``False`` due to a bug in a
diff -pruN 1.6.1-1/README.rst 1.6.4-1/README.rst
--- 1.6.1-1/README.rst	2025-03-07 14:51:10.000000000 +0000
+++ 1.6.4-1/README.rst	2025-08-05 06:26:40.000000000 +0000
@@ -259,7 +259,7 @@ The following describes the behavior of
 for a few key versions:
 
 - Version 0.12.0 and earlier, the ``track_order`` parameter`order was missing
-  and thus order tracking was implicitely set to ``False``.
+  and thus order tracking was implicitly set to ``False``.
 - Version 0.13.0 enabled order tracking by setting the parameter
   ``track_order`` to ``True`` by default without deprecation.
 - Versions 0.13.1 to 1.0.2 set ``track_order`` to ``False`` due to a bug in a
diff -pruN 1.6.1-1/debian/changelog 1.6.4-1/debian/changelog
--- 1.6.1-1/debian/changelog	2025-04-04 10:06:33.000000000 +0000
+++ 1.6.4-1/debian/changelog	2025-08-27 23:30:22.000000000 +0000
@@ -1,3 +1,10 @@
+python-h5netcdf (1.6.4-1) unstable; urgency=medium
+
+  * Team upload.
+  * New upstream release
+
+ -- Drew Parsons <dparsons@debian.org>  Thu, 28 Aug 2025 01:30:22 +0200
+
 python-h5netcdf (1.6.1-1) unstable; urgency=medium
 
   * Team upload.
diff -pruN 1.6.1-1/doc/conf.py 1.6.4-1/doc/conf.py
--- 1.6.1-1/doc/conf.py	2025-03-07 14:51:10.000000000 +0000
+++ 1.6.4-1/doc/conf.py	2025-08-05 06:26:40.000000000 +0000
@@ -117,7 +117,7 @@ napoleon_type_aliases = {
     "Path": "~~pathlib.Path",
 }
 
-# handle release substition
+# handle release substitution
 url = "https://github.com/h5netcdf"
 
 # get version
diff -pruN 1.6.1-1/doc/devguide.rst 1.6.4-1/doc/devguide.rst
--- 1.6.1-1/doc/devguide.rst	2025-03-07 14:51:10.000000000 +0000
+++ 1.6.4-1/doc/devguide.rst	2025-08-05 06:26:40.000000000 +0000
@@ -19,6 +19,8 @@ Contributors
 - `Frédéric Laliberté <https://github.com/laliberte>`_
 - `Ghislain Vaillant <https://github.com/ghisvail>`_
 - `John Readey <https://github.com/jreadey>`_
+- `Jonas Grönberg <https://github.com/JonasGronberg>`_
+- `Kurt Schwehr <https://github.com/schwehr>`_
 - `Lion Krischer <https://github.com/krischer>`_
 - `Mark Harfouche <https://github.com/hmaarrfk>`_
 - `Martin Raspaud <https://github.com/mraspaud>`_
diff -pruN 1.6.1-1/doc/index.rst 1.6.4-1/doc/index.rst
--- 1.6.1-1/doc/index.rst	2025-03-07 14:51:10.000000000 +0000
+++ 1.6.4-1/doc/index.rst	2025-08-05 06:26:40.000000000 +0000
@@ -50,7 +50,7 @@ by Stephan Hoyer. The first `official` `
 `xarray issue tracker`_ only one day later.
 
 The library evolved constantly over the years (fixing bugs and adding enhancements)
-and gained contributions from 19 other :ref:`contributors` so far. The library is widely used,
+and gained contributions from 21 other :ref:`contributors` so far. The library is widely used,
 especially as backend within `xarray`_.
 
 Early 2020 Kai Mühlbauer started to add contributions and after some time he volunteered
diff -pruN 1.6.1-1/h5netcdf/_version.py 1.6.4-1/h5netcdf/_version.py
--- 1.6.1-1/h5netcdf/_version.py	2025-03-07 14:51:19.000000000 +0000
+++ 1.6.4-1/h5netcdf/_version.py	2025-08-05 06:26:52.000000000 +0000
@@ -17,5 +17,5 @@ __version__: str
 __version_tuple__: VERSION_TUPLE
 version_tuple: VERSION_TUPLE
 
-__version__ = version = '1.6.1'
-__version_tuple__ = version_tuple = (1, 6, 1)
+__version__ = version = '1.6.4'
+__version_tuple__ = version_tuple = (1, 6, 4)
diff -pruN 1.6.1-1/h5netcdf/core.py 1.6.4-1/h5netcdf/core.py
--- 1.6.1-1/h5netcdf/core.py	2025-03-07 14:51:10.000000000 +0000
+++ 1.6.4-1/h5netcdf/core.py	2025-08-05 06:26:40.000000000 +0000
@@ -677,7 +677,7 @@ def _unlabeled_dimension_mix(h5py_datase
     if not dimlist:
         status = "nodim"
     else:
-        dimset = set([len(j) for j in dimlist])
+        dimset = {len(j) for j in dimlist}
         # either all dimensions have exactly one scale
         # or all dimensions have no scale
         if dimset ^ {0} == set():
@@ -788,7 +788,7 @@ def _check_fillvalue(group, fillvalue, d
             # 1. we need to warn the user that writing enums with default values
             # which are defined in the enum dict will mask those values
             if (h5fillvalue or 0) in dtype.enum_dict.values():
-                reverse = dict((v, k) for k, v in dtype.enum_dict.items())
+                reverse = {v: k for k, v in dtype.enum_dict.items()}
                 msg = (
                     f"Creating variable with default fill_value {h5fillvalue or 0!r}"
                     f" which IS defined in enum type {dtype!r}."
@@ -982,16 +982,16 @@ class Group(Mapping):
         for k, v in self._all_dimensions.maps[0].items():
             if k in value:
                 if v != value[k]:
-                    raise ValueError(f"cannot modify existing dimension {k:!r}")
+                    raise ValueError(f"cannot modify existing dimension {k!r}")
             else:
                 raise ValueError(
-                    f"new dimensions do not include existing dimension {k:!r}"
+                    f"new dimensions do not include existing dimension {k!r}"
                 )
         self._dimensions.update(value)
 
     def _create_child_group(self, name):
         if name in self:
-            raise ValueError(f"unable to create group {name:!r} (name already exists)")
+            raise ValueError(f"unable to create group {name!r} (name already exists)")
         kwargs = {}
         kwargs.update(track_order=self._track_order)
 
@@ -1035,7 +1035,7 @@ class Group(Mapping):
     ):
         if name in self:
             raise ValueError(
-                f"unable to create variable {name:!r} (name already exists)"
+                f"unable to create variable {name!r} (name already exists)"
             )
         if data is not None:
             data = np.asarray(data)
@@ -1271,10 +1271,8 @@ class Group(Mapping):
         return item
 
     def __iter__(self):
-        for name in self.groups:
-            yield name
-        for name in self.variables:
-            yield name
+        yield from self.groups
+        yield from self.variables
 
     def __len__(self):
         return len(self.variables) + len(self.groups)
@@ -1519,32 +1517,51 @@ class File(Group):
                             "No module named 'h5pyd'. h5pyd is required for "
                             f"opening urls: {path}"
                         )
+                    self._preexisting_file = mode in {"r", "r+", "a"}
+                    # remap "a" -> "r+" to check file existence
+                    # fallback to "w" if not
+                    _mode = mode
+                    if mode == "a":
+                        mode = "r+"
+                    self._h5py = h5pyd
                     try:
-                        with h5pyd.File(path, "r", **kwargs) as f:  # noqa
-                            pass
-                        self._preexisting_file = True
+                        self.__h5file = self._h5py.File(
+                            path, mode, track_order=track_order, **kwargs
+                        )
+                        self._preexisting_file = mode != "w"
                     except OSError:
-                        self._preexisting_file = False
-                    self._h5py = h5pyd
-                    self._h5file = self._h5py.File(
-                        path, mode, track_order=track_order, **kwargs
-                    )
+                        # if file does not exist, create it
+                        if _mode == "a":
+                            mode = "w"
+                            self.__h5file = self._h5py.File(
+                                path, mode, track_order=track_order, **kwargs
+                            )
+                            self._preexisting_file = False
+                            msg = (
+                                "Append mode for h5pyd now probes with 'r+' first and "
+                                "only falls back to 'w' if the file is missing.\n"
+                                "To silence this warning use 'r+' (open-existing) or 'w' "
+                                "(create-new) directly."
+                            )
+                            warnings.warn(msg, UserWarning, stacklevel=2)
+                        else:
+                            raise
                 else:
                     self._preexisting_file = os.path.exists(path) and mode != "w"
                     self._h5py = h5py
-                    self._h5file = self._h5py.File(
+                    self.__h5file = self._h5py.File(
                         path, mode, track_order=track_order, **kwargs
                     )
             elif isinstance(path, h5py.File):
                 self._preexisting_file = mode in {"r", "r+", "a"}
                 self._h5py = h5py
-                self._h5file = path
+                self.__h5file = path
                 # h5py File passed in: let the caller decide when to close it
                 self._close_h5file = False
             else:  # file-like object
                 self._preexisting_file = mode in {"r", "r+", "a"}
                 self._h5py = h5py
-                self._h5file = self._h5py.File(
+                self.__h5file = self._h5py.File(
                     path, mode, track_order=track_order, **kwargs
                 )
         except Exception:
@@ -1553,6 +1570,7 @@ class File(Group):
         else:
             self._closed = False
 
+        self._filename = self._h5file.filename
         self._mode = mode
         self._writable = mode != "r"
         self._root_ref = weakref.ref(self)
@@ -1677,12 +1695,18 @@ class File(Group):
 
     sync = flush
 
+    @property
+    def _h5file(self):
+        if self._closed:
+            raise ValueError(f"I/O operation on {self}: {self._filename!r}")
+        return self.__h5file
+
     def close(self):
         if not self._closed:
             self.flush()
             if self._close_h5file:
                 self._h5file.close()
-            self._h5file = None
+            self.__h5file = None
             self._closed = True
 
     __del__ = close
diff -pruN 1.6.1-1/h5netcdf/dimensions.py 1.6.4-1/h5netcdf/dimensions.py
--- 1.6.1-1/h5netcdf/dimensions.py	2025-03-07 14:51:10.000000000 +0000
+++ 1.6.4-1/h5netcdf/dimensions.py	2025-08-05 06:26:40.000000000 +0000
@@ -22,7 +22,7 @@ class Dimensions(MutableMapping):
         if not self._group._root._writable:
             raise RuntimeError("H5NetCDF: Write to read only")
         if name in self._objects:
-            raise ValueError(f"dimension {name:!r} already exists")
+            raise ValueError(f"dimension {name!r} already exists")
 
         self._objects[name] = Dimension(self._group, name, size, create_h5ds=True)
 
diff -pruN 1.6.1-1/h5netcdf/tests/conftest.py 1.6.4-1/h5netcdf/tests/conftest.py
--- 1.6.1-1/h5netcdf/tests/conftest.py	2025-03-07 14:51:10.000000000 +0000
+++ 1.6.4-1/h5netcdf/tests/conftest.py	2025-08-05 06:26:40.000000000 +0000
@@ -1,5 +1,6 @@
 import os
 import tempfile
+import time
 from pathlib import Path
 from shutil import rmtree
 
@@ -16,50 +17,67 @@ except ImportError:
 
 @pytest.fixture(scope="session")
 def hsds_up():
-    """Provide HDF Highly Scalabale Data Service (HSDS) for h5pyd testing."""
-    if with_reqd_pkgs:
-        root_dir = Path(tempfile.mkdtemp(prefix="tmp-hsds-root-"))
-        bucket_name = "pytest"
-        os.environ["BUCKET_NAME"] = bucket_name
-        os.mkdir(
-            f"{root_dir}/{bucket_name}"
-        )  # need to create a directory for our bucket
-
-        hs_username = "h5netcdf-pytest"
-        hs_password = "TestEarlyTestEverything"
-
-        kwargs = {}
-        kwargs["username"] = hs_username
-        kwargs["password"] = hs_password
-        kwargs["root_dir"] = str(root_dir)
-        kwargs["logfile"] = f"{root_dir}/hsds.log"
-        kwargs["log_level"] = "DEBUG"
-        kwargs["host"] = "localhost"
-        kwargs["sn_port"] = 5101
+    """Provide HDF Highly Scalable Data Service (HSDS) for h5pyd testing."""
+    if not with_reqd_pkgs:
+        pytest.skip("Required packages h5pyd and hsds not available")
+
+    root_dir = Path(tempfile.mkdtemp(prefix="tmp-hsds-root-"))
+    bucket_name = "pytest"
+    os.environ["BUCKET_NAME"] = bucket_name
+    # need to create a directory for our bucket
+    (root_dir / bucket_name).mkdir()
+
+    kwargs = {
+        "username": "h5netcdf-pytest",
+        "password": "TestEarlyTestEverything",
+        "root_dir": str(root_dir),
+        "logfile": str(root_dir / "hsds.log"),
+        "log_level": "DEBUG",
+        "host": "localhost",
+        "sn_port": 5101,
+    }
+
+    os.environ.update(
+        {
+            "BUCKET_NAME": bucket_name,
+            "HS_USERNAME": kwargs["username"],
+            "HS_PASSWORD": kwargs["password"],
+            "HS_USE_HTTPS": "False",
+        }
+    )
+
+    hsds = HsdsApp(**kwargs)
+
+    try:
+        hsds.run()
+        timeout = time.time() + 60
+        while not hsds.ready:
+            if time.time() > timeout:
+                raise TimeoutError("HSDS server did not become ready in time")
+            time.sleep(1)
+
+        os.environ["HS_ENDPOINT"] = hsds.endpoint
+        # make folders expected by pytest
+        Folder("/home/", mode="w")
+        Folder("/home/h5netcdf-pytest/", mode="w")
+
+        yield True
+
+    except Exception as err:
+        log_path = kwargs["logfile"]
+        if os.path.exists(log_path):
+            with open(log_path) as f:
+                print("\n=== HSDS Log ===")
+                print(f.read())
+        else:
+            print(f"HSDS log not found at: {log_path}")
+        raise err
 
+    finally:
         try:
-            hsds = HsdsApp(**kwargs)
-
-            hsds.run()
-            is_up = hsds.ready
-
-            if is_up:
-                os.environ["HS_ENDPOINT"] = hsds.endpoint
-                os.environ["HS_USERNAME"] = hs_username
-                os.environ["HS_PASSWORD"] = hs_password
-                # make folders expected by pytest
-                # pytest/home/h5netcdf-pytest
-                # Folder("/pytest/", mode='w')
-                Folder("/home/", mode="w")
-                Folder("/home/h5netcdf-pytest/", mode="w")
+            hsds.check_processes()
+            hsds.stop()
         except Exception:
-            is_up = False
-
-        yield is_up
-        hsds.check_processes()  # this will capture hsds log output
-        hsds.stop()
-
-        rmtree(root_dir, ignore_errors=True)
+            pass
 
-    else:
-        yield False
+    rmtree(root_dir, ignore_errors=True)
diff -pruN 1.6.1-1/h5netcdf/tests/test_h5netcdf.py 1.6.4-1/h5netcdf/tests/test_h5netcdf.py
--- 1.6.1-1/h5netcdf/tests/test_h5netcdf.py	2025-03-07 14:51:10.000000000 +0000
+++ 1.6.4-1/h5netcdf/tests/test_h5netcdf.py	2025-08-05 06:26:40.000000000 +0000
@@ -3,7 +3,9 @@ import io
 import random
 import re
 import string
+import sys
 import tempfile
+import weakref
 from os import environ as env
 
 import h5py
@@ -11,7 +13,7 @@ import netCDF4
 import numpy as np
 import pytest
 from packaging import version
-from pytest import raises
+from pytest import raises, warns
 
 import h5netcdf
 from h5netcdf import legacyapi
@@ -30,6 +32,7 @@ except ImportError:
 
 
 remote_h5 = ("http:", "hdf5:")
+python_version = version.parse(".".join(map(str, sys.version_info[:3])))
 
 
 @pytest.fixture
@@ -37,26 +40,30 @@ def tmp_local_netcdf(tmpdir):
     return str(tmpdir.join("testfile.nc"))
 
 
+@pytest.fixture()
+def setup_h5pyd_config(hsds_up):
+    env["HS_ENDPOINT"] = "http://127.0.0.1:5101"
+    env["HS_USERNAME"] = "h5netcdf-pytest"
+    env["HS_PASSWORD"] = "TestEarlyTestEverything"
+    env["HS_USE_HTTPS"] = "False"
+
+
 @pytest.fixture(params=["testfile.nc", "hdf5://testfile"])
-def tmp_local_or_remote_netcdf(request, tmpdir, hsds_up):
-    if request.param.startswith(remote_h5):
-        if without_h5pyd:
-            pytest.skip("h5pyd package not available")
-        elif not hsds_up:
-            pytest.skip("HSDS service not running")
-        rnd = "".join(random.choice(string.ascii_uppercase) for _ in range(5))
-        return (
-            "hdf5://"
-            + "home"
-            + "/"
-            + env["HS_USERNAME"]
-            + "/"
-            + "testfile"
-            + rnd
-            + ".nc"
-        )
+def tmp_local_or_remote_netcdf(request, tmpdir):
+    param = request.param
+    if param.startswith(remote_h5):
+        try:
+            hsds_up = request.getfixturevalue("hsds_up")
+        except pytest.skip.Exception:
+            pytest.skip("HSDS not available")
+
+        if not hsds_up:
+            pytest.skip("HSDS fixture returned False (not running)")
+
+        rnd = "".join(random.choices(string.ascii_uppercase, k=5))
+        return f"hdf5://home/{env['HS_USERNAME']}/testfile{rnd}.nc"
     else:
-        return str(tmpdir.join(request.param))
+        return str(tmpdir.join(param))
 
 
 @pytest.fixture(params=[True, False])
@@ -157,7 +164,10 @@ def write_legacy_netcdf(tmp_netcdf, writ
     v = ds.createVariable("foo_unlimited", float, ("x", "unlimited"))
     v[...] = 1
 
-    with raises((h5netcdf.CompatibilityError, TypeError)):
+    with raises(
+        (h5netcdf.CompatibilityError, TypeError),
+        match=r"(?i)(boolean dtypes are not a supported NetCDF feature|illegal primitive data type)",
+    ):
         ds.createVariable("boolean", np.bool_, ("x"))
 
     g = ds.createGroup("subgroup")
@@ -250,28 +260,32 @@ def read_legacy_netcdf(tmp_netcdf, read_
     if write_module is not netCDF4:
         # skip for now: https://github.com/Unidata/netcdf4-python/issues/388
         assert ds.other_attr == "yes"
-    with pytest.raises(AttributeError):
+    with raises(AttributeError, match="not found"):
         ds.does_not_exist
-    assert set(ds.dimensions) == set(
-        ["x", "y", "z", "empty", "string3", "mismatched_dim", "unlimited"]
-    )
-    assert set(ds.variables) == set(
-        [
-            "enum_var",
-            "foo",
-            "y",
-            "z",
-            "intscalar",
-            "scalar",
-            "var_len_str",
-            "mismatched_dim",
-            "foo_unlimited",
-        ]
-    )
+    assert set(ds.dimensions) == {
+        "x",
+        "y",
+        "z",
+        "empty",
+        "string3",
+        "mismatched_dim",
+        "unlimited",
+    }
+    assert set(ds.variables) == {
+        "enum_var",
+        "foo",
+        "y",
+        "z",
+        "intscalar",
+        "scalar",
+        "var_len_str",
+        "mismatched_dim",
+        "foo_unlimited",
+    }
 
-    assert set(ds.enumtypes) == set(["enum_t"])
+    assert set(ds.enumtypes) == {"enum_t"}
 
-    assert set(ds.groups) == set(["subgroup"])
+    assert set(ds.groups) == {"subgroup"}
     assert ds.parent is None
     v = ds.variables["foo"]
     assert array_equal(v, np.ones((4, 5)))
@@ -372,27 +386,31 @@ def read_h5netcdf(tmp_netcdf, write_modu
     if write_module is not netCDF4:
         # skip for now: https://github.com/Unidata/netcdf4-python/issues/388
         assert ds.attrs["other_attr"] == "yes"
-    assert set(ds.dimensions) == set(
-        ["x", "y", "z", "empty", "string3", "mismatched_dim", "unlimited"]
-    )
-    variables = set(
-        [
-            "enum_var",
-            "foo",
-            "z",
-            "intscalar",
-            "scalar",
-            "var_len_str",
-            "mismatched_dim",
-            "foo_unlimited",
-        ]
-    )
+    assert set(ds.dimensions) == {
+        "x",
+        "y",
+        "z",
+        "empty",
+        "string3",
+        "mismatched_dim",
+        "unlimited",
+    }
+    variables = {
+        "enum_var",
+        "foo",
+        "z",
+        "intscalar",
+        "scalar",
+        "var_len_str",
+        "mismatched_dim",
+        "foo_unlimited",
+    }
     # fix current failure of hsds/h5pyd
     if not remote_file:
-        variables |= set(["y"])
+        variables |= {"y"}
     assert set(ds.variables) == variables
 
-    assert set(ds.groups) == set(["subgroup"])
+    assert set(ds.groups) == {"subgroup"}
     assert ds.parent is None
 
     v = ds["foo"]
@@ -646,25 +664,27 @@ def test_optional_netcdf4_attrs(tmp_loca
 def test_error_handling(tmp_local_or_remote_netcdf):
     with h5netcdf.File(tmp_local_or_remote_netcdf, "w") as ds:
         ds.dimensions["x"] = 1
-        with raises(ValueError):
+        with raises(ValueError, match="already exists"):
             ds.dimensions["x"] = 2
-        with raises(ValueError):
+        with raises(ValueError, match="cannot modify existing dimension"):
             ds.dimensions = {"x": 2}
-        with raises(ValueError):
+        with raises(
+            ValueError, match="new dimensions do not include existing dimension"
+        ):
             ds.dimensions = {"y": 3}
         ds.create_variable("x", ("x",), dtype=float)
-        with raises(ValueError):
+        with raises(ValueError, match="unable to create variable"):
             ds.create_variable("x", ("x",), dtype=float)
-        with raises(ValueError):
+        with raises(ValueError, match="name parameter cannot be an empty string"):
             ds.create_variable("y/", ("x",), dtype=float)
         ds.create_group("subgroup")
-        with raises(ValueError):
+        with raises(ValueError, match="unable to create group"):
             ds.create_group("subgroup")
 
 
 def test_decode_string_error(tmp_local_or_remote_netcdf):
     write_h5netcdf(tmp_local_or_remote_netcdf)
-    with pytest.raises(TypeError):
+    with raises(TypeError, match="keyword argument is not allowed"):
         with h5netcdf.legacyapi.Dataset(
             tmp_local_or_remote_netcdf, "r", decode_vlen_strings=True
         ) as ds:
@@ -731,10 +751,10 @@ def test_invalid_netcdf4(tmp_local_or_re
                 check_invalid_netcdf4(var, i)
 
     with h5netcdf.File(tmp_local_or_remote_netcdf, "r") as ds:
-        with raises(ValueError):
+        with raises(ValueError, match="has no dimension scale associated"):
             ds["bar"].variables["foo1"].dimensions
 
-    with raises(ValueError):
+    with raises(ValueError, match="unknown value"):
         with h5netcdf.File(tmp_local_or_remote_netcdf, "r", phony_dims="srt") as ds:
             pass
 
@@ -799,7 +819,7 @@ def test_invalid_netcdf4_mixed(tmp_local
             check_invalid_netcdf4_mixed(var, 3)
 
     with h5netcdf.File(tmp_local_or_remote_netcdf, "r") as ds:
-        with raises(ValueError):
+        with raises(ValueError, match="has no dimension scale associated with"):
             ds.variables["foo1"].dimensions
 
 
@@ -817,12 +837,12 @@ def test_invalid_netcdf_malformed_dimens
         f["z"].make_scale()
         f["foo1"].dims[0].attach_scale(f["x"])
 
-    with raises(ValueError):
+    with raises(ValueError, match="has mixing of labeled and unlabeled dimensions"):
         with h5netcdf.File(tmp_local_or_remote_netcdf, "r") as ds:
             assert ds
             print(ds)
 
-    with raises(ValueError):
+    with raises(ValueError, match="has mixing of labeled and unlabeled dimensions"):
         with h5netcdf.File(tmp_local_or_remote_netcdf, "r", phony_dims="sort") as ds:
             assert ds
             print(ds)
@@ -833,13 +853,13 @@ def test_hierarchical_access_auto_create
     ds.create_variable("/foo/bar", data=1)
     g = ds.create_group("foo/baz")
     g.create_variable("/foo/hello", data=2)
-    assert set(ds) == set(["foo"])
-    assert set(ds["foo"]) == set(["bar", "baz", "hello"])
+    assert set(ds) == {"foo"}
+    assert set(ds["foo"]) == {"bar", "baz", "hello"}
     ds.close()
 
     ds = h5netcdf.File(tmp_local_or_remote_netcdf, "r")
-    assert set(ds) == set(["foo"])
-    assert set(ds["foo"]) == set(["bar", "baz", "hello"])
+    assert set(ds) == {"foo"}
+    assert set(ds["foo"]) == {"bar", "baz", "hello"}
     ds.close()
 
 
@@ -936,14 +956,17 @@ def test_invalid_netcdf_error(tmp_local_
         f.create_variable(
             "lzf_compressed", data=[1], dimensions=("x"), compression="lzf"
         )
-        with pytest.raises(h5netcdf.CompatibilityError):
+        with raises(
+            h5netcdf.CompatibilityError,
+            match="scale-offset filters are not a supported NetCDF feature",
+        ):
             f.create_variable("scaleoffset", data=[1], dimensions=("x",), scaleoffset=0)
 
 
 def test_invalid_netcdf_okay(tmp_local_or_remote_netcdf):
     if tmp_local_or_remote_netcdf.startswith(remote_h5):
         pytest.skip("h5pyd does not support NumPy complex dtype yet")
-    with pytest.warns(UserWarning, match="invalid netcdf features"):
+    with warns(UserWarning, match="invalid netcdf features"):
         with h5netcdf.File(tmp_local_or_remote_netcdf, "w", invalid_netcdf=True) as f:
             f.create_variable(
                 "lzf_compressed", data=[1], dimensions=("x"), compression="lzf"
@@ -965,7 +988,7 @@ def test_invalid_netcdf_overwrite_valid(
     # https://github.com/h5netcdf/h5netcdf/issues/165
     with netCDF4.Dataset(tmp_local_netcdf, mode="w"):
         pass
-    with pytest.warns(UserWarning):
+    with warns(UserWarning, match="You are writing invalid netcdf features"):
         with h5netcdf.File(tmp_local_netcdf, "a", invalid_netcdf=True) as f:
             f.create_variable(
                 "lzf_compressed", data=[1], dimensions=("x"), compression="lzf"
@@ -994,7 +1017,7 @@ def test_reopen_file_different_dimension
 
 
 def test_invalid_then_valid_no_ncproperties(tmp_local_or_remote_netcdf):
-    with pytest.warns(UserWarning, match="invalid netcdf features"):
+    with warns(UserWarning, match="invalid netcdf features"):
         with h5netcdf.File(tmp_local_or_remote_netcdf, "w", invalid_netcdf=True):
             pass
     with h5netcdf.File(tmp_local_or_remote_netcdf, "a"):
@@ -1012,11 +1035,8 @@ def test_creating_and_resizing_unlimited
         f.dimensions["z"] = None
         f.resize_dimension("z", 20)
 
-        with pytest.raises(ValueError) as e:
+        with raises(ValueError, match="is not unlimited and thus cannot be resized"):
             f.resize_dimension("y", 20)
-        assert e.value.args[0] == (
-            "Dimension 'y' is not unlimited and thus cannot be resized."
-        )
 
     h5 = get_hdf5_module(tmp_local_or_remote_netcdf)
     # Assert some behavior observed by using the C netCDF bindings.
@@ -1042,11 +1062,10 @@ def test_creating_variables_with_unlimit
 
         # Trying to create a variable while the current size of the dimension
         # is still zero will fail.
-        with pytest.raises(ValueError) as e:
+        with raises(ValueError, match="Shape tuple is incompatible with data"):
             f.create_variable(
                 "dummy2", data=np.array([[1, 2], [3, 4]]), dimensions=("x", "y")
             )
-        assert e.value.args[0] == "Shape tuple is incompatible with data"
 
         # Creating a coordinate variable
         f.create_variable("x", dimensions=("x",), dtype=np.int64)
@@ -1071,7 +1090,7 @@ def test_creating_variables_with_unlimit
             # We don't expect any errors. This is effectively a void context manager
             expected_errors = memoryview(b"")
         else:
-            expected_errors = pytest.raises(TypeError)
+            expected_errors = raises(TypeError, match="Can't broadcast")
         with expected_errors as e:
             f.variables["dummy3"][:] = np.ones((5, 2))
         if not tmp_local_or_remote_netcdf.startswith(remote_h5):
@@ -1108,11 +1127,10 @@ def test_writing_to_an_unlimited_dimensi
         f.dimensions["z"] = None
 
         # Cannot create it without first resizing it.
-        with pytest.raises(ValueError) as e:
+        with raises(ValueError, match="Shape tuple is incompatible with data"):
             f.create_variable(
                 "dummy1", data=np.array([[1, 2, 3]]), dimensions=("x", "y")
             )
-            assert e.value.args[0] == "Shape tuple is incompatible with data"
 
         # Without data.
         f.create_variable("dummy1", dimensions=("x", "y"), dtype=np.int64)
@@ -1141,7 +1159,9 @@ def test_writing_to_an_unlimited_dimensi
 
         # broadcast writing
         if tmp_local_or_remote_netcdf.startswith(remote_h5):
-            expected_errors = pytest.raises(OSError)
+            expected_errors = raises(
+                OSError, match="Got asyncio.IncompleteReadError during binary read"
+            )
         else:
             # We don't expect any errors. This is effectively a void context manager
             expected_errors = memoryview(b"")
@@ -1300,6 +1320,24 @@ def test_overwrite_existing_file(tmp_loc
         assert ds.attrs._h5attrs.get("_NCProperties", False)
 
 
+def test_overwrite_existing_remote_file(tmp_local_or_remote_netcdf):
+    # create file with legacyapi
+    with legacyapi.Dataset(tmp_local_or_remote_netcdf, "w") as ds:
+        ds.createDimension("x", 10)
+
+    # check attribute
+    with h5netcdf.File(tmp_local_or_remote_netcdf, "r") as ds:
+        assert ds.attrs._h5attrs.get("_NCProperties", False)
+
+    # overwrite file with new api
+    with h5netcdf.File(tmp_local_or_remote_netcdf, "w") as ds:
+        ds.dimensions["x"] = 10
+
+    # check attribute
+    with h5netcdf.File(tmp_local_or_remote_netcdf, "r") as ds:
+        assert ds.attrs._h5attrs.get("_NCProperties", False)
+
+
 def test_scales_on_append(tmp_local_netcdf):
     # create file with _NCProperties attribute
     with netCDF4.Dataset(tmp_local_netcdf, "w") as ds:
@@ -1555,7 +1593,38 @@ def test_no_circular_references(tmp_loca
         refs = gc.get_referrers(ds)
         for ref in refs:
             print(ref)
-        assert len(refs) == 1
+        if python_version >= version.parse("3.14"):
+            assert len(refs) == 0
+        else:
+            assert len(refs) == 1
+
+
+def test_no_circular_references_py314(tmp_local_or_remote_netcdf):
+    # https://github.com/h5py/h5py/issues/2019
+    with h5netcdf.File(tmp_local_or_remote_netcdf, "w") as ds:
+        ds.dimensions["x"] = 2
+        ds.dimensions["y"] = 2
+
+    # clean up everything
+    gc.collect()
+    gc.garbage.clear()
+
+    # use weakref to hold on object
+    file_ref = None
+    with h5netcdf.File(tmp_local_or_remote_netcdf, "r") as ds:
+        file_ref = weakref.ref(ds)
+
+    # clean up
+    gc.collect()
+
+    # check garbage list
+    if file_ref() is not None:
+        print("Uncollectable object:", file_ref())
+        print("Potential GC garbage:")
+        for obj in gc.garbage:
+            print(repr(obj))
+
+    assert file_ref() is None or "<Closed h5netcdf.File>"
 
 
 def test_expanded_variables_netcdf4(tmp_local_netcdf, netcdf_write_module):
@@ -1693,7 +1762,7 @@ def test_track_order_specification(tmp_l
     # While netcdf4-c has historically only allowed track_order to be True
     # There doesn't seem to be a good reason for this
     # https://github.com/Unidata/netcdf-c/issues/2054 historically, h5netcdf
-    # has not specified this parameter (leaving it implicitely as False)
+    # has not specified this parameter (leaving it implicitly as False)
     # We want to make sure we allow both here
     with h5netcdf.File(tmp_local_netcdf, "w", track_order=False):
         pass
@@ -1705,10 +1774,10 @@ def test_track_order_specification(tmp_l
 # This should always work with the default file opening settings
 # https://github.com/h5netcdf/h5netcdf/issues/136#issuecomment-1017457067
 def test_more_than_7_attr_creation(tmp_local_netcdf):
-    with h5netcdf.File(tmp_local_netcdf, "w") as h5file:
+    with h5netcdf.File(tmp_local_netcdf, "w") as _h5file:
         for i in range(100):
-            h5file.attrs[f"key{i}"] = i
-            h5file.attrs[f"key{i}"] = 0
+            _h5file.attrs[f"key{i}"] = i
+            _h5file.attrs[f"key{i}"] = 0
 
 
 # Add a test that is supposed to fail in relation to issue #136
@@ -1717,18 +1786,10 @@ def test_more_than_7_attr_creation(tmp_l
 # https://github.com/h5netcdf/h5netcdf/issues/136#issuecomment-1017457067
 @pytest.mark.parametrize("track_order", [False, True])
 def test_more_than_7_attr_creation_track_order(tmp_local_netcdf, track_order):
-    h5py_version = version.parse(h5py.__version__)
-    if track_order and h5py_version < version.parse("3.7.0"):
-        expected_errors = pytest.raises(KeyError)
-    else:
-        # We don't expect any errors. This is effectively a void context manager
-        expected_errors = memoryview(b"")
-
-    with h5netcdf.File(tmp_local_netcdf, "w", track_order=track_order) as h5file:
-        with expected_errors:
-            for i in range(100):
-                h5file.attrs[f"key{i}"] = i
-                h5file.attrs[f"key{i}"] = 0
+    with h5netcdf.File(tmp_local_netcdf, "w", track_order=track_order) as _h5file:
+        for i in range(100):
+            _h5file.attrs[f"key{i}"] = i
+            _h5file.attrs[f"key{i}"] = 0
 
 
 def test_group_names(tmp_local_netcdf):
@@ -1815,18 +1876,11 @@ def test_bool_slicing_length_one_dim(tmp
         data = ds["hello"][bool_slice, :]
         np.testing.assert_equal(data, np.zeros((1, 2)))
 
-    # should raise for h5py >= 3.0.0 and h5py < 3.7.0
+    # regression test
     # https://github.com/h5py/h5py/pull/2079
     # https://github.com/h5netcdf/h5netcdf/pull/125/
     with h5netcdf.File(tmp_local_netcdf, "r") as ds:
-        h5py_version = version.parse(h5py.__version__)
-        if version.parse("3.0.0") <= h5py_version < version.parse("3.7.0"):
-            error = "Indexing arrays must have integer dtypes"
-            with pytest.raises(TypeError) as e:
-                ds["hello"][bool_slice, :]
-            assert error == str(e.value)
-        else:
-            ds["hello"][bool_slice, :]
+        ds["hello"][bool_slice, :]
 
 
 def test_fancy_indexing(tmp_local_or_remote_netcdf):
@@ -2247,38 +2301,36 @@ def test_user_type_errors_new_api(tmp_lo
             enum_type = ds.create_enumtype(np.uint8, "enum_t", enum_dict1)
 
             if tmp_local_or_remote_netcdf.startswith(remote_h5):
-                testcontext = pytest.raises(RuntimeError, match="Conflict")
+                testcontext = raises(RuntimeError, match="Conflict")
             else:
-                testcontext = pytest.raises(
-                    (KeyError, TypeError), match="name already exists"
-                )
+                testcontext = raises((KeyError, TypeError), match="name already exists")
             with testcontext:
                 ds.create_enumtype(np.uint8, "enum_t", enum_dict2)
 
             enum_type2 = g.create_enumtype(np.uint8, "enum_t2", enum_dict2)
             g.create_enumtype(np.uint8, "enum_t", enum_dict2)
-            with pytest.raises(TypeError, match="Please provide h5netcdf user type"):
+            with raises(TypeError, match="Please provide h5netcdf user type"):
                 ds.create_variable(
                     "enum_var1",
                     ("enum_dim",),
                     dtype=enum_type._h5ds,
                     fillvalue=enum_dict1["missing"],
                 )
-            with pytest.raises(TypeError, match="is not committed into current file"):
+            with raises(TypeError, match="is not committed into current file"):
                 ds.create_variable(
                     "enum_var2",
                     ("enum_dim",),
                     dtype=enum_type_ext,
                     fillvalue=enum_dict1["missing"],
                 )
-            with pytest.raises(TypeError, match="is not accessible in current group"):
+            with raises(TypeError, match="is not accessible in current group"):
                 ds.create_variable(
                     "enum_var3",
                     ("enum_dim",),
                     dtype=enum_type2,
                     fillvalue=enum_dict2["missing"],
                 )
-            with pytest.raises(TypeError, match="Another dtype with same name"):
+            with raises(TypeError, match="Another dtype with same name"):
                 g.create_variable(
                     "enum_var4",
                     ("enum_dim",),
@@ -2297,38 +2349,36 @@ def test_user_type_errors_legacyapi(tmp_
             g = ds.createGroup("subgroup")
             enum_type = ds.createEnumType(np.uint8, "enum_t", enum_dict1)
             if tmp_local_or_remote_netcdf.startswith(remote_h5):
-                testcontext = pytest.raises(RuntimeError, match="Conflict")
+                testcontext = raises(RuntimeError, match="Conflict")
             else:
-                testcontext = pytest.raises(
-                    (KeyError, TypeError), match="name already exists"
-                )
+                testcontext = raises((KeyError, TypeError), match="name already exists")
             with testcontext:
                 ds.createEnumType(np.uint8, "enum_t", enum_dict1)
 
             enum_type2 = g.createEnumType(np.uint8, "enum_t2", enum_dict2)
             g.create_enumtype(np.uint8, "enum_t", enum_dict2)
-            with pytest.raises(TypeError, match="Please provide h5netcdf user type"):
+            with raises(TypeError, match="Please provide h5netcdf user type"):
                 ds.createVariable(
                     "enum_var1",
                     enum_type._h5ds,
                     ("enum_dim",),
                     fill_value=enum_dict1["missing"],
                 )
-            with pytest.raises(TypeError, match="is not committed into current file"):
+            with raises(TypeError, match="is not committed into current file"):
                 ds.createVariable(
                     "enum_var2",
                     enum_type_ext,
                     ("enum_dim",),
                     fill_value=enum_dict1["missing"],
                 )
-            with pytest.raises(TypeError, match="is not accessible in current group"):
+            with raises(TypeError, match="is not accessible in current group"):
                 ds.createVariable(
                     "enum_var3",
                     enum_type2,
                     ("enum_dim",),
                     fill_value=enum_dict2["missing"],
                 )
-            with pytest.raises(TypeError, match="Another dtype with same name"):
+            with raises(TypeError, match="Another dtype with same name"):
                 g.createVariable(
                     "enum_var4",
                     enum_type,
@@ -2346,7 +2396,7 @@ def test_enum_type_errors_new_api(tmp_lo
         enum_type2 = ds.create_enumtype(np.uint8, "enum_t2", enum_dict2)
 
         # 1.
-        with pytest.warns(UserWarning, match="default fill_value 0 which IS defined"):
+        with warns(UserWarning, match="default fill_value 0 which IS defined"):
             ds.create_variable(
                 "enum_var1",
                 ("enum_dim",),
@@ -2354,18 +2404,14 @@ def test_enum_type_errors_new_api(tmp_lo
             )
         # 2. is for legacyapi only
         # 3.
-        with pytest.warns(
-            UserWarning, match="default fill_value 0 which IS NOT defined"
-        ):
+        with warns(UserWarning, match="default fill_value 0 which IS NOT defined"):
             ds.create_variable(
                 "enum_var2",
                 ("enum_dim",),
                 dtype=enum_type,
             )
         # 4.
-        with pytest.warns(
-            UserWarning, match="with specified fill_value 0 which IS NOT"
-        ):
+        with warns(UserWarning, match="with specified fill_value 0 which IS NOT"):
             ds.create_variable(
                 "enum_var3",
                 ("enum_dim",),
@@ -2373,9 +2419,7 @@ def test_enum_type_errors_new_api(tmp_lo
                 fillvalue=0,
             )
         # 5.
-        with pytest.raises(
-            ValueError, match="with specified fill_value 100 which IS NOT"
-        ):
+        with raises(ValueError, match="with specified fill_value 100 which IS NOT"):
             ds.create_variable(
                 "enum_var4",
                 ("enum_dim",),
@@ -2393,14 +2437,14 @@ def test_enum_type_errors_legacyapi(tmp_
         enum_type2 = ds.createEnumType(np.uint8, "enum_t2", enum_dict2)
 
         # 1.
-        with pytest.warns(UserWarning, match="default fill_value 255 which IS defined"):
+        with warns(UserWarning, match="default fill_value 255 which IS defined"):
             ds.createVariable(
                 "enum_var1",
                 enum_type2,
                 ("enum_dim",),
             )
         # 2.
-        with pytest.raises(ValueError, match="default fill_value 255 which IS NOT"):
+        with raises(ValueError, match="default fill_value 255 which IS NOT"):
             ds.createVariable(
                 "enum_var2",
                 enum_type,
@@ -2408,9 +2452,7 @@ def test_enum_type_errors_legacyapi(tmp_
             )
         # 3. is only for new api
         # 4.
-        with pytest.warns(
-            UserWarning, match="interpreted as '_UNDEFINED' by netcdf-c."
-        ):
+        with warns(UserWarning, match="interpreted as '_UNDEFINED' by netcdf-c."):
             ds.createVariable(
                 "enum_var3",
                 enum_type,
@@ -2418,9 +2460,7 @@ def test_enum_type_errors_legacyapi(tmp_
                 fill_value=0,
             )
         # 5.
-        with pytest.raises(
-            ValueError, match="with specified fill_value 100 which IS NOT"
-        ):
+        with raises(ValueError, match="with specified fill_value 100 which IS NOT"):
             ds.createVariable("enum_var4", enum_type, ("enum_dim",), fill_value=100)
 
 
@@ -2438,9 +2478,8 @@ def test_enum_type(tmp_local_or_remote_n
             "enum_var", ("enum_dim",), dtype=enum_type, fillvalue=enum_dict["missing"]
         )
         v[0:3] = [1, 2, 3]
-        with pytest.raises(ValueError) as e:
+        with raises(ValueError, match="assign illegal value"):
             v[3] = 5
-        assert "assign illegal value(s)" in e.value.args[0]
 
     # check, if new API can read them
     with h5netcdf.File(tmp_local_or_remote_netcdf, "r") as ds:
@@ -2481,9 +2520,8 @@ def test_enum_type(tmp_local_or_remote_n
             "enum_var", enum_type, ("enum_dim",), fill_value=enum_dict["missing"]
         )
         v[0:3] = [1, 2, 3]
-        with pytest.raises(ValueError) as e:
+        with raises(ValueError, match="assign illegal value"):
             v[3] = 5
-        assert "assign illegal value(s)" in e.value.args[0]
 
     # check, if new API can read them
     with h5netcdf.File(tmp_local_or_remote_netcdf, "r") as ds:
@@ -2525,9 +2563,7 @@ def test_enum_type(tmp_local_or_remote_n
                 "enum_var", enum_type, ("enum_dim",), fill_value=enum_dict["missing"]
             )
             v[0:3] = [1, 2, 3]
-            with pytest.raises(
-                ValueError, match="assign illegal value to Enum variable"
-            ):
+            with raises(ValueError, match="assign illegal value to Enum variable"):
                 v[3] = 5
 
         # check, if new API can read them
@@ -2714,14 +2750,14 @@ def test_complex_type_creation_errors(tm
 
     with legacyapi.Dataset(tmp_local_netcdf, "w") as ds:
         ds.createDimension("x", size=len(complex_array))
-        with pytest.raises(TypeError, match="data type 'c4' not understood"):
+        with raises(TypeError, match="data type 'c4' not understood"):
             ds.createVariable("data", "c4", ("x",))
 
     if "complex256" not in np.sctypeDict:
         pytest.skip("numpy 'complex256' dtype not available")
     with legacyapi.Dataset(tmp_local_netcdf, "w") as ds:
         ds.createDimension("x", size=len(complex_array))
-        with pytest.raises(
+        with raises(
             TypeError,
             match="Currently only 'complex64' and 'complex128' dtypes are allowed.",
         ):
@@ -2773,3 +2809,32 @@ def test_h5pyd_nonchunked_scalars(hsds_u
         assert ds["foo"]._h5ds.chunks == (1,)
         # However, since it is a scalar dataset, we should not expose the chunking
         assert ds["foo"].chunks is None
+
+
+def test_h5pyd_append(hsds_up):
+    if without_h5pyd:
+        pytest.skip("h5pyd package not available")
+    elif not hsds_up:
+        pytest.skip("HSDS service not running")
+    rnd = "".join(random.choice(string.ascii_uppercase) for _ in range(5))
+    fname = f"hdf5://testfile{rnd}.nc"
+
+    with warns(UserWarning, match="Append mode for h5pyd"):
+        with h5netcdf.File(fname, "a", driver="h5pyd") as ds:
+            assert not ds._preexisting_file
+
+    with h5netcdf.File(fname, "a", driver="h5pyd") as ds:
+        assert ds._preexisting_file
+
+
+def test_raise_on_closed_file(tmp_local_netcdf):
+    f = h5netcdf.File(tmp_local_netcdf, "w")
+    f.dimensions = {"x": 5}
+    v = f.create_variable("hello", ("x",), float)
+    v[:] = np.ones(5)
+    f.close()
+    with pytest.raises(
+        ValueError,
+        match=f"I/O operation on <Closed h5netcdf.File>: '{tmp_local_netcdf}'",
+    ):
+        print(v[:])
diff -pruN 1.6.1-1/h5netcdf.egg-info/PKG-INFO 1.6.4-1/h5netcdf.egg-info/PKG-INFO
--- 1.6.1-1/h5netcdf.egg-info/PKG-INFO	2025-03-07 14:51:19.000000000 +0000
+++ 1.6.4-1/h5netcdf.egg-info/PKG-INFO	2025-08-05 06:26:52.000000000 +0000
@@ -1,6 +1,6 @@
-Metadata-Version: 2.2
+Metadata-Version: 2.4
 Name: h5netcdf
-Version: 1.6.1
+Version: 1.6.4
 Summary: netCDF4 via h5py
 Author-email: Stephan Hoyer <shoyer@gmail.com>, Kai Mühlbauer <kmuehlbauer@wradlib.org>
 Maintainer-email: h5netcdf developers <devteam@h5netcdf.org>
@@ -56,6 +56,7 @@ Requires-Dist: packaging
 Provides-Extra: test
 Requires-Dist: netCDF4; extra == "test"
 Requires-Dist: pytest; extra == "test"
+Dynamic: license-file
 
 h5netcdf
 ========
@@ -318,7 +319,7 @@ The following describes the behavior of
 for a few key versions:
 
 - Version 0.12.0 and earlier, the ``track_order`` parameter`order was missing
-  and thus order tracking was implicitely set to ``False``.
+  and thus order tracking was implicitly set to ``False``.
 - Version 0.13.0 enabled order tracking by setting the parameter
   ``track_order`` to ``True`` by default without deprecation.
 - Versions 0.13.1 to 1.0.2 set ``track_order`` to ``False`` due to a bug in a
