Package: src:extra-data
Version: 1.7.0-5
Severity: serious
Tags: ftbfs trixie sid

Dear maintainer:

During a rebuild of all packages in unstable, your package failed to build:

--------------------------------------------------------------------------------
[...]
 debian/rules clean
dh clean --with python3 --buildsystem=pybuild
   debian/rules override_dh_auto_clean
make[1]: Entering directory '/<<PKGBUILDDIR>>'
dh_auto_clean
dh_auto_clean: warning: Use of debian/compat is deprecated and will be removed 
in debhelper (>= 14~).
I: pybuild base:311: python3.12 setup.py clean 
running clean
removing '/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_extra-data/build' (and 
everything under it)
'build/bdist.linux-x86_64' does not exist -- can't clean it
'build/scripts-3.12' does not exist -- can't clean it
I: pybuild base:311: python3.13 setup.py clean 
running clean
removing '/<<PKGBUILDDIR>>/.pybuild/cpython3_3.13_extra-data/build' (and 
everything under it)

[... snipped ...]

            import numpy.char as char
            return char.chararray
    
>       raise AttributeError("module {!r} has no attribute "
                             "{!r}".format(__name__, attr))
E       AttributeError: module 'numpy' has no attribute 'product'

/usr/lib/python3/dist-packages/numpy/__init__.py:414: AttributeError
_________________________ test_get_dask_array_jungfrau _________________________

mock_jungfrau_run = '/tmp/tmpwu0ic0ad'

    def test_get_dask_array_jungfrau(mock_jungfrau_run):
        run = RunDirectory(mock_jungfrau_run)
        jf = JUNGFRAU(run)
        assert jf.detector_name == 'SPB_IRDA_JF4M'
    
>       arr = jf.get_dask_array('data.adc')

extra_data/tests/test_components.py:246: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
extra_data/components.py:1230: in get_dask_array
    arr = super().get_dask_array(key, fill_value=fill_value, astype=astype)
extra_data/components.py:263: in get_dask_array
    mod_arr = self.data.get_dask_array(source, key, labelled=True)
extra_data/reader.py:514: in get_dask_array
    return self._get_key_data(source, key).dask_array(labelled=labelled)
extra_data/keydata.py:303: in dask_array
    while np.product(chunk_shape) * itemsize > limit and chunk_dim0 > 1:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

attr = 'product'

    def __getattr__(attr):
        # Warn for expired attributes
        import warnings
    
        if attr == "linalg":
            import numpy.linalg as linalg
            return linalg
        elif attr == "fft":
            import numpy.fft as fft
            return fft
        elif attr == "dtypes":
            import numpy.dtypes as dtypes
            return dtypes
        elif attr == "random":
            import numpy.random as random
            return random
        elif attr == "polynomial":
            import numpy.polynomial as polynomial
            return polynomial
        elif attr == "ma":
            import numpy.ma as ma
            return ma
        elif attr == "ctypeslib":
            import numpy.ctypeslib as ctypeslib
            return ctypeslib
        elif attr == "exceptions":
            import numpy.exceptions as exceptions
            return exceptions
        elif attr == "testing":
            import numpy.testing as testing
            return testing
        elif attr == "matlib":
            import numpy.matlib as matlib
            return matlib
        elif attr == "f2py":
            import numpy.f2py as f2py
            return f2py
        elif attr == "typing":
            import numpy.typing as typing
            return typing
        elif attr == "rec":
            import numpy.rec as rec
            return rec
        elif attr == "char":
            import numpy.char as char
            return char
        elif attr == "array_api":
            raise AttributeError("`numpy.array_api` is not available from "
                                 "numpy 2.0 onwards", name=None)
        elif attr == "core":
            import numpy.core as core
            return core
        elif attr == "strings":
            import numpy.strings as strings
            return strings
        elif attr == "distutils":
            if 'distutils' in __numpy_submodules__:
                import numpy.distutils as distutils
                return distutils
            else:
                raise AttributeError("`numpy.distutils` is not available from "
                                     "Python 3.12 onwards", name=None)
    
        if attr in __future_scalars__:
            # And future warnings for those that will change, but also give
            # the AttributeError
            warnings.warn(
                f"In the future `np.{attr}` will be defined as the "
                "corresponding NumPy scalar.", FutureWarning, stacklevel=2)
    
        if attr in __former_attrs__:
            raise AttributeError(__former_attrs__[attr], name=None)
    
        if attr in __expired_attributes__:
            raise AttributeError(
                f"`np.{attr}` was removed in the NumPy 2.0 release. "
                f"{__expired_attributes__[attr]}",
                name=None
            )
    
        if attr == "chararray":
            warnings.warn(
                "`np.chararray` is deprecated and will be removed from "
                "the main namespace in the future. Use an array with a string "
                "or bytes dtype instead.", DeprecationWarning, stacklevel=2)
            import numpy.char as char
            return char.chararray
    
>       raise AttributeError("module {!r} has no attribute "
                             "{!r}".format(__name__, attr))
E       AttributeError: module 'numpy' has no attribute 'product'

/usr/lib/python3/dist-packages/numpy/__init__.py:414: AttributeError
_______________________ test_write_virtual_cxi_jungfrau ________________________

mock_jungfrau_run = '/tmp/tmpwu0ic0ad'
tmpdir = local('/tmp/pytest-of-buildd/pytest-1/test_write_virtual_cxi_jungfra0')

    def test_write_virtual_cxi_jungfrau(mock_jungfrau_run, tmpdir):
        run = RunDirectory(mock_jungfrau_run)
        det = JUNGFRAU(run)
    
        test_file = osp.join(str(tmpdir), 'test.cxi')
>       det.write_virtual_cxi(test_file)

extra_data/tests/test_components.py:371: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
extra_data/components.py:1267: in write_virtual_cxi
    JUNGFRAUCXIWriter(self).write(filename, fillvalues=fillvalues)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <extra_data.write_cxi.JUNGFRAUCXIWriter object at 0x7f1608620550>
filename = 
'/tmp/pytest-of-buildd/pytest-1/test_write_virtual_cxi_jungfra0/test.cxi'
fillvalues = None

    def write(self, filename, fillvalues=None):
        """
        Write the file on disc to filename.
    
        Parameters
        ----------
        filename: str
          Path of the file to be written.
        fillvalues: dict, optional
          Keys are datasets names (one of: data, gain, mask) and associated
          fill value for missing data. defaults are:
            - data: nan (proc, float32) or 0 (raw, uint16)
            - gain: 0 (uint8)
            - mask: 0xffffffff (uint32)
        """
        pulse_ids = self.collect_pulse_ids()
        experiment_ids = np.core.defchararray.add(np.core.defchararray.add(
            self.train_ids_perframe.astype(str), ':'), pulse_ids.astype(str))
    
        layouts = self.collect_data()
    
        data_label = self.image_label
        _fillvalues = {
            # Data can be uint16 (raw) or float32 (proc)
            data_label: np.nan if layouts[data_label].dtype.kind == 'f' else 0,
            'gain': 0,
            'mask': 0xffffffff
        }
        if fillvalues:
            _fillvalues.update(fillvalues)
        # Enforce that fill values are compatible with array dtype
        _fillvalues[data_label] = layouts[data_label].dtype.type(
            _fillvalues[data_label])
        if 'gain' in layouts:
            _fillvalues['gain'] = layouts['gain'].dtype.type(
                _fillvalues['gain'])
        if 'mask' in layouts:
>           _fillvalues['mask'] = layouts['mask'].dtype.type(
                _fillvalues['mask'])
E           OverflowError: Python integer 4294967295 out of bounds for uint16

extra_data/write_cxi.py:261: OverflowError
----------------------------- Captured stdout call -----------------------------
 ### Source: SPB_IRDA_JF4M/DET/JNGFR07:daqOutput, ModNo: 7, Key: data.adc
 ### Source: SPB_IRDA_JF4M/DET/JNGFR01:daqOutput, ModNo: 1, Key: data.adc
 ### Source: SPB_IRDA_JF4M/DET/JNGFR02:daqOutput, ModNo: 2, Key: data.adc
 ### Source: SPB_IRDA_JF4M/DET/JNGFR04:daqOutput, ModNo: 4, Key: data.adc
 ### Source: SPB_IRDA_JF4M/DET/JNGFR06:daqOutput, ModNo: 6, Key: data.adc
 ### Source: SPB_IRDA_JF4M/DET/JNGFR08:daqOutput, ModNo: 8, Key: data.adc
 ### Source: SPB_IRDA_JF4M/DET/JNGFR05:daqOutput, ModNo: 5, Key: data.adc
 ### Source: SPB_IRDA_JF4M/DET/JNGFR03:daqOutput, ModNo: 3, Key: data.adc
 ### Source: SPB_IRDA_JF4M/DET/JNGFR07:daqOutput, ModNo: 7, Key: data.gain
 ### Source: SPB_IRDA_JF4M/DET/JNGFR01:daqOutput, ModNo: 1, Key: data.gain
 ### Source: SPB_IRDA_JF4M/DET/JNGFR02:daqOutput, ModNo: 2, Key: data.gain
 ### Source: SPB_IRDA_JF4M/DET/JNGFR04:daqOutput, ModNo: 4, Key: data.gain
 ### Source: SPB_IRDA_JF4M/DET/JNGFR06:daqOutput, ModNo: 6, Key: data.gain
 ### Source: SPB_IRDA_JF4M/DET/JNGFR08:daqOutput, ModNo: 8, Key: data.gain
 ### Source: SPB_IRDA_JF4M/DET/JNGFR05:daqOutput, ModNo: 5, Key: data.gain
 ### Source: SPB_IRDA_JF4M/DET/JNGFR03:daqOutput, ModNo: 3, Key: data.gain
 ### Source: SPB_IRDA_JF4M/DET/JNGFR07:daqOutput, ModNo: 7, Key: data.mask
 ### Source: SPB_IRDA_JF4M/DET/JNGFR01:daqOutput, ModNo: 1, Key: data.mask
 ### Source: SPB_IRDA_JF4M/DET/JNGFR02:daqOutput, ModNo: 2, Key: data.mask
 ### Source: SPB_IRDA_JF4M/DET/JNGFR04:daqOutput, ModNo: 4, Key: data.mask
 ### Source: SPB_IRDA_JF4M/DET/JNGFR06:daqOutput, ModNo: 6, Key: data.mask
 ### Source: SPB_IRDA_JF4M/DET/JNGFR08:daqOutput, ModNo: 8, Key: data.mask
 ### Source: SPB_IRDA_JF4M/DET/JNGFR05:daqOutput, ModNo: 5, Key: data.mask
 ### Source: SPB_IRDA_JF4M/DET/JNGFR03:daqOutput, ModNo: 3, Key: data.mask
 ### Source: SPB_IRDA_JF4M/DET/JNGFR07:daqOutput, ModNo: 7, Key: data.memoryCell
 ### Source: SPB_IRDA_JF4M/DET/JNGFR01:daqOutput, ModNo: 1, Key: data.memoryCell
 ### Source: SPB_IRDA_JF4M/DET/JNGFR02:daqOutput, ModNo: 2, Key: data.memoryCell
 ### Source: SPB_IRDA_JF4M/DET/JNGFR04:daqOutput, ModNo: 4, Key: data.memoryCell
 ### Source: SPB_IRDA_JF4M/DET/JNGFR06:daqOutput, ModNo: 6, Key: data.memoryCell
 ### Source: SPB_IRDA_JF4M/DET/JNGFR08:daqOutput, ModNo: 8, Key: data.memoryCell
 ### Source: SPB_IRDA_JF4M/DET/JNGFR05:daqOutput, ModNo: 5, Key: data.memoryCell
 ### Source: SPB_IRDA_JF4M/DET/JNGFR03:daqOutput, ModNo: 3, Key: data.memoryCell
_________________ test_write_virtual_cxi_jungfrau_some_modules _________________

mock_jungfrau_run = '/tmp/tmpwu0ic0ad'
tmpdir = local('/tmp/pytest-of-buildd/pytest-1/test_write_virtual_cxi_jungfra1')

    def test_write_virtual_cxi_jungfrau_some_modules(mock_jungfrau_run, tmpdir):
        run = RunDirectory(mock_jungfrau_run)
        det = JUNGFRAU(run, modules=[2, 3, 4, 6])
    
        test_file = osp.join(str(tmpdir), 'test.cxi')
>       det.write_virtual_cxi(test_file)

extra_data/tests/test_components.py:404: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
extra_data/components.py:1267: in write_virtual_cxi
    JUNGFRAUCXIWriter(self).write(filename, fillvalues=fillvalues)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <extra_data.write_cxi.JUNGFRAUCXIWriter object at 0x7f160879f890>
filename = 
'/tmp/pytest-of-buildd/pytest-1/test_write_virtual_cxi_jungfra1/test.cxi'
fillvalues = None

    def write(self, filename, fillvalues=None):
        """
        Write the file on disc to filename.
    
        Parameters
        ----------
        filename: str
          Path of the file to be written.
        fillvalues: dict, optional
          Keys are datasets names (one of: data, gain, mask) and associated
          fill value for missing data. defaults are:
            - data: nan (proc, float32) or 0 (raw, uint16)
            - gain: 0 (uint8)
            - mask: 0xffffffff (uint32)
        """
        pulse_ids = self.collect_pulse_ids()
        experiment_ids = np.core.defchararray.add(np.core.defchararray.add(
            self.train_ids_perframe.astype(str), ':'), pulse_ids.astype(str))
    
        layouts = self.collect_data()
    
        data_label = self.image_label
        _fillvalues = {
            # Data can be uint16 (raw) or float32 (proc)
            data_label: np.nan if layouts[data_label].dtype.kind == 'f' else 0,
            'gain': 0,
            'mask': 0xffffffff
        }
        if fillvalues:
            _fillvalues.update(fillvalues)
        # Enforce that fill values are compatible with array dtype
        _fillvalues[data_label] = layouts[data_label].dtype.type(
            _fillvalues[data_label])
        if 'gain' in layouts:
            _fillvalues['gain'] = layouts['gain'].dtype.type(
                _fillvalues['gain'])
        if 'mask' in layouts:
>           _fillvalues['mask'] = layouts['mask'].dtype.type(
                _fillvalues['mask'])
E           OverflowError: Python integer 4294967295 out of bounds for uint16

extra_data/write_cxi.py:261: OverflowError
----------------------------- Captured stdout call -----------------------------
 ### Source: SPB_IRDA_JF4M/DET/JNGFR02:daqOutput, ModNo: 2, Key: data.adc
 ### Source: SPB_IRDA_JF4M/DET/JNGFR04:daqOutput, ModNo: 4, Key: data.adc
 ### Source: SPB_IRDA_JF4M/DET/JNGFR06:daqOutput, ModNo: 6, Key: data.adc
 ### Source: SPB_IRDA_JF4M/DET/JNGFR03:daqOutput, ModNo: 3, Key: data.adc
 ### Source: SPB_IRDA_JF4M/DET/JNGFR02:daqOutput, ModNo: 2, Key: data.gain
 ### Source: SPB_IRDA_JF4M/DET/JNGFR04:daqOutput, ModNo: 4, Key: data.gain
 ### Source: SPB_IRDA_JF4M/DET/JNGFR06:daqOutput, ModNo: 6, Key: data.gain
 ### Source: SPB_IRDA_JF4M/DET/JNGFR03:daqOutput, ModNo: 3, Key: data.gain
 ### Source: SPB_IRDA_JF4M/DET/JNGFR02:daqOutput, ModNo: 2, Key: data.mask
 ### Source: SPB_IRDA_JF4M/DET/JNGFR04:daqOutput, ModNo: 4, Key: data.mask
 ### Source: SPB_IRDA_JF4M/DET/JNGFR06:daqOutput, ModNo: 6, Key: data.mask
 ### Source: SPB_IRDA_JF4M/DET/JNGFR03:daqOutput, ModNo: 3, Key: data.mask
 ### Source: SPB_IRDA_JF4M/DET/JNGFR02:daqOutput, ModNo: 2, Key: data.memoryCell
 ### Source: SPB_IRDA_JF4M/DET/JNGFR04:daqOutput, ModNo: 4, Key: data.memoryCell
 ### Source: SPB_IRDA_JF4M/DET/JNGFR06:daqOutput, ModNo: 6, Key: data.memoryCell
 ### Source: SPB_IRDA_JF4M/DET/JNGFR03:daqOutput, ModNo: 3, Key: data.memoryCell
____________________________ test_stackview_squeeze ____________________________

    def test_stackview_squeeze():
        # Squeeze not dropping stacking dim
        data = {0: np.zeros((1, 4)), 1: np.zeros((1, 4))}
        sv = StackView(data, 2, (1, 4), data[0], 0, stack_axis=0)
        assert sv.shape == (2, 1, 4)
        assert sv.squeeze().shape == (2, 4)
    
        # Squeeze dropping stacking dim
        data = {0: np.zeros((1, 4))}
        sv = StackView(data, 1, (1, 4), data[0].dtype, 0, stack_axis=0)
        assert sv.shape == (1, 1, 4)
        assert sv.squeeze().shape == (4,)
    
        assert sv.squeeze(axis=0).shape == (1, 4)
        assert sv.squeeze(axis=-2).shape == (1, 4)
    
>       with pytest.raises(np.AxisError):

extra_data/tests/test_stacking.py:167: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

attr = 'AxisError'

    def __getattr__(attr):
        # Warn for expired attributes
        import warnings
    
        if attr == "linalg":
            import numpy.linalg as linalg
            return linalg
        elif attr == "fft":
            import numpy.fft as fft
            return fft
        elif attr == "dtypes":
            import numpy.dtypes as dtypes
            return dtypes
        elif attr == "random":
            import numpy.random as random
            return random
        elif attr == "polynomial":
            import numpy.polynomial as polynomial
            return polynomial
        elif attr == "ma":
            import numpy.ma as ma
            return ma
        elif attr == "ctypeslib":
            import numpy.ctypeslib as ctypeslib
            return ctypeslib
        elif attr == "exceptions":
            import numpy.exceptions as exceptions
            return exceptions
        elif attr == "testing":
            import numpy.testing as testing
            return testing
        elif attr == "matlib":
            import numpy.matlib as matlib
            return matlib
        elif attr == "f2py":
            import numpy.f2py as f2py
            return f2py
        elif attr == "typing":
            import numpy.typing as typing
            return typing
        elif attr == "rec":
            import numpy.rec as rec
            return rec
        elif attr == "char":
            import numpy.char as char
            return char
        elif attr == "array_api":
            raise AttributeError("`numpy.array_api` is not available from "
                                 "numpy 2.0 onwards", name=None)
        elif attr == "core":
            import numpy.core as core
            return core
        elif attr == "strings":
            import numpy.strings as strings
            return strings
        elif attr == "distutils":
            if 'distutils' in __numpy_submodules__:
                import numpy.distutils as distutils
                return distutils
            else:
                raise AttributeError("`numpy.distutils` is not available from "
                                     "Python 3.12 onwards", name=None)
    
        if attr in __future_scalars__:
            # And future warnings for those that will change, but also give
            # the AttributeError
            warnings.warn(
                f"In the future `np.{attr}` will be defined as the "
                "corresponding NumPy scalar.", FutureWarning, stacklevel=2)
    
        if attr in __former_attrs__:
            raise AttributeError(__former_attrs__[attr], name=None)
    
        if attr in __expired_attributes__:
            raise AttributeError(
                f"`np.{attr}` was removed in the NumPy 2.0 release. "
                f"{__expired_attributes__[attr]}",
                name=None
            )
    
        if attr == "chararray":
            warnings.warn(
                "`np.chararray` is deprecated and will be removed from "
                "the main namespace in the future. Use an array with a string "
                "or bytes dtype instead.", DeprecationWarning, stacklevel=2)
            import numpy.char as char
            return char.chararray
    
>       raise AttributeError("module {!r} has no attribute "
                             "{!r}".format(__name__, attr))
E       AttributeError: module 'numpy' has no attribute 'AxisError'

/usr/lib/python3/dist-packages/numpy/__init__.py:414: AttributeError
=============================== warnings summary ===============================
extra_data/tests/cli/test_make_virtual_cxi.py: 10 warnings
extra_data/tests/test_components.py: 20 warnings
extra_data/tests/test_bad_trains.py: 2 warnings
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.13_extra-data/build/extra_data/write_cxi.py:240:
 DeprecationWarning: numpy.core is deprecated and has been renamed to 
numpy._core. The numpy._core namespace contains private NumPy internals and its 
use is discouraged, as NumPy internals can change without warning in any 
release. In practice, most real-world usage of numpy.core is to access 
functionality in the public NumPy API. If that is the case, use the public 
NumPy API. If not, you are using NumPy internals. If you would still like to 
access an internal attribute, use numpy._core.defchararray.
    experiment_ids = np.core.defchararray.add(np.core.defchararray.add(

extra_data/tests/test_utils.py::test_cbf_conversion[0.5]
extra_data/tests/test_utils.py::test_cbf_conversion[1.0]
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.13_extra-data/build/extra_data/utils.py:156:
 UserWarning: The numpy_to_cbf and hdf5_to_cbf functions are deprecated and 
likely to be removed. If you are using either of them, please contact 
da-supp...@xfel.eu .
    cbf_out = numpy_to_cbf(images, index=index)

extra_data/tests/cli/test_make_virtual_cxi.py: 25 warnings
extra_data/tests/test_bad_trains.py: 6 warnings
extra_data/tests/test_components.py: 102 warnings
extra_data/tests/test_keydata.py: 1 warning
extra_data/tests/test_lsxfel.py: 1 warning
extra_data/tests/test_reader_mockdata.py: 4 warnings
extra_data/tests/test_streamer.py: 1 warning
  
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.13_extra-data/build/extra_data/tests/mockdata/base.py:150:
 DeprecationWarning: datetime.datetime.utcnow() is deprecated and scheduled for 
removal in a future version. Use timezone-aware objects to represent datetimes 
in UTC: datetime.datetime.now(datetime.UTC).
    now = datetime.utcnow().replace(microsecond=0)

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
=========================== short test summary info ============================
FAILED extra_data/tests/test_components.py::test_get_array_pulse_id[0.5] - Ty...
FAILED extra_data/tests/test_components.py::test_get_array_with_cell_ids[0.5]
FAILED 
extra_data/tests/test_components.py::test_get_array_pulse_id_reduced_data[0.5]
FAILED extra_data/tests/test_components.py::test_get_dask_array[0.5] - Attrib...
FAILED 
extra_data/tests/test_components.py::test_get_dask_array_reduced_data[0.5]
FAILED extra_data/tests/test_components.py::test_iterate_pulse_id[0.5] - Type...
FAILED extra_data/tests/test_reader_mockdata.py::test_read_fxe_raw_run[0.5]
FAILED extra_data/tests/test_reader_mockdata.py::test_read_spb_proc_run[0.5]
FAILED extra_data/tests/test_reader_mockdata.py::test_run_get_dask_array[0.5]
FAILED 
extra_data/tests/test_reader_mockdata.py::test_run_get_dask_array_labelled[0.5]
FAILED extra_data/tests/test_components.py::test_get_array_pulse_id[1.0] - Ty...
FAILED extra_data/tests/test_components.py::test_get_array_with_cell_ids[1.0]
FAILED 
extra_data/tests/test_components.py::test_get_array_pulse_id_reduced_data[1.0]
FAILED extra_data/tests/test_components.py::test_get_dask_array[1.0] - Attrib...
FAILED 
extra_data/tests/test_components.py::test_get_dask_array_reduced_data[1.0]
FAILED extra_data/tests/test_components.py::test_iterate_pulse_id[1.0] - Type...
FAILED extra_data/tests/test_reader_mockdata.py::test_read_fxe_raw_run[1.0]
FAILED extra_data/tests/test_reader_mockdata.py::test_read_spb_proc_run[1.0]
FAILED extra_data/tests/test_reader_mockdata.py::test_run_get_dask_array[1.0]
FAILED 
extra_data/tests/test_reader_mockdata.py::test_run_get_dask_array_labelled[1.0]
FAILED 
extra_data/tests/cli/test_make_virtual_cxi.py::test_make_virtual_cxi_jungfrau
FAILED extra_data/tests/test_bad_trains.py::test_dask_array - AttributeError:...
FAILED 
extra_data/tests/test_components.py::test_get_array_lpd_parallelgain_select_pulses
FAILED extra_data/tests/test_components.py::test_get_dask_array_lpd_parallelgain
FAILED extra_data/tests/test_components.py::test_get_dask_array_jungfrau - At...
FAILED extra_data/tests/test_components.py::test_write_virtual_cxi_jungfrau
FAILED 
extra_data/tests/test_components.py::test_write_virtual_cxi_jungfrau_some_modules
FAILED extra_data/tests/test_stacking.py::test_stackview_squeeze - AttributeE...
=========== 28 failed, 240 passed, 174 warnings in 85.64s (0:01:25) ============
E: pybuild pybuild:389: test: plugin distutils failed with: exit code=1: cd 
/<<PKGBUILDDIR>>/.pybuild/cpython3_3.13_extra-data/build; python3.13 -m pytest 
dh_auto_test: error: pybuild --test --test-pytest -i python{version} -p "3.12 
3.13" returned exit code 13
make[1]: *** [debian/rules:17: override_dh_auto_test] Error 255
make[1]: Leaving directory '/<<PKGBUILDDIR>>'
make: *** [debian/rules:7: binary] Error 2
dpkg-buildpackage: error: debian/rules binary subprocess returned exit status 2
--------------------------------------------------------------------------------

The above is just how the build ends and not necessarily the most relevant part.
If required, the full build log is available here:

https://people.debian.org/~sanvila/build-logs/202502/

About the archive rebuild: The build was made on virtual machines from AWS,
using sbuild and a reduced chroot with only build-essential packages.

If you could not reproduce the bug please contact me privately, as I
am willing to provide ssh access to a virtual machine where the bug is
fully reproducible.

If this is really a bug in one of the build-depends, please use
reassign and add an affects on src:extra-data, so that this is still
visible in the BTS web page for this package.

Thanks.

Reply via email to