On Fri, Jan 24, 2025 at 03:15:27PM +0100, Thomas Huth wrote: > Move the mipsel replay tests from tests/avocado/replay_kernel.py to > the functional framework. Since the functional tests should be run per > target, we cannot stick all replay tests in one file. Thus let's add > these tests to the file where we already use the same asset already.
Are the replay tests liable to impact running time much ? The test timeouts are per-file, which could motivate a separate test_mipsel_malta_replay.py file ? > > Signed-off-by: Thomas Huth <th...@redhat.com> > --- > tests/avocado/replay_kernel.py | 54 --------------------------- > tests/functional/meson.build | 1 + > tests/functional/test_mipsel_malta.py | 30 +++++++++++++-- > 3 files changed, 28 insertions(+), 57 deletions(-) > diff --git a/tests/functional/meson.build b/tests/functional/meson.build > index b7719ab85f..7d233213c1 100644 > --- a/tests/functional/meson.build > +++ b/tests/functional/meson.build > @@ -35,6 +35,7 @@ test_timeouts = { > 'arm_sx1' : 360, > 'intel_iommu': 300, > 'mips_malta' : 120, > + 'mipsel_malta' : 500, snip > + > + @skipLongRuntime() > + def test_replay_mips_malta32el_nanomips_4k(self): > + self.do_test_replay_mips_malta32el_nanomips(self.ASSET_KERNEL_4K) > + > + @skipLongRuntime() > + def test_replay_mips_malta32el_nanomips_16k_up(self): > + self.do_test_replay_mips_malta32el_nanomips(self.ASSET_KERNEL_16K) > + > + @skipLongRuntime() > + def test_replay_mips_malta32el_nanomips_64k_dbg(self): > + self.do_test_replay_mips_malta32el_nanomips(self.ASSET_KERNEL_64K) Guess that answers my own question. I'd think a separate file for replay tests per target is nicer, so we leave the default executed malta tests with short timeout in meson. With regards, Daniel -- |: https://berrange.com -o- https://www.flickr.com/photos/dberrange :| |: https://libvirt.org -o- https://fstop138.berrange.com :| |: https://entangle-photo.org -o- https://www.instagram.com/dberrange :|