Skip to content

Conversation

@dianaAntic
Copy link

@dianaAntic dianaAntic commented Nov 23, 2023

This PR focusses on adding functionality in the new integrationtest_nanotimingrc_nanorc module (which is in the same directory as integrationtest_nanorc) and the purpose of the new module is to be able to run a nanotimingrc session in parallel with a nanorc session in the same terminal window - previously we had to set nanotimingrc running and, in a separate terminal, run nanorc after having made sure the env.sh has been sourced in each which was quite tedious. So, this PR automates this process to kick off both with the desired RC controls (boot , conf etc....) as well as allowing any integration tests (like the one I'm adding into the timing repo) to use this module to run the test.

One example where this will be useful is the hsilibs repo which has an integration test that requires both nanotimingrc and nanorc to be running.

Here's a clip test (which will be included in the timing repo under integtest) which demonstrates how the two RC's run in parallel using the new module from this PR:

demo.2.mov

},
entry_points={
"pytest11":[
"parallel_rc_plugin = integrationtest.integrationtest_nanotimingrc_nanorc",
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

From my research, it seems that, when running an integration test, one of these entry_points will always need to then be disabled as they have conflicting code and thus we'd need to run for example pytest -p no:parallel_rc my_integtest.py to disable my plugin, when wanting to use the integrationtest_nanorc module. As far as I'm aware, there isn't a way to make the setting of the entry point conditional - is this true?
If it would be inconvenient for everyone wanting to run an integrationtest to have to add the -p no:parallel_rc then one solution is to just comment it out?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've actually just had a thought, I could make an entirely separate integration timing tests which includes all my changes but this would result in a lot of repeated code - particularly in the python/integrationtest/dro_map_gen.py file

fp.write(data)
fp.flush()
fp.close()
return filename
Copy link
Author

@dianaAntic dianaAntic Nov 24, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've also made changes to the python/integrationtest/dro_map_gen.py to generate the dro map, and also some files to create default daq_conf_json and timing_config files so that the user running the test has the choice of including their own config files or using default ones

As it stands, all that needs to be supplied is a connections.xml file that indicates the hardware which is to be tested on.

Is this process of the defaults being hard-coded a good idea or would it be preferable to implement something similar to what's in integrationtest_nanorc (the conf_dict code) and then have the hard coded values live in the test, as in the case of https://github.com/DUNE-DAQ/hsilibs/blob/develop/integtest/iceberg_real_hsi_test.py

@dianaAntic dianaAntic marked this pull request as ready for review November 24, 2023 17:22
@bieryAtFnal
Copy link
Contributor

I'm somewhat surprised by the addition of files and functions to help with daqconf config files and DROMap files, since my sense is that we already have existing ways of generating those.
However, after poking around a bit, I wonder if we should not bother with those concerns now, and instead do that later, once things are merged to develop. In that way, I can run tests that use the integrationtest_nanotimingrc_nanorc.py module, and I can verify that my suggestions work before I advertise them.

@dianaAntic
Copy link
Author

I'm somewhat surprised by the addition of files and functions to help with daqconf config files and DROMap files, since my sense is that we already have existing ways of generating those. However, after poking around a bit, I wonder if we should not bother with those concerns now, and instead do that later, once things are merged to develop. In that way, I can run tests that use the integrationtest_nanotimingrc_nanorc.py module, and I can verify that my suggestions work before I advertise them.

Oh okay, maybe I missed that! I thought it would be useful to have these default json snippets in their respective functions but I see that it clogs up the functions and can easily implement these in my test instead! I'll remove them from the PR and then have it ready for a final review tomorrow!

@eflumerf
Copy link
Member

This PR may be obsolete, but the concepts introduced may still be relevant. I'm leaning towards closing and possibly opening an issue if there is still missing functionality.

@bieryAtFnal
Copy link
Contributor

I think that it would be great to discuss how, and if, we want to proceed with this idea. I'm going to assign this PR to v5.5.0, but all I really would like to see is a discussion on that timescale, and we can decide independently on the appropriate timescale for whatever work we decide we want to do.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants