Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add copy of data to binary output folders for running all tests #111

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

pslack
Copy link

@pslack pslack commented Aug 29, 2023

When I ran "rtneural_tests all" it failed because the data was not on the relative path of the executing test. this copies it to both /tests and to the root of the binary output in case the tests are run from that dir as well

@jatinchowdhury18
Copy link
Owner

Thanks for the contribution!

That said, I was actually planning to rework the way that the tests access the relevant data. Basically, the idea would be to define the RTNeural root directory as a compile definition:

target_compile_definitions(${example_name}
    PRIVATE
        RTNEURAL_ROOT_DIR="${CMAKE_SOURCE_DIR}"
)

Then the tests (and examples and benchmarks) could always find the files they need relative to the provided path.

Anyway, I haven't gotten around to implementing this yet, but I think that might be the most robust solution.

What do you think? I believe that solution would fix the problem that you ran into, but maybe there's something that I'm not seeing.

@pslack
Copy link
Author

pslack commented Aug 29, 2023

I would have done it exactly the way you have suggested, however, it was unclear why the final binary was also copied for the test as well and so I chose this approach instead to service both binaries and to do something quickly.

Also, I think would also do a different initial setup and pattern for tests.. so in the top level CMakeList.txt begin to collect the tests in this pattern ..

#initialize ctests framework at the top level once
include(CTest)
enable_testing()

#add the tests dir
add_subdirectory(tests)
#add test command
add_test(NAME "RTNeural accuracy tests" COMMAND rtneural_tests all)

this then percolates all the tests and you can, for example , run all of them using
make test
as part of the normal build procedure

you can use this also to enforce tests to run when considering pull requests as another benefit.

I can make the changes and submit another PR ?

Kind Regards
P

@jatinchowdhury18
Copy link
Owner

jatinchowdhury18 commented Aug 30, 2023

Yeah, using CTest is a good suggestion as well. I believe CTest also provides a way to run tests in parallel which could help speed up the CI pipelines in the future.

I would just add to your suggestion, that we should only do include(CTest) and enable_testing() if the BUILD_TESTS CMake option is set.

If you'd like to make a PR with both changes, that would be fantastic! (or you could update this PR, your choice)

The intention behind copying the tests binaries to a dedicated folder was to avoid confusion due to CMake placing the binaries in different locations depending on the CMake generator or build configuration being used. (Although I suppose this would be less of an issue when using CTest)

Thanks!

@pslack
Copy link
Author

pslack commented Aug 30, 2023

Hi,

You're most welcome, this is an awesome project !!

I'll confess that my original idea was simply the "easiest" as I see there were more involved / fine grained test configurations in actions and I wanted to contribute something that didn't break everything else. The first thing I do when looking at libraries is to run tests and so I saw that as a possible contribution when I ran into trouble.

I've done a bit more digging , I think to plan and understand this further, aside from "make test" command that runs the whole gambit, one can use the ctest command to do more fine grained selections and configuration and this removes the problems or confusion with paths etc. because CMake tracks all of that.

To describe some ctest command capabilities in more detail:

cd to your root cmake binary output dir

ctest --> runs all of the compiled tests defined by add_test

ctest -N -->lists without running all of the tests that were built and registered with add_test . .(as well as other tests if you've included other cmake sub-directories in the build and they defined tests)

ctest --show-only='json-v1' --> this dumps a json format for all the tests without running them and provides access to the COMMAND of tests, the NAME of tests, the WORKING_DIRECTORY for tests and any custom properties that may be defined. This could be useful for doing other kinds of manipulations in actions for example outside of cmake

ctest -R "your test name here" -->runs the specific test given NAME given a regex

If the json dump is useful, another command after defining the test in CMakeLists.txt is one can add custom properties to tests, so for example we could define the data directory in this manner

set_tests_properties(test_name other_test_name ... PROPERTIES MY_PROP_NAME "MY PROP VALUE")

this gets added to the test json output and can be used for other purposes

The other changes I would consider would be to extract constants from literals that are used in code , this is something my old boss used to insist in our java workings and it made sense. So for example using literal file names. The advantage of using constants instead of literals is that they can be easily changed en mass from one place and gives you access to other tools such as where used or find usages / coverage instead of only having text search available to manipulate or find them.

Kind Regards

P

@jatinchowdhury18
Copy link
Owner

So I ended up setting up CTest with individual test commands (not the all command), and that seems to be working well (#112). I haven't tried doing the JSON dump yet, but that sounds quite interesting as well!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants