You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
More developers are using dev containers for RAPIDS development. We can build most of RAPIDS from the provided dev containers. But when changes need to be tested for impacts on the Java bindings, we have to use a completely separate process.
This would also enable experimenting with cuDF / RAPIDS changes while running Spark workloads.
Describe the solution you'd like
Add the proper dependencies and scripts for building and testing JNI.
Describe alternatives you've considered
Use the documented process for building libcudf for Spark and then install JDK and maven and build them.
Use Spark-RAPIDS containers for testing.
Both of these require me to commit my changes from the dev container, and then checkout the branch inside the Spark environment in order to build and test. An integrated environment will be more productive.
Additional context
I have successfully installed JDK and Maven in a dev container with cuDF, but was unable to build cuDF because of a CMake error.
[exec] CMake Error at /home/coder/cudf/java/target/cmake-build/_deps/rapids-cmake-src/rapids-cmake/find/package.cmake:125 (find_package):
[exec] By not providing "Findcudf.cmake" in CMAKE_MODULE_PATH this project has
[exec] asked CMake to find a package configuration file provided by "cudf", but
[exec] CMake did not find one.
The text was updated successfully, but these errors were encountered:
Is your feature request related to a problem? Please describe.
More developers are using dev containers for RAPIDS development. We can build most of RAPIDS from the provided dev containers. But when changes need to be tested for impacts on the Java bindings, we have to use a completely separate process.
This would also enable experimenting with cuDF / RAPIDS changes while running Spark workloads.
Describe the solution you'd like
Add the proper dependencies and scripts for building and testing JNI.
Describe alternatives you've considered
Both of these require me to commit my changes from the dev container, and then checkout the branch inside the Spark environment in order to build and test. An integrated environment will be more productive.
Additional context
I have successfully installed JDK and Maven in a dev container with cuDF, but was unable to build cuDF because of a CMake error.
The text was updated successfully, but these errors were encountered: