Default Compilation Specification for Spark Extensions
This is the default compilation specification used for
Spark extensions, when used with compile_package_jars.
spark_default_compilation_spec(pkg = infer_active_package_name(),
locations = NULL)Arguments
| pkg | The package containing Spark extensions to be compiled. |
| locations | Additional locations to scan. By default, the
directories |