Default Compilation Specification for Spark Extensions

R/spark_compile.R

spark_default_compilation_spec

Description

This is the default compilation specification used for Spark extensions, when used with compile_package_jars.

Usage

spark_default_compilation_spec( 
  pkg = infer_active_package_name(), 
  locations = NULL 
) 

Arguments

Arguments Description
pkg The package containing Spark extensions to be compiled.
locations Additional locations to scan. By default, the directories /opt/scala and /usr/local/scala will be scanned.