spark_compilation_spec(spark_version = NULL, spark_home = NULL, scalac_path = NULL, scala_filter = NULL, jar_name = NULL)
spark_versionis supplied; in such a case,
sparklyrwill attempt to discover the associated Spark installation using
scalaccompiler to be used during compilation of your Spark extension. Note that you should ensure the version of
scalacselected matches the version of
scalacused with the version of Spark you are compiling against.
scalafiles are used during compilation. This can be useful if you have auxiliary files that should only be included with certain versions of Spark.
For use with
compile_package_jars. The Spark compilation
specification is used when compiling Spark extension Java Archives, and
defines which versions of Spark, as well as which versions of Scala, should
be used for compilation.
Most Spark extensions won't need to define their own compilation specification,
and can instead rely on the default behavior of