Install versions of Spark for use with local Spark connections
spark_connect(master = "local")
spark_install_find( version = NULL, hadoop_version = NULL, installed_only = TRUE, latest = FALSE, hint = FALSE ) spark_install( version = NULL, hadoop_version = NULL, reset = TRUE, logging = "INFO", verbose = interactive() ) spark_uninstall(version, hadoop_version) spark_install_dir() spark_install_tar(tarfile) spark_installed_versions() spark_available_versions( show_hadoop = FALSE, show_minor = FALSE, show_future = FALSE )
Version of Spark to install. See
Version of Hadoop to install. See
Search only the locally installed versions?
Check for latest version?
On failure should the installation code be provided?
Attempts to reset settings to defaults.
Logging level to configure install. Supported options: "WARN", "INFO"
Report information as Spark is downloaded / installed
Path to TAR file conforming to the pattern spark-###-bin-(hadoop)?### where ### reference spark and hadoop versions respectively.
Show Hadoop distributions?
Show minor Spark versions?
Should future versions which have not been released be shown?
List with information about the installed version.