R/install_spark.R
, R/install_spark_versions.R
spark_install.Rd
Install versions of Spark for use with local Spark connections
(i.e. spark_connect(master = "local"
)
spark_install_find( version = NULL, hadoop_version = NULL, installed_only = TRUE, latest = FALSE, hint = FALSE ) spark_install( version = NULL, hadoop_version = NULL, reset = TRUE, logging = "INFO", verbose = interactive() ) spark_uninstall(version, hadoop_version) spark_install_dir() spark_install_tar(tarfile) spark_installed_versions() spark_available_versions( show_hadoop = FALSE, show_minor = FALSE, show_future = FALSE )
version | Version of Spark to install. See |
---|---|
hadoop_version | Version of Hadoop to install. See |
installed_only | Search only the locally installed versions? |
latest | Check for latest version? |
hint | On failure should the installation code be provided? |
reset | Attempts to reset settings to defaults. |
logging | Logging level to configure install. Supported options: "WARN", "INFO" |
verbose | Report information as Spark is downloaded / installed |
tarfile | Path to TAR file conforming to the pattern spark-###-bin-(hadoop)?### where ### reference spark and hadoop versions respectively. |
show_hadoop | Show Hadoop distributions? |
show_minor | Show minor Spark versions? |
show_future | Should future versions which have not been released be shown? |
List with information about the installed version.