Manage Spark Connections
These routines allow you to manage your connections to Spark.
spark_connect(master = "local", spark_home = Sys.getenv("SPARK_HOME"),
method = c("shell", "livy", "databricks", "test"), app_name = "sparklyr",
version = NULL, hadoop_version = NULL, config = spark_config(),
extensions = sparklyr::registered_extensions(), ...)
spark_connection_is_open(sc)
spark_disconnect(sc, ...)
spark_disconnect_all()Arguments
| master | Spark cluster url to connect to. Use |
| spark_home | The path to a Spark installation. Defaults to the path
provided by the |
| method | The method used to connect to Spark. Currently, only
|
| app_name | The application name to be used while running in the Spark cluster. |
| version | The version of Spark to use. Only applicable to
|
| hadoop_version | The version of Hadoop to use. Only applicable to
|
| config | Custom configuration for the generated Spark connection. See
|
| extensions | Extension packages to enable for this connection. By
default, all packages enabled through the use of
|
| ... | Optional arguments; currently unused. |
| sc | A |
Examples
#> [1] TRUE
spark_disconnect(sc)