Error connecting to Spark from Jupyter using Apache Toree SparkR core

I am trying to connect to Spark 2.1.0

from Jupyter

using Apache Toree SparkR kernel

. The kernel loads correctly, but when I try to execute a cell, an error appears and repeats endlessly.

Connecting to Spark using Scala and Python kernels works great. Connecting to Spark using R-RStudio works fine.

Error log:

Loading required package: methods

Attaching package: ‘SparkR’

The following objects are masked from ‘package:stats’:

`cov, filter, lag, na.omit, predict, sd, var, window`

The following objects are masked from ‘package:base’:

`as.data.frame, colnames, colnames<-, drop, endsWith, intersect,`
`rank, rbind, sample, startsWith, subset, summary, transform, union`

      

Warning message:

`In rm(".sparkRcon", envir = .sparkREnv) : objeto` '.sparkRcon' no encontrado
[1] "ExistingPort:" "43101"        
Error in value[[3L]](cond) : 
  Failed to connect JVM: Error in socketConnection(host = hostname, port = port, server = FALSE, : el argumento "timeout" está ausente, sin valor por omisión
Calls: sparkR.connect ... tryCatch -> tryCatchList -> tryCatchOne -> <Anonymous>
Ejecución interrumpida
17/05/04 11:04:12 [ERROR] o.a.t.k.i.s.SparkRProcessHandler - null process exited: 1

      

+3


source to share





All Articles