Scala Spark RDD current number of sections

How do you determine the number of sections of an arbitrary RDD in Scala?

I know that PySpark RDD has getNumPartitions defined in its API, but I can't find an equivalent on the Scala side.

+3
scala apache-spark


source to share


2 answers


It should be rdd.partitions.length

.



+4


source to share


At least in Spark 1.6.1 this works



rdd.getNumPartitions()

      

+4


source to share







All Articles
Loading...
X
Show
Funny
Dev
Pics