Scala Spark RDD current number of sections

How do you determine the number of sections of an arbitrary RDD in Scala?

I know that PySpark RDD has getNumPartitions defined in its API, but I can't find an equivalent on the Scala side.

+3


source to share


2 answers


It should be rdd.partitions.length

.



+4


source


At least in Spark 1.6.1 this works



rdd.getNumPartitions()

      

+4


source







All Articles