Scala Spark RDD current number of sections
How do you determine the number of sections of an arbitrary RDD in Scala?
I know that PySpark RDD has getNumPartitions defined in its API, but I can't find an equivalent on the Scala side.
+3
x89a10
source
to share
2 answers
It should be rdd.partitions.length
.
+4
GΓ‘bor Bakos
source
to share
At least in Spark 1.6.1 this works
rdd.getNumPartitions()
+4
echo
source
to share