Stackoverflow error when running iterative program (Java) in Spark

I am trying to implement a hierarchical agglomerative clustering algorithm in Spark.

And run it in local mode, when the input is large, it is java.lang.StackOverflowError

always thrown away.

It was said (e.g. Disable super long serialization chain ), this is because the serialization chain is long. So I added a breakpoint every few iterations, but it still doesn't work, has anyone else run into this problem.

My code:

JavaRDD<LocalCluster> run(JavaRDD<Document> documents)
{
    @SuppressWarnings("resource")
    JavaSparkContext sc = new JavaSparkContext(documents.context());
    Broadcast<Double> mergeCriterionBroadcast = sc.broadcast(mergeCriterion);
    Accumulable<Long, Long> clusterIdAccumulator = sc.accumulable(documents.map(doc -> doc.getId()).max(Comparator.naturalOrder()), new LongAccumulableParam());

    // Clusters.
    JavaRDD<LocalCluster> clusters = documents.map(document -> 
    {
        return LocalCluster.of(document.getId(), Arrays.asList(document));          
    }).cache();

    // Calculate clusters pair-wise similarity, initially, each document forms a cluster.
    JavaPairRDD<Tuple2<LocalCluster, LocalCluster>, Double> similarities = clusters.cartesian(clusters).filter(clusterPair -> (clusterPair._1().getId() < clusterPair._2().getId()))
    .mapToPair(clusterPair ->
    {
        return new Tuple2<Tuple2<LocalCluster, LocalCluster>, Double>(clusterPair, LocalCluster.getSimilarity(clusterPair._1(), clusterPair._2()));
    })
    .filter(tuple -> (tuple._2() >= mergeCriterionBroadcast.value())).cache();

    // Merge the most similar two clusters.
    long count = similarities.count();
    int loops = 0;
    while (count > 0)
    {
        System.out.println("Count: " + count);
        Tuple2<Tuple2<LocalCluster, LocalCluster>, Double> mostSimilar = similarities.max(SerializableComparator.serialize((a, b) -> Double.compare(a._2(), b._2())));

        Broadcast<Tuple2<Long, Long>> MOST_SIMILAR = sc.broadcast(new Tuple2<Long, Long>(mostSimilar._1()._1().getId(), mostSimilar._1()._2().getId()));

        clusterIdAccumulator.add(1L);
        LocalCluster newCluser = LocalCluster.merge(mostSimilar._1()._1(), mostSimilar._1()._2(), clusterIdAccumulator.value());

        JavaRDD<LocalCluster> newClusterRDD = sc.parallelize(Arrays.asList(newCluser));
        JavaRDD<LocalCluster> filteredClusters = clusters.filter(cluster -> (cluster.getId() != MOST_SIMILAR.value()._1() && cluster.getId() != MOST_SIMILAR.value()._2()));

        JavaPairRDD<Tuple2<LocalCluster, LocalCluster>, Double> newSimilarities = filteredClusters.cartesian(newClusterRDD)
        .mapToPair(clusterPair ->
        {
            return new Tuple2<Tuple2<LocalCluster, LocalCluster>, Double>(clusterPair, LocalCluster.getSimilarity(clusterPair._1(), clusterPair._2()));
        })
        .filter(tuple -> (tuple._2() >= mergeCriterionBroadcast.value()));

        clusters = filteredClusters.union(newClusterRDD).coalesce(2).cache();
        similarities = similarities.filter(tuple -> 
                (tuple._1()._1().getId() != MOST_SIMILAR.value()._1()) && 
                (tuple._1()._1().getId() != MOST_SIMILAR.value()._2()) && 
                (tuple._1()._2().getId() != MOST_SIMILAR.value()._1()) && 
                (tuple._1()._2().getId() != MOST_SIMILAR.value()._2()))
                .union(newSimilarities).coalesce(4).cache();
        if ((loops++) >= 2)
        {
            clusters.checkpoint();
            similarities.checkpoint();
            loops = 0;
        }

        count = similarities.count();
    }

    return clusters.filter(cluster -> cluster.getDocuments().size() > 1);
}

      

The first part of the stacktrace

15/08/05 10:11:19 ERROR TaskSetManager: Failed to serialize task 4078, not attempting to retry it.

java.io.IOException: java.lang.StackOverflowError
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1242)
at org.apache.spark.rdd.CoalescedRDDPartition.writeObject(CoalescedRDD.scala:45)
at sun.reflect.GeneratedMethodAccessor28.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.defaultWriteObject(ObjectOutputStream.java:441)
at org.apache.spark.rdd.UnionPartition$$anonfun$writeObject$1.apply$mcV$sp(UnionRDD.scala:55)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1239)
at org.apache.spark.rdd.UnionPartition.writeObject(UnionRDD.scala:52)
at sun.reflect.GeneratedMethodAccessor27.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.defaultWriteObject(ObjectOutputStream.java:441)
at org.apache.spark.rdd.CoalescedRDDPartition$$anonfun$writeObject$1.apply$mcV$sp(CoalescedRDD.scala:48)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1239)
at org.apache.spark.rdd.CoalescedRDDPartition.writeObject(CoalescedRDD.scala:45)
at sun.reflect.GeneratedMethodAccessor28.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.defaultWriteObject(ObjectOutputStream.java:441)
at org.apache.spark.rdd.UnionPartition$$anonfun$writeObject$1.apply$mcV$sp(UnionRDD.scala:55)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1239)
at org.apache.spark.rdd.UnionPartition.writeObject(UnionRDD.scala:52)
at sun.reflect.GeneratedMethodAccessor27.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.defaultWriteObject(ObjectOutputStream.java:441)
at org.apache.spark.rdd.CoalescedRDDPartition$$anonfun$writeObject$1.apply$mcV$sp(CoalescedRDD.scala:48)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1239)
at org.apache.spark.rdd.CoalescedRDDPartition.writeObject(CoalescedRDD.scala:45)
at sun.reflect.GeneratedMethodAccessor28.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.defaultWriteObject(ObjectOutputStream.java:441)
at org.apache.spark.rdd.UnionPartition$$anonfun$writeObject$1.apply$mcV$sp(UnionRDD.scala:55)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1239)
at org.apache.spark.rdd.UnionPartition.writeObject(UnionRDD.scala:52)
at sun.reflect.GeneratedMethodAccessor27.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.defaultWriteObject(ObjectOutputStream.java:441)
at org.apache.spark.rdd.CoalescedRDDPartition$$anonfun$writeObject$1.apply$mcV$sp(CoalescedRDD.scala:48)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1239)
at org.apache.spark.rdd.CoalescedRDDPartition.writeObject(CoalescedRDD.scala:45)
at sun.reflect.GeneratedMethodAccessor28.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.defaultWriteObject(ObjectOutputStream.java:441)
at org.apache.spark.rdd.UnionPartition$$anonfun$writeObject$1.apply$mcV$sp(UnionRDD.scala:55)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1239)
at org.apache.spark.rdd.UnionPartition.writeObject(UnionRDD.scala:52)
at sun.reflect.GeneratedMethodAccessor27.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.defaultWriteObject(ObjectOutputStream.java:441)
at org.apache.spark.rdd.CoalescedRDDPartition$$anonfun$writeObject$1.apply$mcV$sp(CoalescedRDD.scala:48)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1239)
at org.apache.spark.rdd.CoalescedRDDPartition.writeObject(CoalescedRDD.scala:45)
at sun.reflect.GeneratedMethodAccessor28.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.defaultWriteObject(ObjectOutputStream.java:441)
at org.apache.spark.rdd.UnionPartition$$anonfun$writeObject$1.apply$mcV$sp(UnionRDD.scala:55)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1239)
at org.apache.spark.rdd.UnionPartition.writeObject(UnionRDD.scala:52)
at sun.reflect.GeneratedMethodAccessor27.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.defaultWriteObject(ObjectOutputStream.java:441)
at org.apache.spark.rdd.CoalescedRDDPartition$$anonfun$writeObject$1.apply$mcV$sp(CoalescedRDD.scala:48)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1239)
at org.apache.spark.rdd.CoalescedRDDPartition.writeObject(CoalescedRDD.scala:45)
at sun.reflect.GeneratedMethodAccessor28.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.defaultWriteObject(ObjectOutputStream.java:441)
at org.apache.spark.rdd.UnionPartition$$anonfun$writeObject$1.apply$mcV$sp(UnionRDD.scala:55)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1239)
at org.apache.spark.rdd.UnionPartition.writeObject(UnionRDD.scala:52)
at sun.reflect.GeneratedMethodAccessor27.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.defaultWriteObject(ObjectOutputStream.java:441)
at org.apache.spark.rdd.CoalescedRDDPartition$$anonfun$writeObject$1.apply$mcV$sp(CoalescedRDD.scala:48)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1239)
at org.apache.spark.rdd.CoalescedRDDPartition.writeObject(CoalescedRDD.scala:45)
at sun.reflect.GeneratedMethodAccessor28.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.defaultWriteObject(ObjectOutputStream.java:441)
at org.apache.spark.rdd.UnionPartition$$anonfun$writeObject$1.apply$mcV$sp(UnionRDD.scala:55)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1239)
at org.apache.spark.rdd.UnionPartition.writeObject(UnionRDD.scala:52)
at sun.reflect.GeneratedMethodAccessor27.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

      

Another part of the stacktrace:

Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1266)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1257)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1256)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1256)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:730)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:730)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:730)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1450)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1411)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)

      

And there are many such warnings before the error.

15/08/05 11:19:56 WARN TaskSetManager: Stage 519 contains a task of very large size (226 KB). The maximum recommended task size is 100 KB.

      

+3


source to share


1 answer


Finally, I discovered the problem: using a checkpoint .

In Spark, a checkpoint allows users to trim the RDD line, but the checkpoint will only be created after the RDD has been computed with an action (e.g. collect, count).



In my program no actions are called on the RDD clusters , so no breakpoints are created, the cluster serialization chain keeps growing and leads to a stackoverflowerror finally. To fix this, call the score after the checkpoint.

        clusters = filteredClusters.union(newClusterRDD).coalesce(2).cache();
        similarities = similarities.filter(tuple -> 
                (tuple._1()._1().getId() != MOST_SIMILAR.value()._1()) && 
                (tuple._1()._1().getId() != MOST_SIMILAR.value()._2()) && 
                (tuple._1()._2().getId() != MOST_SIMILAR.value()._1()) && 
                (tuple._1()._2().getId() != MOST_SIMILAR.value()._2()))
                .union(newSimilarities).coalesce(4).cache();
        if ((loops++) >= 50)
        {
            clusters.checkpoint();
            clusters.count();
            similarities.checkpoint();
            loops = 0;
        }

      

+3


source







All Articles