Destroying Spark UDFs Explicitly

I have a long SparkContext in which I define a UDF that accepts a lot of data as its. I noticed that the memory used by this UDF is not flushed back until the SparkContext is complete. I would like to find a way to claim this memory before the application exits. I've tried the following with no luck:

  • Registering another UDF with the same name to trick Spark into freeing the memory used in the first.

  • I checked the ContextCleaner in Spark, but it looks like all the functions there are private and not available to users.

  • I checked out the UDFRegistration class, which has methods for registering UDFs, but not registering them.

Is there a way to remove / destroy UDFs in Spark while the SparkContext is still alive?

+3


source to share





All Articles