Viewing the internal content of a Spark Dataframe

When debugging a spark program, I can pause the stack and look at the frame to see all the DataFrame metadata. Section metadata such as split input, logical plan metadata, basic RDD metadata, etc. But I can't see the content of the DataFrame. DataFrame is a different JVM somewhere on a different node, or even on the same node (on a local learning cluster). So my question is, does anyone use a troubleshooting method where they view the contents of the DataFrame sections in a way how can the driver program be debugged?

+3


source to share





All Articles