SparkSQL: Ignoring invalid json files

I am loading a bunch of JSON files using SparkSQL, but some of them have problems.

I would like to continue processing other files while ignoring bad files, how can I do this?

I tried using try-catch but it still doesn't work. Example:

try {
    val sqlContext = new org.apache.spark.sql.SQLContext(sc)
    import sqlContext._

    val jsonFiles=sqlContext.jsonFile("/requests.loading")
} catch {
    case _: Throwable => // Catching all exceptions and not doing anything with them
}

      

I fail:

14/11/20 01:20:44 INFO scheduler.TaskSetManager: Starting task 3065.0 in stage 1.0 (TID 6150, HDdata2, NODE_LOCAL, 1246 bytes)<BR>
14/11/20 01:20:44 WARN scheduler.TaskSetManager: Lost task 3027.1 in stage 1.0 (TID 6130, HDdata2): com.fasterxml.jackson.core.JsonParseException: Unexpected end-of-input: was expecting closing quote for a string value
 at [Source: java.io.StringReader@753ab9f1; line: 1, column: 1805]

      

+3


source to share


1 answer


If you are using Spark 1.2 Spark SQL will handle these broken JSON records for you. Here's an example ...



// requests.loading has some broken records
val jsonFiles=sqlContext.jsonFile("/requests.loading")
// Look at the schema of jsonFiles, you will see a new column called "_corrupt_record", which holds all broken JSON records
// jsonFiles.printSchema
// Register jsonFiles as a table
jsonFiles.registerTempTable("jsonTable")
// To query all normal records
sqlContext.sql("SELECT * FROM jsonTable WHERE _corrupt_record IS NULL")
// To query all broken JSON records
sqlContext.sql("SELECT _corrupt_record FROM jsonTable WHERE _corrupt_record IS NOT NULL")

      

+1


source







All Articles