How do I join a binary field?

In Scala / Spark, I am trying to do the following:

val portCalls_Ports = 
  portCalls.join(ports, portCalls("port_id") === ports("id"), "inner")

      

However, I am getting the following error:

Exception in thread "main" org.apache.spark.sql.AnalysisException: 
     binary type expression port_id cannot be used in join conditions;

      

It is true that this is a binary type:

root
 |-- id: binary (nullable = false)
 |-- port_id: binary (nullable = false)
     .
     .
     .

+--------------------+--------------------+
|                  id|             port_id|
+--------------------+--------------------+
|[FB 89 A0 FF AA 0...|[B2 B2 84 B9 52 2...|

      

as is ports("id")

.

I am using the following libraries:

scalaVersion := "2.11.11"
libraryDependencies ++= Seq(
  // Spark dependencies
  "org.apache.spark" %% "spark-hive" % "1.6.2",
  "org.apache.spark" %% "spark-mllib" % "1.6.2",
  // Third-party libraries
  "postgresql" % "postgresql" % "9.1-901-1.jdbc4",
  "net.sf.jopt-simple" % "jopt-simple" % "5.0.3"
)

      

Please note that I am using JDBC to read the database tables.

What is the best way to solve this problem?

+3


source to share


1 answer


Pre Spark 2.1.0 , the best workaround I know, uses a function base64

to convert binary columns to strings and compares them:



import org.apache.spark.sql.functions._

val portCalls_Ports =
  portCalls.join(ports, base64(portCalls("port_id")) === base64(ports("id")), "inner")

      

+3


source







All Articles