I am new to sparks and kafka and want to set kafka parameters from properties file

What I am doing now is below

val topic = "mytopic"
val zkhosts = "localhost"
val zkports = "2181

      

"

in my code and then send it to the kafkastream function, but I want to read from the .properties file, is there any possible solution. Any solution would be very helpful.

+3


source to share


1 answer


Given this properties file in /tmp/sample.properties

kafka.topic = "mytopic"
kafka.zkhost = "localhost"
kafka.zkports = 2191

      

We could use a simple Java API Property

to load properties:

import java.io.FileReader
val configFile = new java.io.File("/tmp/sample.properties")
val reader = new FileReader(configFile)
val props = new Properties()
props.load(reader)
reader.close()

      



You can also use your favorite config library to load the properties file like you would any other program.

For example, you can use the popular typesafe config lib . There are many wrappers around Scala, but in its original form, you can do something like:

import com.typesafe.config.ConfigFactory
val configFile = new java.io.File("/tmp/sample.properties")
val kafkaConfig = ConfigFactory.parseFile(configFile)

import java.util.Properties
val kafkaProperties = new Properties()
kafkaProperties.put("zookeeper.hosts", kafkaConfig.getString("kafka.zkhost"))
kafkaProperties.put("zookeeper.port", kafkaConfig.getInt("kafka.zkports"):java.lang.Integer)
kafkaProperties.put("kafka.topic", kafkaConfig.getString("kafka.topic"))

      

(There are many ways to make this nice and compact. I use the most common form here)

+2


source







All Articles