The "SessionTrigger" class must either be declared abstract or implement an abstract element

I am creating a Flink 1.2 tutorial and I want to run some simple example windows. One of them is Session Windows.

The code I want to run is the following:

import <package>.Session
import org.apache.flink.streaming.api.scala._
import org.apache.flink.streaming.api.windowing.assigners.GlobalWindows
import org.apache.flink.streaming.api.windowing.triggers.PurgingTrigger
import org.apache.flink.streaming.api.windowing.windows.GlobalWindow

import scala.util.Try

object SessionWindowExample {

  def main(args: Array[String]) {

    val env = StreamExecutionEnvironment.getExecutionEnvironment

    val source = env.socketTextStream("localhost", 9000)

    //session map
    val values = source.map(value => {
      val columns = value.split(",")
      val endSignal = Try(Some(columns(2))).getOrElse(None)
      Session(columns(0), columns(1).toDouble, endSignal)
    })

    val keyValue = values.keyBy(_.sessionId)

    // create global window

    val sessionWindowStream = keyValue.
      window(GlobalWindows.create()).
      trigger(PurgingTrigger.of(new SessionTrigger[GlobalWindow]()))

    sessionWindowStream.sum("value").print()

    env.execute()
  }
}

      

As you will notice, I need to instantiate the object new SessionTrigger

I am making based on this class:

import <package>.Session
import org.apache.flink.streaming.api.windowing.triggers.Trigger.TriggerContext
import org.apache.flink.streaming.api.windowing.triggers.{Trigger, TriggerResult}
import org.apache.flink.streaming.api.windowing.windows.Window

class SessionTrigger[W <: Window] extends Trigger[Session,W] {

  override def onElement(element: Session, timestamp: Long, window: W, ctx: TriggerContext): TriggerResult = {
    if(element.endSignal.isDefined) TriggerResult.FIRE
    else TriggerResult.CONTINUE
  }

  override def onProcessingTime(time: Long, window: W, ctx: TriggerContext): TriggerResult = {
    TriggerResult.CONTINUE
  }
  override def onEventTime(time: Long, window: W, ctx: TriggerContext): TriggerResult = {
    TriggerResult.CONTINUE
  }
}

      

However InteliJ continues to complain that: Class 'SessionTrigger' must either be declared abstract or implement abstract member 'clear(window: W, ctx: TriggerContext):void' in 'org.apache.flink.streaming.api.windowing.triggers.Trigger'

.

I tried adding this to the class:

 override def clear(window: W, ctx: TriggerContext): Unit = ctx.deleteEventTimeTimer(4)

      

but it doesn't work. This is the error I am getting:

  03/27/2017 15:48:38   TriggerWindow(GlobalWindows(), ReducingStateDescriptor{serializer=co.uk.DRUK.flink.windowing.SessionWindowExample.SessionWindowExample$$anon$2$$anon$1@1aec64d0, reduceFunction=org.apache.flink.streaming.api.functions.aggregation.SumAggregator@1a052a00}, PurgingTrigger(co.uk.DRUK.flink.windowing.SessionWindowExample.SessionTrigger@f2f2cc1), WindowedStream.reduce(WindowedStream.java:276)) -> Sink: Unnamed(4/4) switched to CANCELED
  03/27/2017 15:48:38   Job execution switched to status FAILED.
  Exception in thread "main" org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
  at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$6.apply$mcV$sp(JobManager.scala:900)
  at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$6.apply(JobManager.scala:843)
  at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$6.apply(JobManager.scala:843)
  at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
  at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
  at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
  at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)
  at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
  at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
  at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
  at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
  Caused by: java.lang.ArrayIndexOutOfBoundsException: 1
  at co.uk.DRUK.flink.windowing.SessionWindowExample.SessionWindowExample$$anonfun$1.apply(SessionWindowExample.scala:27)
  at co.uk.DRUK.flink.windowing.SessionWindowExample.SessionWindowExample$$anonfun$1.apply(SessionWindowExample.scala:24)
  at org.apache.flink.streaming.api.scala.DataStream$$anon$4.map(DataStream.scala:521)
  at org.apache.flink.streaming.api.operators.StreamMap.processElement(StreamMap.java:38)
  at org.apache.flink.streaming.runtime.io.StreamInputProcessor.processInput(StreamInputProcessor.java:185)
  at org.apache.flink.streaming.runtime.tasks.OneInputStreamTask.run(OneInputStreamTask.java:63)
  at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:272)
  at org.apache.flink.runtime.taskmanager.Task.run(Task.java:655)
  at java.lang.Thread.run(Thread.java:745)

  Process finished with exit code 1

      

Does anyone know why?

+3


source to share


1 answer


Well the exception says clearly

 Caused by: java.lang.ArrayIndexOutOfBoundsException: 1
  at co.uk.DRUK.flink.windowing.SessionWindowExample.SessionWindowExample$$anonfun$1.apply(SessionWindowExample.scala:27)
  at co.uk.DRUK.flink.windowing.SessionWindowExample.SessionWindowExample$$anonfun$1.apply(SessionWindowExample.scala:24)

      

which obviously displays the following line of code

  Session(columns(0), columns(1).toDouble, endSignal)

      



So, the next obvious thing is to register yours columns

and value

after

  val columns = value.split(",")

      

I suspect it value

just doesn't have a second section separated by commas, at least for some of the values.

+5


source







All Articles