2017-07-19 16:25:45,229 : DEBUG : main : NodeContainerEditPart : : : Spark Joiner 3:69 (CONFIGURED) 2017-07-19 16:26:36,869 : DEBUG : main : NodeContainerEditPart : : : Create Spark Context 3:9 (CONFIGURED) 2017-07-19 16:26:38,006 : DEBUG : main : ExecuteAction : : : Creating execution job for 1 node(s)... 2017-07-19 16:26:38,006 : DEBUG : main : NodeContainer : : : Setting dirty flag on Create Spark Context 3:9 2017-07-19 16:26:38,006 : DEBUG : main : NodeContainer : : : Setting dirty flag on wf_nfirs_dw_v1 3 2017-07-19 16:26:38,006 : DEBUG : main : NodeContainer : : : Create Spark Context 3:9 has new state: CONFIGURED_MARKEDFOREXEC 2017-07-19 16:26:38,006 : DEBUG : main : NodeContainer : : : Create Spark Context 3:9 has new state: CONFIGURED_QUEUED 2017-07-19 16:26:38,006 : DEBUG : KNIME-Workflow-Notifier : WorkflowEditor : : : Workflow event triggered: WorkflowEvent [type=WORKFLOW_DIRTY;node=3;old=null;new=null;timestamp=Jul 19, 2017 4:26:38 PM] 2017-07-19 16:26:38,006 : DEBUG : main : NodeContainer : : : wf_nfirs_dw_v1 3 has new state: EXECUTING 2017-07-19 16:26:38,006 : DEBUG : KNIME-WFM-Parent-Notifier : NodeContainer : : : ROOT has new state: EXECUTING 2017-07-19 16:26:38,006 : DEBUG : KNIME-Worker-117 : WorkflowManager : Create Spark Context : 3:9 : Create Spark Context 3:9 doBeforePreExecution 2017-07-19 16:26:38,006 : DEBUG : KNIME-Worker-117 : NodeContainer : Create Spark Context : 3:9 : Create Spark Context 3:9 has new state: PREEXECUTE 2017-07-19 16:26:38,006 : DEBUG : KNIME-Worker-117 : WorkflowManager : Create Spark Context : 3:9 : Create Spark Context 3:9 doBeforeExecution 2017-07-19 16:26:38,006 : DEBUG : KNIME-Worker-117 : NodeContainer : Create Spark Context : 3:9 : Create Spark Context 3:9 has new state: EXECUTING 2017-07-19 16:26:38,006 : DEBUG : KNIME-Worker-117 : WorkflowFileStoreHandlerRepository : Create Spark Context : 3:9 : Adding handler c8b9f119-61de-4386-8025-d1be0bfc96aa (Create Spark Context 3:9: ) - 22 in total 2017-07-19 16:26:38,006 : DEBUG : KNIME-Worker-117 : LocalNodeExecutionJob : Create Spark Context : 3:9 : Create Spark Context 3:9 Start execute 2017-07-19 16:26:38,006 : INFO : KNIME-Worker-117 : JobserverSparkContext : Create Spark Context : 3:9 : Spark context jobserver://10.10.10.100:8090/spark-job-server_rahul changed status from CONFIGURED to CONFIGURED 2017-07-19 16:26:38,008 : DEBUG : KNIME-Worker-117 : JobserverSparkContext : Create Spark Context : 3:9 : Checking if remote context exists. Name: spark-job-server_rahul 2017-07-19 16:26:38,017 : DEBUG : KNIME-Worker-117 : JobserverSparkContext : Create Spark Context : 3:9 : Remote context does not exist. Name: spark-job-server_rahul 2017-07-19 16:26:38,017 : DEBUG : KNIME-Worker-117 : JobserverSparkContext : Create Spark Context : 3:9 : Creating new remote Spark context. Name: spark-job-server_rahul 2017-07-19 16:26:47,673 : INFO : KNIME-Worker-117 : JobserverSparkContext : Create Spark Context : 3:9 : Spark context jobserver://10.10.10.100:8090/spark-job-server_rahul changed status from CONFIGURED to CONFIGURED 2017-07-19 16:26:47,673 : DEBUG : KNIME-Worker-117 : Create Spark Context : Create Spark Context : 3:9 : reset 2017-07-19 16:26:47,673 : DEBUG : KNIME-Worker-117 : SparkNodeModel : Create Spark Context : 3:9 : In reset() of SparkNodeModel. Calling deleteRDDs. 2017-07-19 16:26:47,673 : ERROR : KNIME-Worker-117 : Create Spark Context : Create Spark Context : 3:9 : Execute failed: Yarn application has already ended! It might have been killed or unable to launch application master. (for details see View > Open KNIME log) 2017-07-19 16:26:47,673 : DEBUG : KNIME-Worker-117 : Create Spark Context : Create Spark Context : 3:9 : Execute failed: Yarn application has already ended! It might have been killed or unable to launch application master. (for details see View > Open KNIME log) com.knime.bigdata.spark.core.context.jobserver.request.RestoredThrowable: org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master. at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:124) at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:64) at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:144) at org.apache.spark.SparkContext.(SparkContext.scala:541) at spark.jobserver.context.DefaultSparkContextFactory$$anon$1.(SparkContextFactory.scala:53) at spark.jobserver.context.DefaultSparkContextFactory.makeContext(SparkContextFactory.scala:53) at spark.jobserver.context.DefaultSparkContextFactory.makeContext(SparkContextFactory.scala:48) at spark.jobserver.context.SparkContextFactory$class.makeContext(SparkContextFactory.scala:37) at spark.jobserver.context.DefaultSparkContextFactory.makeContext(SparkContextFactory.scala:48) at spark.jobserver.JobManagerActor.createContextFromConfig(JobManagerActor.scala:378) at spark.jobserver.JobManagerActor$$anonfun$wrappedReceive$1.applyOrElse(JobManagerActor.scala:122) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25) at ooyala.common.akka.ActorStack$$anonfun$receive$1.applyOrElse(ActorStack.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25) at ooyala.common.akka.Slf4jLogging$$anonfun$receive$1$$anonfun$applyOrElse$1.apply$mcV$sp(Slf4jLogging.scala:26) at ooyala.common.akka.Slf4jLogging$class.ooyala$common$akka$Slf4jLogging$$withAkkaSourceLogging(Slf4jLogging.scala:35) at ooyala.common.akka.Slf4jLogging$$anonfun$receive$1.applyOrElse(Slf4jLogging.scala:25) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25) at ooyala.common.akka.ActorMetrics$$anonfun$receive$1.applyOrElse(ActorMetrics.scala:24) at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498) at akka.actor.ActorCell.invoke(ActorCell.scala:456) at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237) at akka.dispatch.Mailbox.run(Mailbox.scala:219) at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) 2017-07-19 16:26:47,673 : DEBUG : KNIME-Worker-117 : WorkflowManager : Create Spark Context : 3:9 : Create Spark Context 3:9 doBeforePostExecution 2017-07-19 16:26:47,674 : DEBUG : KNIME-Worker-117 : NodeContainer : Create Spark Context : 3:9 : Create Spark Context 3:9 has new state: POSTEXECUTE 2017-07-19 16:26:47,674 : DEBUG : KNIME-Worker-117 : WorkflowManager : Create Spark Context : 3:9 : Create Spark Context 3:9 doAfterExecute - failure 2017-07-19 16:26:47,674 : DEBUG : KNIME-Worker-117 : Create Spark Context : Create Spark Context : 3:9 : reset 2017-07-19 16:26:47,674 : DEBUG : KNIME-Worker-117 : SparkNodeModel : Create Spark Context : 3:9 : In reset() of SparkNodeModel. Calling deleteRDDs. 2017-07-19 16:26:47,674 : DEBUG : KNIME-Worker-117 : Create Spark Context : Create Spark Context : 3:9 : clean output ports. 2017-07-19 16:26:47,674 : DEBUG : KNIME-Worker-117 : WorkflowFileStoreHandlerRepository : Create Spark Context : 3:9 : Removing handler c8b9f119-61de-4386-8025-d1be0bfc96aa (Create Spark Context 3:9: ) - 21 remaining 2017-07-19 16:26:47,674 : DEBUG : KNIME-Worker-117 : NodeContainer : Create Spark Context : 3:9 : Create Spark Context 3:9 has new state: IDLE 2017-07-19 16:26:47,674 : DEBUG : KNIME-Worker-117 : SparkContextCreatorNodeModel : Create Spark Context : 3:9 : Reconfiguring old context with same ID. 2017-07-19 16:26:47,674 : DEBUG : KNIME-Worker-117 : Create Spark Context : Create Spark Context : 3:9 : Configure succeeded. (Create Spark Context) 2017-07-19 16:26:47,674 : DEBUG : KNIME-Worker-117 : NodeContainer : Create Spark Context : 3:9 : Create Spark Context 3:9 has new state: CONFIGURED 2017-07-19 16:26:47,677 : DEBUG : KNIME-Worker-117 : Hive to Spark : Hive to Spark : 3:66 : Configure succeeded. (Hive to Spark) 2017-07-19 16:26:47,678 : DEBUG : KNIME-Node-Usage-Writer : NodeTimer$GlobalNodeStats : : : Successfully wrote node usage stats to file: D:\Users\rghadge\workspace\.metadata\knime\nodeusage_3.0.json 2017-07-19 16:26:47,679 : DEBUG : KNIME-Worker-117 : Hive to Spark : Hive to Spark : 3:68 : Configure succeeded. (Hive to Spark) 2017-07-19 16:26:47,680 : DEBUG : KNIME-Worker-117 : Hive to Spark : Hive to Spark : 3:71 : Configure succeeded. (Hive to Spark) 2017-07-19 16:26:47,682 : DEBUG : KNIME-Worker-117 : Hive to Spark : Hive to Spark : 3:72 : Configure succeeded. (Hive to Spark) 2017-07-19 16:26:47,684 : DEBUG : KNIME-Worker-117 : Hive to Spark : Hive to Spark : 3:73 : Configure succeeded. (Hive to Spark) 2017-07-19 16:26:47,686 : DEBUG : KNIME-Worker-117 : Hive to Spark : Hive to Spark : 3:74 : Configure succeeded. (Hive to Spark) 2017-07-19 16:26:47,687 : DEBUG : KNIME-Worker-117 : Hive to Spark : Hive to Spark : 3:75 : Configure succeeded. (Hive to Spark) 2017-07-19 16:26:47,689 : DEBUG : KNIME-Worker-117 : Hive to Spark : Hive to Spark : 3:76 : Configure succeeded. (Hive to Spark) 2017-07-19 16:26:47,691 : DEBUG : KNIME-Worker-117 : Hive to Spark : Hive to Spark : 3:83 : Configure succeeded. (Hive to Spark) 2017-07-19 16:26:47,693 : DEBUG : KNIME-Worker-117 : Hive to Spark : Hive to Spark : 3:82 : Configure succeeded. (Hive to Spark) 2017-07-19 16:26:47,694 : DEBUG : KNIME-Worker-117 : Hive to Spark : Hive to Spark : 3:85 : Configure succeeded. (Hive to Spark) 2017-07-19 16:26:47,696 : DEBUG : KNIME-Worker-117 : Hive to Spark : Hive to Spark : 3:86 : Configure succeeded. (Hive to Spark) 2017-07-19 16:26:47,698 : DEBUG : KNIME-Worker-117 : Hive to Spark : Hive to Spark : 3:87 : Configure succeeded. (Hive to Spark) 2017-07-19 16:26:47,699 : DEBUG : KNIME-Worker-117 : Hive to Spark : Hive to Spark : 3:92 : Configure succeeded. (Hive to Spark) 2017-07-19 16:26:47,701 : DEBUG : KNIME-Worker-117 : Hive to Spark : Hive to Spark : 3:93 : Configure succeeded. (Hive to Spark) 2017-07-19 16:26:47,703 : DEBUG : KNIME-Worker-117 : Hive to Spark : Hive to Spark : 3:94 : Configure succeeded. (Hive to Spark) 2017-07-19 16:26:47,703 : DEBUG : KNIME-Worker-117 : Spark Joiner : Spark Joiner : 3:69 : Configure succeeded. (Spark Joiner) 2017-07-19 16:26:47,704 : DEBUG : KNIME-Worker-117 : Spark Joiner : Spark Joiner : 3:77 : Configure succeeded. (Spark Joiner) 2017-07-19 16:26:47,704 : DEBUG : KNIME-Worker-117 : Spark Joiner : Spark Joiner : 3:78 : Configure succeeded. (Spark Joiner) 2017-07-19 16:26:47,704 : DEBUG : KNIME-Worker-117 : Spark Joiner : Spark Joiner : 3:79 : Configure succeeded. (Spark Joiner) 2017-07-19 16:26:47,705 : DEBUG : KNIME-Worker-117 : Spark Joiner : Spark Joiner : 3:80 : Configure succeeded. (Spark Joiner) 2017-07-19 16:26:47,705 : DEBUG : KNIME-Worker-117 : Spark Joiner : Spark Joiner : 3:81 : Configure succeeded. (Spark Joiner) 2017-07-19 16:26:47,705 : DEBUG : KNIME-Worker-117 : Spark Joiner : Spark Joiner : 3:84 : Configure succeeded. (Spark Joiner) 2017-07-19 16:26:47,706 : DEBUG : KNIME-Worker-117 : Spark Joiner : Spark Joiner : 3:88 : Configure succeeded. (Spark Joiner) 2017-07-19 16:26:47,706 : DEBUG : KNIME-Worker-117 : Spark Joiner : Spark Joiner : 3:89 : Configure succeeded. (Spark Joiner) 2017-07-19 16:26:47,706 : DEBUG : KNIME-Worker-117 : Spark Joiner : Spark Joiner : 3:90 : Configure succeeded. (Spark Joiner) 2017-07-19 16:26:47,707 : DEBUG : KNIME-Worker-117 : Spark Joiner : Spark Joiner : 3:91 : Configure succeeded. (Spark Joiner) 2017-07-19 16:26:47,707 : DEBUG : KNIME-Worker-117 : Spark Joiner : Spark Joiner : 3:96 : Configure succeeded. (Spark Joiner) 2017-07-19 16:26:47,708 : DEBUG : KNIME-Worker-117 : Spark Joiner : Spark Joiner : 3:97 : Configure succeeded. (Spark Joiner) 2017-07-19 16:26:47,708 : DEBUG : KNIME-Worker-117 : Spark Joiner : Spark Joiner : 3:98 : Configure succeeded. (Spark Joiner) 2017-07-19 16:26:47,709 : DEBUG : KNIME-Worker-117 : Spark Joiner : Spark Joiner : 3:99 : Configure succeeded. (Spark Joiner) 2017-07-19 16:26:47,709 : WARN : KNIME-Worker-117 : Spark Joiner : Spark Joiner : 3:104 : Spark context of first input incompatible with Spark context of second input 2017-07-19 16:26:47,709 : DEBUG : KNIME-Worker-117 : NodeContainer : Create Spark Context : 3:9 : wf_nfirs_dw_v1 3 has new state: IDLE 2017-07-19 16:26:47,709 : DEBUG : KNIME-WFM-Parent-Notifier : NodeContainer : : : ROOT has new state: IDLE