Blog do projektu Open Source JavaHotel

wtorek, 4 lutego 2020

HDP 3.1 and Spark job

I spent several sleepless nights trying to solve the problem while running Spark/HBase application. The application was dying giving the nasty error stack.
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) at java.lang.Thread.run(Thread.java:748) Caused by: io.netty.channel.socket.ChannelOutputShutdownException: Channel output shutdown at io.netty.channel.AbstractChannel$AbstractUnsafe.shutdownOutput(AbstractChannel.java:587) ... 22 more Caused by: java.lang.NoSuchMethodError: org.apache.spark.network.util.AbstractFileRegion.transferred()J at org.apache.spark.network.util.AbstractFileRegion.transfered(AbstractFileRegion.java:28) at io.netty.channel.nio.AbstractNioByteChannel.doWrite(AbstractNioByteChannel.java:228) at io.netty.channel.socket.nio.NioSocketChannel.doWrite(NioSocketChannel.java:282) at io.netty.channel.AbstractChannel$AbstractUnsafe.flush0(AbstractChannel.java:879)
Usually problems like that point at versioning problem. But how could it happen if the application while running is dependent only on the client libraries provided by the HDP cluster?
Finally, after browsing through source code and comparing different versions of libraries, I crawled out of the swamp.
The culprit was an incompatible library in HBase client directory, netty-all-4.0.52.Final.jar. This library is calling deprecated transfered method which, in turn, calls a non-existing transferred method in Spark class. New version netty-all-4.1.17.Final.jar calls a correct transferred method.

The solution was dazzling simple. Just reverse the order of classpath in submit-spark command and give precedence to correct libraries in spark client jars.
spark-submit ... --conf "spark.driver.extraClassPath=$CONF:$LIB" ...
  • wrong: LIB=/usr/hdp/current/hbase-client/lib/*:/usr/hdp/current/spark2-client/jars/*
  • correct: LIB=/usr/hdp/current/spark2-client/jars/*:/usr/hdp/current/hbase-client/lib/*

Brak komentarzy:

Prześlij komentarz