Blog do projektu Open Source JavaHotel

sobota, 29 lutego 2020

Simple RestService library

Motivation
REST API is the mean of choice to communicate between different loosely coupled applications. There are plenty of REST API implementations but I was looking for a solution as simple as possible with minimal external dependencies or prerequisites. Finally, I ended up with the compact library utilizing existing in Java JDK HttpServer.
Links
Intellij IDEA project, source code and javadoc.
Sample project utilizing the RestService project.
Another project having RestService dependency.
Highlights
  • Very lightweight, no external dependency, just Java JDK.
  • Can be dockerized, sample Dockerfile.
  • Adds-on making a life of developer easier.
    • Validating and extracting URL query parameters including type control.
    • Upload data
    • CORS relaxation
    • Sending data with valid HTTP response code.
  • "Dynamic" and "static" REST API call. "Dynamic" means that specification of particular REST API endpoint can be defined after the request reached the server but before handling the request thus allowing providing different custom logic according to the URL path.
Usage
The service class should extend RestHelper.RestServiceHelper abstract class and implements two methods:
  • getParams : delivers REST API call specification (look below) including URL query parameter definition. The method is called after the REST API request accepted by HTTP Server but before validating and running the call. 
  • servicehandle: custom logic to serve the particular REST API endpoint. The method should conclude handling the request by proper "produceresponse" call. The "servicehandle" can take a URL query parameters and utilize several helper methods.
REST API specification
The REST API endpoint specification is defined through RestParams class. The specification consists of:
  • HTTP request method: GET, POST, PUT etc
  • List of allowed  URL query parameters. Three parameters type are supported: BOOLEAN, INT and STRING (meaning any other).
  • CORS should be relaxed for this particular endpoint.
  • Response type content (TEXT, JSON or not specified), Content-Type.
  • List of methods allowed in the response header, Access-Control-Allow-Methods.
Main
The main class should extend RestStart abstract class.






wtorek, 4 lutego 2020

HDP 3.1 and Spark job

I spent several sleepless nights trying to solve the problem while running Spark/HBase application. The application was dying giving the nasty error stack.
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) at java.lang.Thread.run(Thread.java:748) Caused by: io.netty.channel.socket.ChannelOutputShutdownException: Channel output shutdown at io.netty.channel.AbstractChannel$AbstractUnsafe.shutdownOutput(AbstractChannel.java:587) ... 22 more Caused by: java.lang.NoSuchMethodError: org.apache.spark.network.util.AbstractFileRegion.transferred()J at org.apache.spark.network.util.AbstractFileRegion.transfered(AbstractFileRegion.java:28) at io.netty.channel.nio.AbstractNioByteChannel.doWrite(AbstractNioByteChannel.java:228) at io.netty.channel.socket.nio.NioSocketChannel.doWrite(NioSocketChannel.java:282) at io.netty.channel.AbstractChannel$AbstractUnsafe.flush0(AbstractChannel.java:879)
Usually problems like that point at versioning problem. But how could it happen if the application while running is dependent only on the client libraries provided by the HDP cluster?
Finally, after browsing through source code and comparing different versions of libraries, I crawled out of the swamp.
The culprit was an incompatible library in HBase client directory, netty-all-4.0.52.Final.jar. This library is calling deprecated transfered method which, in turn, calls a non-existing transferred method in Spark class. New version netty-all-4.1.17.Final.jar calls a correct transferred method.

The solution was dazzling simple. Just reverse the order of classpath in submit-spark command and give precedence to correct libraries in spark client jars.
spark-submit ... --conf "spark.driver.extraClassPath=$CONF:$LIB" ...
  • wrong: LIB=/usr/hdp/current/hbase-client/lib/*:/usr/hdp/current/spark2-client/jars/*
  • correct: LIB=/usr/hdp/current/spark2-client/jars/*:/usr/hdp/current/hbase-client/lib/*