Monday, September 12, 2016

Enabling 2nd level caching in Hibernate/Wildfly

Standalone.xml


User has to provide the hibernate-infinispan library version in the cache-container configuration.



<cache-container name="hibernate" default-cache="local-query" module="org.hibernate.infinispan:5.2.1.Final">
                <local-cache name="entity">
                    <transaction mode="NON_XA"/>
                    <eviction strategy="LRU" max-entries="10000"/>
                    <expiration max-idle="100000"/>
                </local-cache>
                <local-cache name="local-query">
                    <eviction strategy="LRU" max-entries="10000"/>
                    <expiration max-idle="100000"/>
                </local-cache>
                <local-cache name="timestamps"/>
            </cache-container>


Persistance.xml


 <persistence-unit name="apidb-persistence-unit"
            transaction-type="JTA">
            <description>Forge Persistence Unit</description>
            <provider>org.hibernate.ejb.HibernatePersistence</provider>
            <jta-data-source>java:jboss/datasources/lobDS</jta-data-source>     
            <shared-cache-mode>ALL</shared-cache-mode>
    <properties>
      <property name="hibernate.cache.use_second_level_cache" value="true"/>     </properties>
        </persistence-unit>

Wednesday, September 7, 2016

De/serializing LocalDateTime Using Jackson

java.time.LocalDateTime which represents a date-time as year-month-day-hour-minute-second, is introduced in jdk8.
When we deserialise the objects which contains LocalDateTime , using jackson, it will give comma separated values.


import java.time.LocalDateTime;
import java.time.format.DateTimeFormatter;


import com.fasterxml.jackson.databind.ObjectMapper; import com.fasterxml.jackson.databind.SerializationFeature; import com.fasterxml.jackson.datatype.jsr310.JavaTimeModule; import com.fasterxml.jackson.datatype.jsr310.deser.LocalDateTimeDeserializer;
 
ObjectMapper mapper = new ObjectMapper();


JavaTimeModule javaTimeModule = new JavaTimeModule();

javaTimeModule.addDeserializer(LocalDateTime.class, new LocalDateTimeDeserializer(DateTimeFormatter.ISO_DATE_TIME));

mapper.configure(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS, false);

mapper.registerModule(javaTimeModule);


And also you need to add following dependency;

<dependency>
 <groupid>com.fasterxml.jackson.datatype</groupid>
 <artifactid>jackson-datatype-jsr310</artifactid>
 <version>2.6.5</version>
<dependency>

Tuesday, August 30, 2016

Getting KeyCloakContext From ServletRequest

When REST APIs are protected with keycloak authentication, we might need to get user realm in the backend to get some user information.

In the ServletRequestFilter implementation,


public class ApplicationFilter implements Filter {

private Base64 decoder = new Base64(true);

        @Inject
ServletRequestHolder requestHolder;

@Override
public void doFilter(ServletRequest req, ServletResponse resp, FilterChain chain)
throws IOException, ServletException {
try {
checkAuthentication(req);
....
//TODO
}

/**
* Get the logged in user info from the request.
* @param request
*/
private void checkAuthentication(ServletRequest request) {
HttpServletRequest httpServletRequest = (HttpServletRequest) request;
KeycloakSecurityContext kContext = (KeycloakSecurityContext) httpServletRequest
.getAttribute(KeycloakSecurityContext.class.getName());
String bearerToken;

if (kContext != null) {

bearerToken = kContext.getTokenString();
String[] jwtToken = bearerToken.split("\\.");
byte[] decodedClaims = decoder.decode(jwtToken[1]);
JSONObject jsonObj = new JSONObject(new String(decodedClaims));
.....
//TODO
}

Thursday, August 11, 2016

Supporting hibernate precision/scale for BigDecimal with MySql

When we use BigDecimal in our POJOs, we define the precision /scale  values for it to make sure how many digits database column has to allocate to represent the values. MySql does not support the scale more than 2 digits, we can have only  19/2 for BigDecimal Data types.

@Column(name = "metre" ,precision = 19, scale = 10)
private BigDecimal metre = BigDecimal.ZERO;


To overcome this issue, we can convert BigDecimal datatype to Double when we store the values. JPA provides @Convert annotation which can be used to convert datatypes .

Eg: 

@Convert(converter = BigDecimalConverter.class)
private  BigDecimal metre = BigDecimal.ZERO;


We need to write a custom converter for this.

Converter


import java.math.BigDecimal;

import javax.persistence.AttributeConverter;
import javax.persistence.Converter;

@Converter
public class BigDecimalConverter implements AttributeConverter {

@Override
public Double convertToDatabaseColumn(BigDecimal bigDecimalValue) {
if (bigDecimalValue == null) {
return null;
}

return bigDecimalValue.doubleValue();
}

@Override
public BigDecimal convertToEntityAttribute(Double doubleValue) {
if (doubleValue == null) {
return null;
}

return BigDecimal.valueOf(doubleValue);
}

}

Thursday, July 7, 2016

JAX-RS Custom Exception Mapper

When the service(backend) throws any custom exception, using <javax.ws.rs.ext.ExceptionMapper> we can catch it , and send a relevant message to the client (/caller).

import javax.ws.rs.core.MediaType;
import javax.ws.rs.core.Response;
import javax.ws.rs.ext.ExceptionMapper;
import javax.ws.rs.ext.Provider;

/**
 * Map the server exceptions to return to the client
 *
 */

@Provider
public class CustomJaxRsExceptionMapper implements ExceptionMapper {

@Override
     @Produces("application/json")
public Response toResponse(ServiceException e) {

          Map map = new HashMap();
  map.put("Exception", e.getMessage());

Response response = Response.status(500).entity(map).build();
return response;
}
}

Wednesday, July 6, 2016

org.apache.kafka.common.errors.ApiException: The session timeout is not within an acceptable range.

To overcome above issue in kafka v 0.9.x consumer , set the "session.timeout.ms" property value @ consumer properties > group.min.session.timeout.ms property @ server.properties

Custom Serializer/Deserializer for Apache Kafka v 0.9.x

If we want to publish java beans to apache kafka v 0.9.x, we might need to write our own custom Serializer/Deserializer for them.
This custom serializers can be registered in consumer/producer properties.


import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.io.ObjectInput;
import java.io.ObjectInputStream;
import java.io.ObjectOutput;
import java.io.ObjectOutputStream;
import java.util.Map;

import org.apache.kafka.common.serialization.Deserializer;
import org.apache.kafka.common.serialization.Serializer;

import com.model.base.RawFile;


public class RawFileSerializer implements Serializer, Deserializer {

public RawFileSerializer() {

}

@Override
public RawFile deserialize(String arg0, byte[] fileContent) {
ByteArrayInputStream bis = new ByteArrayInputStream(fileContent);
ObjectInput in = null;
Object obj = null;
try {
in = new ObjectInputStream(bis);
obj = in.readObject();

} catch (IOException e) {

e.printStackTrace();
} catch (ClassNotFoundException e) {

e.printStackTrace();
} finally {
try {
bis.close();
if (in != null) {
in.close();
}
} catch (IOException ex) {
// ignore close exception
}

}
return (RawFile) obj;
}

@Override
public void close() {
// TODO Auto-generated method stub

}

@Override
public void configure(Map arg0, boolean arg1) {
// TODO Auto-generated method stub

}

@Override
public byte[] serialize(String arg0, RawFile file) {


ByteArrayOutputStream bos = new ByteArrayOutputStream();
ObjectOutput out = null;
byte[] rawFileBytes = null;
try {
out = new ObjectOutputStream(bos);
out.writeObject(file);
rawFileBytes = bos.toByteArray();

} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} finally {
try {
if (out != null) {
out.close();
bos.close();
}
} catch (Exception ex) {
ex.getLocalizedMessage();
}

}
return rawFileBytes;
}

}

Tuesday, July 5, 2016

RESTEASY003875: Unable to find a constructor that takes a String param or a valueOf() or fromString() method

The error[1] occurs when we use complex types in our GET querys.  JAXRS understands simple data types. If we use complex types, we might need to have our own (de)serialisation for the complex types.



import java.lang.annotation.Annotation;
import java.lang.reflect.Type;
import java.math.BigDecimal;
import javax.money.Monetary;
import javax.ws.rs.ext.ParamConverter;
import javax.ws.rs.ext.ParamConverterProvider;
import javax.ws.rs.ext.Provider;

import org.javamoney.moneta.Money;

@Provider
public class MoneyConverterProvider  implements ParamConverterProvider {

    private final MoneyConverter converter = new MoneyConverter();

    @Override
    public ParamConverter getConverter(Class rawType, Type genericType, Annotation[] annotations) {
        if (!rawType.equals(Money.class)) return null;
        return (ParamConverter) converter
    }

    public class MoneyConverter implements ParamConverter {

        public Money fromString(String value) {
            if (value == null ||value.isEmpty()) return null
   
            return Money.of(new BigDecimal(value), Monetary.getCurrency("AUD"));
        }

        public String toString(Money value) {
            if (value == null) return "";
            return value.toString(); 
        }

    }


}

[1]


0:42:46,435 ERROR [org.jboss.msc.service.fail] (ServerService Thread Pool -- 86) MSC000001: Failed to start service jboss.undertow.deployment.default-server.default-host./apidb: org.jboss.msc.service.StartException in service jboss.undertow.deployment.default-server.default-host./apidb: java.lang.RuntimeException: RESTEASY003875: Unable to find a constructor that takes a String param or a valueOf() or fromString() method for javax.ws.rs.QueryParam("money") on public javax.ws.rs.core.Response com.rest.autogen.SalesTransactionEndpoint.getItems(java.lang.Integer,java.lang.Integer,java.lang.String,java.lang.Long,java.lang.String, org.javamoney.moneta.Money) for basetype: org.javamoney.moneta.Money

at org.wildfly.extension.undertow.deployment.UndertowDeploymentService$1.run(UndertowDeploymentService.java:85)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
at org.jboss.threads.JBossThread.run(JBossThread.java:320)
Caused by: java.lang.RuntimeException: RESTEASY003875: Unable to find a constructor that takes a String param or a valueOf() or fromString() method for javax.ws.rs.QueryParam("money") on public javax.ws.rs.core.Response com.rest.autogen.SalesTransactionEndpoint.getItems(java.lang.Integer,java.lang.Integer,java.lang.String,java.lang.Long,java.lang.String,org.javamoney.moneta.Money) for basetype: org.javamoney.moneta.Money
at org.jboss.resteasy.core.StringParameterInjector.initialize(StringParameterInjector.java:220)
at org.jboss.resteasy.core.StringParameterInjector.(StringParameterInjector.java:64)
at org.jboss.resteasy.core.QueryParamInjector.(QueryParamInjector.java:30)
at org.jboss.resteasy.core.InjectorFactoryImpl.createParameterExtractor(InjectorFactoryImpl.java:86)
at org.jboss.resteasy.cdi.CdiInjectorFactory.createParameterExtractor(CdiInjectorFactory.java:52)
at org.jboss.resteasy.core.MethodInjectorImpl.(MethodInjectorImpl.java:44)
at org.jboss.resteasy.core.InjectorFactoryImpl.createMethodInjector(InjectorFactoryImpl.java:77)
at org.jboss.resteasy.cdi.CdiInjectorFactory.createMethodInjector(CdiInjectorFactory.java:58)
at org.jboss.resteasy.core.ResourceMethodInvoker.(ResourceMethodInvoker.java:99)
at org.jboss.resteasy.core.ResourceMethodRegistry.processMethod(ResourceMethodRegistry.java:281)
at org.jboss.resteasy.core.ResourceMethodRegistry.register(ResourceMethodRegistry.java:252)
at org.jboss.resteasy.core.ResourceMethodRegistry.addResourceFactory(ResourceMethodRegistry.java:222)
at org.jboss.resteasy.core.ResourceMethodRegistry.addResourceFactory(ResourceMethodRegistry.java:194)
at org.jboss.resteasy.core.ResourceMethodRegistry.addResourceFactory(ResourceMethodRegistry.java:180)
at org.jboss.resteasy.core.ResourceMethodRegistry.addResourceFactory(ResourceMethodRegistry.java:157)
at org.jboss.resteasy.core.ResourceMethodRegistry.addPerRequestResource(ResourceMethodRegistry.java:76)
at org.jboss.resteasy.spi.ResteasyDeployment.registration(ResteasyDeployment.java:434)
at org.jboss.resteasy.spi.ResteasyDeployment.start(ResteasyDeployment.java:245)
at org.jboss.resteasy.plugins.server.servlet.ServletContainerDispatcher.init(ServletContainerDispatcher.java:113)
at org.jboss.resteasy.plugins.server.servlet.HttpServletDispatcher.init(HttpServletDispatcher.java:36)
at io.undertow.servlet.core.LifecyleInterceptorInvocation.proceed(LifecyleInterceptorInvocation.java:117)
at org.wildfly.extension.undertow.security.RunAsLifecycleInterceptor.init(RunAsLifecycleInterceptor.java:78)
at io.undertow.servlet.core.LifecyleInterceptorInvocation.proceed(LifecyleInterceptorInvocation.java:103)
at io.undertow.servlet.core.ManagedServlet$DefaultInstanceStrategy.start(ManagedServlet.java:231)
at

io.undertow.servlet.core.ManagedServlet.createServlet(ManagedServlet.java:132)
at io.undertow.servlet.core.DeploymentManagerImpl.start(DeploymentManagerImpl.java:526)
at org.wildfly.extension.undertow.deployment.UndertowDeploymentService.startContext(UndertowDeploymentService.java:101)
at org.wildfly.extension.undertow.deployment.UndertowDeploymentService$1.run(UndertowDeploymentService.java:82)

Friday, July 1, 2016

Sample Kafka Producer and Consumer

Applied to: Apache Kafka Version: 0.9.X


Producer 



import java.io.IOException;
import java.time.LocalDateTime;
import java.util.Properties;
import java.util.concurrent.TimeUnit;
import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.ProducerRecord;

public class Producer { 
private void generateMessgaes() throws IOException {
    String topic = "myTopic";
    Properties props = new Properties();
    props.put("bootstrap.servers", "localhost:9092");
    props.put("acks", "all");
    props.put("retries", 0);
    props.put("batch.size", 16384);
    props.put("linger.ms", 1);
    props.put("buffer.memory", 33554432);
    props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
    props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
    props.put("serializer.class", "org.apache.kafka.common.serialization.StringSerializer");
    props.put("client.id", "test");

    KafkaProducer producer = null;
    try {
          producer = new KafkaProducer<>(props);
          producer.send(new ProducerRecord(topic, "test msg"));
      } catch (Throwable e) {
        e.printStackTrace();

     } finally {
          producer.close(100,TimeUnit.MILLISECONDS);
      }
   }

public static void main(String[] args) throws IOException {
       Producer producer = new Producer();
       producer.generateMessgaes();
     }
}
Consumer


import java.util.Arrays;
import java.util.List;
import java.util.Properties;
import org.apache.kafka.clients.consumer.ConsumerRecord;
import org.apache.kafka.clients.consumer.ConsumerRecords;
import org.apache.kafka.clients.consumer.KafkaConsumer;
import org.apache.kafka.common.PartitionInfo;
import org.apache.kafka.common.TopicPartition;
import org.apache.kafka.common.serialization.StringDeserializer;
import org.apache.log4j.Logger;

public class Listener {

public void start() throws CoreException {
String topic = "myTopic";
List topics = Arrays.asList(topic);
Properties props = new Properties();
props.put("bootstrap.servers", "aukk1.leightonobrien.com:9092");
props.put("enable.auto.commit", "true"); props.put("auto.commit.interval.ms", "1000"); props.put("session.timeout.ms", "30000");; props.put("heartbeat.interval.ms", "10000"); props.put("auto.offset.reset", "earliest"); props.put("group.id", "test");
props.put("key.deserializer", StringDeserializer.class.getName());
props.put("value.deserializer", StringDeserializer.class.getName());
props.put("fetch.min.bytes", 1); props.put("receive.buffer.bytes", "10000"); props.put("max.partition.fetch.bytes", "10000"); props.put("request.timeout.ms", "40000"); KafkaConsumer consumer = new KafkaConsumer(props);
try { consumer.subscribe(topics);
} catch (Exception e) {
e.printStackTrace();
}
try {
while (true) {
ConsumerRecords records = consumer.poll(3000);
System.out.println("polling msges : " + records.count());
for (ConsumerRecord record : records) {
System.out.println("kafka record : " + record.value());
}
}
} catch (Exception e) {
e.printStackTrace();
} finally { consumer.close();
}
}
public static void main(String args[]) throws CoreException {
Listener listener = new Listener();
listener.start();
}
}

Wednesday, March 30, 2016

The session timeout is not within an acceptable range : Kafka v0.9.0.x

To overcome above issue [1] in the kafka 0.9.0.x, check following properties ;

  1. group.max.session.timeout.ms in the server.properties > session.timeout.ms in the consumer.properties
  2. group.min.session.timeout.ms in the server.properties < session.timeout.ms in the consumer.properties.

[1]

org.apache.kafka.common.errors.ApiException: The session timeout is not within an acceptable range.