Blog

JCache, why and how?

By  
Matti Tahvonen
Matti Tahvonen
·
On Jul 21, 2015 5:00:00 AM
·

Caching is needed in many different kinds of solutions to optimize the application in various ways. Discover what Jcache is and why you need it. Why, where and how could you use it?

Caching libraries in the Java ecosystem is almost as common as mosquitoes in Northern Europe. Ehcache, Hazelcast, Infinispan, GridGain, Apache Ignite, JCS… Caching is needed in many different kinds of solutions to optimize the application in various ways. The simplest libraries are just in-memory object maps with simple evict rules, while the most advanced caching libraries have efficient cross-JVM features and configurable options to write the cache state to disk, making them cluster ready persistency options as such, and a good basis for memory heavy computation and big data style programming.

Most caching solutions are based on map-like data structures and JCache API tries to standardize the most common use cases. If you have advanced needs, you probably have to use some implementation-specific features, like with JPA, but the standard will definitely make it easier to swap between caching libraries in the future. And it also makes it easier for developers to move from a project to another, which are probably using different caching libraries.

Take your first steps into building apps with Vaadin 
Get access to free Vaadin 10 introduction training

“Simple” usage

Although the default JCache API doesn’t let you adjust all implementation-specific features, the default API is pretty versatile and should adapt to most common use cases. In the following code snippet, you can get an overview of creating and configuring a cache and using it to cache an expensive service method call.

private void listEntriesJavaSEStyle(String filter) {
   final String name = "myCache";

    Cache<String, List> cache = Caching.getCache(name, String.class,
            List.class);
    if (cache == null) {
        final CachingProvider cachingProvider = Caching.getCachingProvider();
        final CacheManager mgr = cachingProvider.getCacheManager();
        MutableConfiguration<String, List> config = new MutableConfiguration<>();
        config.setTypes(String.class, List.class);
        config.setStoreByValue(true);
        config.setExpiryPolicyFactory(AccessedExpiryPolicy.factoryOf(
                Duration.ONE_MINUTE));
        Notification.show("Creating cache",
                Notification.Type.WARNING_MESSAGE);
        cache = mgr.createCache(name, config);
    }

    // first look up from cache, if not found, go to service and cache value
    List cached = cache.get(filter);
    if (cached != null) {
        Notification.show("Cache hit!");
        entryList.setBeans(cached);
    } else {
        Notification.show("Cache missed :-(");
        List<PhoneBookEntry> entries = service.getEntries(filter);
        cache.put(filter, entries);
        entryList.setBeans(entries);
    }
}

The code snippet uses the filter string as a key for the cache and caches the result list from backend in the cache, if not found. In serious usage, you’d naturally move configuring the cache away from your application logic and save the cache reference to a local field.

The cache configuration is the most probable place where you’ll still need to enter implementation specific code or other configuration options. The standard API has, for example, quite limited evict rule configuration possibilities, supporting only simple expiration time-based rules.

The actual Cache object API is straightforward for developers with experience of java.util.Map usage or proprietary caching APIs. I’d expect it to match most caching use cases as such, and the need to escape to implementation-specific APIs during actual usage will be rare.

The JCache API also supports cache mutation events and entry processors. When using those, you should, however, notice that JCache doesn’t specify that they should be run in the same JVM process as your own code. So using them for stuff that doesn’t relate directly to the cache may become trickier than you think, especially in a clustered environment. For example modifying your Vaadin UI(s) from a CacheEntryListener might be unstable in certain implementations.

Related reading that might interest you
Download our free guide to the Future of Web Apps

Managed bean goodies

The javax.cache.annotation package contains a lot of handy-looking annotations. With them, you can implement a common caching logic for managed beans (EJB, CDI, Spring…) just by adding a single annotation to a method. For example, the above example to cache the possibly expensive backend call can be implemented as such:

@CacheResult
public List<PhoneBookEntry> getEntries(String filter) {
    // Same very same business logic as in a version without caching
}

The CacheResult annotation is all that is needed and the container will then handle everything, including setting up the cache, checking for existing values from the cache and storing the value, if the value was not found from the cache. The actual cache name can be explicitly defined in the annotation, as well, or at class-level using the CacheDefaults annotation. I’d say that is really simple caching and that’s why I used quotation marks in the previous sub-title.

Super handy, but the downside is that annotations are not that well supported yet. Those who are using Spring 4.1 or newer are super lucky to enjoy these goodies today. Also, the latest Payara (a commercially supported build of GlassFish) has a built-in support for JCache, including the annotations, using Hazelcast as an implementation. The good folks at Tomitribe have put together a nice CDI extension library that allows you to use these annotations in any CDI managed bean today.

Summary

If you are already using a specific caching library in your project, I don’t see that much value in throwing in JCache API eagerly today. Also, if your cache library is used by another library, like as a second level cache for your JPA implementation, the JCache doesn’t bring any advantage to you. But you should definitely start using it in new projects with basic caching requirements and especially in projects where you are planning to try different libraries. All major caching libraries already support JSR 107.

All in all, I think it hasn’t really been an issue that the JCache API took ages to standardize. Portability of applications has been one of the main goals but hasn’t been that big of an issue in this area. The usage of various caching libraries has always been rather similar (~ java.util.Map like API) and it has never been a huge task to swap between caching libraries. The largest differences have been in the way caches have been configured. Setting up a cache can now be defined with a common API as well, but I’d guess that this is an area where many users will still have to fall back to implementation specific features. In cache configuration, with feature-packed implementations, a common standard just cannot tackle everything.

The actual cache usage of caching libraries will, in the future, mostly use the exact same API, independently of the library in use. This will increase a healthy competition among various caching solutions, but the largest benefit of the JSR 107 will be the added productivity of Java developers, who can in the future use the very same API for caching, in each and every project they work in.

The declarative annotation based “caching hints” for managed beans is a feature I expect to be really popular in the future. You can find example usages of the declarative approach and the raw Java API usage from my example project that uses a CDI based service class with Vaadin UI. As an implementation, I used Hazelcast, the first library that published support for the final JCache API, and the handy JCache-CDI library by Tomitribe.

Check out the example of JCache with Vaadin on GitHub

Learn more about Vaadin
Discover the easiest way to build web apps in Java

Related reading that might interest you
Download our free guide to the Future of Web Apps

Matti Tahvonen
Matti Tahvonen
Matti Tahvonen has a long history in Vaadin R&D: developing the core framework from the dark ages of pure JS client side to the GWT era and creating number of official and unofficial Vaadin add-ons. His current responsibility is to keep you up to date with latest and greatest Vaadin related technologies. You can follow him on Twitter – @MattiTahvonen
Other posts by Matti Tahvonen