I keep reading discussions regarding the performance of Seam applications. These discussions are generally centered around the performance overhead of the interception techniques used by Seam. While this is definitely a valid issue in certain scenarios, see this excellent forum discussion started by Tobias Hill, many tend to blame Seam too quickly for their performance issues. If it is taking many seconds or even minutes to load a page, in most cases your application is more likely to blame than Seam.

In my experience, most performance issues stem from data access. Improperly tuned queries (a common culprit) and not using the second-level cache of your ORM provider when appropriate can lead to some serious performance implications in your application. While second-level caching is nothing new, here I will describe why it is important to a Seam application and how you can improve performance using Hibernate’s second-level cache provider.

Before I go any further, note that second-level caching is not the only caching solution you have available if you are using Seam. Seam provides a multi-layer caching solution that allows you to cache page fragments and objects easily while abstracting away the details. You can read all about Seam’s multi-layer caching solution in Chapter 34 of Seam Framework: Experience the Evolution of Java EE.

Loading Reference Data

Seam provides an elegant solution to the common problem of associating entities based on a dropdown selection. Take the common booking example with Seam. We are attempting to book a Hotel and we need to input credit card information. The type of credit card is likely to be a dropdown, but that dropdown is going to need to associate to a CreditCardType entity.

@Entity
public class CreditCardType implements Serializable
{
  @Id
  private Long providerId;
  private String description;
  // ... ...
}

Our Booking class then needs a reference to the CreditCardType class.

@Entity
public class Booking implements Serializable
{
  @Id
  private Long id;
  // ... ...
  @ManyToOne
  private CreditCard creditCard;
  // ... ...
}

To make this task simple, Seam provides the <s:entityConverter /> component which ensures that the user selection is converted to an entity for association with your object.

<h:selectOneMenu id="creditCard" value="#{booking.creditCard}"
    required="true">
  <s:selectItems noSelectionLabel="" var="type"
    value="#{creditCardTypes}"
    itemLabel=”#{type.description}” />
    <s:convertEntity />
  </s:selectItems>
</h:selectOneMenu>

As you can see this is quite simple, but we need to load the creditCardTypes into the conversation context in order to associate an instance to our entity. This is because the creditCardTypes need to be managed instances in the conversation-scoped persistence context. It is quite simple to accomplish this through a @Factory method scoped to the conversation.

@Name(“bookingAction”)
@Scope(CONVERSATION)
public class BookingAction implements Serializable {
  // ... ...
  @In private EntityManager entityManager;

  @Factory(“creditCardTypes”)
  public List<creditcard> loadCreditCardTypes()
  {
    return entityManager.createQuery("select c from " +
      "CreditCardType as c order by c.description").getResultList();
  }
  // ... ...
}

Great, so now we can load our entities into the context and associate them using a dropdown, so what’s the catch? The factory method only executes once, right?  The problem is that the query that loads the CreditCardType instances into the conversation context executes every time a new conversation requests the dropdown list.  This can cause the initial page load to lag.

This may not be a problem in this simple case as we only have this one dropdown, but what if we have many dropdowns on the screen? Even further, what if this dropdown list is used by several conversations? Doesn’t it seem wasteful to hit the database every time we need it? We can avoid the database hit and still achieve the same benefits by using second-level caching.

Second-level caching with Hibernate

Second-level caching is intended for data that is read-mostly. It allows you to store the entity and query data in-memory so that this data can be retrieved without the overhead of returning to the database. You can configure the cache expiration policy, which determines when the data will be refreshed in the cache (e.g. 1 hour, 2 hours, 1 day, etc.) according to the requirements for that entity. An entity like CreditCardType is certainly read-mostly so it is definitely a good candidate for the second-level cache.

Using Hibernate, it is quite simple to cache an entity by using the @Cache annotation.

@Entity
@Cache(usage = CacheConcurrencyStrategy.READ_ONLY)
public class CreditCardType implements Serializable {
  // ... ...
}

We then need to include the jars necessary for a second-level cache provider. I tend to use Ehcache as I find it simple to use and it is fully supported by Seam’s multi-layered caching solution.

Once you include the appropriate jars, you must configure Hibernate to use second-level caching. In your persistence.xml file, add the following properties for your persistence-unit definition.

<persistence-unit name="myBookingDS">
  ... ...
  <properties>
    <property name="hibernate.cache.provider_class"
      value="org.hibernate.cache.EhCacheProvider" />
    <property name="hibernate.cache.use_second_level_cache"
      value="true" />
    <property name="hibernate.cache.use_query_cache"
      value="true" />
    ... ...
  </properties>
<persistence-unit>

The hibernate.cache.provider_class should be specific to the cache provider you are using. Hibernate supports a number of implementations as described in the reference documentation.

Notice that we also set hibernate.cache.use_query_cache to true. This allows us to take the caching a step further by caching the query itself and not just the entities. In order to cache the query, we can take two approaches: use the Hibernate Session API or the Hibernate @NamedQuery annotation. Let’s look at the Hibernate Session API approach first. Our factory method above changes to the following:

@Name(“bookingAction”)
@Scope(CONVERSATION)
public class BookingAction implements Serializable {
  // ... ...
  @In private EntityManager entityManager;

  @Factory(“creditCardTypes”)
  public List<CreditCard> loadCreditCardTypes()
  {
    Session session = (Session) entityManager.getDelegate();

    Query query = session.createQuery("select c from " +
      "CreditCard as c order by c.description");
    query.setCacheable(true);

    return query.list();
  }
  // ... ...
}

Now you will notice in the logs that once the creditCardTypes have been loaded, even a new conversation does not cause a database call the next time these entities are requested. The query and the entities are loaded directly from the second-level cache in-memory.

The other approach is to use the Hibernate @NamedQuery annotation which gives the option to cache your query.

@Entity
@NamedQuery(name="getCreditCardTypes",
  query="select c from CreditCard as c " +
      "order by c.description",
  cacheable=true)
public class CreditCardType implements Serializable
{
  @Id
  private Long id;
  private String description;
  // ... ...
}

The @NamedQuery can then be retrieved through the createNamedQuery() method in the EntityManager API.

While we are only showing one scenario here, there are many cases where second-level caching can be applied in your application.

No silver bullet

By no means am I claiming here that second-level caching is the solution for every scenario. Performance tuning is somewhat of an art. It is definitely handy to know the various potential hot spots when tuning an application, but a solution that works in one case may not work in others. Simply read up on the various approaches and techniques to tune your application so that you can apply each technique when the time is right.