Monday, July 29, 2013

Nested Hibernate Filters

Nested Hibernate Filters

Hibernate added a feature that is not part of the JPA, and this is filters.
When you want to load information from the database but want to filter the data, you can use HQL, @Where or filters.
HQL is good if your filter is limited to the query that you need. If you always want to filter the class you can add to the class the @Where. In the cases that you sometimes want to filter and sometimes not, or you need to parameterize the filtering then you use the @Filters.
It feels like all the features of filtering were an afterthought by hibernate. The reason is that hibernate does not use the field names for the filtering or @Where but uses the database field name. The reason they do this, is that it is easy to add to the sql statement the where statement, and they do not have to parse the class objects.
Another oversight on the filter issue is nested filters. Hibernate does not care how many times you enable the filter, one disable will disable it in any case. Also if you change the parameter of a filter that is already open hibernate has no problem and will update the fiter.

Take the following case:

Enable filter, date: 1/1/2000
Load data
Enable filter, date: 1/1/1999
Load data
                Disable filter
Load data
Disable filter

In this scenario we have two problems. Once the disable filter is called there is no open filter so the third load data will run with no filter at all. Even if the filter was not closed we would still have the problem that the date of the filter is incorrect, since it is set for 1999 and not 2000.
To solve this issue we need to implement a stack of filters per filter name and store them on the local thread (so that our genericdao is thread safe).

FilterInfo Class

The first class will hold the parameters of the filter and what is the nested count (how many times has the enable been called for the same filter with the same parameters):
public class FilterInfo {
       private int nestedCount;
       private Map<String, Object> parameters = new HashMap<String, Object>();
      
             
       public FilterInfo(Map<String, Object> parameters) {
              super();
              this.parameters = parameters;
       }

      
       public boolean areParametersTheSame(Map<String, Object> currentParameters) {
              if (parameters.size() > 0) {
                     // check old parameters, see if changed
                     for (String paramName : parameters.keySet()) {
                           for (String currentparamName : currentParameters.keySet()) {
                                  if (paramName.equals(currentparamName)) {
                                         Object newValue = parameters.get(paramName);
                                         Object oldValue = currentParameters.get(paramName);
                                         if (!newValue.equals(oldValue)) {
                                                return false;
                                         }
                                  }
                           }
                     }
              }
              return true;
       }

       public int decNestedCount() {
              nestedCount--;
              return nestedCount;
       }

       public int incNestedCount() {
              nestedCount++;
              return nestedCount;
       }

       public Map<String, Object> getParameters() {
              return parameters;
       }

       public int getNestedCount() {
              return nestedCount;
       }

}

FilterStack Class

This class will store a stack of the parameters per filter so that we can re enable the filter with the correct parameters once a nested filter is disabled.
public class FilterStack {
       private String filterName;
       private Stack<FilterInfo> filterStack = new Stack<FilterInfo>();

       public FilterStack(String filterName) {
              this.filterName = filterName;
       }

       public FilterInfo setParameters(Map<String, Object> parameters) {
              boolean areParametersTheSame = false;
              FilterInfo filterInfo = null;
              if (filterStack.size() > 0) {
                     filterInfo = filterStack.peek();
                     areParametersTheSame = filterInfo.areParametersTheSame(parameters);
              }
              if (!areParametersTheSame) {
                     // add a new stack
                     filterInfo = new FilterInfo(parameters);
                     filterStack.push(filterInfo);
              }
              filterInfo.incNestedCount();
              return filterInfo;
       }

       public FilterInfo disableFilter() {
              if (filterStack.size()==0) {
                     throw new InfraException("you are closing a filter that is not open: " + filterName);
              }
              FilterInfo peek = filterStack.peek();
              if (peek != null) {
                     int decNestedCount = peek.decNestedCount();
                     if (decNestedCount <= 0) {
                           filterStack.pop();
                           return filterStack.size()>0 ?  filterStack.peek() : null;
                     }
              }
              return peek;
       }

}

NestedFilterInfo Class

The last class will be stored on the local thread, and will hold all the information for all the open filters.
public class NestedFilterInfo {
       private Map<String, FilterStack> filtersByName = new HashMap<String, FilterStack>();

       public FilterInfo enableFilter(String filterName, Map<String, Object> parameters) {
              // get current filter information
              FilterStack filterStack = filtersByName.get(filterName);
              if (filterStack == null) {
                     filterStack = new FilterStack(filterName);
                     filtersByName.put(filterName, filterStack);
              }

              // check if filter parameters have changed.
              return filterStack.setParameters(parameters);
       }

       public FilterInfo disableFilter(String filterName) {
              FilterStack filterStack = filtersByName.get(filterName);
              return filterStack.disableFilter();
       }
      
       public Set<String> getFilterNames() {
              return filtersByName.keySet();
       }

       public void clear() {
              filtersByName.clear();           
       }
}


GenericDao class

In the generic dao class that actually opens and closes the filter will use these classes to manage all open filters. The function to enable a filter will call:
public void enableFilter(String filterName, Map<String, Object> mapParameters) {
NestedFilterInfo filterStack = getFilterStack();
       FilterInfo enableFilter = filterStack.enableFilter(filterName, mapParameters);
       sessionEnableFilter(filterName, enableFilter.getParameters());
}

private void sessionEnableFilter(String filterName, Map<String, Object> mapParameters) {
       Filter filter = session().enableFilter(filterName);
       Object value;
       if (mapParameters != null) {
              for (Map.Entry<String, Object> entryParameters : mapParameters.entrySet()) {
                     value = entryParameters.getValue();
                     filter.setParameter(entryParameters.getKey(), value);
              }
       }
}

The enabling filter is the easy part. For every enable filter if no parameters have changed we will increment the counter on the FilterInfo class. If the parameters have changed we will create a new FIlterInfo and add it to the stack in the FilterStack.
The magic is done in the disable filter:
public void disableFilter(String filterName) {
       NestedFilterInfo filterStack = getFilterStack();
       FilterInfo filterInfo = filterStack.disableFilter(filterName);
       if (filterInfo == null || filterInfo.getNestedCount() <= 0) {
              session().disableFilter(filterName);
       } else {
              sessionEnableFilter(filterName, filterInfo.getParameters());
       }
}

Here we call the disable and get a FilterInfo class. If the nested count is down to zero then we have finished the stack and called the disable the same amount as the enable so we will close the filter on the session. If not then either the stack count went down by one, or we got the previous filter that was open. In either case we update the session filter with the parameters, and this way we have reverted the filter to the previous one.
This solution support both nested filters with the same parameters (only after the last disable with the filter really be closed) and nested filters with different parameters (with the new parameters the session will be updated, and on the disable filter the previous parameters will be returned to the session).








Tuesday, July 16, 2013

Merge Update Save and filters

Merge Update Save and Filters


In sql there are three major types of actions: insert, update, delete.
Hibernate has a lot of functions that use these three actions. It is important to understand the difference between them, and when to use what.
·         delete()
·         persist()
·         save()
·         update()
·         saveOrUpdate()
·         merge()

Delete

The easiest is delete. This will delete the entity from the database. The object passed to the delete method can be either transient or persistent.  Since hibernate uses the id for the delete is does not make a difference.

Persist, Save

Both of these will generate an insert into the database. The save will return the id that was generated, where persist returns void. Persist should be used unless you need the id immediately, since in the case of save an insert is run with the method, while persist is optimized and will run the insert at the latest possible time when the id is needed.

Update

Update is the tricky part of hibernate and has the most options. The hibernate documentation writes the following about persistent objects:
Transactional persistent instances (i.e. objects loaded, saved, created or queried by the Session) can be manipulated by the application, and any changes to persistent state will be persisted when the Session is flushed.
This means as long as the transaction and session is open, any change to a persisted object will be updated in the database when the session is flushed (usually on commit but can be called explicitly).
In the case of detached objects, hibernate has the methods of update and merge.
Update is the simple one. Update will persist a detached object to the database. In the case that you do not know if you need an insert or update, you can use the method saveOrUpdate. This method will check the id of the object. If there is no id an insert will be generated else an updated will be generated.
In the case that you call update on an object (detached) that already has an instance in the session, hibernate will through an exception. This is since hibernate wants to let you know that two different changes have been made to the object (one in the session and the other detached). So the way hibernate lets you know is through an exception - NonUniqueObjectException.
A simple way to solve this issue if you want to ignore previous changes and persist the latest one, is to call evict on the session and remove the previous object. Then the update will succeed.

Merge

Merge knows to overcome the problem of update. Though as we will see merge is very sophisticated and needs to be used with caution. The documentation of hibernate is as follows:
o    if there is a persistent instance with the same identifier currently associated with the session, copy the state of the given object onto the persistent instance
o    if there is no persistent instance currently associated with the session, try to load it from the database, or create a new persistent instance
o    the persistent instance is returned
o    the given instance does not become associated with the session, it remains detached

Thinks you must take into consideration:

Returned Object

Take care to save the result of the method, since the object in the parameter remains detached. Hibernate will update the internal object with your information but will not add it to the session since another one exits, this one will be returned to you.

Select issues

The next issue is that if your object is not in the session, hibernate will generate a select statement to get it. This can be a performance hit that you did not take into consideration if all you wanted was an update. This is especially true for the case that you have fields marked as eager, or in certain cases of *ToOne (hibernate will load them also with lazy annotation – if the foreign key is not using the id [primary key]), in this case hibernate will load the whole graph.
Another problem with the select of the merge is filters. If you are using filters to load your data, and on the merge you do not have the filters open, hibernate will generate the select for the merge but without the filters. This will cause an inconsistency of data.

Another problem with filters is the root object. Hibernate filters are activated only on connections (joins) to other objects. So if you have an object (a) that points to an object (b) that holds a collections (c’s). Loading A where b is lazy will cause hibernate to not use the filter on b, only on c. The simple solution to this is to add a join in the hql and then the filter will be added. But in the case of merge, the original hql will not be run. A standard load will be run with the eager and the filter will not affect the object b.