Pages

Monday, March 7, 2011

Caching isn't perfect !

Caching  in applications is a two sided sword. If it is not used properly it can have unintended side effects.
In my current project, we were using Ehcache as a cache manager in Hibernate.Since the database interactions are very minimum, we were not concentrating on how good or effective the caching was.At the time of UAT, team noticed that the application caching strategy was not effective. So we started looking in to it.As one would think, we have certain values which will never change and there are few which can change frequently. Since application was defining the same strategy for all the entities it was not effective,which is understandable.

Now  JEE application space is too crowded ,there are many plugable components which can be used or coded in many different ways.As the number of components increase this permutation and combination increases exponentially.In other words  now a days there is no universal solution, and thus solution depends on  the components used , version, application server it is running etc etc.

Background:
We uses Hibernate as an O/R mapping layer along with the Spring framework .The hibernate components are deployed as a .har file in Jboss application server.To access the hibernate sessions from the application layer app uses org.springframework.orm.hibernate3.HibernateTemplate,which is a convenient class provided by spring and Ehcache as a Hibernate cache provider.

Even though the above said configurations are set according to the  Ehcache/Hibernate configurations:( http://ehcache.org/documentation/hibernate.html), the application was not caching any of the entities defined in the ehcache.xml.



ehcache.xml
        <cache name="com.sample.data.Person"
           maxElementsInMemory="10"
           eternal="false"
           overflowToDisk="true"
           timeToIdleSeconds="300"
           timeToLiveSeconds="600"
           memoryStoreEvictionPolicy="LFU" />
Person.hbm.xml
……..
<cache usage="read-write" />
……..
……..


When , hibernate show sql option was turned on , it was clear that app goes to DB every single time it executes the query.

Then ,after doing a little bit research, I enabled the secondary caching and query caching on hibernate. Since app was using har file , these changes should be done on jboss-service.xml.

jboss-service.xml
    <attribute name="QueryCacheEnabled">true</attribute>
    <attribute name="SecondLevelCacheEnabled">true</attribute>
I was very much confident that the caching will work fine now, but it turned out to be a false confidence.
Now what? all the combination of property values are testing and finally started following the call path using server side debugging.

Finally , I arrived at the Eureka moment !
it  seems like the HibernateDaoSupport class is not getting some of the properties set through the config files. The testing results shows that we need to manually set the setCacheQueries to true in theHibernate Template(getHibernateTemplate().setCacheQueries(true))

Now  from the logs it is very clear that when app is hitting the cache and when it is going to the DB.

2011-01-17 12:32:23,752 DEBUG [net.sf.ehcache.Cache] org.hibernate.cache.UpdateTimestampsCache cache - Miss
2011-01-17 12:32:23,752 DEBUG [net.sf.ehcache.hibernate.EhCache] Element for key PERSON is null
2011-01-17 12:32:23,752 DEBUG [org.hibernate.cache.StandardQueryCache] returning cached query results

Few Important things to watch:

1) It is very important to understand the defaultCache element in the ehcache.xml.
 If a separate cache  entity is not defined in the ehcache.xml , ehcache will use defaultCache settings.

              <defaultCache maxElementsInMemory="1000"
                  eternal="false"
                  timeToIdleSeconds="30"
                  timeToLiveSeconds="30"
                  overflowToDisk="true"
                  diskPersistent="false"
                  diskExpiryThreadIntervalSeconds="120" />

2) Another thing is be judicious about the element  "eternal". If the value is set to "true" then the cache will never expire.

Finally , I should acknowledge the following references helped me a lot to arrive at a working solution.Thanks to all sharing the knowledge
 
http://docs.jboss.org/jbossas/javadoc/4.0.5/hibernate-int/org/jboss/hibernate/jmx/Hibernate.html 


 

Tuesday, December 28, 2010

Running loop requests using SaopUI

One of my friends told me a story about a guy who earns more than $300 dollars per hour with his expertise in the Microsoft Excel. It was little surprising for me, but when we think about it, to give something really quick and meaningful with what we have is really important rather than saying , If we had this and that we would have provided that etc etc. This we week we had a quick requirement to get data from a  secured we service. It was very urgent for the business unit to solve some discrepancy in reporting with the client.Basically we need to hit a webservice and take a value from the response, 1000 times.something like  a data driven load test.

Somehow , SoapUI came in to my mind and started experimenting it and found out that soapUI pro has the features which will be beneficial to us. Here is the experiment.

Needs:
1. We should be able to parametrize a value in the request (say customer Id), reading a value from an external file.
2. Read one field value from the response(say account balance) and write to a file against the input value.
3. We should be able to issue the request in a loop(same as the number of customer ids in the external file)


Solution:

Step 1:
Create a new project giving the WSDL location. Identify the request, right click  and click on "Add to TestCase". It will automatically create a test suit for you.

Fig: 1  

Now the project will look like Fig 1.

Step 2:

Now we need to think about  our needs . We should be able to read a value from the file . To achieve that SoapUI has a concept called DataSource. We should be able add another test step for this. Right click on testStep, click on add Step , and choose DataSource.

Fig : 2

Now you can see a new step is added to the test case:

Fig : 3
 But  the DataSource step should happen before the stockPurchaseRequest step , so move it one step above.
Now double click on the DataSource and the editor opens on the right panel.
Fig : 4   
Now create a new property (say customerId) and  then choose the appropriate DataSource( see Fig : 4)

Step 3: 
Now we have to hook this property to the request . In order to do that, double click the request, the editor open up on the right panel then  click on the Form tab. Choose the request parameter you want to get  from the DataSource. Click on the "Get Data" and choose it from the DataSource property.(See Fig : 5)

Fig : 5

Step 4:

Now we need to add a DataSink  which is the file where we store the values from the response. It is similar to the DataSource  creation. Add a new property (say PaymentType). Then choose the DataSink file location and type


Fig : 6

 Now we need to tranfer the response values to the DataSink. For that execute the request and get the response. change the response editor to "Outline" . Right click on the filed which you want to export. Click on "Transfer To" and point it to the DataSink property we created in the previous step.

Fig : 7
It will automatically create a PropertyTransfer step for you.

Fig : 8
Step 5:

Now finally we need run these step for each value in the DataSource. Add and configure a DataSource Loop step in to the Test Steps.


Now we are done with all the set ups. Add some value to the Data Source and run the test case . It will run as many times the data rows in the DataSource. It will create a new  DataSink file at the configured location.

One important thing to be noticed is that we can transfer any number of parameters  from the Data Source to request and from response to Data Sink as well.

Monday, November 22, 2010

Jibx issue with primitives when made "Optional"

Last week, I have been struggling with  a issue in Jibx.It was a very simple change as a code fix, it took me one whole day. We have to make certain fields optional in a request object. This object will be converted in to an XML using Jibx and then will be submitted to an external web service. We put the one of the boolean value as optional.Since we are defaulting the value in the object to false, this change should not affect anything big.Because of this confidence, we started testing with the client. But it errors out saying required filed is missing . Ok  we made it optional(here it will be never optional as it will always have value either true or false), but since it has a default value always it should not act as optional, whlie submitting the request.
 Jibx Mapping:
<mapping class="com.sample.model.Account"
             name="client-account"
             flexible="true"
             ordered="false">
        <value name="test-account"
               style="element"
               get-method="isTestAccount"
               set-method="setTestAccount"
               usage="optional"/>
        <value name="status"
               style="element"
               get-method="getStatus"
               set-method="setStatus"
               usage="optional"  />
        <value name="payment-type"
               style="element"
               get-method="getPaymentType"
               set-method="setPaymentType"
               usage="optional"  />
        <value name="notes"
               style="element"
               get-method="getNotes"
               set-method="setNotes"
               usage="optional"  />
    </mapping>

Marshalling Result :

test-account  = false
<?xml version="1.0" encoding="UTF-8"?><client-account><status>Good</status><payment-type>Pre-Pay</payment-type><notes>unlimited</notes></client-account>

test-account  = true
<?xml version="1.0" encoding="UTF-8"?><client-account><test-account>true</test-account><status>Good</status><payment-type>Pre-Pay</payment-type><notes>unlimited</notes></client-account>

Here it is very clear that Jibx is not putting the elements if the value is false. It took hours to arrive at this conclusion due to the complexity of the testing procedure.

Finally we found the answer from Dennis Sosnoski saying :(here)
When you set optional='true' on a primitive value, JiBX checks if the value is equal to the default and if so does not include the value when marshalling. If you'd like to have effectively a three-state representation - yes/no/not present - you can do so by using a Boolean value in place of the boolean. Alternatively, you can use a test-method to check if the value is present when marshalling.

It is worth knowing especially if you committed  a change like this  as "simple and easy " to your manager :-)

Update: We had another occurrence of this problem with a primitive integer. The field  was optional so when the value is "0"(zero) jibx omits that element in the marshaled xml.Earlier we were thinking only about the 'boolean' types.





Sunday, September 26, 2010

Stateful Quartz job

Quartz job does a good job for scheduling.We have a quartz job for processing a file which is scheduled to run every 5 minutes. Recently we came across a problem. Normally the file processing will take only less than the trigger interval so there will be only one job running at any point of time. Due large file size, the file processing time started taking more than 5 minute, which means a second job will be triggered before the first one finished. The second one will grab the same file again and will crate so many issues like duplicate entries, file permission issues etc etc.
We started thinking about "Only one should run at any point of time" requirement. Luckily there is a simple and smart way to achieve this. Declare the job as "Stateful".

if a job is stateful, and a trigger attempts to 'fire' the job while it is already executing, the trigger will block (wait) until the previous execution completes.

You 'mark' a Job as stateful by having it implement the StatefulJob interface, rather than the Job interface.


  public class sampleJob
                     extends QuartzJobBean
                       implements StatefulJob
    {

         protected void executeInternal(final JobExecutionContext _ctx)
                                                           throws JobExecutionException
   {
    // Do the job here
    }
  }

Sunday, August 29, 2010

Xfire work around when not all WSDL operations are in the client Interface

One of the major challenges for a project is the when we start integration testing with the client. No matter how much precaution we take there will be some surprises. One of our clients changed there wsdl at the last moment.The change is very simple, they combined their multiple webservices together in to a single wsdl. So in effect we have to care only few of the wdsl operations. uha... Xfire is expecting all the wsdl operations in the client interface. How can we solve this... Finally we solved it by deploying a local copy of  their wsdl removing the unnecessary  operations from it. So Xfire client will think that its looking at the real wsdl and the interface has all the operations defined. In order to do this we created a .war directory and put the local wsdl in it and deploy it. From the Xfire client point to this wsdl.

Orginal Client WSDL( other details omitted)

<wsdl:operation name="Sample1">
         <soap:operation soapAction="http:xx/sample1" style="document"/>
         <wsdl:input>
            <soap:body use="literal"/>
         </wsdl:input>
         <wsdl:output>           
<soap:body use="literal"/>
        </wsdl:output>
         <wsdl:fault name="CommunicationExceptionFault">
            <soap:fault name="CommunicationExceptionFault" use="literal"/>
         </wsdl:fault>
         <wsdl:fault name="InvalidOperationExceptionFault">
            <soap:fault name="InvalidOperationExceptionFault" use="literal"/>
         </wsdl:fault>
      </wsdl:operation>
   </wsdl:binding>
<wsdl:operation name="sample2">
         <soap:operation soapAction="http:xx/sample2" style="document"/>
         <wsdl:input>
            <soap:body use="literal"/>
         </wsdl:input>
         <wsdl:output>
            <soap:body use="literal"/>
         </wsdl:output>
         <wsdl:fault name="CommunicationExceptionFault">
            <soap:fault name="CommunicationExceptionFault" use="literal"/>
         </wsdl:fault>
         <wsdl:fault name="InvalidOperationExceptionFault">
            <soap:fault name="InvalidOperationExceptionFault" use="literal"/>
         </wsdl:fault>      </wsdl:operation>
   </wsdl:binding>
LocalCopy  (We are only interested in operation sample1)
localClient.wsdl
<wsdl:operation name="Sample1"><br />
         <soap:operation soapAction="http:xx/sample1" style="document"/><br />
         <wsdl:input><br />
            <soap:body use="literal"/><
         </wsdl:input><br />
         <wsdl:output><br />
            <soap:body use="literal"/>&lt
         </wsdl:output><
         <wsdl:fault name="CommunicationExceptionFault"
            <soap:fault name="CommunicationExceptionFault" use="literal"/>
         </wsdl:fault>
         <wsdl:fault name="InvalidOperationExceptionFault">
            <soap:fault name="InvalidOperationExceptionFault" use="literal"/>
         </wsdl:fault>
      </wsdl:operation>
   </wsdl:binding>

And the web service client will be defined as :
<bean id="sampleClient" parent="abstractsampleClient">
        <property name="serviceInterface"
                  value="com.testsample.SampleController" />
        <property name="wsdlDocumentUrl"
                  value="http://localhost/mock-client-wsdl/localClient.wsdl" />
    </bean>

 Note: localClient.wsdl file will be created in a directory  mock-client-wsdl.war and deployed in the server  so that it will act as external .wsdl file

Consider this:
 1) Keeping a local copy of the client wsdl is not always a good idea.
 2)Because of this complexity the application maintainability will be tough with out proper documentation

However this is the simplest work around to make it work....

Sunday, August 22, 2010

Sharing Properties in Spring and Ant

Sharing properties among different modules of application is a big topic of discussion. Its very important for changing the properties per environment while production deployment and also effectively  testing the application pointing to or injecting  a mock implemention . One of many advantages of spring frame work is this testing flexibility.We can even test the production using the mock objects. In order to get these full benefit we need to use PropertyPlaceholderConfigurer class from spring frame work.

1. Define sample.properties
jdbc.driver=org.hsqldb.jdbcDriver
2. Set up PropertyPlaceholderConfigurer
  <bean class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">
<property name="locations" value="classpath:bin/sample.properties" />
</bean>

3.Use them as follows
 <bean
<bean id="dataSource" class="org.springframework.jdbc.datasource.DriverManagerDataSource">
<property name="driverClassName"><value>${jdbc.driver}</value></property>
...
...
</bean>

This class resolves placeholders in bean property values of context definitions. It pulls values from a properties file into bean definitions.The default placeholder syntax follows the Ant / Log4J / JSP EL style:  ${...}. If you want to check against multiple properties files, specify multiple resources via the "locations" setting. You can also define multiple PropertyPlaceholderConfigurers, each with its own placeholder syntax. Default property values can be defined via "properties", to make overriding definitions in properties files optional. A configurer will also check against system properties (e.g. "user.dir") if it cannot resolve a placeholder with any of the specified properties. This can be customized via "systemPropertiesMode". Property values can be converted after reading them in, through overriding the PropertyResourceConfigurer.convertPropertyValue(java.lang.String) method. For example, encrypted values can be detected and decrypted accordingly before processing them.

Another benefit is the same file can be used in ant build file also :

<property file="sample.properties"/>
...
<target name="browse">
<java classname="org.hsqldb.util.DatabaseManager" fork="yes" failonerror="true">
<classpath refid="classpath"/>
<arg value="-url"/>
<arg value="${jdbc.url}"/>
</java>
</target>
<property name="driverClassName"><value>${jdbc.driver}</value></property><br />
...<br />
<br />
...<br />
</bean><br />
<br />
<br />
 

Tuesday, August 17, 2010

Soap Message Logging in Xfire

Anyone  who has even a little application support experience would know the importance of appropriate logging in the code.If something is not working, it would be a hell at 2 AM to figure out  what happened to request in side the application.That is one of the reasons why people from production support make faces when they hear about dev team is using a new framework. Most of the frameworks, in order to make the performance good, use very little logging. Some frameworks give the option to the developer.If the developer wants to have extensive logging he has to configure. Xifre was one among them. In my current application ,it was important to see the entire Soap request came in and the Soap response went out to the system.

Since Xfire is stax based it never caches the whole message in memory , but luckily  there were options to  configure to log the whole  in/out messages.

First  we need to add a DOMInHandler  to the  request flow.  It reads the incoming stream to a DOM document and sets the stream to  a W3CDOMStreamReader.Then use LoggingHandler to log the message.

Same approach can be used for the out going messages using , DOMOutHandler and LoggingHandler.

//Tell XFire to cache a DOM document for the various in/out flows
service.addInHandler(new org.codehaus.xfire
                        .util.dom.DOMInHandler());
service.addOutHandler(new org.codehaus.xfire
                         .util.dom.DOMOutHandler());

// Add a logging handler to each flow 
service.addInHandler(new org.codehaus.xfire
                            .util.LoggingHandler());
service.addOutHandler(new org.codehaus.xfire
                           .util.LoggingHandler());

In the  XML config  looks like :
&lt;bean id="sampleWS"<br />
          class="org.codehaus.xfire.spring.remoting.XFireExporter"<br />
          abstract="true"><br />
        &lt;property name="style" value="document" /><br />
        &lt;property name="serviceFactory" ref="bindingServiceFactory" /><br />
        &lt;property name="namespace" value="http://sample.com" /><br />
        &lt;property name="xfire" ref="xfire" /><br />
        &lt;property name="inHandlers"><br />
            &lt;list><br />
                &lt;ref bean="inHandler" /><br />
                &lt;ref bean="loggingHandler" /><br />
            &lt;/list><br />
        &lt;/property><br />
        &lt;property name="outHandlers"><br />
            &lt;list><br />
                &lt;ref bean="outHandler" /><br />
                &lt;ref bean="loggingHandler" /><br />
            &lt;/list><br />
        &lt;/property><br />
       <br />
    &lt;/bean><br />
<br />
<br />
 &lt;bean id="inHandler"<br />
          class="org.codehaus.xfire.util.dom.DOMInHandler"<br />
          scope="prototype" /><br />
<br />
    &lt;bean id="outHandler"<br />
          class="org.codehaus.xfire.util.dom.DOMOutHandler"<br />
          scope="prototype" /><br />
<br />
    &lt;bean id="loggingHandler"<br />
          class="org.codehaus.xfire.util.LoggingHandler"<br />
          scope="prototype" />
Related Posts Plugin for WordPress, Blogger...