Tuesday, December 27, 2011

Debugging with Maven

For Non-Forked Process
The way to debug any project using maven is to use "mvnDebug" command line tool instead of "mvn". So, if I want to debug a tomcat based project, I would do:
mvnDebug tomcat:run

which would enable the debugging on the JVM and would wait for the debugger to connect to the vm:

By default, the JDWP port that mvnDebug listens is on 8000. 

The next step is to connect to this vm through the debugger in Eclipse - 
Go to Run->Debug Configurations->Remote Java Application and create a new "launch configuration" along these lines:



Click on debug and eclipse should connect to the maven vm and the tomcat should start up at this point. 


For Forked Process
The above approach will however not work for forked processes like JUnit tests - tests by default are forked by maven. 
There are two workarounds for debugging JUnit tests:
To prevent forking of Junit tests, this can be done using a forkMode parameter this way:
mvnDebug test -Dtest=GtdProjectDaoIntegrationTest -DforkMode=never

The second workaround is to use the "maven.surefire.debug" property:
mvn -Dmaven.surefire.debug test -Dtest=GtdProjectDaoIntegrationTest

This would, by default, start up the debugger at port 5005. A variation of this is to explicitly specify the port where the debugger is to be started and with additional JDWP options:
mvn -Dmaven.surefire.debug="-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=8000 -Xnoagent -Djava.compiler=NONE" test


References
http://maven.apache.org/plugins/maven-surefire-plugin/examples/debugging.html

Monday, December 19, 2011

Concurrency - Executors and Spring Integration

This is a follow up to a previous blog entry:

Thread Pool/Executors Based Implementation
A better approach than the raw thread version, is a Thread pool based one, where an appropriate thread pool size is defined based on the system where the task is running - Number of CPU's/(1-Blocking Coefficient of Task). Venkat Subramaniams book has more details:




First I defined a custom task to generate the Report Part, given the Report Part Request, this is implemented as a Callable:
public class ReportPartRequestCallable implements Callable<ReportPart> {
 private final ReportRequestPart reportRequestPart;
 private final ReportPartGenerator reportPartGenerator;

 public ReportPartRequestCallable(ReportRequestPart reportRequestPart, ReportPartGenerator reportPartGenerator) {
     this.reportRequestPart = reportRequestPart;
     this.reportPartGenerator = reportPartGenerator;
    }

 @Override
    public ReportPart call() {
    return this.reportPartGenerator.generateReportPart(reportRequestPart);
    } 
}

public class ExecutorsBasedReportGenerator implements ReportGenerator {
    private static final Logger logger = LoggerFactory.getLogger(ExecutorsBasedReportGenerator.class);

    private ReportPartGenerator reportPartGenerator;

    private ExecutorService executors = Executors.newFixedThreadPool(10);

    @Override
    public Report generateReport(ReportRequest reportRequest) {
        List<Callable<ReportPart>> tasks = new ArrayList<Callable<ReportPart>>();
        List<ReportRequestPart> reportRequestParts = reportRequest.getRequestParts();
        for (ReportRequestPart reportRequestPart : reportRequestParts) {
            tasks.add(new ReportPartRequestCallable(reportRequestPart, reportPartGenerator));
        }

        List<Future<ReportPart>> responseForReportPartList;
        List<ReportPart> reportParts = new ArrayList<ReportPart>();
        try {
            responseForReportPartList = executors.invokeAll(tasks);
            for (Future<ReportPart> reportPartFuture : responseForReportPartList) {
                reportParts.add(reportPartFuture.get());
            }

        } catch (Exception e) {
            logger.error(e.getMessage(), e);
            throw new RuntimeException(e);
        }
        return new Report(reportParts);
    }

 ......
}

Here a thread pool is created using the Executors.newFixedThreadPool(10) call, with a pool size of 10, a callable task is generated for each of the report request parts, and handed over to the threadpool using the ExecutorService abstraction -
responseForReportPartList = executors.invokeAll(tasks);
this call returns a List of Futures, which support a get() method which is a blocking call on the response to be available.

This is clearly a much better implementation compared to the raw thread version, the number of threads is constrained to a manageable number under load.


Spring Integration Based Implementation
The approach that I personally like the most is using Spring Integration, the reason is that with Spring Integration I focus on the components doing the different tasks and leave it upto Spring Integration to wire the flow together, using a xml based or annotation based configuration. Here I will be using a XML based configuration :

The components in my case are:
1. The component to generate the report part, given the report part request, which I had shown earlier.
2. A component to split the report request to report request parts:
public class DefaultReportRequestSplitter implements ReportRequestSplitter{
 @Override
 public List<ReportRequestPart> split(ReportRequest reportRequest) {
  return reportRequest.getRequestParts();
 }
}

3. A component to assemble/aggregate the report parts into a whole report:
public class DefaultReportAggregator implements ReportAggregator{

    @Override
    public Report aggregate(List<ReportPart> reportParts) {
        return new Report(reportParts);
    }

}

And that is all the java code that is required with Spring Integration, the rest of the is wiring - here I have used a Spring integration configuration file:
<?xml version="1.0" encoding="UTF-8"?>
<beans ....

    <int:channel id="report.partsChannel"/>
    <int:channel id="report.reportChannel"/>
    <int:channel id="report.partReportChannel">
        <int:queue capacity="50"/>
    </int:channel>  
    <int:channel id="report.joinPartsChannel"/>


 <int:splitter id="splitter" ref="reportsPartSplitter" method="split" 
        input-channel="report.partsChannel" output-channel="report.partReportChannel"/>
    
    <task:executor id="reportPartGeneratorExecutor" pool-size="10" queue-capacity="50" />
    
 <int:service-activator id="reportsPartServiceActivator"  ref="reportPartReportGenerator" method="generateReportPart" 
            input-channel="report.partReportChannel" output-channel="report.joinPartsChannel">
    <int:poller task-executor="reportPartGeneratorExecutor" fixed-delay="500">
    </int:poller>
 </int:service-activator>

    <int:aggregator ref="reportAggregator" method="aggregate" 
            input-channel="report.joinPartsChannel" output-channel="report.reportChannel" ></int:aggregator> 

    <int:gateway id="reportGeneratorGateway" service-interface="org.bk.sisample.springintegration.ReportGeneratorGateway" 
           default-request-channel="report.partsChannel" default-reply-channel="report.reportChannel"/>
    
    <bean name="reportsPartSplitter" class="org.bk.sisample.springintegration.processors.DefaultReportRequestSplitter"></bean>
    <bean name="reportPartReportGenerator" class="org.bk.sisample.processors.DummyReportPartGenerator"/>
    <bean name="reportAggregator" class="org.bk.sisample.springintegration.processors.DefaultReportAggregator"/>
    <bean name="reportGenerator" class="org.bk.sisample.springintegration.SpringIntegrationBasedReportGenerator"/>

</beans>

Spring Source Tool Suite provides a great way of visualizing this file:
this matches perfectly with my original view of the user flow:

In the Spring Integration version of the code, I have defined the different components to handle the different parts of the flow:
1. A splitter to convert a report request to report request parts:
<int:splitter id="splitter" ref="reportsPartSplitter" method="split" 
        input-channel="report.partsChannel" output-channel="report.partReportChannel"/>

2. A service activator component to generate a report part from a report part request:

<int:service-activator id="reportsPartServiceActivator"  ref="reportPartReportGenerator" method="generateReportPart" 
            input-channel="report.partReportChannel" output-channel="report.joinPartsChannel">
    <int:poller task-executor="reportPartGeneratorExecutor" fixed-delay="500">
    </int:poller>
 </int:service-activator>
3. An aggregator to join the report parts back to a report, and is intelligent enough to correlate the original split report requests appropriately without any explicit coding required for it:
<int:aggregator ref="reportAggregator" method="aggregate" 
            input-channel="report.joinPartsChannel" output-channel="report.reportChannel" ></int:aggregator> 


What is interesting in this code is that, like in the executors based sample, the number of threads that services each of these components is completely configurable using the xml file, by using appropriate channels to connect the different components together and by using task executors with the thread pool size set as attribute of the executor.

In this code, I have defined a queue channel where the report request parts come in:

<int:channel id="report.partReportChannel">
        <int:queue capacity="50"/>
    </int:channel>  


and is serviced by the service activator component, using a task executor with a thread pool of size 10, and a capacity of 50:

<task:executor id="reportPartGeneratorExecutor" pool-size="10" queue-capacity="50" />
    
 <int:service-activator id="reportsPartServiceActivator"  ref="reportPartReportGenerator" method="generateReportPart" 
            input-channel="report.partReportChannel" output-channel="report.joinPartsChannel">
    <int:poller task-executor="reportPartGeneratorExecutor" fixed-delay="500">
    </int:poller>
 </int:service-activator>


All this through configuration!


The entire codebase for this sample is available at this github location: https://github.com/bijukunjummen/si-sample

Saturday, December 17, 2011

Concurrency - Sequential and Raw Thread

I worked on a project a while back, where the report flow was along these lines:



  1. User would request for a report
  2. The report request would be translated into smaller parts/sections
  3. The report for each part, based on the type of the part/section would be generated by a report generator
  4. The constituent report parts would be reassembled into a final report and given back to the user

My objective is to show how I progressed from a bad implementation to a fairly good implementation:

Some of the basic building blocks that I have is best demonstrated by a unit test:
This is a test helper which generates a sample report request, with constituent report request parts:
public class FixtureGenerator {
    public static ReportRequest generateReportRequest(){
        List<ReportRequestPart> requestParts = new ArrayList<ReportRequestPart>();
        Map<String, String> attributes = new HashMap<String, String>();
        attributes.put("user","user");
        Context context = new Context(attributes );
    
        ReportRequestPart part1 = new ReportRequestPart(Section.HEADER, context);
        ReportRequestPart part2 = new ReportRequestPart(Section.SECTION1, context);
        ReportRequestPart part3 = new ReportRequestPart(Section.SECTION2, context);
        ReportRequestPart part4 = new ReportRequestPart(Section.SECTION3, context);
        ReportRequestPart part5 = new ReportRequestPart(Section.FOOTER, context);   
        
        requestParts.add(part1);        
        requestParts.add(part2);
        requestParts.add(part3);
        requestParts.add(part4);
        requestParts.add(part5);
        
        ReportRequest reportRequest  = new ReportRequest(requestParts );
        return reportRequest;
    }

}
And the test for the report generation:
public class FixtureGenerator {
 @Test
 public void testSequentialReportGeneratorTime(){
  long startTime = System.currentTimeMillis();
  Report report = this.reportGenerator.generateReport(FixtureGenerator.generateReportRequest());
  long timeForReport = System.currentTimeMillis()-startTime;
  assertThat(report.getSectionReports().size(), is (5));
  logger.error(String.format("Sequential Report Generator : %s ms", timeForReport));
 } 

The component which generates a part of the report is a dummy implementation with a 2 second delay to simulate a IO intensive call:
public class DummyReportPartGenerator implements ReportPartGenerator{

 @Override
 public ReportPart generateReportPart(ReportRequestPart reportRequestPart) {
  try {
   //Deliberately introduce a delay
   Thread.sleep(2000);
  } catch (InterruptedException e) {
   e.printStackTrace();
  }
  return new ReportPart(reportRequestPart.getSection(), "Report for " + reportRequestPart.getSection());
 }
}

Sequential Implementation
Given these base set of classes, my first naive sequential implementation is the following:
public class SequentialReportGenerator implements ReportGenerator {
 private ReportPartGenerator reportPartGenerator;

 @Override
 public Report generateReport(ReportRequest reportRequest){
  List<ReportRequestPart> reportRequestParts = reportRequest.getRequestParts();
  List<ReportPart> reportSections = new ArrayList<ReportPart>();
  for (ReportRequestPart reportRequestPart: reportRequestParts){
   reportSections.add(reportPartGenerator.generateReportPart(reportRequestPart));
  }
  return new Report(reportSections);
 }
 
 
......
}

Obviously, for a report request with 5 parts in it, each part taking 2 seconds to be fulfilled this report takes about 10 seconds for it to be returned back to the user.

It begs to be made concurrent.

Raw Thread Based Implementation
The first concurrent implementation, not good but better than sequential is the following, where a thread is spawned for every report request part, waiting on the reportparts to be generated(using thread.join() method), and aggregating the pieces as they come in.

public class RawThreadBasedReportGenerator implements ReportGenerator {
    private static final Logger logger = LoggerFactory.getLogger(RawThreadBasedReportGenerator.class);

    private ReportPartGenerator reportPartGenerator;

    @Override
    public Report generateReport(ReportRequest reportRequest) {
        List<ReportRequestPart> reportRequestParts = reportRequest.getRequestParts();
        List<Thread> threads = new ArrayList<Thread>();
        List<ReportPartRequestRunnable> runnablesList = new ArrayList<ReportPartRequestRunnable>();
        for (ReportRequestPart reportRequestPart : reportRequestParts) {
            ReportPartRequestRunnable reportPartRequestRunnable = new ReportPartRequestRunnable(reportRequestPart, reportPartGenerator);
            runnablesList.add(reportPartRequestRunnable);
            Thread thread = new Thread(reportPartRequestRunnable);
            threads.add(thread);
            thread.start();
        }

        for (Thread thread : threads) {
            try {
                thread.join();
            } catch (InterruptedException e) {
                logger.error(e.getMessage(), e);
            }
        }

        List<ReportPart> reportParts = new ArrayList<ReportPart>();

        for (ReportPartRequestRunnable reportPartRequestRunnable : runnablesList) {
            reportParts.add(reportPartRequestRunnable.getReportPart());
        }

        return new Report(reportParts);

    }    
    .....
}

The danger with this approach is that a new thread is being created for every report part, so in a real world scenario if a 100 simultaneous request comes in with each request spawning 5 threads, this can potentially end up creating 500 costly threads in the vm!!

So thread creation has to be constrained in some way. I will go through two more approaches where threads are controlled, in the next blog entry.

Friday, December 16, 2011

jquery maphilight

jquery maphilight is a fantastic jquery plugin to overlay an image with highlights, based on information from an imagemap.
I recently used it for one of my work projects and it worked out beautifully - highly recommended for overlay highlights.

Saturday, December 3, 2011

SimpleDateFormat and TimeZone

Recently I was stumped by a simple concept - I needed to transform a timestamp in a Europe/London timezone to a yyyyMMdd format. So I had a code along this lines to do this:
SimpleDateFormat formatter = new SimpleDateFormat("yyyyMMdd");
Calendar date = Calendar.getInstance(TimeZone.getTimeZone("Europe/London"));
date.set(Calendar.YEAR, 2011);
date.set(Calendar.MONTH, 10);
date.set(Calendar.DAY_OF_MONTH, 15);
date.set(Calendar.HOUR_OF_DAY, 3);
int aDateName = Integer.valueOf(formatter.format(date.getTime()));
System.out.println(aDateName);

I was expecting it to print 20111115 as the output.

However, the output was 20111114(when executing from US EST Timezone) - this is because I am transforming Calendar to a date using getTime() API, and as soon as I do this the timezone is set to UTC. The workaround is to somehow set the the timezone attribute at the point where it is printed back to a string, this can be done by setting the timezone attribute of SimpleDateFormat, otherwise it tends to format it based on the default timezone where the code is run -

This is what fixed the code for me:

.....
formatter.setTimeZone(TimeZone.getTimeZone("Europe/London"));
.....

Friday, November 11, 2011

Enabling Proxy server for CXF Client

If you ever have the need to enable proxy server for a CXF based webservice client, this is the way to go about it:

Assuming a Spring based application, add this new namespace to the Spring configuration file:

 xmlns:http-conf="http://cxf.apache.org/transports/http/configuration"
....
 xsi:schemaLocation="...
    http://cxf.apache.org/transports/http/configuration
    http://cxf.apache.org/schemas/configuration/http-conf.xsd">
...



Now, if all outbound requests are via a proxy server then make the following entry in your Spring configuration:

 <http-conf:conduit name="*.http-conduit">
  <http-conf:client ProxyServer="${proxy.server}" ProxyServerPort="${proxy.port}" />  
 </http-conf:conduit>

If you need to disable the proxy server, just provide an empty proxy.server parameter.

If you need to enable it only for specific WS requests, then the conduit name has to be modified to the specific services portType:
http-conf:conduit name="{http://apache.org/hello_world_soap_http}SoapPort.http-conduit"

More information here.

Wednesday, October 19, 2011

SOAP-Fault for a Contract First service using Spring-WS

This is a follow up to the simple webservice described in this post

Just to recap, my webservice was responsible for returning the details of a "member" given the identifier for the member. If a member was not found for the identifier, the service did not return a member.

For this request, which matches a member in the system:

<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:mem="http://bk.org/memberservice/">
   <soapenv:Header/>
   <soapenv:Body>
      <mem:MemberDetailsRequest xmlns:mem="http://bk.org/memberservice/">
         <mem:id>1</mem:id>
      </mem:MemberDetailsRequest>
   </soapenv:Body>
</soapenv:Envelope>

this is the response:

<SOAP-ENV:Envelope xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/">
   <SOAP-ENV:Header/>
   <SOAP-ENV:Body>
      <ns2:MemberDetailsResponse xmlns:ns2="http://bk.org/memberservice/">
         <ns2:memberdetail>
            <ns2:id>1</ns2:id>
            <ns2:name>john doe</ns2:name>
            <ns2:phone>111-111-1111</ns2:phone>
            <ns2:city>City</ns2:city>
            <ns2:state>State</ns2:state>
         </ns2:memberdetail>
      </ns2:MemberDetailsResponse>
   </SOAP-ENV:Body>
</SOAP-ENV:Envelope>


and for a case where there is no match, this is the response:

<SOAP-ENV:Envelope xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/">
   <SOAP-ENV:Header/>
   <SOAP-ENV:Body>
      <ns2:MemberDetailsResponse xmlns:ns2="http://bk.org/memberservice/"/>
   </SOAP-ENV:Body>
</SOAP-ENV:Envelope>

Now, what I want to do is, instead of giving a response of this form, I want my contract to explicitly declare a Fault, in case a member is not found.

There are a couple of different ways of doing it.

The first way is to simply throw a RuntimeException, this would be trapped by the Spring-WS stack(using a SimpleFaultMessageResolver) and returned as a Soap fault.
So now my endpoint implementation is:
...
@Endpoint
public class GetMemberDetailsEndpoint {
 @Autowired private MemberManager memberManager;

 @PayloadRoot(namespace = "http://bk.org/memberservice/", localPart = "MemberDetailsRequest")
 @ResponsePayload
 public MemberDetailsResponse getMemberDetails(@RequestPayload MemberDetailsRequest request) throws Exception {
  MemberDetail memberDetail = memberManager.findByMemberId(request.getId());
  if (memberDetail==null){
   throw new RuntimeException("Member Not Found");
  }
  MemberDetailsResponse response = new MemberDetailsResponse(memberDetail);
  return response;

 }
...

and the response for a case where a member is not found is a clean Soap fault with the message set in the exception as the fault detail:

<SOAP-ENV:Envelope xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/">
   <SOAP-ENV:Header/>
   <SOAP-ENV:Body>
      <SOAP-ENV:Fault>
         <faultcode>SOAP-ENV:Server</faultcode>
         <faultstring xml:lang="en">Member Not Found</faultstring>
      </SOAP-ENV:Fault>
   </SOAP-ENV:Body>
</SOAP-ENV:Envelope>


The second way is to define a custom exception class, and throw this custom exception from the endpoint:
@PayloadRoot(namespace = "http://bk.org/memberservice/", localPart = "MemberDetailsRequest")
 @ResponsePayload
 public MemberDetailsResponse getMemberDetails(@RequestPayload MemberDetailsRequest request) throws Exception {
  MemberDetail memberDetail = memberManager.findByMemberId(request.getId());
  if (memberDetail==null){
   throw new MemberDetailsFault("Member Not Found");
  }
  MemberDetailsResponse response = new MemberDetailsResponse(memberDetail);
  return response;

 }

and define a resolver, which will map exceptions of this new type(MemberDetailsFault) to a more descriptive text:
<bean class="org.springframework.ws.soap.server.endpoint.SoapFaultMappingExceptionResolver">
        <property name="defaultFault" value="SERVER"/>
        <property name="exceptionMappings">
            <value>
                org.bk.memberservice.message.MemberDetailsFault=SERVER,No Member-message from application context
            </value>
        </property>
    </bean> 

the exceptionresolver is discovered by type, so there is no need to give the bean a name. With this in place, the Soap fault, will have the fault detail based on the exceptionMapping:
<SOAP-ENV:Envelope xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/">
   <SOAP-ENV:Header/>
   <SOAP-ENV:Body>
      <SOAP-ENV:Fault>
         <faultcode>SOAP-ENV:Server</faultcode>
         <faultstring xml:lang="en">No Member-message from application context</faultstring>
      </SOAP-ENV:Fault>
   </SOAP-ENV:Body>
</SOAP-ENV:Envelope>

A third way, is to annotate the custom exception class with a @SoapFault annotation, describing the exception to be returned as part of the faultdetail:
@SoapFault(faultCode = FaultCode.SERVER, faultStringOrReason = "No Member Found - From @SoapFault annotation")
public class MemberDetailsFault extends RuntimeException{


A new resolver is required to interpret exception with annotations to a Soap Fault, SoapFaultAnnotationExceptionResolver:
<bean class="org.springframework.ws.soap.server.endpoint.SoapFaultAnnotationExceptionResolver"/>

With this in place, the response is:
<SOAP-ENV:Envelope xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/">
   <SOAP-ENV:Header/>
   <SOAP-ENV:Body>
      <SOAP-ENV:Fault>
         <faultcode>SOAP-ENV:Server</faultcode>
         <faultstring xml:lang="en">No Member Found - From @SoapFault annotation</faultstring>
      </SOAP-ENV:Fault>
   </SOAP-ENV:Body>
</SOAP-ENV:Envelope>

I personally prefer the third approach, it is a little more concise.

Sample available at this location: git://github.com/bijukunjummen/memberservice-contractfirst.git

Monday, October 3, 2011

Spring Custom Date Editor

Recently I had a need to convert a date specified in this sample format "10/01/2010 01:05:20" into java.util.Date, with the date specified in a spring configuration file:

<bean name="aBean" p:id="1" class="org.bk.BeanClass" p:fromDate="01/10/2011 01:05:00" p:toDate="01/10/2011 02:05:00" p:counterId="1"/>

The hour and second part of the date was however being dropped by Spring.

The fix is simple, register a custom date editor, which understands the appropriate date format:

 <bean class="org.springframework.beans.factory.config.CustomEditorConfigurer">
    <property name="customEditors">
     <map>
       <entry key="java.util.Date">
          <bean class="org.springframework.beans.propertyeditors.CustomDateEditor">
           <constructor-arg>
        <bean class="java.text.SimpleDateFormat">
                     <constructor-arg><value>MM/dd/yyyy hh:mm:ss</value></constructor-arg>
                  </bean>           
           </constructor-arg>
           <constructor-arg index="1"><value>true</value></constructor-arg>
          </bean>
       </entry>
     </map>
    </property>
 </bean>






Thursday, September 29, 2011

Indianapolis Java User Group - Presentation on Clojure

I attended a session on Clojure at Indianapolis JUG yesterday. The session was conducted by Carin Meier.

I have wanted to look at Clojure for sometime now, but have been put off by the different looking LISP syntax. Instead I have been learning Scala as the functional JVM based language this year

The session was great, it was about the basics of Clojure but presented in a fun way - it has provided me sufficient amount of motivation to have a second look at Clojure.

The presentation slides are available at Carin Meier's Github account.


Saturday, September 10, 2011

Using SBT 0.10.1, Eclipsify for a new Scala project

Run sbt(0.10.1) in a new folder:

D:\samplescala>sbt

D:\samplescala>set SCRIPT_DIR=C:\util\sbt\

D:\samplescala>java -Dfile.encoding=UTF8 -Xmx1536M -Xss1M -XX:+CMSClassUnloadingEnabled -XX:MaxPermSize=256m -jar "C:\util\sbt\sbt
-launch.jar"
Getting net.java.dev.jna jna 3.2.3 ...
:: retrieving :: org.scala-tools.sbt#boot-app
        confs: [default]
        1 artifacts copied, 0 already retrieved (838kB/46ms)
Getting Scala 2.8.1 (for sbt)...
:: retrieving :: org.scala-tools.sbt#boot-scala
        confs: [default]
        3 artifacts copied, 0 already retrieved (15178kB/235ms)
Getting org.scala-tools.sbt sbt_2.8.1 0.10.1 ...
:: retrieving :: org.scala-tools.sbt#boot-app
        confs: [default]
        36 artifacts copied, 0 already retrieved (6414kB/965ms)
[info] Set current project to default-097978 (in build file:/D:/samplescala/)

The following folders will show up under the root folder:

Create the default source/test/resource structure:
mkdir src\main\resources
mkdir src\main\scala
mkdir src\main\java
mkdir src\test\resources
mkdir src\test\scala
mkdir src\test\java

Create a build configuration file build.sbt, and place it in the root of the project with the following content:

name:="samplescala"

version:="1.0"

scalaVersion := "2.9.0-1"

libraryDependencies ++= Seq(
    "junit" % "junit" % "4.8" % "test",
    "org.scalatest" % "scalatest_2.9.0" % "1.6.1"
)

defaultExcludes ~= (filter => filter || "*~")

Add, one sample test in the file src\test\scala\gcdtests.scala, just to test out the sbt configuration:
package com.sample

object Gcd{
    def gcd(a:Int, b:Int):Int = if (b==0) a else gcd(b, a%b)
}


import org.scalatest.FlatSpec
import org.scalatest.matchers.ShouldMatchers


class GcdTests extends FlatSpec with ShouldMatchers{
    "GCD of 1440, 408" should  "be 24" in {
        Gcd.gcd(1440, 408) should equal (24)
    }
}


Invoke sbt, and run test, if everything is configured correctly the following output will be displayed:
D:\samplescala>sbt

D:\samplescala>set SCRIPT_DIR=C:\util\sbt\

D:\samplescala>java -Dfile.encoding=UTF8 -Xmx1536M -Xss1M -XX:+CMSClassUnloadingEnabled -XX:MaxPermSize=256m -jar "C:\util\sbt\sbt
-launch.jar"
[info] Set current project to default-097978 (in build file:/D:/samplescala/)
> test
[info] Updating {file:/D:/samplescala/}default-097978...
[info] Done updating.
[info] GcdTests:
[info] GCD of 1440, 408
[info] - should be 24
[info] Passed: : Total 1, Failed 0, Errors 0, Passed 1, Skipped 0
[success] Total time: 1 s, completed Sep 10, 2011 7:00:28 PM

Now, to import this project into eclipse. Create a build.sbt file with the following contents and place in the project/plugins folder:
libraryDependencies <+= (libraryDependencies, sbtVersion) { (deps, version) => 
    "de.element34" %% "sbt-eclipsify" % "0.10.0-SNAPSHOT"
}

Restart sbt, if the plugin is configured correctly, eclipse will be a valid task in sbt, running this will create the .project and .classpath files for Eclipse:
> eclipse
[info] Starting eclipse
[info] written .project for samplescala
[info] written .classpath for samplescala
[info] * Don't forget to install the Scala IDE Plugin from http://www.scalaide.org/
[info] You may now import your projects in Eclipse

Assuming that Scala IDE for Eclipse is installed, import this new project into Eclipse:

Thursday, September 8, 2011

Simple Introduction to AOP - Session 5

This will be a wrap up of the AOP intro, with an example that will comprehensively exercise the concepts introduced in the previous sessions.

The use case is simple, I am going to define a custom annotation, PerfLog, I expect the calls to methods annotated with this annotation to be timed and logged.
Let me start by defining the annotation:

package org.bk.annotations;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;

@Target({ElementType.TYPE, ElementType.METHOD})
@Retention(RetentionPolicy.RUNTIME)
public @interface PerfLog {
    
}
Now to annotate some service methods with this annotation:

@Service
public class DefaultInventoryService implements InventoryService{
    
    private static Logger logger = LoggerFactory.getLogger(InventoryService.class);

    
    @Override
    public Inventory create(Inventory inventory) {
        logger.info("Create Inventory called");
        inventory.setId(1L);
        return inventory; 
    }

    @Override
    public List<Inventory> list() {
        return new ArrayList<Inventory>();
    }

    @Override
    @PerfLog
    public Inventory update(Inventory inventory) {
        return inventory;
    }

    @Override
    public boolean delete(Long id) {
        logger.info("Delete Inventory called");
        return true;
    }

    @Override
    @PerfLog
    public Inventory findByVin(String vin) {
        logger.info("find by vin called");
        return new Inventory("testmake", "testmodel","testtrim","testvin" );
    }

    @Override
    @PerfLog
    public Inventory compositeUpdateService(String vin, String newMake) {
        logger.info("composite Update Service called");
        Inventory inventory = findByVin(vin);
        inventory.setMake(newMake);
        update(inventory);
        return inventory;
    }
}

Here three methods of DefaultInventoryService have been annotated with @PerfLog annotation - update, findByVin, compositeUpdateService which internally invokes the methods findByVin and update.

Now for the Aspect which will intercept all calls to methods annotated with @PerfLog and log the time taken for the method call:

package org.bk.inventory.aspect;

import org.aspectj.lang.ProceedingJoinPoint;
import org.aspectj.lang.annotation.Around;
import org.aspectj.lang.annotation.Aspect;
import org.aspectj.lang.annotation.Pointcut;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

@Aspect
public class AuditAspect {

    private static Logger logger = LoggerFactory.getLogger(AuditAspect.class);

    @Pointcut("execution(@org.bk.annotations.PerfLog * *.*(..))")
    public void performanceTargets(){}
   

    @Around("performanceTargets()")
    public Object logPerformanceStats(ProceedingJoinPoint joinpoint) {
        try {
            long start = System.nanoTime();
            Object result = joinpoint.proceed();
            long end = System.nanoTime();
            logger.info(String.format("%s took %d ns", joinpoint.getSignature(), (end - start)));
            return result;
        } catch (Throwable e) {
            throw new RuntimeException(e);
        }
    }
}

Here the pointcut expression -
@Pointcut("execution(@org.bk.annotations.PerfLog * *.*(..))")
selects all methods annotated with @PerfLog annotation, and the aspect method logPerformanceStats logs the time taken by the method calls.

To test this:
package org.bk.inventory;

import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;

import org.bk.inventory.service.InventoryService;
import org.bk.inventory.types.Inventory;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;

@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration("classpath:/testApplicationContextAOP.xml")
public class AuditAspectTest {

    @Autowired 
    InventoryService inventoryService;
        
    @Test
    public void testInventoryService() {
        Inventory inventory = this.inventoryService.create(new Inventory("testmake", "testmodel","testtrim","testvin" ));
        assertThat(inventory.getId(), is(1L));
        
        assertThat(this.inventoryService.delete(1L), is(true));
        assertThat(this.inventoryService.compositeUpdateService("vin","newmake").getMake(),is("newmake"));
    }

}

When this test is invoked the output is the following:
2011-09-08 20:54:03,521 org.bk.inventory.service.InventoryService - Create Inventory called
2011-09-08 20:54:03,536 org.bk.inventory.service.InventoryService - Delete Inventory called
2011-09-08 20:54:03,536 org.bk.inventory.service.InventoryService - composite Update Service called
2011-09-08 20:54:03,536 org.bk.inventory.service.InventoryService - find by vin called
2011-09-08 20:54:03,536 org.bk.inventory.aspect.AuditAspect - Inventory org.bk.inventory.service.DefaultInventoryService.findByVin(String) took 64893 ns
2011-09-08 20:54:03,536 org.bk.inventory.aspect.AuditAspect - Inventory org.bk.inventory.service.DefaultInventoryService.update(Inventory) took 1833 ns
2011-09-08 20:54:03,536 org.bk.inventory.aspect.AuditAspect - Inventory org.bk.inventory.service.DefaultInventoryService.compositeUpdateService(String, String) took 1371171 ns

the advice is correctly invoked for findByVin, update and compositeUpdateService.

This sample is available at : git://github.com/bijukunjummen/AOP-Samples.git


Links to all sessions on AOP:
AOP Session 1 - Decorator Pattern using Java Dynamic Proxies
AOP Session 2 - Using Spring AOP - xml based configuration
AOP Session 3 - Using Spring AOP - @AspectJ based configuration - with/without compile time weaving
AOP Session 4 - Native AspectJ with compile time weaving
AOP Session 5 - Comprehensive Example

Friday, September 2, 2011

Simple Introduction to AOP - Session 4

Yet another way to define an aspect - this time using native aspectj notation.
package org.bk.inventory.aspect;

import org.bk.inventory.types.Inventory;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

public aspect AuditAspect {
    private static Logger logger = LoggerFactory.getLogger(AuditAspect.class);

    pointcut serviceMethods() : execution(* org.bk.inventory.service.*.*(..));

    pointcut serviceMethodsWithInventoryAsParam(Inventory inventory) : execution(* org.bk.inventory.service.*.*(Inventory)) && args(inventory);

    before() : serviceMethods() {
        logger.info("before method");
    }

    Object around() : serviceMethods() {
        long start = System.nanoTime();
        Object result = proceed();
        long end = System.nanoTime();
        logger.info(String.format("%s took %d ns", thisJoinPointStaticPart.getSignature(),
                (end - start)));
        return result;
    }

    Object around(Inventory inventory) : serviceMethodsWithInventoryAsParam(inventory) {
        Object result = proceed(inventory);
        logger.info(String.format("WITH PARAM: %s", inventory.toString()));
        return result;
    }
    after() : serviceMethods() {
        logger.info("after method");
    }
}


This maps to the previously defined @AspectJ notation

Since this is a DSL specifically for defining Aspects, it is not understood by the java compiler. AspectJ provides a tool(ajc) to compile these native aspectj files and to weave the aspects into the targeted pointcuts. Maven provides a plugin which seamlessly invokes ajc at the point of compilation:

   <plugin>
    <groupId>org.codehaus.mojo</groupId>
    <artifactId>aspectj-maven-plugin</artifactId>
    <version>1.0</version>
    <dependencies>
     <dependency>
      <groupId>org.aspectj</groupId>
      <artifactId>aspectjrt</artifactId>
      <version>${aspectj.version}</version>
     </dependency>
     <dependency>
      <groupId>org.aspectj</groupId>
      <artifactId>aspectjtools</artifactId>
      <version>${aspectj.version}</version>
     </dependency>
    </dependencies>
    <executions>
     <execution>
      <goals>
       <goal>compile</goal>
       <goal>test-compile</goal>
      </goals>
     </execution>
    </executions>
    <configuration>
     <outxml>true</outxml>
     <aspectLibraries>
      <aspectLibrary>
       <groupId>org.springframework</groupId>
       <artifactId>spring-aspects</artifactId>
      </aspectLibrary>
     </aspectLibraries>
     <source>1.6</source>
     <target>1.6</target>
    </configuration>
   </plugin>



Links to all sessions on AOP:
AOP Session 1 - Decorator Pattern using Java Dynamic Proxies
AOP Session 2 - Using Spring AOP - xml based configuration
AOP Session 3 - Using Spring AOP - @AspectJ based configuration - with/without compile time weaving
AOP Session 4 - Native AspectJ with compile time weaving
AOP Session 5 - Comprehensive Example

Saturday, August 20, 2011

Simple Introduction to AOP - Session 3

Another way of defining an Aspect is using @AspectJ annotaions - which is natively understood by Spring:
package org.bk.inventory.aspect;

import org.aspectj.lang.ProceedingJoinPoint;
import org.aspectj.lang.annotation.After;
import org.aspectj.lang.annotation.Around;
import org.aspectj.lang.annotation.Aspect;
import org.aspectj.lang.annotation.Before;
import org.aspectj.lang.annotation.Pointcut;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

@Aspect
public class AuditAspect {

    private static Logger logger = LoggerFactory.getLogger(AuditAspect.class);

    @Pointcut("execution(* org.bk.inventory.service.*.*(..))")
    public void serviceMethods(){
        //
    }

    
    @Before("serviceMethods()")
    public void beforeMethod() {
        logger.info("before method");
    }

    @Around("serviceMethods()")
    public Object aroundMethod(ProceedingJoinPoint joinpoint) {
        try {
            long start = System.nanoTime();
            Object result = joinpoint.proceed();
            long end = System.nanoTime();
            logger.info(String.format("%s took %d ns", joinpoint.getSignature(), (end - start)));
            return result;
        } catch (Throwable e) {
            throw new RuntimeException(e);
        }
    }
      
    @After("serviceMethods()")
    public void afterMethod() {
        logger.info("after method");
    }    
}


The @Aspect annotation on the class identifies it as an aspect definition. It starts by defining the pointcuts:
@Pointcut("execution(* org.bk.inventory.service.*.*(..))")
    public void serviceMethods(){}
The above basically identifies all the methods of all types in org.bk.inventory.service package, this pointcut is identified by the name of the method on which the annotation is placed - in this case "serviceMethods". Next, the advice is defined using the @Before(serviceMethods()), @After(serviceMethods()) and @Around(serviceMethods()) annotation and the specifics of what needs to happen is the body of the methods with those annotations. Spring AOP natively understands the @AspectJ annotations, if this Aspect is defined as a bean:
<bean id="auditAspect" class="org.bk.inventory.aspect.AuditAspect" />
Spring would create a dynamic proxy to apply the advice on all the target beans identified as part of the pointcut notation.

Links to all sessions on AOP:
AOP Session 1 - Decorator Pattern using Java Dynamic Proxies
AOP Session 2 - Using Spring AOP - xml based configuration
AOP Session 3 - Using Spring AOP - @AspectJ based configuration - with/without compile time weaving
AOP Session 4 - Native AspectJ with compile time weaving
AOP Session 5 - Comprehensive Example

Saturday, August 13, 2011

Simple Introduction to AOP - Session 2

Here, I will show how the cross-cutting concern that was introduced in the previous session, can be implemented using Spring AOP - Spring offers multiple ways of implementing Aspects - XML configuration based, @AspectJ based. In this specific example, I will use XML configuration file based way of defining the aspect

Spring AOP works in the context of a Spring container, so the service implementation that was defined in the previous session needs to be a Spring bean, I am defining it using the @Service annotation:
@Service
public class DefaultInventoryService implements InventoryService{
...
}
Now, I want to record the time taken for each of the method calls of my DefaultInventoryService - I am first going to modularize this as an "advice":
package org.bk.inventory.aspect;

import org.aspectj.lang.ProceedingJoinPoint;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

public class AuditAdvice {

    private static Logger logger = LoggerFactory.getLogger(AuditAdvice.class);

    public void beforeMethod() {
        logger.info("before method");
    }

    public void afterMethod() {
        logger.info("after method");
    }

    public Object aroundMethod(ProceedingJoinPoint joinpoint) {
        try {
            long start = System.nanoTime();
            Object result = joinpoint.proceed();
            long end = System.nanoTime();
            logger.info(String.format("%s took %d ns", joinpoint.getSignature(), (end - start)));
            return result;
        } catch (Throwable e) {
            throw new RuntimeException(e);
        }
    }
        
}

This advice is expected to capture the time taken by the methods in DefaultInventoryService. So now to wire this advice to the DefaultInventoryService spring bean:
 <bean id="auditAspect" class="org.bk.inventory.aspect.AuditAdvice" />

 <aop:config>
  <aop:aspect ref="auditAspect">
   <aop:pointcut id="serviceMethods" expression="execution(* org.bk.inventory.service.*.*(..))" />

   <aop:before pointcut-ref="serviceMethods" method="beforeMethod" />  
   <aop:around pointcut-ref="serviceMethods" method="aroundMethod" />
   <aop:after-returning pointcut-ref="serviceMethods" method="afterMethod" /> 
  </aop:aspect>
 </aop:config>

This works by first defining the "pointcut" - the places(in this example, the service methods) to add the cross cutting concern(capturing the method execution time in this example) to. Here I have defined it using a pointcut expression -
execution(* org.bk.inventory.service.*.*(..))
, which is essentially selecting all methods of all the types in the org.bk.inventory.service package. Once the pointcut is defined, it defines what needs to be done around the pointcut(the advice), using the expression:
<aop:around pointcut-ref="serviceMethods" method="aroundMethod" />
This basically says, that around every method of any service type, execute the aroundMethod of AspectAdvice that was defined earlier. Now, if the service methods are executed, I would see the advice getting invoked during the method execution, the following is a sample output if DefaultInventoryService, createInventory method is called:
org.bk.inventory.service.InventoryService - Create Inventory called
org.bk.inventory.aspect.AuditAdvice - Inventory org.bk.inventory.service.InventoryService.create(Inventory) took 82492 ns
Spring's AOP implementation works by generating a dynamic proxy at runtime for all the target beans, based on the defined pointcut.
Links to all sessions on AOP:
AOP Session 1 - Decorator Pattern using Java Dynamic Proxies
AOP Session 2 - Using Spring AOP - xml based configuration
AOP Session 3 - Using Spring AOP - @AspectJ based configuration - with/without compile time weaving
AOP Session 4 - Native AspectJ with compile time weaving
AOP Session 5 - Comprehensive Example

Tuesday, August 9, 2011

A simple introduction to AOP - Session 1

Why use AOP, a simple way to answer this question is to show an implementation of a cross cutting concern without using AOP.

Consider a simple service and it's implementation:
public interface InventoryService {
    public Inventory create(Inventory inventory);
    public List list();
    public Inventory findByVin(String vin);
    public Inventory update(Inventory inventory);
    public boolean delete(Long id);
    public Inventory compositeUpdateService(String vin, String newMake);
}

and its default implementation:
public class DefaultInventoryService implements InventoryService{
    
    @Override
    public Inventory create(Inventory inventory) {
        logger.info("Create Inventory called");
        inventory.setId(1L);
        return inventory; 
    }

    @Override
    public List list() {
        return new ArrayList();
    }

    @Override
    public Inventory update(Inventory inventory) {
        return inventory;
    }

    @Override
    public boolean delete(Long id) {
        logger.info("Delete Inventory called");
        return true;
    }
....

This is just one service. Assume that there are many more services in this project.

So now, if there were a requirement to record the time taken by each of the service methods, the option without AOP would be something along the following lines. Create a decorator for the service:
public class InventoryServiceDecorator implements InventoryService{
    private static Logger logger = LoggerFactory.getLogger(InventoryServiceDecorator.class);
    private InventoryService decorated;

    @Override
    public Inventory create(Inventory inventory) {
        logger.info("before method: create");
        long start = System.nanoTime();
        Inventory inventoryCreated = decorated.create(inventory);
        long end = System.nanoTime();
        logger.info(String.format("%s took %d ns", "create", (end-start)) );
        return inventoryCreated;
    }
This decorator would essentially intercept the call on behalf of the decorated, record the time taken for the method call while delegating the call to the decorated object.

Imagine doing this for all the methods and all the services in the project. This is the scenario that AOP addresses, it provides a way for the cross cutting concerns(Recording time for the service method calls for eg.) to be modularized - to be packaged up separately without polluting the core of the classes.

To end the session, a different way to implement the decorator would be using the dynamic proxy feature of Java:


public class AuditProxy implements java.lang.reflect.InvocationHandler {
    
    private static Logger logger = LoggerFactory.getLogger(AuditProxy.class);
    private Object obj;

    public static Object newInstance(Object obj) {
        return java.lang.reflect.Proxy.newProxyInstance(obj.getClass().getClassLoader(), obj
                .getClass().getInterfaces(), new AuditProxy(obj));
    }

    private AuditProxy(Object obj) {
        this.obj = obj;
    }

    public Object invoke(Object proxy, Method m, Object[] args) throws Throwable {
        Object result;
        try {
            logger.info("before method " + m.getName());
            long start = System.nanoTime();
            result = m.invoke(obj, args);
            long end = System.nanoTime();
            logger.info(String.format("%s took %d ns", m.getName(), (end-start)) );
        } catch (InvocationTargetException e) {
            throw e.getTargetException();
        } catch (Exception e) {
            throw new RuntimeException("unexpected invocation exception: " + e.getMessage());
        } finally {
            logger.info("after method " + m.getName());
        }
        return result;
    }
}

So now, when creating an instance of InventoryService, I would create it through the AuditProxy dynamic proxy:
InventoryService inventoryService = (InventoryService)AuditProxy.newInstance(new DefaultInventoryService());

the overridden invoke method of java.lang.reflect.InvocationHandler would intercept all calls to the InventoryService created in this manner, where the cross cutting concern of auditing the method call time is recorded. This way the cross cutting concern is modularized to one place(AuditProxy), but still needs to be explicitly known by the clients of InventoryService, when instantiating InventoryService.

In the next few sessions, I will demonstrate how this can be more cleanly accomplished using Spring AOP, Spring AOP with @AspectJ, AspectJ, @AspectJ with compile time weaving and finish it off with a comprehensive example.

Links to all sessions on AOP:
AOP Session 1 - Decorator Pattern using Java Dynamic Proxies
AOP Session 2 - Using Spring AOP - xml based configuration
AOP Session 3 - Using Spring AOP - @AspectJ based configuration - with/without compile time weaving
AOP Session 4 - Native AspectJ with compile time weaving
AOP Session 5 - Comprehensive Example

Tuesday, July 26, 2011

"n" slots - continued..

A different java implementation, closer to being a good functional implementation, but not quite there:

public List<String> fillNSlots(int n){
        return fillNSlotsWithPrefix(0, n, "");
        
    }
    
    private List<String> fillNSlotsWithPrefix(int counter, int  n, String prefix){
        if (counter==n-1){
            return new ArrayList<String>(Arrays.asList(new String[]{prefix + "0", prefix + "1"})); 
        }else{
            List<String> result1 = fillNSlotsWithPrefix(counter+1, n , prefix + "0");
            result1.addAll(fillNSlotsWithPrefix(counter+1, n , prefix + "1"));
            return result1;
        }
    }

It is not quite there, as there is still a need to hold an intermediate state with a temporary variable, result1 above, which could have been avoided had List supported an API which adds an element and returns itself or returns a new list.

The following is an equivalent implementation in scala:

def fillNSlots(n:Int):List[String] = {
        fillNSlotsWithPrefix(0,n,"")
    }

    def fillNSlotsWithPrefix(counter:Int, n: Int, prefix:String): List[String] = {
        if (counter==(n-1))
            (prefix+"0")::(prefix+"1")::List()
        else
            fillNSlotsWithPrefix(counter+1, n, prefix+"0"):::fillNSlotsWithPrefix(counter+1, n, prefix+"1")
    }

Sunday, July 24, 2011

"n" slots - different ways of filling it

The problem is simple - Given "n" slots, find all possible different configurations to fill the slots:
For "2" slots, the possible configurations are - [00, 01, 10, 11]
For "3" slots, the possible configurations are - [000, 001, 010, 011, 100, 101, 110, 111]

Here is a test for any solution:

@Test
    public void test2Slots() {
        String[] slotsAnswer = {"00","01", "10", "11"};
        assertThat(fillNSlots(2), hasItems(slotsAnswer));
    }

    @Test
    public void test3Slots() {
        String[] slotsAnswer = {"000", "001", "010", "011", "100", "101", "110", "111"};
        assertThat(fillNSlots(3), hasItems(slotsAnswer));
    }

    @Test
    public void test5Slots() {
        String[] slotsAnswer = {"00000", "00001", "00010", "00011", "00100", "00101", "00110", "00111", "01000", "01001", "01010", "01011", "01100", "01101", "01110", "01111", "10000", "10001", "10010", "10011", "10100", "10101", "10110", "10111", "11000", "11001", "11010", "11011", "11100", "11101", "11110", "11111"};
        assertThat(fillNSlots(5), hasItems(slotsAnswer));
    }

I have a naive solution to start with:
public List<String> fillNSlots(int n){
        List<String> result = new ArrayList<String>();
        fillNSlotsWithPrefix(result, n, "");
        return result;
    }
    
    private void fillNSlotsWithPrefix(List<String> result, int n, String prefix){
        if (n==0){
            result.add(prefix);
        }else{
            fillNSlotsWithPrefix(result, n-1, prefix + "0");
            fillNSlotsWithPrefix(result, n-1, prefix + "1");
        }
    }
This solution works, however state is being passed(an arraylist of result) with each of the "fillNSlotsWithPrefix" methods recursive calls. A pure functional method, however would not have had this side effect. I will fix this implementation to be more functional over the course of next week.

Monday, June 20, 2011

Writing Integration tests for Spring WS endpoints

Writing Integration tests for Spring WS endpoints is easy, based on this resource from Spring-WS reference site. Spring WS provides a MockWebServiceClient class to test the Spring-WS endpoints.

My endpoint has the following signature:


@Endpoint
public class GetMemberDetailsEndpoint {

 @Resource private MemberManager memberManager;

 @PayloadRoot(namespace = "http://bk.org/memberservice/", localPart = "MemberDetailsRequest")
 @ResponsePayload
 public MemberDetailsResponse getMemberDetails(@RequestPayload MemberDetailsRequest request) throws Exception {
  MemberDetail memberDetail = memberManager.findByMemberId(request.getId());
                ......
 }

I have a memberManager bean dependency in my endpoint, I mock this up using easymock first:
<bean name="memberManager" class="org.easymock.EasyMock" factory-method="createMock">
  <constructor-arg value="org.bk.memberservice.service.MemberManager"/>
 </bean>

this way Spring will wire it to the endpoint when starting up the bean factory. Record the easymock with appropriate expectations:
MemberDetail memberDetail = new MemberDetail("john doe", "111-111-1111", "City", "State");
        memberDetail.setId(1L);
        expect(memberManager.findByMemberId(1L)).andReturn(memberDetail);
        replay(memberManager);

Initialize MockWebserviceClient, set up the test:
mockClient = MockWebServiceClient.createClient(applicationContext);
        Source requestPayload = new StringSource(
                "<mem:MemberDetailsRequest xmlns:mem=\"http://bk.org/memberservice/\">"
                        + "<mem:id>1</mem:id>" 
                        + "</mem:MemberDetailsRequest>");
        Source responsePayload = new StringSource(
                "<ns3:MemberDetailsResponse xmlns:ns3=\"http://bk.org/memberservice/\">"
          + "<memberDetail>" 
          + "<id>1</id>"
          + "<name>john doe</name>"
          + "<phone>111-111-1111</phone>"
          + "<city>City</city>"
          + "<state>State</state>"
          + "</memberDetail>"
      + "</ns3:MemberDetailsResponse>");

        mockClient.sendRequest(withPayload(requestPayload)).andExpect(payload(responsePayload));
        verify(this.memberManager);
This completes the test, MockWebserviceClient would take care of packaging up the raw xml request, dispatching it the appropriate WS endpoint, getting the response and validating it. Updated codesample with integration test available at: git://github.com/bijukunjummen/memberservice-contractfirst.git

Saturday, June 18, 2011

Supporting Spring-WS and Spring MVC integration in a project

Spring WS and Spring MVC provide different front controller implementations as a gateway to the webservice and the MVC functionality respectively. The Dispatcher Servlet used by Spring-WS is :

org.springframework.ws.transport.http.MessageDispatcherServlet
and the one used by Spring MVC is :
org.springframework.web.servlet.DispatcherServlet
To have a combined Spring MVC and Spring-WS project, it is possible to configure these front controllers based on the URI pattern of the request, in the following way:
<servlet>
        <servlet-name>member-ws</servlet-name>
        <servlet-class>org.springframework.ws.transport.http.MessageDispatcherServlet</servlet-class>
        <init-param>
        <param-name>contextConfigLocation</param-name>
        <param-value>classpath:/META-INF/spring/applicationContext-ws.xml</param-value>
        </init-param>
    </servlet>
    
    <servlet>
        <servlet-name>member-web</servlet-name>
        <servlet-class>org.springframework.web.servlet.DispatcherServlet</servlet-class>
        <init-param>
        <param-name>contextConfigLocation</param-name>
        <param-value>/WEB-INF/spring/webmvc-config.xml</param-value>
        </init-param>
    </servlet>    

    <servlet-mapping>
        <servlet-name>member-ws</servlet-name>
        <url-pattern>/services/*</url-pattern>
    </servlet-mapping>
    
    <servlet-mapping>
        <servlet-name>member-ws</servlet-name>
        <url-pattern>*.wsdl</url-pattern>
    </servlet-mapping>
    
    <servlet-mapping>
        <servlet-name>member-web</servlet-name>
        <url-pattern>/web/*</url-pattern>
    </servlet-mapping>    


In this specific instance, all requests to /web/ is handled by the Spring MVC DispatcherServlet whereas all requests to /services is handled by Spring-WS DispatcherServlet. Further, each dispatcher servlet is configured with its custom Spring configuration file, the one for Spring MVC loads up the contollers, the one for Spring WS loads up the Webservice endpoints.

I am not sure if this is a optimal configuration, but it works for me in this project available at: git://github.com/bijukunjummen/memberservice-contractfirst.git


OR An alternate way to hook up DispatcherServlet to handle Spring-WS requests is described here!: http://static.springsource.org/spring-ws/sites/2.0/reference/html/server.html#d4e884

Sunday, June 12, 2011

IntelliJ IDEA for Scala/SBT projects

I have found it easier to use Intellij IDEA with Scala plugins, for Scala learnings. This is the way I normally spin up a scala project.

1. Use sbt to first create a shell of a project:
D:\learn\shell-project>sbt

D:\learn\shell-project>set SCRIPT_DIR=C:\util\sbt\

D:\learn\shell-project>java -Xmx512M -jar "C:\util\sbt\sbt-launch-0.7.5.jar"
Project does not exist, create new project? (y/N/s) y
Name: shellproject
Organization: org.bk
......

2. Add eclipsify plugin - this is done by creating a sbt ProjectDefinition under project/plugins folder:
import sbt._

 class MySbtProjectPlugins(info: ProjectInfo) extends PluginDefinition(info) {
       lazy val eclipse = "de.element34" % "sbt-eclipsify" % "0.7.0"
 }


3. Add scalatest to the dependency - this is done by creating a file of the following type under project/build folder:

import sbt._
import de.element34.sbteclipsify._

class MySbtProject(info: ProjectInfo) extends DefaultProject(info) with Eclipsify {
    override def libraryDependencies = Set(
       "org.scalatest" % "scalatest" % "1.3" % "test->default"
    ) ++ super.libraryDependencies
}

4. reload sbt(run "reload" in the sbt console)

5. A new action "eclipse" should now be available. Run "eclipse"

6. That's it. Now a new project should be available for IntelliJ idea to import as a project.

Monday, June 6, 2011

Lucene Search on Numeric Long field

Something new to me, I have previously enabled Lucene search using pure text fields, but stumbled recently when trying to search using a Long field:
IndexWriterConfig indexConfig = new IndexWriterConfig(Version.LUCENE_30,  new StandardAnalyzer(Version.LUCENE_30));
IndexWriter indexWriter = new IndexWriter(directory, indexConfig );

Document doc = new Document();
doc.add(new NumericField("id", Store.YES, true).setLongValue(123L));
and to search on this field:
IndexSearcher is = new IndexSearcher(dir);
Query query = new TermQuery(new Term("id", NumericUtils.longToPrefixCoded(123L)));
TopDocs hits = is.search(query, 10);

Tuesday, May 31, 2011

Tower of Hanoi in Scala

Had a little fun with scala after 2 months, an implementation of Tower of Hanoi:
object Hanoi{
    def move(n:Int, fromTower:String, toTower:String, usingTower:String):Unit = {
        if (n==0) return
        move(n-1, fromTower, usingTower, toTower);
        println("Moving from " + fromTower + " to " + toTower)
        move(n-1, usingTower, toTower, fromTower);
    }
}

Thursday, May 26, 2011

Drools session at Indianapolis Java User Group

I attended a JBoss Drools session at the Indianapolis JUG yesterday evening. The session was presented by Ray Ploski, a Principal Solution Architect with Redhat.
Ray went over the breadth of product offerings within the Drools Umbrella -

  • Drools Expert(the Rule engine)
  • Drools Guvnor(Rules Hosting/Management)
  • Drools Fusion(Complex Event Processing) 
  • Drools Planner(Planning)
  • Drools Flow(Process Flow)
All these product offerings is together referred to as the "Drools Business Logic Integration Platform". 
I had gone into the meeting with the mistaken assumption that Drools is just a rule engine and came out  a little wiser. 

Some notes that I have from the meeting are the following:
  • Ray has hosted the presentation, samples at this location - http://bit.ly/cjug-drools-2011
  • Drools Flow is being merged into jBPM5 - with support for Graphical process flows tightly integrated with Drools Expert.
  • Rules and other artifacts can be versioned within Guvnor - internally it uses JCR as the API for versioning with Jack Rabbit as the implementation
  • Java code can implement a pull based model, to pull in rules from Guvnor - at a specified schedule pull in the latest rules from Guvnor, thus reflecting the rule changes without needing to restart application
  • Drools Expert uses a MVEL dialect to express the condition(LHS) part of the rule
  • One of the largest Drools users has Million and a half rules defined!
PS: E-gineering hosted this meeting at their great new office location.

Wednesday, May 25, 2011

java.lang.ClassCastException: org.apache.cxf.transport.servlet.CXFServlet cannot be cast to javax.servlet.Servlet

I was getting this exception:
java.lang.ClassCastException: org.apache.cxf.transport.servlet.CXFServlet cannot be cast to javax.servlet.Servlet
        at org.apache.catalina.core.StandardWrapper.loadServlet(StandardWrapper.java:1116)
        at org.apache.catalina.core.StandardWrapper.load(StandardWrapper.java:993)
        at org.apache.catalina.core.StandardContext.loadOnStartup(StandardContext.java:4350)
        at org.apache.catalina.core.StandardContext.start(StandardContext.java:4659)
        at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1045)
        at org.apache.catalina.core.StandardHost.start(StandardHost.java:785)
when trying to start up Tomcat using maven command -
mvn tomcat:run
on this application -
git://github.com/bijukunjummen/memberservice-codefirst.git
It turns out that the issue is related to a dependency that CXF pulls in which conflicts with Tomcats version of Servlet class. The fix is to change the maven dependency from:
  <dependency>
   <groupId>org.apache.cxf</groupId>
   <artifactId>apache-cxf</artifactId>
   <version>${cxf.version}</version>
   <type>pom</type>
  </dependency>
to the specific set of CXF libs:
  <dependency>
   <groupId>org.apache.cxf</groupId>
   <artifactId>cxf-rt-frontend-jaxws</artifactId>
   <version>${cxf.version}</version>
  </dependency>
  <dependency>
   <groupId>org.apache.cxf</groupId>
   <artifactId>cxf-rt-transports-http</artifactId>
   <version>${cxf.version}</version>
  </dependency>
  <dependency>
   <groupId>org.apache.cxf</groupId>
   <artifactId>cxf-rt-transports-http</artifactId>
   <version>${cxf.version}</version>
  </dependency>
  <dependency>
   <groupId>org.apache.cxf</groupId>
   <artifactId>cxf-rt-ws-security</artifactId>
   <version>${cxf.version}</version>
  </dependency>
  <dependency>
      <groupId>org.apache.ws.security</groupId>
      <artifactId>wss4j</artifactId>
      <version>1.6.0</version>
  </dependency> 
Now
mvn tomcat:run
should run through just fine.

Sunday, May 22, 2011

Enabling WS-Security Username Token Profile for Apache CXF service

This article describes how to create a code first webservice. I am going to extend the sample provided to support WS-Security Username Token profile. It is a way for the callers of the service to prove their identity by providing username and a password.

To recap the previous article, it is very simple to expose a code first webservice using Apache CXF with Spring. Apache CXF provides a custom Spring namespace to easily configure the endpoint:

<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:aop="http://www.springframework.org/schema/aop"
 xmlns:context="http://www.springframework.org/schema/context"
 xmlns:jaxws="http://cxf.apache.org/jaxws"
 xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
        http://www.springframework.org/schema/aop http://www.springframework.org/schema/aop/spring-aop.xsd
        http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context.xsd
        http://cxf.apache.org/jaxws http://cxf.apache.org/schemas/jaxws.xsd">

 <import resource="classpath:META-INF/cxf/cxf.xml"/>
 <import resource="classpath:META-INF/cxf/cxf-extension-soap.xml"/>
 

 <bean id="memberendpoint" class="org.bk.memberservice.endpoint.DefaultMemberEndpoint"/>
 
 <jaxws:endpoint address="/memberservice" id="memberservicehttp" implementor="#memberendpoint" >
 </jaxws:endpoint>

</beans>

This exposes the memberendpoint bean above as a fully configured webservice.

To secure this service using usernametoken, first implement a callback for CXF to invoke to validate the credentials passed by the user:

package org.bk.memberservice.endpoint;

import java.io.IOException;

import javax.security.auth.callback.Callback;
import javax.security.auth.callback.CallbackHandler;
import javax.security.auth.callback.UnsupportedCallbackException;

import org.apache.ws.security.WSPasswordCallback;

public class UsernameTokenCallback implements CallbackHandler {

 @Override
 public void handle(Callback[] callbacks) throws IOException,
   UnsupportedCallbackException {
  Callback callback = callbacks[0];
  WSPasswordCallback pc = (WSPasswordCallback) callback;
//              Retrieve and set the real password..which will be validated 
//              by CXF validator against the digest password
pc.setPassword("test");
  System.out.println("Received cred: " + pc.getIdentifier() + " : " + pc.getPassword());
//  validate the credentials
//              boolean isValid = true;
//              throw IO Exception if the credentials are not valid
//  if (!isValid) {
//   throw new IOException("Bad Credentials");
//  }
 }
}

To wire this Callback handler with the service endpoint, CXF uses a concept called the interceptor, basically the webservice call is handled by the interceptors before being handed over to the service.


<jaxws:endpoint address="/memberservice" id="memberservicehttp"
  implementor="#memberendpoint">
  <jaxws:inInterceptors>
   <bean class="org.apache.cxf.ws.security.wss4j.WSS4JInInterceptor">
    <constructor-arg>
     <map>
      <entry key="action" value="UsernameToken" />
      <entry key="passwordType" value="PasswordDigest" />
      <entry key="passwordCallbackRef">
       <ref bean="usernameTokenCallback" />
      </entry>
     </map>
    </constructor-arg>
   </bean>
  </jaxws:inInterceptors>
 </jaxws:endpoint>

That's it!! the endpoint is now secured using UsernameToken profile. To test this, bring up the endpoint, use SOAP UI to send a normal request, it will fail with the message that a ws-security header is required - this is the username token soap header that is expected as part of the request.

 Fix this by adding the username and password, select "digest" as the password type and the service invocation should just work.
Updated Sample available at: git://github.com/bijukunjummen/memberservice-codefirst.git
Reference:
1. WS-Security reference in Wikipedia: http://en.wikipedia.org/wiki/WS-Security
2. WS-Secuirty usernametoken profile specs at OASIS: http://www.oasis-open.org/committees/download.php/16782/wss-v1.1-spec-os-UsernameTokenProfile.pdf
3. Apache CXF reference: http://cxf.apache.org/docs/ws-security.html
4. New changes as part of WSS4J - http://coheigea.blogspot.com/2011/02/wspasswordcallback-changes-in-wss4j-16.html


Thursday, May 19, 2011

Quick and Dirty fixtures for tests

This is a quick and dirty approach that I use to quickly add fixtures for a entity test, say if I am testing an entity of the following type:

@Entity
@Table(name="gtdcontexts")
public class GtdContext {
    @Size(min = 1, max = 50)
    private String name;
...
In my test Spring context file, I have entries to instantiate beans of the type:
 <bean name="context1" class="org.bk.simplygtd.domain.GtdContext"  p:name="context1"/>
    <bean name="context2" class="org.bk.simplygtd.domain.GtdContext"  p:name="context2"/>
    <bean name="context3" class="org.bk.simplygtd.domain.GtdContext"  p:name="context3"/>
    <bean name="context4" class="org.bk.simplygtd.domain.GtdContext"  p:name="context4"/>
    <bean name="context5" class="org.bk.simplygtd.domain.GtdContext"  p:name="context5"/>

Now in my test class, I autowire in a Map in the following way:
 
 @Autowired
 Map<String, GtdContext> gtdContextsMap;

Spring in this case autowires in all the GtdContexts instances into this Map, with the key as the bean name and the value being the instance. So now that the map is populated, a test embedded database can be populated with these fixtures:
<jdbc:embedded-database type="H2" id="dataSource"></jdbc:embedded-database>

  for (Map.Entry<String, GtdContext> entry:gtdContextsMap.entrySet()){
   this.gtdContextDao.persist(entry.getValue());
  }

That is it! an embedded h2 database with some sample entries will be ready for tests

Saturday, May 14, 2011

Sample Contract First Service using Spring-WS 2.0

I have now updated the sample for this article, to use Spring WS 2.0, which now provides a annotation based model for Webservice endpoints. The amount of configuration required to define a Webservice endpoint has reduced considerably between Spring WS 1.x and Spring WS 2.0  - most of the routine boilerplate to define an endpoint has moved from configuration to an annotation:
Whereas earlier, the endpoint would have been defined by a code of the following signature:
public class GetMemberDetailsEndpoint extends
  AbstractMarshallingPayloadEndpoint {
 private MemberManager memberManager;
 protected Object invokeInternal(Object requestObject) throws Exception {
  MemberDetailsRequest request = (MemberDetailsRequest) requestObject;
  MemberDetail memberDetail = memberManager.getMemberDetails(request
    .getId());
  MemberDetailsResponse response = new MemberDetailsResponse(memberDetail);
  return response;
 }

......
}
With Spring-WS 2.0 the endpoint can be more intuitively defined with the following signature:
@Endpoint
public class GetMemberDetailsEndpoint {

 @Autowired private MemberManager memberManager;

 @PayloadRoot(namespace = "http://bk.org/memberservice/", localPart = "MemberDetailsRequest")
 @ResponsePayload
 public MemberDetailsResponse getMemberDetails(@RequestPayload MemberDetailsRequest request) throws Exception {
  MemberDetail memberDetail = memberManager.getMemberDetails(request
    .getId());
  MemberDetailsResponse response = new MemberDetailsResponse(memberDetail);
  return response;

 }

.....

}
Updated code available at:
git://github.com/bijukunjummen/memberservice-contractfirst.git

Contract First and Code First Webservice Development

There are generally two styles for Webservice Development - Contract First and Code First.

Contract First is an approach where the developer starts from defining a contract for the webservice using a WSDL and then goes about generating/developing the codebase - typically using stacks that allow code to be generated from the wsdl.

Code First on the other hand is an approach where the developer starts by defining an interface, then generates the contract for the webservice using tools that allow translating an interface to wsdl.

I have used both approaches now for different projects over time and but am not very opinionated about either of the approaches. A purist approach would be to start with the contract, which I have personally advocated in the past, however in some real world projects I have tended to go with a code first approach, for a few reasons:

  1. Good tool support is required to define a reasonably involved wsdl - Eclipse is not good enough to create a quality wsdl - I would recommend IBM RAD or XML Spy for creating wsdl's
  2. A very deep knowledge of XML schema language is required to generate good constructs
  3. Development time is generally slower -  as creating a wsdl takes a good amount of time


Code First Approach on the other hand is quick. Let me illustrate by changing my DZone Example which is a Contract First example to a Code First using Apache CXF:

The contract for the service was:
<?xml version="1.0" encoding="UTF-8"?><wsdl:definitions xmlns:wsdl="http://schemas.xmlsoap.org/wsdl/" xmlns:ms="http://bk.org/memberservice/" xmlns:soap="http://schemas.xmlsoap.org/wsdl/soap/" xmlns:xsd="http://www.w3.org/2001/XMLSchema" name="memberservice" targetNamespace="http://bk.org/memberservice/">
 <wsdl:types>
  <xsd:schema targetNamespace="http://bk.org/memberservice/" elementFormDefault="qualified">
   <xsd:complexType name="MemberDetailType">
    <xsd:sequence>
     <xsd:element name="name" type="xsd:string"/>
     <xsd:element name="phone" type="xsd:string"/>
     <xsd:element name="city" type="xsd:string"/>
     <xsd:element name="state" type="xsd:string"/>
    </xsd:sequence>
   </xsd:complexType>
   <xsd:element name="MemberDetailsRequest">
    <xsd:complexType>
     <xsd:sequence>
      <xsd:element name="id" type="xsd:string"/>
     </xsd:sequence>
    </xsd:complexType>
   </xsd:element>
   <xsd:element name="MemberDetailsResponse">
    <xsd:complexType>
     <xsd:sequence>
      <xsd:element name="memberdetail" type="ms:MemberDetailType"/>
     </xsd:sequence>
    </xsd:complexType>
   </xsd:element>
  </xsd:schema>
 </wsdl:types>
 <wsdl:message name="MemberDetailsRequest">
  <wsdl:part element="ms:MemberDetailsRequest" name="parameters"/>
 </wsdl:message>
 <wsdl:message name="MemberDetailsResponse">
  <wsdl:part element="ms:MemberDetailsResponse" name="parameters"/>
 </wsdl:message>
 <wsdl:portType name="memberservice">
  <wsdl:operation name="GetMemberDetails">
   <wsdl:input message="ms:MemberDetailsRequest"/>
   <wsdl:output message="ms:MemberDetailsResponse"/>
  </wsdl:operation>
 </wsdl:portType>
 <wsdl:binding name="memberserviceSOAP" type="ms:memberservice">
  <soap:binding style="document" transport="http://schemas.xmlsoap.org/soap/http"/>
  <wsdl:operation name="GetMemberDetails">
   <soap:operation soapAction="http://bk.org/memberservice/GetMemberDetails"/>
   <wsdl:input>
    <soap:body use="literal"/>
   </wsdl:input>
   <wsdl:output>
    <soap:body use="literal"/>
   </wsdl:output>
  </wsdl:operation>
 </wsdl:binding>
 <wsdl:service name="memberservice">
  <wsdl:port binding="ms:memberserviceSOAP" name="memberserviceSOAP">
   <soap:address location="http://localhost:8081/memberservice/services/MemberDetailsRequest"/>
  </wsdl:port>
 </wsdl:service>
</wsdl:definitions>

Let us start by defining a java interface that can sufficiently mimic this WSDL:
package org.bk.memberservice.endpoint;

import javax.jws.WebMethod;
import javax.jws.WebParam;
import javax.jws.WebResult;
import javax.jws.WebService;

import org.bk.memberservice.message.MemberDetailsRequest;
import org.bk.memberservice.message.MemberDetailsResponse;

@WebService
public interface MemberEndpoint {
 @WebMethod(operationName = "MemberDetailsRequest")
 @WebResult(name = "MemberDetailsResponse", targetNamespace = "http://bk.org/memberservice/")
 MemberDetailsResponse getMemberDetails(
   @WebParam(name = "MemberDetailsRequest") MemberDetailsRequest memberDetailsRequest);
}

and the JAXB2 annotations on the request and response types:

package org.bk.memberservice.message;

import javax.xml.bind.annotation.XmlAccessType;
import javax.xml.bind.annotation.XmlAccessorType;
import javax.xml.bind.annotation.XmlRootElement;

@XmlAccessorType(XmlAccessType.FIELD)
@XmlRootElement(name = "MemberDetailsRequest", namespace="http://bk.org/memberservice/")
public class MemberDetailsRequest {

 public MemberDetailsRequest() {
 }

 public MemberDetailsRequest(String id) {
  this.id = id;
 }

 private String id;

 public String getId() {
  return id;
 }

 public void setId(String id) {
  this.id = id;
 }

}

package org.bk.memberservice.message;

import javax.xml.bind.annotation.XmlAccessType;
import javax.xml.bind.annotation.XmlAccessorType;
import javax.xml.bind.annotation.XmlElement;
import javax.xml.bind.annotation.XmlRootElement;

import org.bk.memberservice.types.MemberDetail;

@XmlAccessorType(XmlAccessType.FIELD)
@XmlRootElement(name = "MemberDetailsResponse", namespace="http://bk.org/memberservice/")
public class MemberDetailsResponse {

 public MemberDetailsResponse() {
 }

 @XmlElement(name="memberdetail", namespace="http://bk.org/memberservice/")
 private MemberDetail memberDetail;

 public MemberDetailsResponse(MemberDetail memberDetail) {
  this.memberDetail = memberDetail;
 }

 public MemberDetail getMemberDetail() {
  return memberDetail;
 }

 public void setMemberDetail(MemberDetail memberDetail) {
  this.memberDetail = memberDetail;
 }

}
This generates a wsdl fairly close to the manually generated one, but not quite there - there does not seem to be a good way of controlling the namespace of operation name, and the wrappers around the request and the response xml structures.

So considering these factors, a rule that I have used is:

  • If a service is consumed only by a known internal client(say for cases where the same team is the producer AND the consumer) then go for code first, for the speed of development
  • If a service is expected to be more widely used, then go for contract first, with careful wsdl design

Sample code used here is available at :
git://github.com/bijukunjummen/memberservice-codefirst.git

Thursday, May 5, 2011

Layered vs Big Ball of Mud


Layered arch vs Big Ball Of Mud
§  Separation of concerns -  Each layer encapsulates distinct functions –eg Presentation , Business Logic, Data Access in traditional three-tiered architecture.
§  If multiple presentation technology needs to be supported – say Mobile and Web based views, only the presentation tier is affected
§  If a different kind of data access pattern needs to be supported – say Memcache based caching, only the data tier is affected
§  Allows Reuse – lower layers can be re-used by the layers above it.
§  Testability – Each layer can be tested independently.
§  Maintainable – if view breaks, look in the presentation tier. If data seems wrong, look in the data access tier, if business logic seems wrong, look at the business tier
§  Fosters Developer specialization – UI developers looking at Presentation layer