Saturday, September 27, 2008

Building enterprise applications with Maven to use optional packages


The Java 1.3 feature of optional packages introduced support for an expanded set of Jar-file manifest attributes that enable an application jar to specify its list of dependencies on other jars. Maven can generate a jar and build a manifest automatically for you so that you can leverage optional packages deployed in your environment (whether it be the classpath or jars deployed in an application container). If you are looking to build and deploy an application EAR that has dependencies on a slew of 3rd party libraries, you can deploy the 3rd party libraries as optional packages and cut down on the size of your EAR file across many applications. The reuse of these 3rd party jars across multiple Web or enterprise application deployments becomes really handy if you want all these applications to do a common upgrade to a different version for their dependent library. So how do you tell Maven to do this? Here's an example of a Web application POM file. Within the build section, specify the following snippet -

<plugins>
<plugin>
<artifactId>maven-war-plugin</artifactId>
<configuration>
<archive>
<manifest>
<addExtensions>true</addExtensions>
</manifest>
</archive>
</configuration>
</plugin>
</plugins>


However, while building this WAR file, the dependencies will still get pulled in unless you specify another attribute on your dependencies. Example here - if you have a dependency on aspectj and aspectj is deployed to your container, say Weblogic10, as an optional package, then in your POM for your WAR file, you would specify the following -

<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
<version>1.6.0</version>
<!-- <scope>provided</scope> -->
<optional>true</optional>
</dependency>

The attribute that's important in this snippet is the "optional" attribute, the presence of which will omit the file from being included in the WAR file. The omission of the file is not the only thing that is important here. The "scope" element is also important. There are some specific rules that Maven follows when generating the manifest based on the "scope" and "optional" tags that you need to understand. Here's their docs that shows a table that helps clearly explain the relationship between these 2 attributes. In the above example, while the optional element will will exclude the jar from being added to the WAR file, the scope element will also cause the jar's extension name to be omitted from the MANIFEST - something you probably do not want if you are looking to leverage optional packages deployed.

A jar file deployed as an optional package essentially needs to have 3 properties in its manifest file -

Extension-Name: asjpectj
Specification-Version: 1.6.0
Implementation-Version: 1.6.0

The manifest file built by Maven will put the extension name of the dependent jar in the "Extension-List" property of the manifest of your WAR file. It will also put the implementation version. You have to make sure that both these attributes match with those of the deployed optional package (i.e. in this example the aspectj jar). Otherwise, you'll probably get a ClassNotFoundException. As long as you get the extension name and implementation version of the optional package the same as what Maven generates in the manifest of the dependent WAR, everything will work and deploy without a problem. Take a look at this article for Weblogic to understand the constraints on naming for optional packages.

While the above is tedious at first, it becomes a very efficient way of managing application dependencies in enterprise production systems once you get everything working and lined up. I actually created a Swing tool that takes an EAR file that has all the dependencies in it and spits out the jars converted into optional packages that Maven understands. All you really have to do is take the version information from the name of the jar and put it into the Specification(as long as this contains only digits eg. "1.1.2" ) and implementation properties of the jar's manifest.

Though your initial effort to get everything setup will take a little bit of time, it will be well worth it. So give this operational efficiency a try.....

Thursday, August 7, 2008

Using Ubuntu for Java development in the enterprise


Ubuntu's been a rage over the last 2-3 years, however it definately seems to be catching on with a lot of folks as a possible OS of choice, with the widespread dissatisfaction around MS Vista. I decided to really give Ubuntu the spin to see if I really would develop applications and do my daily work in that environment - it's one thing saying Ubuntu's an alternative to Windows, its another to actually get the tools for the job working on this OS.

So first thing I did was to install Ubuntu using Wubi from within Windows Xp. This to me is a really cool piece of software, as Ubuntu looks like any other Windows program installed. However, it actually allows Ubuntu to be booted when your pc's turned on by adding an entry to the Windows boot menu. Ubuntu is installed within a file in the Windows file system (c:\ubuntu\disks\root.disk), this file is seen by Linux as a real hard disk. The install didn't take too much time, about 20 minutes on my Toshiba M700. So essentially, you're not running Ubuntu in a virtualized mode.

While Ubuntu looks pretty good, the UI doesn't necessarily looks extraordinary. So I decided to jazz it up to look like the MacOS X. Take a look at this. On the left, that's what my desktop looks like now.

So now that I had the system look spiffy, I decided to do some real Java work on it. With Synaptic software manager, I installed JDK 1.5. I also downloaded MyEclipse 6.5. Now for the application server, I decided to use Weblogic - one of the more common enterprise wide application servers in use in production environments. I downloaded the RedHat version of Weblogic 10 MP1. For the database, MySQL was the obvious choice and Synaptics makes that an easy install.

Finally, after all that, I got a JEE5 application deployed and running for a quick test of all components. Everything worked perfectly - infact, Weblogic actually seems to run faster than it does on the Windows side - under the same hardware. On the whole, I also saw that the OS was consuming lesser memory helping programs overall. Ubuntu seems a really viable option for Java developers, and I would definitely recommend that you give some thought to this option, especially in your organization if you're a pure Java shop - lesser cost to maintain, and closer to your production environments. MS applications like Office have substitutes (OpenOffice, Evolution), but you can also use virtualization software to run these MS apps seamlessly on Ubuntu using something like QEmu. I really do think this has the potential to take off in many organizations.

PS: While I initially posted information about a 32bit Ubuntu install, I later went on and installed the 64bit version since my laptop has Intel Core2 chips. I was able to run all software I mentioned above and mostly also saw some performance improvement. The only glitch was that 2 of the MacOS options - the global menu and startup screens don't work or need to be compiled for a 64bit system. On the software side, Acrobat Reader couldn't be installed, however Ubuntu has pdf-reader preinstalled - so that's not something you'll miss.

Thursday, July 17, 2008

Eclipse and Maven integration

If you are looking to use Maven on your next Project and currently use Eclipse as your IDE, then I would suggest you also take a look at MyEclipse 6.5. This new version of MyEclipse has some nifty features implemented that aren't just a whole bunch of open source plugins bundled together (like prior versions of Myeclipse) - especially the integration built in for Maven.


Myeclipse 6.5 is built on Eclipse 3.3 and the Web Tools Project(WTP). The Maven support I am referring to isn't available in the regular Maven IDE plugin. What has been provided with MyEclipse is the ability to create EJB and WEB projects structured around Maven. These EJB/WEB projects are enabled to be deployable to your application server of choice from within the IDE. While this makes life easy for a developer, one still needs to understand a little bit more about how to go about structuring these EJB and WEB projects to create the necessary enterprise artifact or EAR file.
Using the MyEclipse project options, I went ahead and created two projects - FrmkConsoleServices, an EJB project and FrmkWebConsole, a WEB project. Due to the lact of support for creating an Enterprise project (essentially one that creates the EAR file), I created the project - FrmkConsole, that brings together the other two projects or creates the EAR file containing a JAR and WAR. The POM for this EAR project looks like this, which I had to hand-code since MyEclipse does not have a capability to create such an Enterprise project (a feature they need to have in their next version asap ):

<dependencies>
<dependency>
<groupId>com.mycompany.myappfrmk.console</groupId>
<artifactId>console-svcs</artifactId>
<type>ejb</type>
<version>1.0</version>
</dependency>
<dependency>
<groupId>com.mycompany.myappfrmk.console</groupId>
<artifactId>web-console</artifactId>
<type>war</type>
<version>1.0</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-ear-plugin</artifactId>
<version>2.3.1</version>
<configuration>
<generateApplicationXml>
true
</generateApplicationXml>
<generatedDescriptorLocation>
${basedir}/target
</generatedDescriptorLocation>
<finalName>FrmkConsole</finalName>
</configuration>
</plugin>

You will also need to control the creation of the various project artifacts, or get the WAR,JAR built before the EAR, and that's the reason for creating FrmkConsoleRoot. It essentially specifies the modules that make up the project and the POM really only contains the following, besides declaring the packaging for it to be of type "pom":

<modules>
<module>../FrmkConsoleServices</module>
<module>../FrmkWebConsole</module>
<module>../FrmkConsole</module>
</modules>

Overall though, I was not particularly happy that I had to create 4 top-level projects for me to create an EAR file. This approach may work for groups of developers working separately on the WEB and Service layer, however the ideal structure here should be one top-level project that has multiple modules under it - in this case 3 (the ear, the war, the jar). If you don't want to use MyEclipse, then I would go with the m2eclipse plugin. This is pretty good and allows you to create projects based on archetypes available. For example an archetype that helps in building a EAR file is "maven-archetype-j2ee-simple". Using m2eclipse, you can select this archetype by building a new Maven project and selecting the "internal" catalog that has these predefined template projects. You essentially want your project structured according to this archetype.At command line you can retrieve it using the following command:
mvn archetype:create -DgroupId=com.mycompany.myapp -DartifactId=my-webapp 
-DarchetypeArtifactId=maven-archetype-j2ee-simple -DarchetypeVersion=1.0

Well, one thing I can say after all that - its not easy to find this information, but at-least its getting there. Once you have your first project setup, subsequent ones should be easy with Maven. Those of you who are used to ant may find all this a little cumbersome initially, but Maven's archetype is exactly what convention over configuration is all about. Give Maven a try, its not all bad....:).

Thursday, July 10, 2008

Modeling Services using a tool


If you are looking to do SOA modeling, I would recommend using StarUML. I think this is an awesome piece of open source software for doing UML based modeling. Its comparable to the likes of IBM's Rational tools for Object Modeling using UML notations. However, the thing I found interesting was the flexibility it offered me to tweak the tool to add another modeling profile - a service modeling profile, that I thought would be a good start in creating a "Service architecture map" in line with the methodology, Steve Jones (CTO, Capgemini) talks about for service discovery. That methodology has been contributed to OASIS and you can read about that methodology here.

If you go with that approach, the template I created for StarUML, should help you create the Service architecture map for your organization. That template can be downloaded here (Services Modeling template for StarUML). Unzip this zipfile into the modules directory of your StarUML install. When you start StarUML, you should see a new option in the "New Project By Approach" dialog called - "Service Modeling Approach". An example diagram using this template would look like this:


You can further extend this template if you wished - the documentation for StarUML is here

Monday, June 30, 2008

Using custom spring annotations in the Web and App tier

I had blogged earlier about why I thought Spring 2.5 annotations were really cool and well implemented. I had mentioned in that article how one could use one's own custom annotations through Spring 2.5 to inject dependencies into web application classes as well as the EJBs in the application tier. I went ahead and tried this out for the web and app tier to see how challenging this really was. It required the web classes that had my custom annotations to be instantiated through Spring. The Struts2 framework that's really getting a lot of momentum these days (check out Matt Raible's stats on this here) integrates well with Spring and though the documentation is not thorough, the link here gives a fairly good idea of how one goes about Spring enabling struts2. The gist here is that the object factory for all struts actions is the StrutsSpringObjectFactory. The configuration of my web.xml really needed only this listener:

  <listener>
<listener-class>
org.springframework.web.context.ContextLoaderListener
</listener-class>
</listener>

I also had an applicationContext.xml (under WEB-INF) that scanned for my own custom annotations. So that allowed my Action classes to use my custom annotations. With Struts2's native annotations, you don't need to worry about the struts-config.xml for result-jsp mappings either. Its all there in your action class. I even had advice around the action to capture timing stats thus implementing "seperation of concerns" to the hilt!! Well that took care of the web tier.

My app tier was EJB 3.0 based and I found that Spring had an interceptor that helped my cause. So my session facade had to be annotated with my custom Spring interceptor as shown here:

@Stateless(name = "HelloWorldService")
@Interceptors(MySpringBeanAutowiringInterceptor.class)
public class HelloWorldService implements HelloWorld, Serializable

The custom Spring interceptor essentially helps me to use a custom annotation (my own type for autowiring) in the EJB. The code for the interceptor looks like:

public class MySpringBeanAutowiringInterceptor extends SpringBeanAutowiringInterceptor
{
@Override
protected void configureBeanPostProcessor(AutowiredAnnotationBeanPostProcessor processor, Object target) {
processor.setAutowiredAnnotationType(CustomAutowiring.class);
}
}

Now here came the tricky part - I had to create the beanRefContext.xml for Spring and had to have it load up another context file that had information to setup Spring to search for my custom annotations as so -

<bean class="org.springframework.context.support.ClassPathXmlApplicationContext">
<constructor-arg type="java.lang.String" value="ejbApplicationContext.xml"/>
</bean>

In the ejbApplicationContext.xml, I setup the component-scan tag to search for my annotations. This worked though I was initially at a loss as to why I need to specify two context files to get dependencies to be injected by Spring. I found the reason for this on the Spring forums (by Juergen Hoeller) -

Essentially, your "beanRefContext.xml" file will typically define a ClassPathXmlApplicationContext which in turns loads your actual beans. The "beanRefContext.xml" mechanism allows multiple such shared ApplicationContexts to co-exist, differentiated by bean factory locators. In the default EJB3 case, simply define one such ApplicationContext which will be picked up automatically, regardless of name.

You can also read more on this in the Spring docs here. The other thing also was that I had to make sure I had the correct set of libraries in my EAR. Those libraries were:

- spring.jar
- asm-2.2.3.jar
- asm-commons-2.2.3.jar
- aspectjrt.jar
- aspectjweaver.jar
- common-annotations.jar
- commons-logging.jar
- log4j-1.2.14.jar

Overall, I found that the web side Struts2 integration with Spring seemed much more cleaner that the EJB one. At this point I would still recommend this approach, but make sure you have a little bit of time on your hands to get this working exactly the way you want it. If you are interested, I can further share any other information - just call out.

I will continue to tweak and research on the EJB implementation - but overall, this approach still makes for an interesting application design.

Sunday, June 8, 2008

Microsoft TechEd 2008: Oslo and WCF

This was the first time that I attended a Microsoft TechEd event. While the sessions were interesting, there was a lot of emphasis on WCF ( Windows Communication Foundation ), and understandably so I guess - its essentially a platform(WCF extensions) on a platform (.NET) and I say that because of the swooping changes that are brought to the .NET platform by adding the WCF implementation to it. At the core, WCF implements WS-* standards for web services, however there are very few vendors (if any other) that have implemented WS-* to this degree and simplicity. I think that right there should be a wakeup call for Java folks. I was pretty impressed with what I saw of WCF and look forward to playing with it.

Microsoft's codename "Oslo" project could also potentially be another knee-jerker for the Java camp. "Oslo" sounds interesting as a concept and is essentially a broad set of innovations touching many aspects of the application lifecycle, including languages, development tools, integration, application management, and more. The whole concept of "Oslo" is based around models. One would jump to the conclusion that this is Microsoft's MDD (Model Driven Design) methodology. However, its not really that. "Oslo" consists of 3 parts : Repository, Lifecycle manager, and schema language. The repository maintains the various model artifacts, and so models in the case of "Oslo" are anything in IT that has a representation through the schema and is maintained in the repository (eg. things like applications, computers, processes, services,SLAs). The schema extensions help to formalize definitions of basic entities (infrastructure and business based) in a software's life-cycle. What is really cool is the lifecycle manager uses the repository and the schema extensions to manage the entities represented. So for example, if a component is deployed on a certain platform/environment, on multiple machines, lifecycle manager will know about this profile and deploy the component accordingly onto the defined platform/environment.

All this occurs through sophisticated visual integration with tools like Visual Studio, Visio and Oslo's own visual editor - as a way to provide the same information to tools that people typically use in different roles ( analyst, developer etc). So Oslo really seems to me like a tool from the software development lifecycle perspective - modeling a component/service/process, building it, and then moving it and deploying it to an environment.

The key here is that there are so many bits and pieces to this colossal, that only a company like Microsoft could pull this feat through - the Open space world will probably have a hard time replicating such an effort and collaborate effectively. That would invariably give Microsoft a competitive edge. Another scary thought for the Java folks.

From what I saw at TechEd, Microsoft has some interesting technologies that have a lot of potential. C# as a language is evolving better and more rapidly than Java and so is the declarative language F# over something like Groovy on the Java side. In a services world too, WCF's ease of use and WS-* richness also place Microsoft very well in that space. "Oslo" still is a mystery, but if it comes close to what Microsoft is talking about, I think the Java space could find itself in troubled waters very soon :(.

Monday, May 12, 2008

A Single Sign-on implementation

If you are looking to implement a Single Signon implementation, then a worthwhile library to look at would be OpenSAML 2.0. The 2.0 library is well written and much more intuitive to use compared to its predecessor - OpenSAML 1.0. I recently implemented a SAML 1.1 solution using the 2.0 library on Weblogic 8.1. The library by default is for the Java 1.5 platform, however using Retroweaver, one can use it for Java 1.4 (since Weblogic 8.1 does not work with Java 1.5). The SAML Browser/Artifact profile is pretty easy to understand and implement with the library and good ol' Wikipedia gives an easy explanation of the steps needed to get the communication going. However, I would recommend that you read through the OASIS SAMLBind document to get a better understanding of implementation details.

I used Xfire (v1.2.6) to implement the Artifact Resolution Service at the IdP that the Service provider invokes once it receives the artifact posted to it. Since the OpenSAML library works with the raw xml document, one needs to configure xfire to prevent marshalling of the incoming xml to java POJOs. XFire has a cool way to work with raw xml documents, using a binding called as Message Binding. So the xfire service interface looks like:
public XMLStreamReader invoke(Document samlRequest) throws Exception
and the OpenSAML library can then work with the DOM document using the XMLObjectBuilderFactory request factory.

Some quirks here - while returning the response, I had to return an object of type XMLStreamReader - I used W3CDOMStreamReader to convert to this type. Also, for getting SOAP faults correctly, the interface definition of the service had to throw the base "Exception"class and not a specific exception type.

While the implementation does add a bit of added complexity through the use of XFire, it works out pretty well and allows one to use these excellent libraries -XFire & OpenSAML 2.0 - together to leverage the best of both.

The solution is on its way into production this week - fingers crossed....:).

Wednesday, April 30, 2008

SOA - how does one discover services??

Lately in the SOA yahoo group, I was involved in a discussion about what one would consider a successful SOA implementation. Given that SOA really has to be considered from a business perspective, there are very few case studies out there that provide details about the business specifics of implementation. These case studies only talk about the integration of various systems. Very disappointing and frustrating for people to learn from.

A successful implementation of Business SOA really has to outline and give details about the process of service discovery (business services more specifically) and how an organization goes about outlining the Service architecture map that helps it to zoom in on those services that provide strategic /tactical advantage versus ones that are more utility services and could probably be bought. KPIs around these business services would then help the organization measure the benefits of these business services in the strictest sense in line with its strategies.

So, as I research on SOA, those interesting case studies elude me - however the best source of material I've come across has been from Steve Jones (CTO, CapGemini). In general though, service discovery comes very close to the methodology of discovering domain objects in an enterprise wide system. As Steve has eloquently put - services tend to be around functional nouns, so "Order Management" is a good example, it will have capabilities that are the verbs "newOrder","createDispatch", etc and it will have priorities for its operation e.g. availability, response times, dispatch sizes etc.

It definitely seems like OO design, however this discovery has to be top-down, with business folks heavily involved throughout service discovery. Most organizations struggle with that, but maybe there's light at the end of the tunnel as the industry gets a better understanding and feel for SOA and what such undertakings really require an organization to do.

Thursday, April 24, 2008

Liking Spring 2.5 (love annotations)

Ok, I think I feel comfortable in adopting or suggesting the Spring framework for JEE projects. And I am sure some people would say that they had reached that conclusion a long time ago. However, earlier I would tend to disagree on a couple of counts if someone suggested that Spring was the best thing to happen to Java and a better alternative to the existing JEE spec. The things I despised about Spring earlier were:
  • Too many libraries to include
  • Too much in an Xml configuration leading to some tricky runtime debugging errors
However, the Spring folks seem to have taken notice and have addressed these problems with Spring 2.5. So at my first opportunity, I decided to code with Spring 2.5 leveraging it mainly for its annotation facility. A really good resource I found for seeing the customization of Spring's annotation features is on InfoQ. However, I also extended a couple of the classes in Spring to be able to mimic resource manager injection capability (aka autowiring), similar to JPA persistent manager injection in JEE 5. The code looks something like this:

@MyAnnotation
public class WiredBean
{
@MyInjectionAnnotation(resourceManager=AResourceType)
private Resource resource;

public boolean isInjected()
{
if ( resource!=null )
return true;
else
return false;
}
}

Spring allows renaming of annotations so that there is no dependency in your code on Spring classes. However to be able to inject (or autowire objects) based on your own annotation, requires some coding and essentially the following 2 things -
  1. Create your own class that extends InstantiationAwareBeanPostProcessorAdapter ( look at PersistenceAnnotationBeanPostProcessor for an example or email me for my example)
  2. Your Sping xml configuration needs really only 2 entries:
    1. An entry to register your own AnnotationBeanPostProcessor
    2. An entry to scan for your own autowiring annotation that allows one to override default behavior.
And you'll get code that looks like the above. Injecting your own manager objects into a JEE 5 application could be done using a Filter on the web side and Interceptors on the service/ejb side. If you are in an organization that wishes to avoid direct dependency on Spring all over the place in their JEE apps, custom annotations as I've mentioned allows for that separation.