What exactly is the application container Apache Karaf?


Apache Karaf is a small OSGi-based runtime environment that provides a lightweight container capable of hosting various components and applications. Karaf offers numerous features familiar to those who use application containers based on Java EE or Spring.

karaf_logo

Those encompass the support for the Java Authentication and Authorization Service (JAAS), different dependency injection frameworks such as OSGi blueprint and Spring, the support for Java Persistence API (JPA) and Java Transaction API (JTA) and the offering of clustering, monitoring, and cloud integration utilities.

This article is the first of a series covering topics on the development and operation of OSGi based applications with Karaf.

Overview of Core Features

As already mentioned in the introduction, Karaf offers a comprehensive set of core features. These core features are structured along the areas of provisioning and deployment, logging, dynamic configuration, administration and management and last but not least OSGi framework support.

Provisioning and Deployment

Karaf provides different options for application provisioning and deployment which allows deploying artifacts as feature sets. This enables the structuring of artifacts into larger deployment units, which is a great way to increase the re-use of existing functionality and therefore reduces the overall footprint of your application.

Even if  feature deployment is a great way to structure OSGi based applications, Karaf’s deployment options are not limited to this approach. The container allows also to deploy so-called “Karaf archives” or even more commonly used web archives.

Logging

Additionally, Karaf provides a broad support of logging frameworks by integrating the Pax Logging project. Pax Logging integrates many of today’s popular logging frameworks and closes the gap that emerged by the OSGi community’s decision to discontinue their logging service due to the fact that numerous competing logging frameworks were already available and commonly used.

Dynamic Configuration

While it has been a challenge for other application containers to realize dynamic configuration

capabilities, Karaf offers the opportunity to interact with configuration changes at runtime. In order to utilize dynamic configurations, an application needs to register with the management service with its reference. The management service is then able to pass in any configuration change back to the application bundle.

Although this is a great way to make configuration changes without restarting the application, it is the application’s responsibility to react to these changes, i.e. by destroying and restarting an existing thread.

Administration and Management

Karaf offers several ways to administer and manage the container itself and all application artifacts executed within the container runtime. The most comprehensive administration interface is the extensible command-line interface that is remotely accessible via an incorporated SSH server. In addition to the command-line interface, Karaf offers also a comprehensive web UI that exposes all basic management features to your browser.

OSGi Framework Support

Finally, the container supports different OSGi frameworks. Per default, Apache Felix is pre-configured, but Karaf supports also Eclipse Equinox while it is theoretically possible to run on any OSGi environment.

History of Origins

The roots of Apache Karaf reach back to the kernel development of the Apache ServiceMix project. The aim

of the ServiceMix subproject was to develop a simple-to-use command-line interface for the administration and management of OSGi artifacts.

The development team announced in spring 2008 publicly that they completed the third milestone of the kernel project. It took only until the September 18th, 2008  to release the first version of the kernel as GA.

Even if ServiceMix was a great home for the kernel project during the early days, the project moved in spring 2009 under the Apache Felix project umbrella and got renamed to “Karaf”. The developer community made that decision to increase the attention towards the project within the broader OSGi community and grow the developer base.

Moving the project eventually paid off and Karaf received a promotion to become a top-level Apache project. Since that time, the project has continued to improve the runtime environment and added numerous features.

Summary

Given the fact that Karaf is around for almost eight years, has maintained a solid developer community with most founding members still on the team and has grown a comprehensive feature set, it is fair to say that Karaf is a stable application container to host OSGi based applications and more. The structure of the container offers a large variety of application areas, covering all areas of application development and its adaptability serves as a foundation for many operational scenarios.

Converting Locales to Currencies with Java Using Spring

Successful commercial software applications have to deal with internationalization and localization so that the software can be distributed to other countries in the world, no matter if it is a desktop application or an online service which is used via the Internet. Besides the translation of the language, it is also important to incorporate regional differences such as number and date formatting, or the display of the regional currency.

8463683689_baa33ca431_z

This article provides an introduction to Java Locale and Currency, the Java based Currency retrieval and provides further detail on how to embed more advanced currency conversions in your application logic.

Java Locale

Regional characteristics are coded into the Local object of java.util. According to the API documentation, a Locale object

… represents a specific geographical, political, or cultural region.

Locales are commonly used to tailor information to the end user, such as date and number format representations. To achieve localization, a Locale object consists of a language, a script (e.g. Latin or Cyrillic), a country and some additional fields like variant or extension that contain additional formatting information.

Java Currency

The Currency object of Java is a representation of the ISO 4216 currency code list. To retrieve a Currency object, you are supposed to call one of the getInstance methods. One of those methods returns a Currency based on a Locale object.

Based on the method signature, it seems that any Local object can be mapped against a Currency. However, it turns out that only Locale objects that fulfill a set of preconditions can be actually used to instantiate a Currency object.

Discovering the Relation between Locales and Currencies

First of all, a Currency and a Locale do not maintain a common reference that can be used to navigate between both types. A Currency only maintains attributes that describe a currency in more detail, such as a currencyCode or a symbol. On the other side, Locale objects hold only information that are related to language or country settings. So there is no direct relation between both classes, other than both have the capability to be serialized.

CurrencyConversion

However, since a Currency has an instantiation method, Java provides some application logic to derive a Currency from a Locale object.

Converting Locales into Currencies in Java

Locale objects can be initialized with a subset of all attributes. It is therefore possible to instantiate an Locale by passing in e.g. only the language information to the constructor. However, when we want to instantiate a Currency, the country information becomes important, since Java can not interpret language-only Locale parameters and throws an IllegalArgumentException.

Although the exception handling may be misleading, since we passed in a valid Locale object. The application behavior makes sense, since currencies relate rather to a country, not a language. English is a language spoken in many countries such as Great Britain, the United States, Canada, New Zealand or Australia. However each of these countries maintains their own currency. So even if the Locale object is totally valid, it does not contain enough information to derive the desired information.

Advanced Conversions via Spring Converters

To avoid scenarios where your application does not provide detailed failure information, you might want to wrap the Java conversion logic in your own converter and provide a service for that. Since Spring 3.x, type conversion facilities have become part of spring-core. One option to wrap your conversion logic would be therefore to implement a Locale to Currency converter.

To expose the converter, you may want to implement a conversion services and add the converter to the service.

Once the service is in place, you can simply add the conversion service to your application context and retrieve the Currency from your newly created service.

The Good, the Bad and the Ugly

Overall, I would like to conclude that Java’s functionality to derive currency information from regional information is quite powerful and easy to use. However the approach has two limitations, from my perspective

First of all, there is definitely room to improve the built in exception handling. It would be desirable, if a failed conversion provides detailed information on the cause instead of the generic IllegalArgumentException.

The second limitation is related to the support of multiple currencies per country. Currently, Java supports only a single currency per country, which covers the reality in most of the countries. However, some countries, such as Serbia-Montenegro, maintain multiple legal tenders at the same time or have a secondary currency next to the country’s official exchange. Those more exotic cases are not built into the core framework.

Finally, if you face one of the exotic use cases and you need to extend the base functionality, you might want to utilize the conversion facilities of your application framework. In this article, I utilized the conversion mechanism of Spring, but other frameworks provide equal support.

Java 8 Time API (JSR310), Hibernate and Spring-Data-JPA

Recently I spend some time porting one of my old applications to Java 8. Java 8 offers besides several nice enhancements with respect to the overall language expression (i.e., Lambda expressions or the Stream API) also a new Date-Time package released under the JSR 310. Since I’ve been working for quite some time with joda-time, I was really interested what Java 8 had to offer.

Calendar

First off all, although JSR310 is not a direct port of joda-time, it is very intuitive and migrating from joda-time to the new java.time.* package started without any unexpected impediments. I was able to port my resources and REST services with having any major problem, but when I finally reached the point to port my persistence layer, consisting of spring-data-jpa and Hibernate with an underlying MySQL engine I discovered that the transition is not as smooth as expected. I’ve implemented my entities back in the days using the  java.util.Date functionality, and never bothered touching this layer and adjust it in order to use joda-time or any other date library. Therefore I never ran into the issue of Date and Time serialization from hibernate to the underlying persistence store. Since I upgraded all my underlying dependencies to use the latest version, my expectation was that I could adjust my Entities using java.time.LocalDate and java.time.LocalDateTime without any further adjustments. Surprisingly – at least for me – this was not the case.

The following two sections give an overview on what I discovered and how I was able to overcome my implementation serialization issues.

Serialization on DDL Generation

The first observation I made was during the DDL generation of Hibernate. Although some of you may point out, that Hibernate’s DDL generation is by no means built to keep you production database schema up to date and there are better tools to manage your database schema changes, it is a great way to setup your test database. It is also a good indicator to see Hibernate’s default type mapping, i.e. how the object relational mapper (ORM) treats the LocalDate type.

Default Column Definiton: LocalDate to TINYBLOB

The first change I made was adjusting the existing orderDate and migrate it from java.util.Date to java.time.LocalDate. Since I wanted to reveal the mapping behavior, I did not specify a column definition in my @Column JPA annotation. After the change, my @Entity object contained an id and a date field.

In order to generate the DDL on application startup, I enabled the HibernateJpaVendorAdapter to generate the DDL. Surprisingly I found out that Hibernate treats the LocalDate field as binary object and translates it into a TINYBLOB. Having written quite a few SQL statements in my life, I can definitly confirm that date and time functions are quite common in order to extract meaningful information of your data. Since the default mapping ends up as a binary object, it becomes necessary to translate the binary object into a date object with every statement execution that requires data and time functions. Additionally, any index over a date may be wasted, since the binary-to-date translation may not operate directly on the index and will therefore not achieve the desired performance optimization.

Custom Column Definiton: LocalDate to DATETIME

Since the default mapping on DDL generation did not result in the desired DATETIME fields, it becomes necessary to add a column definition to the JPA entity. Specifying an annotation attribute columnDefintion = “DATETIME” will force Hibernate or an other ORM to use the specified database type.

The explicit definition of the column results in the the right DATETIME field in the database. However, it is worth to note that specifying columnDefinitions will bind the application code closer to the underlying RDBMS and may introduce therefore a bigger effort when migrating between different database systems. This statement is especially valid when using RDBMS specific data types in your columnDefinition section.

Data Serialization on Query Execution

Although we are now able to map the LocalDate object to the correct database type on DDL generation, this does not imply that the ORM system is already capable to serialize objects with the correct data type. To evaluate the statement execution, I’ve created a simple test case that attempts to insert an object to the database. The following snippet shows the Hibernate generated SQL statement with relevant fields.

When executing this statement, I ran into a DataIntegrityViolationException that finally pointed to an insert attempt of an incorrect DateTime object to the ORDER_DATE column. The behavior was somewhat expected, since the columnDefinition has no direct impact on the query or statement execution and does therefore not facilitate any mapping from LocalDate to Date.

In order to support JSR310 when using JPA and an underlying ORM, it is still necessary to convert from LocalDate to Date objects. If you require full control over your date conversion, you might want to consider writing your Spring @Converter yourself. Since I had less ambitious goals, I found a nice spring-data-jpa class called Jsr310JpaConverters that contained the mapping logic meeting my needs. to configure the converter, I simply added the conversion package to my entity manager package scan. The entity manager will pickup the converter class and execute the conversion back to java.util.Date so that any JSR310 DateTime object can be directly used in your @Entity.

Conclusion

The migration to Java 8 offers a lot more functionality and it is worth to consider an upgrade, without mentioning the official support and maintenance cycles of each version. However, when you consider to upgrade and adopt new features, make sure that the underlying dependencies support the features already. The spring-data-jpa and Hibernate example shows that certain components have a faster adoption rate whereas others may need more time to implement new features provided.

If you consider upgrading to Java 8, this article demonstrates some pitfalls in case you may want to apply the new DataTime features to your JPA entities. I also hope, that the article provides sufficient detail to make a decision if the new DateTime functionality adds actually a lot value to your entities, or if the conversion should be executed at a different application level. Personally, in my scenario, it was a good decision to migrate my entities, since I was able to apply the conversion class provided by spring-data.