The need for speed

For the past 6 weeks I’ve been working on developing a prototype that acts as a front end for a query service. The queries come in quite a few different shapes and sizes and the prototype only implements a vertical slice through a few of the queries.
The architecture is quite simple, there is a web front end, some middleware, a proxy/gateway and a legacy data source. All of this is written in Java, with a J2EE server talking to the proxy (Java application). The interesting part of the prototype was to have a pluggable middleware layer so we could test a few different technologies (raw sockets, JMS and Javaspaces) for performance.
One of the interesting parts of this project is that the legacy datasource is fast, very, very fast. A query returning a single result across a very, very large dataset returns in about 15-20 ms. A larger result set would take closer to 60ms. Now, I’ll pose a question here (which is answered later so no peeking), given the architecture, and given the technologies involved, what do you think the slowest point would be ?
The implementation used Tomcat as the J2EE container, and had JSP for producing the output (both an HTML view and a data only view). The hardware that initial testing was done on my desktop P4 3.0Ghz and later on some whizzy Sun hardware.
We had simple monitoring for the transactions at the browser (using JMeter), at the servlet and at the proxy/gateway. These consisted of keeping the average transaction time at that point. JMeter provided a couple of other bits of information, but we only compared the averages.
We also tested a “no-load” scenario (4 transaction/sec), up to “extreme load” (600 transaction/sec). No attempt was made during the development of the system to optimise for performance, we were only really interested in comparative performance for the middleware, but total performance would have become an issue if average transaction time approached 500ms.
Thankfully, that was never an issue. The final results of the system indicated that the vast majority of the time was spent in the legacy system (remember, only 20ms). The web and middleware only adding a total of 10-15ms back to the browser. This was fairly consistent under extreme load, which was a result that surprised us, and the performance testing experts who came in to independently measure the results.
How many people would have guessed that final result ? Certainly not me. I’ve never thought that J2EE was slow, but to actually participate in the development of a system and produce real metrics was very interesting.
So, it’s completely possible to produce well performing J2EE systems, and if your system doesn’t perform, don’t blame J2EE, blame your code.

Advertisements

Goodbye G00fy

On Tuesday night I bid farewall to my motorbike. I finally gave it to common sense as my arm is too badly injured to be able to control the bike at anything above walking speed.
I had a bunch of great times on the bike, and it was mechanically brilliant. It was (is) a 1992 Honda CBR600, basically bulletproof and is going to a new owner that will love and care for it (and ride it more than I will). I bought it new, and have managed a number of rides on Phillip Island racetrack which will still remain one of the highlights of my life.
I don’t know what the quality of Honda engineering is these days, but I can’t fault the bikes of the era of mine. All the owners I’ve ever met have raved about how you have to shoot the bikes to kill em. I hope that’s still the case.
I can’t help feeling that a little bit of my youth just left me when I sold the bike. I think I’ll have to try and find something equally as dangerous, stupid and exiting to try to maintain it. Growing old isn’t about growing up.