back to CATS home

Resources

Open Source Performance Monitoring Tools, Tips and Tricks for Java

User: jarcher
Date: 4/25/2007 3:24 pm
Views: 22450
Rating: 1    Rate [

+

|

-

]

On Friday I attended a session named Open Source Performance Monitoring Tools, Tips and Tricks for Java by Matt Secoske, that covered a number of different tools to help profile and monitor performance for Java applications. Some of the important issues he covered in his talk were:

A. Plan for Performance
1) determine your performance goals
2) create testing scenarios
3) determine monitoring/profiling needs
4) integrate into development process (continuous performance testing)
5) integrate into production environmnet

B. Performance questions to ask
1) what is your expected total number of clients?
2) what is your expected peak concurrent number of clients?
3) what are the most common task(s) these clients will be doing?
4) what is an acceptable response time?
5) how long will the data stay around?

C. Types of profiling tools that can/should be used
1) load testing/driving
2) logging / log analysis
3) contained profiling (profiler wraps application)
4) external profiling (JVMPI)
5) java.lang.instrumentation

Some tools that were suggested:
* JUnitPref (decorates existing JUnit tests, great for benchmarks & continuous performance testing) -- might be included in JUnit 4.0
* The Grinder (clusterable performance testing, stress, load, capacity and functional testing, has a proxy for recording traffic of real users, scriptable in Jython)
* Apache JMeter (stress, load, capacity, and functional testing, plug-in architecture for customization)
* Log files / analysis
* Logging tools: Log4J + Aspects == simple, transparent and targeted testing.
* Aspect-based logging tools: aspectj, aspectwerkz, java interactive profiler, glassbox inspector
* JFluid / NetBeans Profiler
* Eclipse TPTP

Some Tips and Tricks:
1) Put in just enough metrics to get your performance measurements
2) performance testing environment is not the production environment
3) real-world data + real-world usage patterns + near production environments == accurate benchmarks
4) keep a little monitoring in production.

And his final thoughts:
1) performance monitoring, like most things in software development, is an iterative process.
2) Initial set-up will take longer than expected, but it's worth it.
3) Premature optimization is the root of all evil
4) Know when and what to optimize; this comes from experience and profiling.
5) make performance a part of your development process

His slides can be found at: sccosoft.net

Replies

PreviousBackNext
Login