Audit4j benchmarks measure the relative performance of various cases of Audit4j framework. These benchmarks will measure Throughput and Avarage Running time for various configurations and methodologies.
The benchmarks are JMH based. To run them, clone the Audit4j Benchmark project from github, then run the microbenchmarks.jar using java -jar. For example, to run the Audit4jBenchmarks which measure the performance of the general auditing service, do something like the following:
git clone https://github.com/audit4j/audit4j-benchmarks.git
mvn clean package && java -jar target/microbenchmarks.jar org.audit4j.benchmark.GeneralBenchmarks
- GeneralBenchmarks – Benchmarks on sending events calling AuditManager.
- AnnotationEventBenchmarks – Benchmarks on annotations.
- DatabaseHandler_Embeded_Benchmarks – Benchmark on database plugin and using HSQL embedded database.
Sensible defaults are placed as annotations on the benchmark classes themselves. In many cases, those can be overridden by passing the appropriate CLI parameter. Refer to the JMH documentation for more information on what options are available and what effect they have on the benchmark runs.