Modules
The X Platform Performance Benchmark Suite consists of 7 modules, each testing a specific X Platform runtime component. This section provides detailed documentation for each module including benchmark programs, parameters, and interpreting results.
Module Overview
Encoding (ADM)
Message encoding/decoding performance across different serializers
Canonical Benchmark
The AEP Module contains the canonical end-to-end benchmark used for X Platform's official performance testing. This benchmark measures the complete Receive-Process-Send flow of a clustered microservice.
For detailed results from the canonical benchmark, see:
Component Benchmarks
The other six modules isolate and test individual X Platform components:
Low-Level Components
Time Module - Tests the overhead of X Platform's time API, which is critical for timestamping and latency measurement.
Serialization Module - Benchmarks message encoding/decoding across different serialization formats (Xbuf2, Protobuf, etc.).
Messaging and Replication
Link Module - Tests the cluster replication link that synchronizes state between primary and backup instances.
Messaging Module - Benchmarks the pub/sub messaging layer (SMA) used for inter-service communication.
Persistence and Storage
Persistence Module - Tests message and transaction log persistence performance.
Storage Module - Benchmarks the object data store (ODS) used for state persistence.
Using Module Documentation
Each module page includes:
Overview - What the module tests and why it matters
Benchmark Programs - Available test programs and their purposes
Parameters - Command-line options and their effects
Running Examples - Complete command-line examples
Interpreting Results - Understanding the benchmark output
Performance Insights - What the results tell you about X Platform performance
Next Steps
Choose a module from the list above to learn about its benchmarks
See Benchmark Suite Overview for download and installation instructions
Review Canonical Benchmark Results
Last updated

