-
Notifications
You must be signed in to change notification settings - Fork 66
Add HTTPBenchmarkApp with consolidated "real‑world" benchmarks #101
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Motivation: Added a self‑contained SwiftNIO example that demonstrates realistic throughput and latency scenarios—file streaming, high‑concurrency sums, partial I/O patterns, and lock contention. A command‑line tool that can run these benchmarks in‑process and print percentile statistics helps both users and contributors understand and tune performance. Modications: - `HTTPBenchmarkApp` implementing four HTTP endpoints and four in‑process benchmarks. - CLI flags: - `--run-all-benchmarks` to run all scenarios back‑to‑back. - `--samples <N>` to customize iteration count (default 10). - `--use-io-uring` to switch to `NIOTSEventLoopGroup` when available. - Benchmark helpers: - `measure(_:)` and `measureMultiple(iterations:block:)` for timing. - `calculateStatistics(from:)` to compute p0/p25/p50/p75/p90/p99/p100. - `formatBenchmarkTable(metric:stats:)` to render Unicode tables. Results: Users can now build and run a single tool to: 1. Execute workload benchmarks entirely in‑process, with configurable sample counts and percentile output. 2. Optionally serve HTTP endpoints for external latency measurements. This provides an extensible SwiftNIO example app that is both educational and practically useful for performance tuning.
@Option(help: "Host to bind on") var host: String = "127.0.0.1" | ||
@Option(help: "Port to bind on") var port: Int = 8080 | ||
@Option(help: "Number of samples for each consolidated benchmark") var samples: Int = 10 | ||
@Flag(help: "Enable io_uring backend (requires NIOTransportServices)") var useIOUring: Bool = false |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks wrong: NIOTransportServices doesn't use io_uring, it uses Network.framework.
// will execute consolidated benchmarks and output a report. | ||
// Optionally, the --use-io-uring flag enables NIOTSEventLoopGroup (Linux io_uring). | ||
// | ||
//===----------------------------------------------------------------------===// |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This license header isn't valid.
let server = try bootstrap.bind(host: host, port: port).wait() | ||
print("HTTPBenchmarkApp running on \(host):\(port)") | ||
try server.closeFuture.wait() | ||
try group.syncShutdownGracefully() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This seems a bit odd: we either run benchmarks or a HTTP server. Why that choice?
} | ||
dg.wait() | ||
} | ||
print(formatBenchmarkTable(metric: "Lock Contention (ms)", stats: stats4)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These benchmarks don't seem to be particularly HTTP related.
} | ||
} | ||
|
||
extension BenchmarkRequestHandler: @unchecked Sendable {} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This type definitely isn't Sendable.
Thank you for your reviews @Lukasa! Just so it doesn't seem like I've abandoned this project, I wanted to put this here: I'll continue working on this right after my exams. |
Motivations
Modifications
HTTPBenchmarkApp
implementing four HTTP endpoints and four in‑process benchmarks.--run-all-benchmarks
to run all scenarios back‑to‑back.--samples <N>
to customize iteration count (default 10).--use-io-uring
to switch toNIOTSEventLoopGroup
when available.measure(_:)
andmeasureMultiple(iterations:block:)
for timing.calculateStatistics(from:)
to compute p0/p25/p50/p75/p90/p99/p100.formatBenchmarkTable(metric:stats:)
to render Unicode tables.Results
Users can now build and run a single tool to:
1. Execute workload benchmarks entirely in‑process, with configurable sample counts and percentile output.
2. Optionally serve HTTP endpoints for external latency measurements.
This provides an extensible SwiftNIO example app that is useful for performance tuning.