Torturing websocket APIs with Rust

TL;DR Project repository


I am currently working on architecting a revision of a backend infrastructure for a social media mobile application. Current API uses a WebSocket protocol, with clients maintaining a connection to an API Gateway and the business logic is deployed as an AWS Lambda.

This has worked remarkably well, the Lambda self-handles scaling and with automated deployment we could focus on delivering features. However with the influx of users we are starting to see the limitations of this model:

  • API gateway gets costly
  • Allthough it is possible to use connection pooling in the context of Lambda, the ephemeral nature of it means no guarantees and constantly opening new connections can strain your database.

Therefore we started to play with different architectural solutions that will address those shortcoming, yet still tick the boxes of load balancing and easy horizontal scalability, while keeping the costs reasonable.

Building different POCs called for a way to benchmark their performance: enter ws-load-test.

How it works

ws-load-test is a high-throughput tool for testing websocket APIs and written in Rust. It will open a number of concurrent connections to a websocket endpoint and start flooding it with PING requests, at the same time collecting various statistics on the response times.

These reported statistics are collected across all the client connection tasks using Rust port of High Dynamic Range Histograms, a low-latency, high-range histogram implementation.

Concurrent tasks (namely the WS connections) rely on the async-std asynchronous runtime, which chooses how to run them, i.e. how many threads to start and how to distribute tasks on them. async-std is a competition to de-facto standard Tokio async runtime, it has a very simple yet interesting design philosophy of mirroring the std library with async variants.

Using the tool

ws-load-test is not yet released as a crate, as I deemed it too simplistic in it’s current form, but after cloning the repository you can build the binary:

cargo build --release

The arguments can be passed on the command line, for example to open 3 concurrent client connections to the echo public websocket endpoint:

./target/release/ws-load-test -c 3 -g ws://

And this is the output you should see:


Written on November 28, 2020