Skip to content

mouloud240/Benchmarks

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

19 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Web Framework Benchmark

A comprehensive benchmarking suite to compare the performance of different web frameworks across various scenarios using k6 load testing.

Overview

This repository contains load tests and implementations for multiple web frameworks to determine which performs best under different conditions. Each framework implementation follows the same specification to ensure fair comparisons.

Repository Structure

benchmark/
β”œβ”€β”€ load-test/              # k6 load testing scripts
β”‚   β”œβ”€β”€ plain-text.js       # Simple GET endpoint benchmark
β”‚   └── parsing-validation.js # JSON POST endpoint benchmark
β”œβ”€β”€ go/                     # Go implementation
β”œβ”€β”€ fastapi/                # FastAPI (Python) implementation
β”œβ”€β”€ django/                 # Django (Python) implementation
β”œβ”€β”€ nest/                   # NestJS (Node.js/Fastify) implementation
└── rust/                   # Rust implementation 

Test Scenarios

1. Plain Text Response (load-test/plain-text.js)

Tests a simple GET endpoint that returns plain data without complex processing.

  • Endpoint: GET /api/v1/greetings
  • Load Profile:
    • Ramp-up to 1,000 users over 20s
    • Ramp-up to 5,000 users over 40s
    • Ramp-up to 10,000 users over 40s
    • Maintain 10,000 users for 30s
    • Ramp-down to 0 users over 20s
  • Success Criteria:
    • 95% of requests complete below 500ms
    • HTTP error rate < 1%

2. Parsing & Validation (load-test/parsing-validation.js)

Tests JSON parsing, validation, and response generation with a POST endpoint.

  • Endpoint: POST /api/v1/greetings
  • Payload:
    {
      "id": 123456789,
      "name": "user-1-5",
      "message": "hello from vu 1 iter 5",
      "greetDate": "2025-11-03T10:00:00.000Z"
    }
  • Load Profile: Same as plain text test
  • Success Criteria: Same as plain text test

Upcoming Test Scenarios

We're actively working on expanding our benchmark suite with additional real-world scenarios. The following tests will be added soon:

πŸ” JWT Authentication

Testing authentication middleware performance with token generation, validation, and refresh operations.

πŸ—„οΈ Database Queries

Benchmarking CRUD operations, complex joins, and ORM performance with real database interactions (PostgreSQL/MySQL).

πŸ“ File Uploads

Testing multipart form data handling, file validation, and storage operations with various file sizes.

βš™οΈ CPU-Bound Operations

Testing computational performance with tasks like:

  • Image processing and resizing
  • Data encryption/decryption
  • Hash computation (bcrypt, argon2)
  • JSON serialization/deserialization of large datasets

Stay tuned for these additions! Contributions are welcome.

Prerequisites

  • Docker - For running framework implementations
  • k6 - For running load tests

Installing k6

# macOS
brew install k6

# Linux
sudo gpg -k
sudo gpg --no-default-keyring --keyring /usr/share/keyrings/k6-archive-keyring.gpg --keyserver hkp://keyserver.ubuntu.com:80 --recv-keys C5AD17C747E3415A3642D57D77C6C491D6AC1D69
echo "deb [signed-by=/usr/share/keyrings/k6-archive-keyring.gpg] https://dl.k6.io/deb stable main" | sudo tee /etc/apt/sources.list.d/k6.list
sudo apt-get update
sudo apt-get install k6

# Windows
choco install k6

Running Tests

  1. Start the server you want to test:
# Example: Testing the Go implementation
cd go
docker build -t benchmark-go .
docker run -p 3000:8080 benchmark-go
  1. Run the desired load test:
# Plain text test
k6 run load-test/plain-text.js

# Parsing & validation test
k6 run load-test/parsing-validation.js

# Override the base URL if needed
BASE_URL=http://localhost:3000 k6 run load-test/plain-text.js

Contributing

We welcome contributions of new framework implementations! To add a new framework:

1. Create Framework Directory

Create a new directory with your framework name:

mkdir my-framework
cd my-framework

2. Implement Required Endpoints

Your implementation must provide these two endpoints:

GET /api/v1/greetings

Returns a simple greeting response.

Response Example:

{
  "message": "Hello, World!"
}

POST /api/v1/greetings

Accepts and validates a JSON payload.

Request Body:

{
  "id": number,
  "name": string,
  "message": string,
  "greetDate": string (ISO 8601 date)
}

Response (2xx status code):

{
  "success": true,
  "data": {
    "id": number,
    "name": string,
    "message": string,
    "greetDate": string
  }
}

3. Create Dockerfile

Create a Dockerfile that:

  • Builds your application
  • Exposes the appropriate port (map to 3000 when running)
  • Runs the server

Example Dockerfile structure:

FROM your-base-image

WORKDIR /app

# Copy dependencies files
COPY package.json ./

# Install dependencies
RUN install-command

# Copy source code
COPY . .

# Build (if necessary)
RUN build-command

# Expose port
EXPOSE 8080

# Run the server
CMD ["start-command"]

4. Test Your Implementation

Before submitting:

  1. Build and run your Docker container:

    docker build -t benchmark-myframework .
    docker run -p 3000:8080 benchmark-myframework
  2. Test the endpoints manually:

    # Test GET endpoint
    curl http://localhost:3000/api/v1/greetings
    
    # Test POST endpoint
    curl -X POST http://localhost:3000/api/v1/greetings \
      -H "Content-Type: application/json" \
      -d '{"id": 1, "name": "test", "message": "hello", "greetDate": "2025-11-03T10:00:00.000Z"}'
  3. Run both k6 tests:

    k6 run load-test/plain-text.js
    k6 run load-test/parsing-validation.js

5. Submit Your Implementation

  1. Fork the repository
  2. Create a new branch: git checkout -b add-myframework
  3. Add your implementation
  4. Commit your changes: git commit -am 'Add MyFramework implementation'
  5. Push to the branch: git push origin add-myframework
  6. Submit a Pull Request with:
    • Framework name and version
    • Brief description of your implementation
    • Test results (k6 output summary)

Test Environment

All benchmarks were conducted on the following system:

OS: Fedora Linux 42 (KDE Plasma Desktop Edition) x86_64
Host: LENOVO LNVNB161216
Kernel: 6.15.7-200.fc42.x86_64
CPU: 12th Gen Intel i5-12450HX (12) @ 4.400GHz
GPU: NVIDIA GeForce RTX 4050 Max-Q / Mobile
GPU: Intel Alder Lake-S [UHD Graphics]
Memory: 7814MiB / 15705MiB

Hardware Specifications:

  • Processor: Intel Core i5-12450HX (12 cores, up to 4.4 GHz)
  • RAM: 16 GB
  • Graphics: NVIDIA RTX 4050 Max-Q + Intel UHD Graphics
  • Operating System: Fedora Linux 42
  • Kernel: 6.15.7-200.fc42.x86_64

Benchmark Results

πŸ“Š For detailed metrics and complete results, see RESULTS.md

Summary Table

Framework Plain JSON Response JSON Parsing & Validation
Avg Duration p95 / Req/s Avg Duration p95 / Req/s
Rust (Axum) 9.84 ms 48 ms / 16,459 req/s 18.89 ms 85.11 ms / 14,648 req/s
Golang 24.54 ms 100.01 ms / 13,753 req/s 41.92 ms 186.04 ms / 11,431 req/s
NestJS (Fastify) 113.84 ms 485.8 ms / 13,254 req/s 407.14 ms 525.46 ms / 7,650 req/s
FastAPI 1.51 s 3.25 s / 3,076 req/s 2.06 s 4.74 s / 2,376 req/s
Laravel 9.51 s ⚠️ 60s / 529 req/s 10.9 s ⚠️ 60s / 467 req/s
Django 33.37 s ⚠️ 60s / 166 req/s 33.15 s ⚠️ 60s / 163 req/s

Legend:

  • ⚠️ = Failed to meet thresholds (high error rate or timeout issues)
  • p95 = 95th percentile response time
  • Req/s = Requests per second (throughput)

Key Findings

  1. πŸ† Winner: Rust (Axum) - Fastest average and p95 response times in both scenarios, though it comes with a steeper learning curve and more complex setup compared to other frameworks
  2. πŸ₯ˆ Runner-up: Golang - Excellent performance with minimal overhead
  3. πŸ₯‰ Third Place: NestJS (Fastify) - Best performing Node.js framework, competitive throughput
  4. FastAPI - Moderate performance, suitable for Python ecosystem
  5. Laravel & Django - Struggled under high load with significant timeout issues

Guidelines for Fair Comparison

  • All implementations should use production-ready configurations
  • No caching mechanisms unless they're framework defaults
  • Logging should be minimal (errors only)
  • Use the most recent stable version of each framework
  • Run tests on the same hardware and conditions
  • Close all unnecessary applications during testing

Contact

For questions, suggestions, or contributions, feel free to reach out:

Email: mouloudhasrane@gmail.com/bouorumanamoundher@gmail.com

License

This project is licensed under the MIT License - see the LICENSE.md file for details.

About

Web Framework Benchmark

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors