A high-performance, configurable URL health checker built in Go using goroutines, channels, worker pools, and context-based cancellation.
This project simulates a real-world concurrent monitoring system, capable of validating large sets of URLs efficiently with structured logging and graceful shutdown.
-
⚡ Concurrent URL checking using worker pool pattern
-
📂 Reads URLs from external file (
-filename) -
⚙️ Config-driven system using JSON (
-config) -
🧵 Efficient concurrency with goroutines & channels
-
⏱ Latency measurement for each request
-
🔁 Automatic redirect handling
-
🔒 Detects SSL/TLS and network failures
-
❌ Handles invalid URLs and unreachable hosts
-
📊 Categorized responses:
- 2xx → Success
- 3xx → Redirect
- 4xx → Client Error
- 5xx → Server Error
-
🪵 Structured logging using
slog -
📁 Logs written to:
- Console
- Output file (
output-<filename>.txt)
-
🛑 Graceful shutdown using OS signals (
SIGINT,SIGTERM) -
🔄 Context-based cancellation for safe worker termination
- Goroutines & concurrency primitives
- Worker Pool pattern
- Buffered vs unbuffered channels
- Context cancellation (
context.WithCancel) - Graceful shutdown handling
- Channel lifecycle management
- Structured logging (
slog) - File I/O (streaming input)
- Real-world concurrent system design
.
├── config.json
├── go.mod
├── internal
│ ├── model
│ │ ├── config.go
│ │ └── result.go
│ ├── urlchecker
│ │ ├── urlchecker.go
│ │ └── urlchecker_test.go
│ └── worker
│ └── worker.go
├── main.go
└── readme.md
-
Config is loaded from
config.json -
URLs are streamed from input file
-
URLs are pushed into
url channel -
Worker pool processes URLs concurrently
-
Each worker:
- sends HTTP request
- measures latency
- classifies response
-
Results are sent to
result channel -
Main goroutine consumes results and logs output
+----------------------+
| URL File Input |
+----------+-----------+
↓
+----------------------+
| URL Channel |
+----------+-----------+
↓
+-------------------------+
| Worker Pool (N) |
| ┌───────────────────┐ |
| | URL Checker | |
| └───────────────────┘ |
+-----------+-------------+
↓
+----------------------+
| Result Channel |
+----------+-----------+
↓
+----------------------+
| Logger (slog) |
| Console + File Output|
+----------------------+
git clone https://github.com/rranand/URL-Checker.git
cd URL-Checkergo mod tidyExample: urls.txt
https://google.com
https://github.com
https://amazon.com
https://invalid-url.test
Example: config.json
{
"timeout_duration" : 5,
"max_idle_conn" : 90,
"max_idle_conn_per_host" : 10,
"idle_conn_timeout_duration" : 20,
"no_of_workers": 10
}go run main.go -filename=urls.txt -config=config.jsonLogs are written to:
output-urls.txt
And also printed to console.
INFO health check worker_id=3 url=https://google.com status=HEALTHY latency=477ms
INFO health check worker_id=5 url=https://amazon.com status=HEALTHY latency=1.7s
INFO health check worker_id=2 url=https://invalid-url.test status=DOWN latency=120ms
- Fixed number of workers prevents overload
- Efficient parallel processing
urlChan→ distributes workresChan→ collects results
-
Stops all workers safely
-
Triggered by:
- error during ingestion
- OS signals
-
Listens for:
SIGINT(Ctrl+C)SIGTERM
-
Cancels context → stops workers → completes processing
-
Ensures no data loss
-
Avoids premature program exit
-
Prevents race conditions between:
- worker completion
- result consumption
-
Ensures connection reuse by draining HTTP response body
-
Handles network instability gracefully
Includes real-world scenarios:
- ✅ Valid URLs
- 🔁 Redirect chains
- ❌ Client errors (4xx)
- 💥 Server errors (5xx)
- ⏱ Slow responses / timeouts
- 🔒 SSL/TLS failures
- 🚫 Invalid domains
- Retry with exponential backoff
- Rate limiting
- Metrics (Prometheus / Grafana)
- CLI enhancements
- Distributed worker nodes
- JSON logging mode
MIT License