Official multi-platform upload SDK for DataNodes - a file hosting service.
Looking for pre-compiled binaries? Download ready-to-run executables from upload-binaries - no dependencies required!
Features:
- Multi-threaded chunked uploads for maximum speed (800+ MB/s)
- Automatic retry with exponential backoff
- Batch file and folder uploads
- Progress bar with speed and ETA
- Cross-platform support
| SDK | Speed | Notes |
|---|---|---|
| Go | ~885 MB/s | Fastest, single binary |
| Node.js | ~840 MB/s | Uses worker threads |
| Python | ~930 MB/s | ThreadPoolExecutor |
| Bash | ~900 MB/s | Uses curl + parallel |
| PHP | ~800 MB/s | curl_multi |
| PowerShell | ~750 MB/s | Runspace pools |
Speeds measured uploading 20GB file on 1Gbps connection
# Requirements: curl, jq, bc
./datanodes-upload.sh myfile.zip YOUR_API_KEY# Requirements: Python 3.6+, requests
pip install requests
python3 datanodes-upload.py myfile.zip YOUR_API_KEY# Requirements: Node.js 14+, axios
npm install axios
node datanodes-upload.js myfile.zip YOUR_API_KEY# Requirements: Go 1.18+ (no dependencies)
go run datanodes-upload.go myfile.zip YOUR_API_KEY
# Or build binary
go build -o datanodes-upload datanodes-upload.go
./datanodes-upload myfile.zip YOUR_API_KEY# Requirements: PHP 7.4+ with curl extension
php datanodes-upload.php myfile.zip YOUR_API_KEY# Requirements: PowerShell 5.1+ (Windows) or 7+ (cross-platform)
.\datanodes-upload.ps1 -Files myfile.zip -ApiKey YOUR_API_KEYAll SDKs support these options:
| Option | Description | Default |
|---|---|---|
-t, --threads |
Parallel upload threads | 10 |
-r, --retries |
Retries per chunk | 3 |
-f, --folder |
Destination folder ID | 0 (root) |
-d, --directory |
Upload entire directory | - |
-R, --recursive |
Include subdirectories | off |
# 20 threads, 5 retries, to folder ID 123
./datanodes-upload.sh -t 20 -r 5 -f 123 myfile.zip YOUR_API_KEY./datanodes-upload.sh file1.zip file2.zip file3.zip YOUR_API_KEY./datanodes-upload.sh -d /path/to/folder YOUR_API_KEY./datanodes-upload.sh -d /path/to/folder -R YOUR_API_KEY- Log in to DataNodes
- Go to Account > API Keys
- Copy your API key
To upload to a specific folder:
- Navigate to the folder on DataNodes
- The folder ID is in the URL:
https://datanodes.to/files?fld_id=123 - Use
-f 123when uploading
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓
┃ ✓ UPLOAD COMPLETE ┃
┗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┛
Summary
─────────────────────────────────────────────────
File: myfile.zip
Size: 19.53 GB
Upload Time: 22.6s
Avg Speed: 885.36 MB/s
Links
─────────────────────────────────────────────────
Download: https://datanodes.to/abc123xyz/myfile.zip
Delete: https://datanodes.to/abc123xyz/myfile.zip?killcode=xyz789
Detailed documentation for each SDK:
- Get Upload Server - Requests dedicated upload server from API
- Generate Session ID - Creates unique 16-digit session identifier
- Chunk File - Splits file into 1-100 MB chunks
- Parallel Upload - Uploads chunks simultaneously (10 threads default)
- Retry Failed - Automatically retries failed chunks with backoff
- Finalize - Imports chunks into final file on server
GET https://datanodes.to/api/upload/server?key=API_KEY
PUT https://server.datanodes.to/upload/put_chunk_mt.cgi
Headers:
X-Upload-SID: <session_id>
X-Seek-To: <byte_offset>
Body: <chunk_data>
POST https://server.datanodes.to/upload/import_file
Body: fn=<filename>&fld_id=<folder>&st=OK&op=upload_result&sess_id=<session_id>
Contributions are welcome! Please feel free to submit a Pull Request.
MIT License - see LICENSE file.
- Issues: GitHub Issues
- Website: datanodes.to