A tokenizer tool built with React, TypeScript, and Vite. Analyze text across multiple AI providers with real-time token counting, cost calculation, and advanced visualization features.
Supports OpenAI, Anthropic, and Google AI tokenizers with cost estimation.
Check out: https://tokenizer.twinql.ai/
- Text Input: Paste text directly or upload multiple files simultaneously
- Real-time Metrics: Character count, word count, and token count
- Frequency Analysis: Top 10 most frequent words and tokens with interactive charts
- OpenAI: Real-time tokenization with models like GPT-4o, o1, o3, o4-mini
- Anthropic: API-based counting for Claude Sonnet 4, Claude Opus 4, Claude 3.5 models
- Google: Gemini 2.5 Flash, Gemini 2.0 Flash, and Gemini 2.5 Pro support
- Real-time Pricing: Input/output cost estimation for all supported models
- Multi-Currency: USD and INR support with live exchange rates
- Per-File Breakdown: Individual token and cost analysis for attached files
- Text Files:
.txt,.md,.csv,.html,.css - Code Files:
.js,.jsx,.ts,.tsx,.json,.py,.java,.c,.cpp,.go,.rs - Documents:
.docx,.pdfwith full text extraction - Spreadsheets:
.xlsx,.xlswith data parsing - Multi-file Upload: Attach and analyze multiple files with individual token tracking
- Token Visualization: Color-coded token display with virtualization for large texts
- Interactive Charts: Word and token frequency analysis with responsive charts
- Dark/Light Mode: Toggle between themes
- Token/ID Toggle: View actual tokens or their numeric IDs (OpenAI models)
- Keyboard Shortcuts:
Ctrl/Cmd + Enterfor quick token calculation - Local Storage: API keys and preferences saved locally
- Responsive Design: Works across desktop and mobile devices
- Real-time Updates: Instant recalculation for OpenAI models
-
Clone the repository:
git clone <repository-url> cd tokenizer
-
Install dependencies:
npm install # or yarn install -
Set up API Keys:
- Anthropic: Obtain API key from Anthropic Console
- Google: Obtain API key from Google AI Studio
- API keys are stored locally in your browser for security
-
Run the development server:
npm run dev # or yarn dev -
Open your browser and navigate to the local development URL (usually
http://localhost:5173).
No data leaves your browser.
We are actively looking for contributions! They are heartily welcome! Please feel free to submit issues or pull requests.
- Thanks to @niieani for his amazing lib https://github.com/niieani/gpt-tokenizer
-
Better and organized UI -
Add claude tokenizer support -
Input/Output Pricing for each model -
Multi-file uploads -
Image token counting (from upload and screenshot) - Better text analysis and recommendations to reduce token usage
- Automatic token minification