diff --git a/.gitignore b/.gitignore index 3df5d7033..99921d3aa 100644 --- a/.gitignore +++ b/.gitignore @@ -44,3 +44,6 @@ plugins/* # Binary files and backups bin/pixlet/ config/backups/ + +# Starlark apps runtime storage (installed .star files and cached renders) +/starlark-apps/ diff --git a/docs/STARLARK_APPS_GUIDE.md b/docs/STARLARK_APPS_GUIDE.md new file mode 100644 index 000000000..d4a85a717 --- /dev/null +++ b/docs/STARLARK_APPS_GUIDE.md @@ -0,0 +1,500 @@ +# Starlark Apps Guide + +## Overview + +The Starlark Apps plugin for LEDMatrix enables you to run **Tidbyt/Tronbyte community apps** on your LED matrix display without modification. This integration allows you to access hundreds of pre-built widgets and apps from the vibrant Tidbyt community ecosystem. + +## Important: Third-Party Content + +**⚠️ Apps are NOT managed by the LEDMatrix project** + +- Starlark apps are developed and maintained by the **Tidbyt/Tronbyte community** +- LEDMatrix provides the runtime environment but does **not** create, maintain, or support these apps +- All apps originate from the [Tronbyte Apps Repository](https://github.com/tronbyt/apps) +- App quality, functionality, and security are the responsibility of individual app authors +- LEDMatrix is not affiliated with Tidbyt Inc. or the Tronbyte project + +## What is Starlark? + +[Starlark](https://github.com/bazelbuild/starlark) is a Python-like language originally developed by Google for the Bazel build system. Tidbyt adopted Starlark for building LED display apps because it's: + +- **Sandboxed**: Apps run in a safe, restricted environment +- **Simple**: Python-like syntax that's easy to learn +- **Deterministic**: Apps produce consistent output +- **Fast**: Compiled and optimized for performance + +## How It Works + +### Architecture + +```text +┌─────────────────────────────────────────────────────────┐ +│ LEDMatrix System │ +│ ┌────────────────────────────────────────────────────┐ │ +│ │ Starlark Apps Plugin (manager.py) │ │ +│ │ • Manages app lifecycle (install/uninstall) │ │ +│ │ • Handles app configuration │ │ +│ │ • Schedules app rendering │ │ +│ └─────────────────┬──────────────────────────────────┘ │ +│ │ │ +│ ┌─────────────────▼──────────────────────────────────┐ │ +│ │ Pixlet Renderer (pixlet_renderer.py) │ │ +│ │ • Executes .star files using Pixlet CLI │ │ +│ │ • Extracts configuration schemas │ │ +│ │ • Outputs WebP animations │ │ +│ └─────────────────┬──────────────────────────────────┘ │ +│ │ │ +│ ┌─────────────────▼──────────────────────────────────┐ │ +│ │ Frame Extractor (frame_extractor.py) │ │ +│ │ • Decodes WebP animations into frames │ │ +│ │ • Scales/centers output for display size │ │ +│ │ • Manages frame timing │ │ +│ └─────────────────┬──────────────────────────────────┘ │ +│ │ │ +│ ┌─────────────────▼──────────────────────────────────┐ │ +│ │ LED Matrix Display │ │ +│ │ • Renders final output to physical display │ │ +│ └────────────────────────────────────────────────────┘ │ +└─────────────────────────────────────────────────────────┘ + + ▲ + │ + Downloads apps from + │ +┌───────────────────┴─────────────────────────────────────┐ +│ Tronbyte Apps Repository (GitHub) │ +│ • 974+ community-built apps │ +│ • Weather, sports, stocks, games, clocks, etc. │ +│ • https://github.com/tronbyt/apps │ +└──────────────────────────────────────────────────────────┘ +``` + +### Rendering Pipeline + +1. **User installs app** from the Tronbyte repository via web UI +2. **Plugin downloads** the `.star` file (and any assets like images/fonts) +3. **Schema extraction** parses configuration options from the `.star` source +4. **User configures** the app through the web UI (timezone, location, API keys, etc.) +5. **Pixlet renders** the app with user config → produces WebP animation +6. **Frame extraction** decodes WebP → individual PIL Image frames +7. **Display scaling** adapts 64x32 Tidbyt output to your matrix size +8. **Rotation** cycles through your installed apps based on schedule + +## Getting Started + +### 1. Install Pixlet + +Pixlet is the rendering engine that executes Starlark apps. The plugin will attempt to use: + +1. **Bundled binary** (recommended): Downloaded to `bin/pixlet/pixlet-{platform}-{arch}` +2. **System installation**: If `pixlet` is available in your PATH + +#### Auto-Install via Web UI + +Navigate to: **Plugins → Starlark Apps → Status → Install Pixlet** + +This runs the bundled installation script which downloads the appropriate binary for your platform. + +#### Manual Installation + +```bash +cd /path/to/LEDMatrix +bash scripts/download_pixlet.sh +``` + +Verify installation: +```bash +./bin/pixlet/pixlet-linux-amd64 version +# Pixlet 0.50.2 (or later) +``` + +### 2. Enable the Starlark Apps Plugin + +1. Open the web UI +2. Navigate to **Plugins** +3. Find **Starlark Apps** in the installed plugins list +4. Enable the plugin +5. Configure settings: + - **Magnify**: Auto-calculated based on your display size (or set manually) + - **Render Interval**: How often apps re-render (default: 300s) + - **Display Duration**: How long each app shows (default: 15s) + - **Cache Output**: Enable to reduce re-rendering (recommended) + +### 3. Browse and Install Apps + +1. Navigate to **Plugins → Starlark Apps → App Store** +2. Browse available apps (974+ options) +3. Filter by category: Weather, Sports, Finance, Games, Clocks, etc. +4. Click **Install** on desired apps +5. Configure each app: + - Set location/timezone + - Enter API keys if required + - Customize display preferences + +### 4. Configure Apps + +Each app may have different configuration options: + +#### Common Configuration Types + +- **Location** (lat/lng/timezone): For weather, clocks, transit +- **API Keys**: For services like weather, stocks, sports scores +- **Display Preferences**: Colors, units, layouts +- **Dropdown Options**: Team selections, language, themes +- **Toggles**: Enable/disable features + +Configuration is stored in `starlark-apps/{app-id}/config.json` and persists across app updates. + +## App Sources and Categories + +All apps are sourced from the [Tronbyte Apps Repository](https://github.com/tronbyt/apps). Popular categories include: + +### 🌤️ Weather +- Analog Clock (with weather) +- Current Weather +- Weather Forecast +- Air Quality Index + +### 🏈 Sports +- NFL Scores +- NBA Scores +- MLB Scores +- NHL Scores +- Soccer/Football Scores +- Formula 1 Results + +### 💰 Finance +- Stock Tickers +- Cryptocurrency Prices +- Market Indices + +### 🎮 Games & Fun +- Conway's Game of Life +- Pong +- Nyan Cat +- Retro Animations + +### 🕐 Clocks +- Analog Clock +- Fuzzy Clock +- Binary Clock +- Word Clock + +### 📰 Information +- News Headlines +- RSS Feeds +- GitHub Activity +- Reddit Feed + +### 🚌 Transit & Travel +- Transit Arrivals +- Flight Tracker +- Train Schedules + +## Display Size Compatibility + +Tronbyte/Tidbyt apps are designed for **64×32 displays**. LEDMatrix automatically adapts content for different display sizes: + +### Magnification + +The plugin calculates optimal magnification based on your display: + +```text +magnify = floor(min(display_width / 64, display_height / 32)) +``` + +Examples: +- **64×32**: magnify = 1 (native, pixel-perfect) +- **128×64**: magnify = 2 (2x scaling, crisp) +- **192×64**: magnify = 2 (2x + horizontal centering) +- **256×64**: magnify = 2 (2x + centering) + +### Scaling Modes + +**Config → Starlark Apps → Scale Method:** +- `nearest` (default): Sharp pixels, retro look +- `bilinear`: Smooth scaling, slight blur +- `bicubic`: Higher quality smooth scaling +- `lanczos`: Best quality, most processing + +**Center vs Scale:** +- `scale_output=true`: Stretch to fill display (may distort aspect ratio) +- `center_small_output=true`: Center output without stretching (preserves aspect ratio) + +## Configuration Schema Extraction + +LEDMatrix automatically extracts configuration schemas from Starlark apps by parsing the `get_schema()` function in the `.star` source code. + +### Supported Field Types + +| Starlark Type | Web UI Rendering | +|--------------|------------------| +| `schema.Location` | Lat/Lng/Timezone picker | +| `schema.Text` | Text input field | +| `schema.Toggle` | Checkbox/switch | +| `schema.Dropdown` | Select dropdown | +| `schema.Color` | Color picker | +| `schema.DateTime` | Date/time picker | +| `schema.OAuth2` | Warning message (not supported) | +| `schema.PhotoSelect` | Warning message (not supported) | +| `schema.LocationBased` | Text fallback with note | +| `schema.Typeahead` | Text fallback with note | + +### Schema Coverage + +- **90-95%** of apps: Full schema support +- **5%**: Partial extraction (complex/dynamic schemas) +- **<1%**: No schema (apps without configuration) + +Apps without extracted schemas can still run with default settings. + +## File Structure + +```text +LEDMatrix/ +├── plugin-repos/starlark-apps/ # Plugin source code +│ ├── manager.py # Main plugin logic +│ ├── pixlet_renderer.py # Pixlet CLI wrapper +│ ├── frame_extractor.py # WebP decoder +│ ├── tronbyte_repository.py # GitHub API client +│ └── requirements.txt # Python dependencies +│ +├── starlark-apps/ # Installed apps (user data) +│ ├── manifest.json # App registry +│ │ +│ └── analogclock/ # Example app +│ ├── analogclock.star # Starlark source +│ ├── config.json # User configuration +│ ├── schema.json # Extracted schema +│ ├── cached_render.webp # Rendered output cache +│ └── images/ # App assets (if any) +│ ├── hour_hand.png +│ └── minute_hand.png +│ +├── bin/pixlet/ # Pixlet binaries +│ ├── pixlet-linux-amd64 +│ ├── pixlet-linux-arm64 +│ └── pixlet-darwin-arm64 +│ +└── scripts/ + └── download_pixlet.sh # Pixlet installer +``` + +## API Keys and External Services + +Many apps require API keys for external services: + +### Common API Services + +- **Weather**: OpenWeatherMap, Weather.gov, Dark Sky +- **Sports**: ESPN, The Sports DB, SportsData.io +- **Finance**: Alpha Vantage, CoinGecko, Yahoo Finance +- **Transit**: TransitLand, NextBus, local transit APIs +- **News**: NewsAPI, Reddit, RSS feeds + +### Security Note + +- API keys are stored in `config.json` files on disk +- The LEDMatrix web interface does NOT encrypt API keys +- Ensure your Raspberry Pi is on a trusted network +- Use read-only or limited-scope API keys when possible +- **Never commit `starlark-apps/*/config.json` to version control** + +## Troubleshooting + +### Pixlet Not Found + +**Symptom**: "Pixlet binary not found" error + +**Solutions**: +1. Run auto-installer: **Plugins → Starlark Apps → Install Pixlet** +2. Manual install: `bash scripts/download_pixlet.sh` +3. Check permissions: `chmod +x bin/pixlet/pixlet-*` +4. Verify architecture: `uname -m` matches binary name + +### App Fails to Render + +**Symptom**: "Rendering failed" error in logs + +**Solutions**: +1. Check logs: `journalctl -u ledmatrix | grep -i pixlet` +2. Verify config: Ensure all required fields are filled +3. Test manually: `./bin/pixlet/pixlet-linux-amd64 render starlark-apps/{app-id}/{app-id}.star` +4. Missing assets: Some apps need images/fonts that may fail to download +5. API issues: Check API keys and rate limits + +### Schema Not Extracted + +**Symptom**: App installs but shows no configuration options + +**Solutions**: +1. App may not have a `get_schema()` function (normal for some apps) +2. Schema extraction failed: Check logs for parse errors +3. Manual config: Edit `starlark-apps/{app-id}/config.json` directly +4. Report issue: File bug with app details at LEDMatrix GitHub + +### Apps Show Distorted/Wrong Size + +**Symptom**: Content appears stretched, squished, or cropped + +**Solutions**: +1. Check magnify setting: **Plugins → Starlark Apps → Config** +2. Try `center_small_output=true` to preserve aspect ratio +3. Adjust `magnify` manually (1-8) for your display size +4. Some apps assume 64×32 - may not scale perfectly to all sizes + +### App Shows Outdated Data + +**Symptom**: Weather, sports scores, etc. don't update + +**Solutions**: +1. Check render interval: **App Config → Render Interval** (300s default) +2. Force re-render: **Plugins → Starlark Apps → {App} → Render Now** +3. Clear cache: Restart LEDMatrix service +4. API rate limits: Some services throttle requests +5. Check app logs for API errors + +## Performance Considerations + +### Render Intervals + +- Apps re-render on a schedule (default: 300s = 5 minutes) +- Lower intervals = more CPU/API usage +- Recommended minimums: + - Static content (clocks): 30-60s + - Weather: 300s (5min) + - Sports scores: 60-120s + - Stock tickers: 60s + +### Caching + +Enable caching to reduce CPU load: +- `cache_rendered_output=true` (recommended) +- `cache_ttl=300` (5 minutes) + +Cached WebP files are stored in `starlark-apps/{app-id}/cached_render.webp` + +### Display Rotation + +Balance number of enabled apps with display duration: +- 5 apps × 15s = 75s full cycle +- 20 apps × 15s = 300s (5 min) cycle + +Long cycles may cause apps to render before being displayed. + +## Limitations + +### Unsupported Features + +- **OAuth2 Authentication**: Apps requiring OAuth login won't work +- **PhotoSelect**: Image upload from mobile device not supported +- **Push Notifications**: Apps can't receive real-time events +- **Background Jobs**: No persistent background tasks + +### API Rate Limits + +Many apps use free API tiers with rate limits: +- Rendering too frequently may exceed limits +- Use appropriate `render_interval` settings +- Consider paid API tiers for heavy usage + +### Display Size Constraints + +Apps designed for 64×32 may not utilize larger displays fully: +- Content may appear small on 128×64+ displays +- Magnification helps but doesn't add detail +- Some apps hard-code 64×32 dimensions + +## Advanced Usage + +### Manual App Installation + +Upload custom `.star` files: +1. Navigate to **Starlark Apps → Upload** +2. Select `.star` file from disk +3. Configure app ID and metadata +4. Set render/display timing + +### Custom App Development + +While LEDMatrix runs Tronbyte apps, you can also create your own: + +1. **Learn Starlark**: [Tidbyt Developer Docs](https://tidbyt.dev/) +2. **Write `.star` file**: Use Pixlet APIs for rendering +3. **Test locally**: `pixlet render myapp.star` +4. **Upload**: Use LEDMatrix web UI to install +5. **Share**: Contribute to [Tronbyte Apps](https://github.com/tronbyt/apps) repo + +### Configuration Reference + +**Plugin Config** (`config/config.json` → `plugins.starlark-apps`): + +```json +{ + "enabled": true, + "magnify": 0, // 0 = auto, 1-8 = manual + "render_timeout": 30, // Max seconds for Pixlet render + "cache_rendered_output": true, // Cache WebP files + "cache_ttl": 300, // Cache duration (seconds) + "scale_output": true, // Scale to display size + "scale_method": "nearest", // nearest|bilinear|bicubic|lanczos + "center_small_output": false, // Center instead of scale + "default_frame_delay": 50, // Frame timing (ms) + "max_frames": null, // Limit frames (null = unlimited) + "auto_refresh_apps": true // Auto re-render on interval +} +``` + +**App Config** (`starlark-apps/{app-id}/config.json`): + +```json +{ + "location": "{\"lat\":\"40.7128\",\"lng\":\"-74.0060\",\"timezone\":\"America/New_York\"}", + "units": "imperial", + "api_key": "your-api-key-here", + "render_interval": 300, // App-specific override + "display_duration": 15 // App-specific override +} +``` + +## Resources + +### Official Documentation + +- **Tidbyt Developer Docs**: https://tidbyt.dev/ +- **Starlark Language**: https://github.com/bazelbuild/starlark +- **Pixlet Repository**: https://github.com/tidbyt/pixlet +- **Tronbyte Apps**: https://github.com/tronbyt/apps + +### LEDMatrix Documentation + +- [Plugin Development Guide](PLUGIN_DEVELOPMENT_GUIDE.md) +- [REST API Reference](REST_API_REFERENCE.md) +- [Troubleshooting Guide](TROUBLESHOOTING.md) + +### Community + +- **Tidbyt Community**: https://discuss.tidbyt.com/ +- **Tronbyte Apps Issues**: https://github.com/tronbyt/apps/issues +- **LEDMatrix Issues**: https://github.com/ChuckBuilds/LEDMatrix/issues + +## License and Legal + +- **LEDMatrix**: MIT License (see project root) +- **Starlark Apps Plugin**: MIT License (part of LEDMatrix) +- **Pixlet**: Apache 2.0 License (Tidbyt Inc.) +- **Tronbyte Apps**: Various licenses (see individual app headers) +- **Starlark Language**: Apache 2.0 License (Google/Bazel) + +**Disclaimer**: LEDMatrix is an independent project and is not affiliated with, endorsed by, or sponsored by Tidbyt Inc. The Starlark Apps plugin enables interoperability with Tidbyt's open-source ecosystem but does not imply any official relationship. + +## Support + +For issues with: +- **LEDMatrix integration**: File issues at [LEDMatrix GitHub](https://github.com/ChuckBuilds/LEDMatrix/issues) +- **Specific apps**: File issues at [Tronbyte Apps](https://github.com/tronbyt/apps/issues) +- **Pixlet rendering**: File issues at [Pixlet Repository](https://github.com/tidbyt/pixlet/issues) + +--- + +**Ready to get started?** Install the Starlark Apps plugin and explore 974+ community apps! 🎨 diff --git a/plugin-repos/starlark-apps/__init__.py b/plugin-repos/starlark-apps/__init__.py new file mode 100644 index 000000000..1d5dabca4 --- /dev/null +++ b/plugin-repos/starlark-apps/__init__.py @@ -0,0 +1,7 @@ +""" +Starlark Apps Plugin Package + +Seamlessly import and manage Starlark (.star) widgets from the Tronbyte/Tidbyt community. +""" + +__version__ = "1.0.0" diff --git a/plugin-repos/starlark-apps/config_schema.json b/plugin-repos/starlark-apps/config_schema.json new file mode 100644 index 000000000..e493204f2 --- /dev/null +++ b/plugin-repos/starlark-apps/config_schema.json @@ -0,0 +1,100 @@ +{ + "$schema": "http://json-schema.org/draft-07/schema#", + "type": "object", + "title": "Starlark Apps Plugin Configuration", + "description": "Configuration for managing Starlark (.star) apps", + "properties": { + "enabled": { + "type": "boolean", + "description": "Enable or disable the Starlark apps system", + "default": true + }, + "pixlet_path": { + "type": "string", + "description": "Path to Pixlet binary (auto-detected if empty)", + "default": "" + }, + "render_timeout": { + "type": "number", + "description": "Maximum time in seconds for rendering a .star app", + "default": 30, + "minimum": 5, + "maximum": 120 + }, + "cache_rendered_output": { + "type": "boolean", + "description": "Cache rendered WebP output to reduce CPU usage", + "default": true + }, + "cache_ttl": { + "type": "number", + "description": "Cache time-to-live in seconds", + "default": 300, + "minimum": 60, + "maximum": 3600 + }, + "default_frame_delay": { + "type": "number", + "description": "Default delay between frames in milliseconds (if not specified by app)", + "default": 50, + "minimum": 16, + "maximum": 1000 + }, + "scale_output": { + "type": "boolean", + "description": "Scale app output to match display dimensions", + "default": true + }, + "scale_method": { + "type": "string", + "enum": ["nearest", "bilinear", "bicubic", "lanczos"], + "description": "Scaling algorithm (nearest=pixel-perfect, lanczos=smoothest)", + "default": "nearest" + }, + "magnify": { + "type": "integer", + "description": "Pixlet magnification factor (0=auto, 1=64x32, 2=128x64, 3=192x96, etc.)", + "default": 0, + "minimum": 0, + "maximum": 8 + }, + "center_small_output": { + "type": "boolean", + "description": "Center small apps on large displays instead of stretching", + "default": false + }, + "background_render": { + "type": "boolean", + "description": "Render apps in background to avoid display delays", + "default": true + }, + "auto_refresh_apps": { + "type": "boolean", + "description": "Automatically refresh apps at their specified intervals", + "default": true + }, + "transition": { + "type": "object", + "description": "Transition settings for app display", + "properties": { + "type": { + "type": "string", + "enum": ["redraw", "fade", "slide", "wipe"], + "default": "fade" + }, + "speed": { + "type": "integer", + "description": "Transition speed (1-10)", + "default": 3, + "minimum": 1, + "maximum": 10 + }, + "enabled": { + "type": "boolean", + "default": true + } + } + } + }, + "additionalProperties": false +} diff --git a/plugin-repos/starlark-apps/frame_extractor.py b/plugin-repos/starlark-apps/frame_extractor.py new file mode 100644 index 000000000..bb52d9e18 --- /dev/null +++ b/plugin-repos/starlark-apps/frame_extractor.py @@ -0,0 +1,285 @@ +""" +Frame Extractor Module for Starlark Apps + +Extracts individual frames from WebP animations produced by Pixlet. +Handles both static images and animated WebP files. +""" + +import logging +from typing import List, Tuple, Optional +from PIL import Image + +logger = logging.getLogger(__name__) + + +class FrameExtractor: + """ + Extracts frames from WebP animations. + + Handles: + - Static WebP images (single frame) + - Animated WebP files (multiple frames with delays) + - Frame timing and duration extraction + """ + + def __init__(self, default_frame_delay: int = 50): + """ + Initialize frame extractor. + + Args: + default_frame_delay: Default delay in milliseconds if not specified + """ + self.default_frame_delay = default_frame_delay + + def load_webp(self, webp_path: str) -> Tuple[bool, Optional[List[Tuple[Image.Image, int]]], Optional[str]]: + """ + Load WebP file and extract all frames with their delays. + + Args: + webp_path: Path to WebP file + + Returns: + Tuple of: + - success: bool + - frames: List of (PIL.Image, delay_ms) tuples, or None on failure + - error: Error message, or None on success + """ + try: + with Image.open(webp_path) as img: + # Check if animated + is_animated = getattr(img, "is_animated", False) + + if not is_animated: + # Static image - single frame + # Convert to RGB (LED matrix needs RGB) to match animated branch format + logger.debug(f"Loaded static WebP: {webp_path}") + rgb_img = img.convert("RGB") + return True, [(rgb_img.copy(), self.default_frame_delay)], None + + # Animated WebP - extract all frames + frames = [] + frame_count = getattr(img, "n_frames", 1) + + logger.debug(f"Extracting {frame_count} frames from animated WebP: {webp_path}") + + for frame_index in range(frame_count): + try: + img.seek(frame_index) + + # Get frame duration (in milliseconds) + # WebP stores duration in milliseconds + duration = img.info.get("duration", self.default_frame_delay) + + # Ensure minimum frame delay (prevent too-fast animations) + if duration < 16: # Less than ~60fps + duration = 16 + + # Convert frame to RGB (LED matrix needs RGB) + frame = img.convert("RGB") + frames.append((frame.copy(), duration)) + + except EOFError: + logger.warning(f"Reached end of frames at index {frame_index}") + break + except Exception as e: + logger.warning(f"Error extracting frame {frame_index}: {e}") + continue + + if not frames: + error = "No frames extracted from WebP" + logger.error(error) + return False, None, error + + logger.debug(f"Successfully extracted {len(frames)} frames") + return True, frames, None + + except FileNotFoundError: + error = f"WebP file not found: {webp_path}" + logger.error(error) + return False, None, error + except Exception as e: + error = f"Error loading WebP: {e}" + logger.error(error) + return False, None, error + + def scale_frames( + self, + frames: List[Tuple[Image.Image, int]], + target_width: int, + target_height: int, + method: Image.Resampling = Image.Resampling.NEAREST + ) -> List[Tuple[Image.Image, int]]: + """ + Scale all frames to target dimensions. + + Args: + frames: List of (image, delay) tuples + target_width: Target width in pixels + target_height: Target height in pixels + method: Resampling method (default: NEAREST for pixel-perfect scaling) + + Returns: + List of scaled (image, delay) tuples + """ + scaled_frames = [] + + for frame, delay in frames: + try: + # Only scale if dimensions don't match + if frame.width != target_width or frame.height != target_height: + scaled_frame = frame.resize( + (target_width, target_height), + resample=method + ) + scaled_frames.append((scaled_frame, delay)) + else: + scaled_frames.append((frame, delay)) + except Exception as e: + logger.warning(f"Error scaling frame: {e}") + # Keep original frame on error + scaled_frames.append((frame, delay)) + + logger.debug(f"Scaled {len(scaled_frames)} frames to {target_width}x{target_height}") + return scaled_frames + + def center_frames( + self, + frames: List[Tuple[Image.Image, int]], + target_width: int, + target_height: int, + background_color: tuple = (0, 0, 0) + ) -> List[Tuple[Image.Image, int]]: + """ + Center frames on a larger canvas instead of scaling. + Useful for displaying small widgets on large displays without distortion. + + Args: + frames: List of (image, delay) tuples + target_width: Target canvas width + target_height: Target canvas height + background_color: RGB tuple for background (default: black) + + Returns: + List of centered (image, delay) tuples + """ + centered_frames = [] + + for frame, delay in frames: + try: + # If frame is already the right size, no centering needed + if frame.width == target_width and frame.height == target_height: + centered_frames.append((frame, delay)) + continue + + # Create black canvas at target size + canvas = Image.new('RGB', (target_width, target_height), background_color) + + # Calculate position to center the frame + x_offset = (target_width - frame.width) // 2 + y_offset = (target_height - frame.height) // 2 + + # Paste frame onto canvas + canvas.paste(frame, (x_offset, y_offset)) + centered_frames.append((canvas, delay)) + + except Exception as e: + logger.warning(f"Error centering frame: {e}") + # Keep original frame on error + centered_frames.append((frame, delay)) + + logger.debug(f"Centered {len(centered_frames)} frames on {target_width}x{target_height} canvas") + return centered_frames + + def get_total_duration(self, frames: List[Tuple[Image.Image, int]]) -> int: + """ + Calculate total animation duration in milliseconds. + + Args: + frames: List of (image, delay) tuples + + Returns: + Total duration in milliseconds + """ + return sum(delay for _, delay in frames) + + def optimize_frames( + self, + frames: List[Tuple[Image.Image, int]], + max_frames: Optional[int] = None, + target_duration: Optional[int] = None + ) -> List[Tuple[Image.Image, int]]: + """ + Optimize frame list by reducing frame count or adjusting timing. + + Args: + frames: List of (image, delay) tuples + max_frames: Maximum number of frames to keep + target_duration: Target total duration in milliseconds + + Returns: + Optimized list of (image, delay) tuples + """ + if not frames: + return frames + + optimized = frames.copy() + + # Limit frame count if specified + if max_frames is not None and max_frames > 0 and len(optimized) > max_frames: + # Sample frames evenly + step = len(optimized) / max_frames + indices = [int(i * step) for i in range(max_frames)] + optimized = [optimized[i] for i in indices] + logger.debug(f"Reduced frames from {len(frames)} to {len(optimized)}") + + # Adjust timing to match target duration + if target_duration: + current_duration = self.get_total_duration(optimized) + if current_duration > 0: + scale_factor = target_duration / current_duration + optimized = [ + (frame, max(16, int(delay * scale_factor))) + for frame, delay in optimized + ] + logger.debug(f"Adjusted timing: {current_duration}ms -> {target_duration}ms") + + return optimized + + def frames_to_gif_data(self, frames: List[Tuple[Image.Image, int]]) -> Optional[bytes]: + """ + Convert frames to GIF byte data for caching or transmission. + + Args: + frames: List of (image, delay) tuples + + Returns: + GIF bytes, or None on error + """ + if not frames: + return None + + try: + from io import BytesIO + + output = BytesIO() + + # Prepare frames for PIL + images = [frame for frame, _ in frames] + durations = [delay for _, delay in frames] + + # Save as GIF + images[0].save( + output, + format="GIF", + save_all=True, + append_images=images[1:], + duration=durations, + loop=0, # Infinite loop + optimize=False # Skip optimization for speed + ) + + return output.getvalue() + + except Exception as e: + logger.error(f"Error converting frames to GIF: {e}") + return None diff --git a/plugin-repos/starlark-apps/manager.py b/plugin-repos/starlark-apps/manager.py new file mode 100644 index 000000000..3da8c24d5 --- /dev/null +++ b/plugin-repos/starlark-apps/manager.py @@ -0,0 +1,1057 @@ +""" +Starlark Apps Plugin for LEDMatrix + +Manages and displays Starlark (.star) apps from Tronbyte/Tidbyt community. +Provides seamless widget import without modification. + +API Version: 1.0.0 +""" + +import json +import os +import re +import time +import fcntl +from pathlib import Path +from typing import Dict, Any, Optional, List, Tuple +from PIL import Image + +from src.plugin_system.base_plugin import BasePlugin, VegasDisplayMode +from src.logging_config import get_logger +from pixlet_renderer import PixletRenderer +from frame_extractor import FrameExtractor + +logger = get_logger(__name__) + + +class StarlarkApp: + """Represents a single installed Starlark app.""" + + def __init__(self, app_id: str, app_dir: Path, manifest: Dict[str, Any]): + """ + Initialize a Starlark app instance. + + Args: + app_id: Unique identifier for this app + app_dir: Directory containing the app files + manifest: App metadata from manifest + """ + self.app_id = app_id + self.app_dir = app_dir + self.manifest = manifest + self.star_file = app_dir / manifest.get("star_file", f"{app_id}.star") + self.config_file = app_dir / "config.json" + self.schema_file = app_dir / "schema.json" + self.cache_file = app_dir / "cached_render.webp" + + # Load app configuration and schema + self.config = self._load_config() + self.schema = self._load_schema() + + # Merge schema defaults into config for any missing fields + self._merge_schema_defaults() + + # Runtime state + self.frames: Optional[List[Tuple[Image.Image, int]]] = None + self.current_frame_index = 0 + self.last_frame_time = 0 + self.last_render_time = 0 + + def _load_config(self) -> Dict[str, Any]: + """Load app configuration from config.json.""" + if self.config_file.exists(): + try: + with open(self.config_file, 'r') as f: + return json.load(f) + except (OSError, json.JSONDecodeError) as e: + logger.warning(f"Could not load config for {self.app_id}: {e}") + return {} + + def _load_schema(self) -> Optional[Dict[str, Any]]: + """Load app schema from schema.json.""" + if self.schema_file.exists(): + try: + with open(self.schema_file, 'r') as f: + return json.load(f) + except (OSError, json.JSONDecodeError) as e: + logger.warning(f"Could not load schema for {self.app_id}: {e}") + return None + + def _merge_schema_defaults(self) -> None: + """ + Merge schema default values into config for any missing fields. + This ensures existing apps get defaults when schemas are updated with new fields. + """ + if not self.schema: + return + + # Get fields from schema (handles both 'fields' and 'schema' keys) + fields = self.schema.get('fields') or self.schema.get('schema') or [] + defaults_added = False + + for field in fields: + if isinstance(field, dict) and 'id' in field and 'default' in field: + field_id = field['id'] + # Only add if not already present in config + if field_id not in self.config: + self.config[field_id] = field['default'] + defaults_added = True + logger.debug(f"Added default value for {self.app_id}.{field_id}: {field['default']}") + + # Save config if we added any defaults + if defaults_added: + self.save_config() + + def _validate_config(self) -> Optional[str]: + """ + Validate config values to prevent injection and ensure data integrity. + + Returns: + Error message if validation fails, None if valid + """ + for key, value in self.config.items(): + # Validate key format + if not re.match(r'^[a-zA-Z_][a-zA-Z0-9_]{0,63}$', key): + return f"Invalid config key format: {key}" + + # Validate location fields (JSON format) + if isinstance(value, str) and value.strip().startswith('{'): + try: + loc = json.loads(value) + except json.JSONDecodeError as e: + return f"Invalid JSON for key {key}: {e}" + + # Validate lat/lng if present + try: + if 'lat' in loc: + lat = float(loc['lat']) + if not -90 <= lat <= 90: + return f"Latitude {lat} out of range [-90, 90] for key {key}" + if 'lng' in loc: + lng = float(loc['lng']) + if not -180 <= lng <= 180: + return f"Longitude {lng} out of range [-180, 180] for key {key}" + except ValueError as e: + return f"Invalid numeric value for {key}: {e}" + + return None + + def save_config(self) -> bool: + """Save current configuration to file with validation.""" + try: + # Validate config before saving + error = self._validate_config() + if error: + logger.error(f"Config validation failed for {self.app_id}: {error}") + return False + + with open(self.config_file, 'w') as f: + json.dump(self.config, f, indent=2) + return True + except Exception as e: + logger.exception(f"Could not save config for {self.app_id}: {e}") + return False + + def is_enabled(self) -> bool: + """Check if app is enabled.""" + return self.manifest.get("enabled", True) + + def get_render_interval(self) -> int: + """Get render interval in seconds.""" + default = 300 + try: + value = self.manifest.get("render_interval", default) + interval = int(value) + except (ValueError, TypeError): + interval = default + + # Clamp to safe range: min 5, max 3600 + return max(5, min(interval, 3600)) + + def get_display_duration(self) -> int: + """Get display duration in seconds.""" + default = 15 + try: + value = self.manifest.get("display_duration", default) + duration = int(value) + except (ValueError, TypeError): + duration = default + + # Clamp to safe range: min 1, max 600 + return max(1, min(duration, 600)) + + def should_render(self, current_time: float) -> bool: + """Check if app should be re-rendered based on interval.""" + interval = self.get_render_interval() + return (current_time - self.last_render_time) >= interval + + +class StarlarkAppsPlugin(BasePlugin): + """ + Starlark Apps Manager plugin. + + Manages Starlark (.star) apps and renders them using Pixlet. + Each installed app becomes a dynamic display mode. + """ + + def __init__(self, plugin_id: str, config: Dict[str, Any], + display_manager, cache_manager, plugin_manager): + """Initialize the Starlark Apps plugin.""" + super().__init__(plugin_id, config, display_manager, cache_manager, plugin_manager) + + # Initialize components + self.pixlet = PixletRenderer( + pixlet_path=config.get("pixlet_path"), + timeout=config.get("render_timeout", 30) + ) + self.extractor = FrameExtractor( + default_frame_delay=config.get("default_frame_delay", 50) + ) + + # App storage + self.apps_dir = self._get_apps_directory() + self.manifest_file = self.apps_dir / "manifest.json" + self.apps: Dict[str, StarlarkApp] = {} + + # Display state + self.current_app: Optional[StarlarkApp] = None + self.last_update_check = 0 + + # Check Pixlet availability + if not self.pixlet.is_available(): + self.logger.error("Pixlet not available - Starlark apps will not work") + self.logger.error("Install Pixlet or place bundled binary in bin/pixlet/") + else: + version = self.pixlet.get_version() + self.logger.info(f"Pixlet available: {version}") + + # Calculate optimal magnification based on display size + self.calculated_magnify = self._calculate_optimal_magnify() + if self.calculated_magnify > 1: + self.logger.info(f"Display size: {self.display_manager.matrix.width}x{self.display_manager.matrix.height}, " + f"recommended magnify: {self.calculated_magnify}") + + # Load installed apps + self._load_installed_apps() + + self.logger.info(f"Starlark Apps plugin initialized with {len(self.apps)} apps") + + @property + def modes(self) -> List[str]: + """ + Return list of display modes (one per installed Starlark app). + + This allows each installed app to appear as a separate display mode + in the schedule/rotation system. + + Returns: + List of app IDs that can be used as display modes + """ + # Return list of enabled app IDs as display modes + return [app.app_id for app in self.apps.values() if app.is_enabled()] + + def validate_config(self) -> bool: + """ + Validate plugin configuration. + + Ensures required configuration values are valid for Starlark apps. + + Returns: + True if configuration is valid, False otherwise + """ + # Call parent validation first + if not super().validate_config(): + return False + + # Validate magnify range (0-8) + if "magnify" in self.config: + magnify = self.config["magnify"] + if not isinstance(magnify, int) or magnify < 0 or magnify > 8: + self.logger.error("magnify must be an integer between 0 and 8") + return False + + # Validate render_timeout + if "render_timeout" in self.config: + timeout = self.config["render_timeout"] + if not isinstance(timeout, (int, float)) or timeout < 5 or timeout > 120: + self.logger.error("render_timeout must be a number between 5 and 120") + return False + + # Validate cache_ttl + if "cache_ttl" in self.config: + ttl = self.config["cache_ttl"] + if not isinstance(ttl, (int, float)) or ttl < 60 or ttl > 3600: + self.logger.error("cache_ttl must be a number between 60 and 3600") + return False + + # Validate scale_method + if "scale_method" in self.config: + method = self.config["scale_method"] + valid_methods = ["nearest", "bilinear", "bicubic", "lanczos"] + if method not in valid_methods: + self.logger.error(f"scale_method must be one of: {', '.join(valid_methods)}") + return False + + # Validate default_frame_delay + if "default_frame_delay" in self.config: + delay = self.config["default_frame_delay"] + if not isinstance(delay, (int, float)) or delay < 16 or delay > 1000: + self.logger.error("default_frame_delay must be a number between 16 and 1000") + return False + + return True + + def _calculate_optimal_magnify(self) -> int: + """ + Calculate optimal magnification factor based on display dimensions. + + Tronbyte apps are designed for 64x32 displays. + This calculates what magnification would best fit the current display. + + Returns: + Recommended magnify value (1-8) + """ + try: + display_width = self.display_manager.matrix.width + display_height = self.display_manager.matrix.height + + # Tronbyte native resolution + NATIVE_WIDTH = 64 + NATIVE_HEIGHT = 32 + + # Calculate scale factors for width and height + width_scale = display_width / NATIVE_WIDTH + height_scale = display_height / NATIVE_HEIGHT + + # Use the smaller scale to ensure content fits + # (prevents overflow on one dimension) + scale_factor = min(width_scale, height_scale) + + # Round down to get integer magnify value + magnify = int(scale_factor) + + # Clamp to reasonable range (1-8) + magnify = max(1, min(8, magnify)) + + self.logger.debug(f"Display: {display_width}x{display_height}, " + f"Native: {NATIVE_WIDTH}x{NATIVE_HEIGHT}, " + f"Calculated magnify: {magnify}") + + return magnify + + except Exception as e: + self.logger.warning(f"Could not calculate magnify: {e}") + return 1 + + def get_magnify_recommendation(self) -> Dict[str, Any]: + """ + Get detailed magnification recommendation for current display. + + Returns: + Dictionary with recommendation details + """ + try: + display_width = self.display_manager.matrix.width + display_height = self.display_manager.matrix.height + + NATIVE_WIDTH = 64 + NATIVE_HEIGHT = 32 + + width_scale = display_width / NATIVE_WIDTH + height_scale = display_height / NATIVE_HEIGHT + + # Calculate for different magnify values + recommendations = [] + for magnify in range(1, 9): + render_width = NATIVE_WIDTH * magnify + render_height = NATIVE_HEIGHT * magnify + + # Check if this magnify fits perfectly + perfect_fit = (render_width == display_width and render_height == display_height) + + # Check if scaling is needed + needs_scaling = (render_width != display_width or render_height != display_height) + + # Calculate quality score (1-100) + if perfect_fit: + quality_score = 100 + elif not needs_scaling: + quality_score = 95 + else: + # Score based on how close to display size + width_ratio = min(render_width, display_width) / max(render_width, display_width) + height_ratio = min(render_height, display_height) / max(render_height, display_height) + quality_score = int((width_ratio + height_ratio) / 2 * 100) + + recommendations.append({ + 'magnify': magnify, + 'render_size': f"{render_width}x{render_height}", + 'perfect_fit': perfect_fit, + 'needs_scaling': needs_scaling, + 'quality_score': quality_score, + 'recommended': magnify == self.calculated_magnify + }) + + return { + 'display_size': f"{display_width}x{display_height}", + 'native_size': f"{NATIVE_WIDTH}x{NATIVE_HEIGHT}", + 'calculated_magnify': self.calculated_magnify, + 'width_scale': round(width_scale, 2), + 'height_scale': round(height_scale, 2), + 'recommendations': recommendations + } + + except Exception as e: + self.logger.exception(f"Error getting magnify recommendation: {e}") + return { + 'display_size': 'unknown', + 'calculated_magnify': 1, + 'recommendations': [] + } + + def _get_effective_magnify(self) -> int: + """ + Get the effective magnify value to use for rendering. + + Priority: + 1. User-configured magnify (if valid and in range 1-8) + 2. Auto-calculated magnify + + Returns: + Magnify value to use + """ + config_magnify = self.config.get("magnify") + + # Validate and clamp config_magnify + if config_magnify is not None: + try: + # Convert to int if possible + config_magnify = int(config_magnify) + # Clamp to safe range (1-8) + if 1 <= config_magnify <= 8: + return config_magnify + except (ValueError, TypeError): + # Non-numeric value, fall through to calculated + pass + + # Fall back to auto-calculated value + return self.calculated_magnify + + def _get_apps_directory(self) -> Path: + """Get the directory for storing Starlark apps.""" + try: + # Try to find project root + current_dir = Path(__file__).resolve().parent + project_root = current_dir.parent.parent + apps_dir = project_root / "starlark-apps" + except Exception: + # Fallback to current working directory + apps_dir = Path.cwd() / "starlark-apps" + + # Create directory if it doesn't exist + apps_dir.mkdir(parents=True, exist_ok=True) + return apps_dir + + def _sanitize_app_id(self, app_id: str) -> str: + """ + Sanitize app_id into a safe slug for use in file paths. + + Args: + app_id: Original app identifier + + Returns: + Sanitized slug containing only [a-z0-9_.-] characters + """ + if not app_id: + raise ValueError("app_id cannot be empty") + + # Replace invalid characters with underscore + # Allow only: lowercase letters, digits, underscore, period, hyphen + safe_slug = re.sub(r'[^a-z0-9_.-]', '_', app_id.lower()) + + # Remove leading/trailing dots, underscores, or hyphens + safe_slug = safe_slug.strip('._-') + + # Ensure it's not empty after sanitization + if not safe_slug: + raise ValueError(f"app_id '{app_id}' becomes empty after sanitization") + + return safe_slug + + def _verify_path_safety(self, path: Path, base_dir: Path) -> None: + """ + Verify that a path is within the base directory to prevent path traversal. + + Args: + path: Path to verify + base_dir: Base directory that path must be within + + Raises: + ValueError: If path escapes the base directory + """ + try: + resolved_path = path.resolve() + resolved_base = base_dir.resolve() + + # Check if path is relative to base directory + if not resolved_path.is_relative_to(resolved_base): + raise ValueError( + f"Path traversal detected: {resolved_path} is not within {resolved_base}" + ) + except (ValueError, AttributeError) as e: + # AttributeError for Python < 3.9 where is_relative_to doesn't exist + # Fallback: check if resolved path starts with resolved base + resolved_path = path.resolve() + resolved_base = base_dir.resolve() + + try: + resolved_path.relative_to(resolved_base) + except ValueError: + raise ValueError( + f"Path traversal detected: {resolved_path} is not within {resolved_base}" + ) from e + + def _load_installed_apps(self) -> None: + """Load all installed apps from manifest.""" + if not self.manifest_file.exists(): + # Create initial manifest + self._save_manifest({"apps": {}}) + return + + try: + with open(self.manifest_file, 'r') as f: + manifest = json.load(f) + + apps_data = manifest.get("apps", {}) + for app_id, app_manifest in apps_data.items(): + try: + # Sanitize app_id to prevent path traversal + safe_app_id = self._sanitize_app_id(app_id) + app_dir = (self.apps_dir / safe_app_id).resolve() + + # Verify path safety + self._verify_path_safety(app_dir, self.apps_dir) + except ValueError as e: + self.logger.warning(f"Invalid app_id '{app_id}': {e}") + continue + + if not app_dir.exists(): + self.logger.warning(f"App directory missing: {app_id}") + continue + + try: + # Use safe_app_id for internal storage to match directory structure + app = StarlarkApp(safe_app_id, app_dir, app_manifest) + self.apps[safe_app_id] = app + self.logger.debug(f"Loaded app: {app_id} (sanitized: {safe_app_id})") + except Exception as e: + self.logger.exception(f"Error loading app {app_id}: {e}") + + self.logger.info(f"Loaded {len(self.apps)} Starlark apps") + + except Exception as e: + self.logger.exception(f"Error loading apps manifest: {e}") + + def _save_manifest(self, manifest: Dict[str, Any]) -> bool: + """ + Save apps manifest to file with file locking to prevent race conditions. + Acquires exclusive lock on manifest file before writing to prevent concurrent modifications. + """ + temp_file = None + lock_fd = None + try: + # Create parent directory if needed + self.manifest_file.parent.mkdir(parents=True, exist_ok=True) + + # Open manifest file for locking (create if doesn't exist, don't truncate) + # Use os.open with O_CREAT | O_RDWR to create if missing, but don't truncate + lock_fd = os.open(str(self.manifest_file), os.O_CREAT | os.O_RDWR, 0o644) + + # Acquire exclusive lock on manifest file BEFORE creating temp file + # This serializes all writers and prevents concurrent races + fcntl.flock(lock_fd, fcntl.LOCK_EX) + + try: + # Now that we hold the lock, create and write temp file + temp_file = self.manifest_file.with_suffix('.tmp') + + with open(temp_file, 'w') as f: + json.dump(manifest, f, indent=2) + f.flush() + os.fsync(f.fileno()) # Ensure data is written to disk + + # Atomic rename (overwrites destination) while still holding lock + temp_file.replace(self.manifest_file) + return True + finally: + # Release lock + fcntl.flock(lock_fd, fcntl.LOCK_UN) + os.close(lock_fd) + + except (OSError, IOError, json.JSONDecodeError, ValueError) as e: + self.logger.exception("Error saving manifest while writing manifest file", exc_info=True) + # Clean up temp file if it exists + if temp_file is not None and temp_file.exists(): + try: + temp_file.unlink() + except Exception as cleanup_exc: + self.logger.warning(f"Failed to clean up temp file {temp_file}: {cleanup_exc}") + # Clean up lock fd if still open + if lock_fd is not None: + try: + os.close(lock_fd) + except Exception as cleanup_exc: + self.logger.warning(f"Failed to close lock file descriptor: {cleanup_exc}") + return False + + def _update_manifest_safe(self, updater_fn) -> bool: + """ + Safely update manifest with file locking to prevent race conditions. + Holds exclusive lock for entire read-modify-write cycle. + + Args: + updater_fn: Function that takes manifest dict and modifies it in-place + + Returns: + True if successful, False otherwise + """ + lock_fd = None + temp_file = None + try: + # Create parent directory if needed + self.manifest_file.parent.mkdir(parents=True, exist_ok=True) + + # Open manifest file for locking (create if doesn't exist, don't truncate) + lock_fd = os.open(str(self.manifest_file), os.O_CREAT | os.O_RDWR, 0o644) + + # Acquire exclusive lock for entire read-modify-write cycle + fcntl.flock(lock_fd, fcntl.LOCK_EX) + + try: + # Read current manifest while holding exclusive lock + if self.manifest_file.exists() and self.manifest_file.stat().st_size > 0: + with open(self.manifest_file, 'r') as f: + manifest = json.load(f) + else: + # Empty or non-existent file, start with default structure + manifest = {"apps": {}} + + # Apply updates while still holding lock + updater_fn(manifest) + + # Write back to temp file, then atomic replace (still holding lock) + temp_file = self.manifest_file.with_suffix('.tmp') + with open(temp_file, 'w') as f: + json.dump(manifest, f, indent=2) + f.flush() + os.fsync(f.fileno()) + + # Atomic rename while still holding lock + temp_file.replace(self.manifest_file) + return True + + finally: + # Release lock + fcntl.flock(lock_fd, fcntl.LOCK_UN) + os.close(lock_fd) + + except (OSError, IOError, json.JSONDecodeError, ValueError) as e: + self.logger.exception("Error updating manifest during read-modify-write cycle", exc_info=True) + # Clean up temp file if it exists + if temp_file is not None and temp_file.exists(): + try: + temp_file.unlink() + except Exception as cleanup_exc: + self.logger.warning(f"Failed to clean up temp file {temp_file}: {cleanup_exc}") + # Clean up lock fd if still open + if lock_fd is not None: + try: + os.close(lock_fd) + except Exception as cleanup_exc: + self.logger.warning(f"Failed to close lock file descriptor: {cleanup_exc}") + return False + + def update(self) -> None: + """Update method - check if apps need re-rendering.""" + current_time = time.time() + + # Check apps that need re-rendering based on their intervals + if self.config.get("auto_refresh_apps", True): + for app in self.apps.values(): + if app.is_enabled() and app.should_render(current_time): + self._render_app(app, force=False) + + def display(self, force_clear: bool = False) -> None: + """ + Display current Starlark app. + + This method is called during the display rotation. + Displays frames from the currently active app. + """ + try: + if force_clear: + self.display_manager.clear() + + # If no current app, try to select one + if not self.current_app: + self._select_next_app() + + if not self.current_app: + # No apps available + self.logger.debug("No Starlark apps to display") + return + + # Render app if needed + if not self.current_app.frames: + success = self._render_app(self.current_app, force=True) + if not success: + self.logger.error(f"Failed to render app: {self.current_app.app_id}") + return + + # Display current frame + self._display_frame() + + except Exception as e: + self.logger.error(f"Error displaying Starlark app: {e}") + + def _select_next_app(self) -> None: + """Select the next enabled app for display.""" + enabled_apps = [app for app in self.apps.values() if app.is_enabled()] + + if not enabled_apps: + self.current_app = None + return + + # Simple rotation - could be enhanced with priorities + if self.current_app and self.current_app in enabled_apps: + current_idx = enabled_apps.index(self.current_app) + next_idx = (current_idx + 1) % len(enabled_apps) + self.current_app = enabled_apps[next_idx] + else: + self.current_app = enabled_apps[0] + + self.logger.debug(f"Selected app for display: {self.current_app.app_id}") + + def _render_app(self, app: StarlarkApp, force: bool = False) -> bool: + """ + Render a Starlark app using Pixlet. + + Args: + app: App to render + force: Force render even if cached + + Returns: + True if successful + """ + try: + current_time = time.time() + + # Check cache + use_cache = self.config.get("cache_rendered_output", True) + cache_ttl = self.config.get("cache_ttl", 300) + + if (not force and use_cache and app.cache_file.exists() and + (current_time - app.last_render_time) < cache_ttl): + # Use cached render + self.logger.debug(f"Using cached render for: {app.app_id}") + return self._load_frames_from_cache(app) + + # Render with Pixlet + self.logger.info(f"Rendering app: {app.app_id}") + + # Get effective magnification factor (config or auto-calculated) + magnify = self._get_effective_magnify() + self.logger.debug(f"Using magnify={magnify} for {app.app_id}") + + # Filter out LEDMatrix-internal timing keys before passing to pixlet + INTERNAL_KEYS = {'render_interval', 'display_duration'} + pixlet_config = {k: v for k, v in app.config.items() if k not in INTERNAL_KEYS} + + success, error = self.pixlet.render( + star_file=str(app.star_file), + output_path=str(app.cache_file), + config=pixlet_config, + magnify=magnify + ) + + if not success: + self.logger.error(f"Pixlet render failed: {error}") + return False + + # Extract frames + success = self._load_frames_from_cache(app) + if success: + app.last_render_time = current_time + + return success + + except Exception as e: + self.logger.error(f"Error rendering app {app.app_id}: {e}") + return False + + def _load_frames_from_cache(self, app: StarlarkApp) -> bool: + """Load frames from cached WebP file.""" + try: + success, frames, error = self.extractor.load_webp(str(app.cache_file)) + + if not success: + self.logger.error(f"Frame extraction failed: {error}") + return False + + # Scale frames if needed + if self.config.get("scale_output", True): + width = self.display_manager.matrix.width + height = self.display_manager.matrix.height + + # Get scaling method from config + scale_method_str = self.config.get("scale_method", "nearest") + scale_method_map = { + "nearest": Image.Resampling.NEAREST, + "bilinear": Image.Resampling.BILINEAR, + "bicubic": Image.Resampling.BICUBIC, + "lanczos": Image.Resampling.LANCZOS + } + scale_method = scale_method_map.get(scale_method_str, Image.Resampling.NEAREST) + + # Check if we should center instead of scale + if self.config.get("center_small_output", False): + frames = self.extractor.center_frames(frames, width, height) + else: + frames = self.extractor.scale_frames(frames, width, height, scale_method) + + # Optimize frames to limit memory usage (max_frames=None means no limit) + max_frames = self.config.get("max_frames") + if max_frames is not None: + frames = self.extractor.optimize_frames(frames, max_frames=max_frames) + + app.frames = frames + app.current_frame_index = 0 + app.last_frame_time = time.time() + + self.logger.debug(f"Loaded {len(frames)} frames for {app.app_id}") + return True + + except Exception as e: + self.logger.error(f"Error loading frames for {app.app_id}: {e}") + return False + + def _display_frame(self) -> None: + """Display the current frame of the current app.""" + if not self.current_app or not self.current_app.frames: + return + + try: + current_time = time.time() + frame, delay_ms = self.current_app.frames[self.current_app.current_frame_index] + + # Set frame on display manager + self.display_manager.image = frame + self.display_manager.update_display() + + # Check if it's time to advance to next frame + delay_seconds = delay_ms / 1000.0 + if (current_time - self.current_app.last_frame_time) >= delay_seconds: + self.current_app.current_frame_index = ( + (self.current_app.current_frame_index + 1) % len(self.current_app.frames) + ) + self.current_app.last_frame_time = current_time + + except Exception as e: + self.logger.error(f"Error displaying frame: {e}") + + def install_app(self, app_id: str, star_file_path: str, metadata: Optional[Dict[str, Any]] = None, assets_dir: Optional[str] = None) -> bool: + """ + Install a new Starlark app. + + Args: + app_id: Unique identifier for the app + star_file_path: Path to .star file to install + metadata: Optional metadata (name, description, etc.) + assets_dir: Optional directory containing assets (images/, sources/, etc.) + + Returns: + True if successful + """ + try: + import shutil + + # Sanitize app_id to prevent path traversal + safe_app_id = self._sanitize_app_id(app_id) + + # Create app directory with resolved path + app_dir = (self.apps_dir / safe_app_id).resolve() + + # Verify path safety BEFORE creating directories + self._verify_path_safety(app_dir, self.apps_dir) + app_dir.mkdir(parents=True, exist_ok=True) + + # Copy .star file with sanitized app_id + star_dest = app_dir / f"{safe_app_id}.star" + # Verify star_dest path safety + self._verify_path_safety(star_dest, self.apps_dir) + shutil.copy2(star_file_path, star_dest) + + # Copy asset directories if provided (images/, sources/, etc.) + if assets_dir and Path(assets_dir).exists(): + assets_path = Path(assets_dir) + for item in assets_path.iterdir(): + if item.is_dir(): + # Copy entire directory (e.g., images/, sources/) + dest_dir = app_dir / item.name + # Verify dest_dir path safety + self._verify_path_safety(dest_dir, self.apps_dir) + if dest_dir.exists(): + shutil.rmtree(dest_dir) + shutil.copytree(item, dest_dir) + self.logger.debug(f"Copied assets directory: {item.name}") + self.logger.info(f"Installed assets for {app_id}") + + # Create app manifest entry + app_manifest = { + "name": metadata.get("name", app_id) if metadata else app_id, + "original_id": app_id, # Store original for reference + "star_file": f"{safe_app_id}.star", + "enabled": True, + "render_interval": metadata.get("render_interval", 300) if metadata else 300, + "display_duration": metadata.get("display_duration", 15) if metadata else 15 + } + + # Try to extract schema + _, schema, _ = self.pixlet.extract_schema(str(star_dest)) + if schema: + schema_path = app_dir / "schema.json" + # Verify schema path safety + self._verify_path_safety(schema_path, self.apps_dir) + with open(schema_path, 'w') as f: + json.dump(schema, f, indent=2) + + # Create default config — pre-populate with schema defaults + default_config = {} + if schema: + fields = schema.get('fields') or schema.get('schema') or [] + for field in fields: + if isinstance(field, dict) and 'id' in field and 'default' in field: + default_config[field['id']] = field['default'] + config_path = app_dir / "config.json" + # Verify config path safety + self._verify_path_safety(config_path, self.apps_dir) + with open(config_path, 'w') as f: + json.dump(default_config, f, indent=2) + + # Update manifest (use safe_app_id as key to match directory) + def update_fn(manifest): + manifest["apps"][safe_app_id] = app_manifest + + if not self._update_manifest_safe(update_fn): + self.logger.error(f"Failed to update manifest for {app_id}") + return False + + # Create app instance (use safe_app_id for internal key, original for display) + app = StarlarkApp(safe_app_id, app_dir, app_manifest) + self.apps[safe_app_id] = app + + self.logger.info(f"Installed Starlark app: {app_id} (sanitized: {safe_app_id})") + return True + + except Exception as e: + self.logger.error(f"Error installing app {app_id}: {e}") + return False + + def uninstall_app(self, app_id: str) -> bool: + """ + Uninstall a Starlark app. + + Args: + app_id: App to uninstall + + Returns: + True if successful + """ + try: + import shutil + + if app_id not in self.apps: + self.logger.warning(f"App not found: {app_id}") + return False + + # Remove from current app if selected + if self.current_app and self.current_app.app_id == app_id: + self.current_app = None + + # Get app reference before removing from dict + app = self.apps.get(app_id) + + # Update manifest FIRST (before modifying filesystem) + def update_fn(manifest): + if app_id in manifest["apps"]: + del manifest["apps"][app_id] + + if not self._update_manifest_safe(update_fn): + self.logger.error(f"Failed to update manifest when uninstalling {app_id}") + return False + + # Remove from apps dict + self.apps.pop(app_id) + + # Remove directory (after manifest update succeeds) + if app and app.app_dir.exists(): + shutil.rmtree(app.app_dir) + + self.logger.info(f"Uninstalled Starlark app: {app_id}") + return True + + except Exception as e: + self.logger.error(f"Error uninstalling app {app_id}: {e}") + return False + + def get_display_duration(self) -> float: + """Get display duration for current app.""" + if self.current_app: + return float(self.current_app.get_display_duration()) + return self.config.get('display_duration', 15.0) + + # ─── Vegas Mode Integration ────────────────────────────────────── + + def get_vegas_content(self) -> Optional[List[Image.Image]]: + """Return rendered frames from enabled starlark apps for vegas scroll.""" + images = [] + for app in self.apps.values(): + if not app.is_enabled(): + continue + # Use cached frames if available + if app.frames: + images.extend([frame for frame, delay in app.frames]) + else: + # Try to render and extract frames + if self._render_app(app): + if app.frames: + images.extend([frame for frame, delay in app.frames]) + return images if images else None + + def get_vegas_content_type(self) -> str: + """Indicate the type of content for Vegas scroll.""" + return "multi" + + def get_vegas_display_mode(self) -> VegasDisplayMode: + """Get the display mode for Vegas scroll integration.""" + return VegasDisplayMode.FIXED_SEGMENT + + def get_info(self) -> Dict[str, Any]: + """Return plugin info for web UI.""" + info = super().get_info() + info.update({ + 'pixlet_available': self.pixlet.is_available(), + 'pixlet_version': self.pixlet.get_version(), + 'installed_apps': len(self.apps), + 'enabled_apps': len([a for a in self.apps.values() if a.is_enabled()]), + 'current_app': self.current_app.app_id if self.current_app else None, + 'apps': { + app_id: { + 'name': app.manifest.get('name', app_id), + 'enabled': app.is_enabled(), + 'has_frames': app.frames is not None + } + for app_id, app in self.apps.items() + } + }) + return info diff --git a/plugin-repos/starlark-apps/manifest.json b/plugin-repos/starlark-apps/manifest.json new file mode 100644 index 000000000..ef44707b8 --- /dev/null +++ b/plugin-repos/starlark-apps/manifest.json @@ -0,0 +1,26 @@ +{ + "id": "starlark-apps", + "name": "Starlark Apps", + "version": "1.0.0", + "author": "LEDMatrix", + "description": "Manages and displays Starlark (.star) apps from Tronbyte/Tidbyt community. Import widgets seamlessly without modification.", + "entry_point": "manager.py", + "class_name": "StarlarkAppsPlugin", + "category": "system", + "tags": [ + "starlark", + "widgets", + "tronbyte", + "tidbyt", + "apps", + "community" + ], + "display_modes": [], + "update_interval": 60, + "default_duration": 15, + "dependencies": [ + "Pillow>=10.0.0", + "PyYAML>=6.0", + "requests>=2.31.0" + ] +} diff --git a/plugin-repos/starlark-apps/pixlet_renderer.py b/plugin-repos/starlark-apps/pixlet_renderer.py new file mode 100644 index 000000000..40f8f59d0 --- /dev/null +++ b/plugin-repos/starlark-apps/pixlet_renderer.py @@ -0,0 +1,659 @@ +""" +Pixlet Renderer Module for Starlark Apps + +Handles execution of Pixlet CLI to render .star files into WebP animations. +Supports bundled binaries and system-installed Pixlet. +""" + +import json +import logging +import os +import platform +import re +import shutil +import subprocess +from pathlib import Path +from typing import Dict, Any, Optional, Tuple, List + +logger = logging.getLogger(__name__) + + +class PixletRenderer: + """ + Wrapper for Pixlet CLI rendering. + + Handles: + - Auto-detection of bundled or system Pixlet binary + - Rendering .star files with configuration + - Schema extraction from .star files + - Timeout and error handling + """ + + def __init__(self, pixlet_path: Optional[str] = None, timeout: int = 30): + """ + Initialize the Pixlet renderer. + + Args: + pixlet_path: Optional explicit path to Pixlet binary + timeout: Maximum seconds to wait for rendering + """ + self.timeout = timeout + self.pixlet_binary = self._find_pixlet_binary(pixlet_path) + + if self.pixlet_binary: + logger.info(f"[Starlark Pixlet] Pixlet renderer initialized with binary: {self.pixlet_binary}") + else: + logger.warning("[Starlark Pixlet] Pixlet binary not found - rendering will fail") + + def _find_pixlet_binary(self, explicit_path: Optional[str] = None) -> Optional[str]: + """ + Find Pixlet binary using the following priority: + 1. Explicit path provided + 2. Bundled binary for current architecture + 3. System PATH + + Args: + explicit_path: User-specified path to Pixlet + + Returns: + Path to Pixlet binary, or None if not found + """ + # 1. Check explicit path + if explicit_path and os.path.isfile(explicit_path): + if os.access(explicit_path, os.X_OK): + logger.debug(f"Using explicit Pixlet path: {explicit_path}") + return explicit_path + else: + logger.warning(f"Explicit Pixlet path not executable: {explicit_path}") + + # 2. Check bundled binary + try: + bundled_path = self._get_bundled_binary_path() + if bundled_path and os.path.isfile(bundled_path): + # Ensure executable + if not os.access(bundled_path, os.X_OK): + try: + os.chmod(bundled_path, 0o755) + logger.debug(f"Made bundled binary executable: {bundled_path}") + except OSError: + logger.exception(f"Could not make bundled binary executable: {bundled_path}") + + if os.access(bundled_path, os.X_OK): + logger.debug(f"Using bundled Pixlet binary: {bundled_path}") + return bundled_path + except OSError: + logger.exception("Could not locate bundled binary") + + # 3. Check system PATH + system_pixlet = shutil.which("pixlet") + if system_pixlet: + logger.debug(f"Using system Pixlet: {system_pixlet}") + return system_pixlet + + logger.error("Pixlet binary not found in any location") + return None + + def _get_bundled_binary_path(self) -> Optional[str]: + """ + Get path to bundled Pixlet binary for current architecture. + + Returns: + Path to bundled binary, or None if not found + """ + try: + # Determine project root (parent of plugin-repos) + current_dir = Path(__file__).resolve().parent + project_root = current_dir.parent.parent + bin_dir = project_root / "bin" / "pixlet" + + # Detect architecture + system = platform.system().lower() + machine = platform.machine().lower() + + # Map architecture to binary name + if system == "linux": + if "aarch64" in machine or "arm64" in machine: + binary_name = "pixlet-linux-arm64" + elif "x86_64" in machine or "amd64" in machine: + binary_name = "pixlet-linux-amd64" + else: + logger.warning(f"Unsupported Linux architecture: {machine}") + return None + elif system == "darwin": + if "arm64" in machine: + binary_name = "pixlet-darwin-arm64" + else: + binary_name = "pixlet-darwin-amd64" + elif system == "windows": + binary_name = "pixlet-windows-amd64.exe" + else: + logger.warning(f"Unsupported system: {system}") + return None + + binary_path = bin_dir / binary_name + if binary_path.exists(): + return str(binary_path) + + logger.debug(f"Bundled binary not found at: {binary_path}") + return None + + except OSError: + logger.exception("Error finding bundled binary") + return None + + def _get_safe_working_directory(self, star_file: str) -> Optional[str]: + """ + Get a safe working directory for subprocess execution. + + Args: + star_file: Path to .star file + + Returns: + Resolved parent directory, or None if empty or invalid + """ + try: + resolved_parent = os.path.dirname(os.path.abspath(star_file)) + # Return None if empty string to avoid FileNotFoundError + if not resolved_parent: + logger.debug(f"Empty parent directory for star_file: {star_file}") + return None + return resolved_parent + except (OSError, ValueError): + logger.debug(f"Could not resolve working directory for: {star_file}") + return None + + def is_available(self) -> bool: + """ + Check if Pixlet is available and functional. + + Returns: + True if Pixlet can be executed + """ + if not self.pixlet_binary: + return False + + try: + result = subprocess.run( + [self.pixlet_binary, "version"], + capture_output=True, + text=True, + timeout=5 + ) + return result.returncode == 0 + except subprocess.TimeoutExpired: + logger.debug("Pixlet version check timed out") + return False + except (subprocess.SubprocessError, OSError): + logger.exception("Pixlet not available") + return False + + def get_version(self) -> Optional[str]: + """ + Get Pixlet version string. + + Returns: + Version string, or None if unavailable + """ + if not self.pixlet_binary: + return None + + try: + result = subprocess.run( + [self.pixlet_binary, "version"], + capture_output=True, + text=True, + timeout=5 + ) + if result.returncode == 0: + return result.stdout.strip() + except subprocess.TimeoutExpired: + logger.debug("Pixlet version check timed out") + except (subprocess.SubprocessError, OSError): + logger.exception("Could not get Pixlet version") + + return None + + def render( + self, + star_file: str, + output_path: str, + config: Optional[Dict[str, Any]] = None, + magnify: int = 1 + ) -> Tuple[bool, Optional[str]]: + """ + Render a .star file to WebP output. + + Args: + star_file: Path to .star file + output_path: Where to save WebP output + config: Configuration dictionary to pass to app + magnify: Magnification factor (default 1) + + Returns: + Tuple of (success: bool, error_message: Optional[str]) + """ + if not self.pixlet_binary: + return False, "Pixlet binary not found" + + if not os.path.isfile(star_file): + return False, f"Star file not found: {star_file}" + + try: + # Build command - config params must be POSITIONAL between star_file and flags + # Format: pixlet render [key=value]... [flags] + cmd = [ + self.pixlet_binary, + "render", + star_file + ] + + # Add configuration parameters as positional arguments (BEFORE flags) + if config: + for key, value in config.items(): + # Validate key format (alphanumeric + underscore only) + if not re.match(r'^[a-zA-Z_][a-zA-Z0-9_]*$', key): + logger.warning(f"Skipping invalid config key: {key}") + continue + + # Convert value to string for CLI + if isinstance(value, bool): + value_str = "true" if value else "false" + elif isinstance(value, str) and (value.startswith('{') or value.startswith('[')): + # JSON string - keep as-is, will be properly quoted by subprocess + value_str = value + else: + value_str = str(value) + + # Validate value doesn't contain dangerous shell metacharacters + # Block: backticks, $(), pipes, redirects, semicolons, ampersands, null bytes + # Allow: most printable chars including spaces, quotes, brackets, braces + if re.search(r'[`$|<>&;\x00]|\$\(', value_str): + logger.warning(f"Skipping config value with unsafe shell characters for key {key}: {value_str}") + continue + + # Add as positional argument (not -c flag) + cmd.append(f"{key}={value_str}") + + # Add flags AFTER positional config arguments + cmd.extend([ + "-o", output_path, + "-m", str(magnify) + ]) + + # Build sanitized command for logging (redact sensitive values) + sanitized_cmd = [self.pixlet_binary, "render", star_file] + if config: + config_keys = list(config.keys()) + sanitized_cmd.append(f"[{len(config_keys)} config entries: {', '.join(config_keys)}]") + sanitized_cmd.extend(["-o", output_path, "-m", str(magnify)]) + logger.debug(f"Executing Pixlet: {' '.join(sanitized_cmd)}") + + # Execute rendering + safe_cwd = self._get_safe_working_directory(star_file) + result = subprocess.run( + cmd, + capture_output=True, + text=True, + timeout=self.timeout, + cwd=safe_cwd # Run in .star file directory (or None if relative path) + ) + + if result.returncode == 0: + if os.path.isfile(output_path): + logger.debug(f"Successfully rendered: {star_file} -> {output_path}") + return True, None + else: + error = "Rendering succeeded but output file not found" + logger.error(error) + return False, error + else: + error = f"Pixlet failed (exit {result.returncode}): {result.stderr}" + logger.error(error) + return False, error + + except subprocess.TimeoutExpired: + error = f"Rendering timeout after {self.timeout}s" + logger.error(error) + return False, error + except (subprocess.SubprocessError, OSError): + logger.exception("Rendering exception") + return False, "Rendering failed - see logs for details" + + def extract_schema(self, star_file: str) -> Tuple[bool, Optional[Dict[str, Any]], Optional[str]]: + """ + Extract configuration schema from a .star file by parsing source code. + + Supports: + - Static field definitions (location, text, toggle, dropdown, color, datetime) + - Variable-referenced dropdown options + - Graceful degradation for unsupported field types + + Args: + star_file: Path to .star file + + Returns: + Tuple of (success: bool, schema: Optional[Dict], error: Optional[str]) + """ + if not os.path.isfile(star_file): + return False, None, f"Star file not found: {star_file}" + + try: + # Read .star file + with open(star_file, 'r', encoding='utf-8') as f: + content = f.read() + + # Parse schema from source + schema = self._parse_schema_from_source(content, star_file) + + if schema: + field_count = len(schema.get('schema', [])) + logger.debug(f"Extracted schema with {field_count} field(s) from: {star_file}") + return True, schema, None + else: + # No schema found - not an error, app just doesn't have configuration + logger.debug(f"No schema found in: {star_file}") + return True, None, None + + except UnicodeDecodeError as e: + error = f"File encoding error: {e}" + logger.warning(error) + return False, None, error + except Exception as e: + logger.exception(f"Schema extraction failed for {star_file}") + return False, None, f"Schema extraction error: {str(e)}" + + def _parse_schema_from_source(self, content: str, file_path: str) -> Optional[Dict[str, Any]]: + """ + Parse get_schema() function from Starlark source code. + + Args: + content: .star file content + file_path: Path to file (for logging) + + Returns: + Schema dict with format {"version": "1", "schema": [...]}, or None + """ + # Extract variable definitions (for dropdown options) + var_table = self._extract_variable_definitions(content) + + # Extract get_schema() function body + schema_body = self._extract_get_schema_body(content) + if not schema_body: + logger.debug(f"No get_schema() function found in {file_path}") + return None + + # Extract version + version_match = re.search(r'version\s*=\s*"([^"]+)"', schema_body) + version = version_match.group(1) if version_match else "1" + + # Extract fields array from schema.Schema(...) - handle nested brackets + fields_start_match = re.search(r'fields\s*=\s*\[', schema_body) + if not fields_start_match: + # Empty schema or no fields + return {"version": version, "schema": []} + + # Find matching closing bracket + bracket_count = 1 + i = fields_start_match.end() + while i < len(schema_body) and bracket_count > 0: + if schema_body[i] == '[': + bracket_count += 1 + elif schema_body[i] == ']': + bracket_count -= 1 + i += 1 + + if bracket_count != 0: + # Unmatched brackets + logger.warning(f"Unmatched brackets in schema fields for {file_path}") + return {"version": version, "schema": []} + + fields_text = schema_body[fields_start_match.end():i-1] + + # Parse individual fields + schema_fields = [] + # Match schema.FieldType(...) patterns + field_pattern = r'schema\.(\w+)\s*\((.*?)\)' + + # Find all field definitions (handle nested parentheses) + pos = 0 + while pos < len(fields_text): + match = re.search(field_pattern, fields_text[pos:], re.DOTALL) + if not match: + break + + field_type = match.group(1) + field_start = pos + match.start() + field_end = pos + match.end() + + # Handle nested parentheses properly + paren_count = 1 + i = pos + match.start() + len(f'schema.{field_type}(') + while i < len(fields_text) and paren_count > 0: + if fields_text[i] == '(': + paren_count += 1 + elif fields_text[i] == ')': + paren_count -= 1 + i += 1 + + field_params_text = fields_text[pos + match.start() + len(f'schema.{field_type}('):i-1] + + # Parse field + field_dict = self._parse_schema_field(field_type, field_params_text, var_table) + if field_dict: + schema_fields.append(field_dict) + + pos = i + + return { + "version": version, + "schema": schema_fields + } + + def _extract_variable_definitions(self, content: str) -> Dict[str, List[Dict]]: + """ + Extract top-level variable assignments (for dropdown options). + + Args: + content: .star file content + + Returns: + Dict mapping variable names to their option lists + """ + var_table = {} + + # Find variable definitions like: variableName = [schema.Option(...), ...] + var_pattern = r'^(\w+)\s*=\s*\[(.*?schema\.Option.*?)\]' + matches = re.finditer(var_pattern, content, re.MULTILINE | re.DOTALL) + + for match in matches: + var_name = match.group(1) + options_text = match.group(2) + + # Parse schema.Option entries + options = self._parse_schema_options(options_text, {}) + if options: + var_table[var_name] = options + + return var_table + + def _extract_get_schema_body(self, content: str) -> Optional[str]: + """ + Extract get_schema() function body using indentation-aware parsing. + + Args: + content: .star file content + + Returns: + Function body text, or None if not found + """ + # Find def get_schema(): line + pattern = r'^(\s*)def\s+get_schema\s*\(\s*\)\s*:' + match = re.search(pattern, content, re.MULTILINE) + + if not match: + return None + + # Get the indentation level of the function definition + func_indent = len(match.group(1)) + func_start = match.end() + + # Split content into lines starting after the function definition + lines_after = content[func_start:].split('\n') + body_lines = [] + + for line in lines_after: + # Skip empty lines + if not line.strip(): + body_lines.append(line) + continue + + # Calculate indentation of current line + stripped = line.lstrip() + line_indent = len(line) - len(stripped) + + # If line has same or less indentation than function def, check if it's a top-level def + if line_indent <= func_indent: + # This is a line at the same or outer level - check if it's a function + if re.match(r'def\s+\w+', stripped): + # Found next top-level function, stop here + break + # Otherwise it might be a comment or other top-level code, stop anyway + break + + # Line is indented more than function def, so it's part of the body + body_lines.append(line) + + if body_lines: + return '\n'.join(body_lines) + return None + + def _parse_schema_field(self, field_type: str, params_text: str, var_table: Dict) -> Optional[Dict[str, Any]]: + """ + Parse individual schema field definition. + + Args: + field_type: Field type (Location, Text, Toggle, etc.) + params_text: Field parameters text + var_table: Variable lookup table + + Returns: + Field dict, or None if parse fails + """ + # Map Pixlet field types to JSON typeOf + type_mapping = { + 'Location': 'location', + 'Text': 'text', + 'Toggle': 'toggle', + 'Dropdown': 'dropdown', + 'Color': 'color', + 'DateTime': 'datetime', + 'OAuth2': 'oauth2', + 'PhotoSelect': 'photo_select', + 'LocationBased': 'location_based', + 'Typeahead': 'typeahead', + 'Generated': 'generated', + } + + type_of = type_mapping.get(field_type, field_type.lower()) + + # Skip Generated fields (invisible meta-fields) + if type_of == 'generated': + return None + + field_dict = {"typeOf": type_of} + + # Extract common parameters + # id + id_match = re.search(r'id\s*=\s*"([^"]+)"', params_text) + if id_match: + field_dict['id'] = id_match.group(1) + else: + # id is required, skip field if missing + return None + + # name + name_match = re.search(r'name\s*=\s*"([^"]+)"', params_text) + if name_match: + field_dict['name'] = name_match.group(1) + + # desc + desc_match = re.search(r'desc\s*=\s*"([^"]+)"', params_text) + if desc_match: + field_dict['desc'] = desc_match.group(1) + + # icon + icon_match = re.search(r'icon\s*=\s*"([^"]+)"', params_text) + if icon_match: + field_dict['icon'] = icon_match.group(1) + + # default (can be string, bool, or variable reference) + # First try to match quoted strings (which may contain commas) + default_match = re.search(r'default\s*=\s*"([^"]*)"', params_text) + if not default_match: + # Try single quotes + default_match = re.search(r"default\s*=\s*'([^']*)'", params_text) + if not default_match: + # Fall back to unquoted value (stop at comma or closing paren) + default_match = re.search(r'default\s*=\s*([^,\)]+)', params_text) + + if default_match: + default_value = default_match.group(1).strip() + # Handle boolean + if default_value in ('True', 'False'): + field_dict['default'] = default_value.lower() + # Handle string literal from first two patterns (already extracted without quotes) + elif re.search(r'default\s*=\s*["\']', params_text): + # This was a quoted string, use the captured content directly + field_dict['default'] = default_value + # Handle variable reference (can't resolve, use as-is) + else: + # Try to extract just the value if it's like options[0].value + if '.' in default_value or '[' in default_value: + # Complex expression, skip default + pass + else: + field_dict['default'] = default_value + + # For dropdown, extract options + if type_of == 'dropdown': + options_match = re.search(r'options\s*=\s*([^,\)]+)', params_text) + if options_match: + options_ref = options_match.group(1).strip() + # Check if it's a variable reference + if options_ref in var_table: + field_dict['options'] = var_table[options_ref] + # Or inline options + elif options_ref.startswith('['): + # Find the full options array (handle nested brackets) + # This is tricky, for now try to extract inline options + inline_match = re.search(r'options\s*=\s*(\[.*?\])', params_text, re.DOTALL) + if inline_match: + options_text = inline_match.group(1) + field_dict['options'] = self._parse_schema_options(options_text, var_table) + + return field_dict + + def _parse_schema_options(self, options_text: str, var_table: Dict) -> List[Dict[str, str]]: + """ + Parse schema.Option list. + + Args: + options_text: Text containing schema.Option(...) entries + var_table: Variable lookup table (not currently used) + + Returns: + List of {"display": "...", "value": "..."} dicts + """ + options = [] + + # Match schema.Option(display = "...", value = "...") + option_pattern = r'schema\.Option\s*\(\s*display\s*=\s*"([^"]+)"\s*,\s*value\s*=\s*"([^"]+)"\s*\)' + matches = re.finditer(option_pattern, options_text) + + for match in matches: + options.append({ + "display": match.group(1), + "value": match.group(2) + }) + + return options diff --git a/plugin-repos/starlark-apps/requirements.txt b/plugin-repos/starlark-apps/requirements.txt new file mode 100644 index 000000000..7c1dfc12a --- /dev/null +++ b/plugin-repos/starlark-apps/requirements.txt @@ -0,0 +1,3 @@ +Pillow>=10.4.0 +PyYAML>=6.0.2 +requests>=2.32.0 diff --git a/plugin-repos/starlark-apps/tronbyte_repository.py b/plugin-repos/starlark-apps/tronbyte_repository.py new file mode 100644 index 000000000..ba647de07 --- /dev/null +++ b/plugin-repos/starlark-apps/tronbyte_repository.py @@ -0,0 +1,601 @@ +""" +Tronbyte Repository Module + +Handles interaction with the Tronbyte apps repository on GitHub. +Fetches app listings, metadata, and downloads .star files. +""" + +import logging +import time +import requests +import yaml +import threading +from typing import Dict, Any, Optional, List, Tuple +from pathlib import Path +from concurrent.futures import ThreadPoolExecutor, as_completed + +logger = logging.getLogger(__name__) + +# Module-level cache for bulk app listing (survives across requests) +_apps_cache = {'data': None, 'timestamp': 0, 'categories': [], 'authors': []} +_CACHE_TTL = 7200 # 2 hours +_cache_lock = threading.Lock() + + +class TronbyteRepository: + """ + Interface to the Tronbyte apps repository. + + Provides methods to: + - List available apps + - Fetch app metadata + - Download .star files + - Parse manifest.yaml files + """ + + REPO_OWNER = "tronbyt" + REPO_NAME = "apps" + DEFAULT_BRANCH = "main" + APPS_PATH = "apps" + + def __init__(self, github_token: Optional[str] = None): + """ + Initialize repository interface. + + Args: + github_token: Optional GitHub personal access token for higher rate limits + """ + self.github_token = github_token + self.base_url = "https://api.github.com" + self.raw_url = "https://raw.githubusercontent.com" + + self.session = requests.Session() + if github_token: + self.session.headers.update({ + 'Authorization': f'token {github_token}' + }) + self.session.headers.update({ + 'Accept': 'application/vnd.github.v3+json', + 'User-Agent': 'LEDMatrix-Starlark-Plugin' + }) + + def _make_request(self, url: str, timeout: int = 10) -> Optional[Dict[str, Any]]: + """ + Make a request to GitHub API with error handling. + + Args: + url: API URL to request + timeout: Request timeout in seconds + + Returns: + JSON response or None on error + """ + try: + response = self.session.get(url, timeout=timeout) + + if response.status_code == 403: + # Rate limit exceeded + logger.warning("[Tronbyte Repo] GitHub API rate limit exceeded") + return None + elif response.status_code == 404: + logger.warning(f"[Tronbyte Repo] Resource not found: {url}") + return None + elif response.status_code != 200: + logger.error(f"[Tronbyte Repo] GitHub API error: {response.status_code}") + return None + + return response.json() + + except requests.Timeout: + logger.error(f"[Tronbyte Repo] Request timeout: {url}") + return None + except requests.RequestException as e: + logger.error(f"[Tronbyte Repo] Request error: {e}", exc_info=True) + return None + except (json.JSONDecodeError, ValueError) as e: + logger.error(f"[Tronbyte Repo] JSON parse error for {url}: {e}", exc_info=True) + return None + + def _fetch_raw_file(self, file_path: str, branch: Optional[str] = None, binary: bool = False): + """ + Fetch raw file content from repository. + + Args: + file_path: Path to file in repository + branch: Branch name (default: DEFAULT_BRANCH) + binary: If True, return bytes; if False, return text + + Returns: + File content as string/bytes, or None on error + """ + branch = branch or self.DEFAULT_BRANCH + url = f"{self.raw_url}/{self.REPO_OWNER}/{self.REPO_NAME}/{branch}/{file_path}" + + try: + response = self.session.get(url, timeout=10) + if response.status_code == 200: + return response.content if binary else response.text + else: + logger.warning(f"[Tronbyte Repo] Failed to fetch raw file: {file_path} ({response.status_code})") + return None + except requests.Timeout: + logger.error(f"[Tronbyte Repo] Timeout fetching raw file: {file_path}") + return None + except requests.RequestException as e: + logger.error(f"[Tronbyte Repo] Network error fetching raw file {file_path}: {e}", exc_info=True) + return None + + def list_apps(self) -> Tuple[bool, Optional[List[Dict[str, Any]]], Optional[str]]: + """ + List all available apps in the repository. + + Returns: + Tuple of (success, apps_list, error_message) + """ + url = f"{self.base_url}/repos/{self.REPO_OWNER}/{self.REPO_NAME}/contents/{self.APPS_PATH}" + + data = self._make_request(url) + if data is None: + return False, None, "Failed to fetch repository contents" + + if not isinstance(data, list): + return False, None, "Invalid response format" + + # Filter directories (apps) + apps = [] + for item in data: + if item.get('type') == 'dir': + app_id = item.get('name') + if app_id and not app_id.startswith('.'): + apps.append({ + 'id': app_id, + 'path': item.get('path'), + 'url': item.get('url') + }) + + logger.info(f"Found {len(apps)} apps in repository") + return True, apps, None + + def get_app_metadata(self, app_id: str) -> Tuple[bool, Optional[Dict[str, Any]], Optional[str]]: + """ + Fetch metadata for a specific app. + + Reads the manifest.yaml file for the app and parses it. + + Args: + app_id: App identifier + + Returns: + Tuple of (success, metadata_dict, error_message) + """ + manifest_path = f"{self.APPS_PATH}/{app_id}/manifest.yaml" + + content = self._fetch_raw_file(manifest_path) + if not content: + return False, None, f"Failed to fetch manifest for {app_id}" + + try: + metadata = yaml.safe_load(content) + + # Validate that metadata is a dict before mutating + if not isinstance(metadata, dict): + if metadata is None: + logger.warning(f"Manifest for {app_id} is empty or None, initializing empty dict") + metadata = {} + else: + logger.error(f"Manifest for {app_id} is not a dict (got {type(metadata).__name__}), skipping") + return False, None, f"Invalid manifest format: expected dict, got {type(metadata).__name__}" + + # Enhance with app_id + metadata['id'] = app_id + + # Parse schema if present + if 'schema' in metadata: + # Schema is already parsed from YAML + pass + + return True, metadata, None + + except (yaml.YAMLError, TypeError) as e: + logger.error(f"Failed to parse manifest for {app_id}: {e}") + return False, None, f"Invalid manifest format: {e}" + + def list_apps_with_metadata(self, max_apps: Optional[int] = None) -> List[Dict[str, Any]]: + """ + List all apps with their metadata. + + This is slower as it fetches manifest.yaml for each app. + + Args: + max_apps: Optional limit on number of apps to fetch + + Returns: + List of app metadata dictionaries + """ + success, apps, error = self.list_apps() + + if not success: + logger.error(f"Failed to list apps: {error}") + return [] + + if max_apps is not None: + apps = apps[:max_apps] + + apps_with_metadata = [] + for app_info in apps: + app_id = app_info['id'] + success, metadata, error = self.get_app_metadata(app_id) + + if success and metadata: + # Merge basic info with metadata + metadata.update({ + 'repository_path': app_info['path'] + }) + apps_with_metadata.append(metadata) + else: + # Add basic info even if metadata fetch failed + apps_with_metadata.append({ + 'id': app_id, + 'name': app_id.replace('_', ' ').title(), + 'summary': 'No description available', + 'repository_path': app_info['path'], + 'metadata_error': error + }) + + return apps_with_metadata + + def list_all_apps_cached(self) -> Dict[str, Any]: + """ + Fetch ALL apps with metadata, using a module-level cache. + + On first call (or after cache TTL expires), fetches the directory listing + via the GitHub API (1 call) then fetches all manifests in parallel via + raw.githubusercontent.com (not rate-limited). Results are cached for 2 hours. + + Returns: + Dict with keys: apps, categories, authors, count, cached + """ + global _apps_cache + + now = time.time() + + # Check cache with lock (read-only check) + with _cache_lock: + if _apps_cache['data'] is not None and (now - _apps_cache['timestamp']) < _CACHE_TTL: + return { + 'apps': _apps_cache['data'], + 'categories': _apps_cache['categories'], + 'authors': _apps_cache['authors'], + 'count': len(_apps_cache['data']), + 'cached': True + } + + # Fetch directory listing (1 GitHub API call) + success, app_dirs, error = self.list_apps() + if not success or not app_dirs: + logger.error(f"Failed to list apps for bulk fetch: {error}") + return {'apps': [], 'categories': [], 'authors': [], 'count': 0, 'cached': False} + + logger.info(f"Bulk-fetching manifests for {len(app_dirs)} apps...") + + def fetch_one(app_info): + """Fetch a single app's manifest (runs in thread pool).""" + app_id = app_info['id'] + manifest_path = f"{self.APPS_PATH}/{app_id}/manifest.yaml" + content = self._fetch_raw_file(manifest_path) + if content: + try: + metadata = yaml.safe_load(content) + if not isinstance(metadata, dict): + metadata = {} + metadata['id'] = app_id + metadata['repository_path'] = app_info.get('path', '') + return metadata + except (yaml.YAMLError, TypeError) as e: + logger.warning(f"Failed to parse manifest for {app_id}: {e}") + # Fallback: minimal entry + return { + 'id': app_id, + 'name': app_id.replace('_', ' ').replace('-', ' ').title(), + 'summary': 'No description available', + 'repository_path': app_info.get('path', ''), + } + + # Parallel manifest fetches via raw.githubusercontent.com (high rate limit) + apps_with_metadata = [] + with ThreadPoolExecutor(max_workers=5) as executor: + futures = {executor.submit(fetch_one, info): info for info in app_dirs} + for future in as_completed(futures): + try: + result = future.result(timeout=30) + if result: + apps_with_metadata.append(result) + except Exception as e: + app_info = futures[future] + logger.warning(f"Failed to fetch manifest for {app_info['id']}: {e}") + apps_with_metadata.append({ + 'id': app_info['id'], + 'name': app_info['id'].replace('_', ' ').replace('-', ' ').title(), + 'summary': 'No description available', + 'repository_path': app_info.get('path', ''), + }) + + # Sort by name for consistent ordering + apps_with_metadata.sort(key=lambda a: (a.get('name') or a.get('id', '')).lower()) + + # Extract unique categories and authors + categories = sorted({a.get('category', '') for a in apps_with_metadata if a.get('category')}) + authors = sorted({a.get('author', '') for a in apps_with_metadata if a.get('author')}) + + # Update cache with lock + with _cache_lock: + _apps_cache['data'] = apps_with_metadata + _apps_cache['timestamp'] = now + _apps_cache['categories'] = categories + _apps_cache['authors'] = authors + + logger.info(f"Cached {len(apps_with_metadata)} apps ({len(categories)} categories, {len(authors)} authors)") + + return { + 'apps': apps_with_metadata, + 'categories': categories, + 'authors': authors, + 'count': len(apps_with_metadata), + 'cached': False + } + + def download_star_file(self, app_id: str, output_path: Path, filename: Optional[str] = None) -> Tuple[bool, Optional[str]]: + """ + Download the .star file for an app. + + Args: + app_id: App identifier (directory name) + output_path: Where to save the .star file + filename: Optional specific filename from manifest (e.g., "analog_clock.star") + If not provided, assumes {app_id}.star + + Returns: + Tuple of (success, error_message) + """ + # Validate inputs for path traversal + if '..' in app_id or '/' in app_id or '\\' in app_id: + return False, f"Invalid app_id: contains path traversal characters" + + star_filename = filename or f"{app_id}.star" + if '..' in star_filename or '/' in star_filename or '\\' in star_filename: + return False, f"Invalid filename: contains path traversal characters" + + # Validate output_path to prevent path traversal + import tempfile + try: + resolved_output = output_path.resolve() + temp_dir = Path(tempfile.gettempdir()).resolve() + + # Check if output_path is within the system temp directory + # Use try/except for compatibility with Python < 3.9 (is_relative_to) + try: + is_safe = resolved_output.is_relative_to(temp_dir) + except AttributeError: + # Fallback for Python < 3.9: compare string paths + is_safe = str(resolved_output).startswith(str(temp_dir) + '/') + + if not is_safe: + logger.warning(f"Path traversal attempt in download_star_file: app_id={app_id}, output_path={output_path}") + return False, f"Invalid output_path for {app_id}: must be within temp directory" + except Exception as e: + logger.error(f"Error validating output_path for {app_id}: {e}") + return False, f"Invalid output_path for {app_id}" + + # Use provided filename or fall back to app_id.star + star_path = f"{self.APPS_PATH}/{app_id}/{star_filename}" + + content = self._fetch_raw_file(star_path) + if not content: + return False, f"Failed to download .star file for {app_id} (tried {star_filename})" + + try: + output_path.parent.mkdir(parents=True, exist_ok=True) + with open(output_path, 'w', encoding='utf-8') as f: + f.write(content) + + logger.info(f"Downloaded {app_id}.star to {output_path}") + return True, None + + except OSError as e: + logger.exception(f"Failed to save .star file: {e}") + return False, f"Failed to save file: {e}" + + def get_app_files(self, app_id: str) -> Tuple[bool, Optional[List[str]], Optional[str]]: + """ + List all files in an app directory. + + Args: + app_id: App identifier + + Returns: + Tuple of (success, file_list, error_message) + """ + url = f"{self.base_url}/repos/{self.REPO_OWNER}/{self.REPO_NAME}/contents/{self.APPS_PATH}/{app_id}" + + data = self._make_request(url) + if not data: + return False, None, "Failed to fetch app files" + + if not isinstance(data, list): + return False, None, "Invalid response format" + + files = [item['name'] for item in data if item.get('type') == 'file'] + return True, files, None + + def download_app_assets(self, app_id: str, output_dir: Path) -> Tuple[bool, Optional[str]]: + """ + Download all asset files (images, sources, etc.) for an app. + + Args: + app_id: App identifier + output_dir: Directory to save assets to + + Returns: + Tuple of (success, error_message) + """ + # Validate app_id for path traversal + if '..' in app_id or '/' in app_id or '\\' in app_id: + return False, f"Invalid app_id: contains path traversal characters" + + try: + # Get directory listing for the app + url = f"{self.base_url}/repos/{self.REPO_OWNER}/{self.REPO_NAME}/contents/{self.APPS_PATH}/{app_id}" + data = self._make_request(url) + if not data: + return False, f"Failed to fetch app directory listing" + + if not isinstance(data, list): + return False, f"Invalid directory listing format" + + # Find directories that contain assets (images, sources, etc.) + asset_dirs = [] + for item in data: + if item.get('type') == 'dir': + dir_name = item.get('name') + # Common asset directory names in Tronbyte apps + if dir_name in ('images', 'sources', 'fonts', 'assets'): + asset_dirs.append((dir_name, item.get('url'))) + + if not asset_dirs: + # No asset directories, this is fine + return True, None + + # Download each asset directory + for dir_name, dir_url in asset_dirs: + # Validate directory name for path traversal + if '..' in dir_name or '/' in dir_name or '\\' in dir_name: + logger.warning(f"Skipping potentially unsafe directory: {dir_name}") + continue + + # Get files in this directory + dir_data = self._make_request(dir_url) + if not dir_data or not isinstance(dir_data, list): + logger.warning(f"Could not list files in {app_id}/{dir_name}") + continue + + # Create local directory + local_dir = output_dir / dir_name + local_dir.mkdir(parents=True, exist_ok=True) + + # Download each file + for file_item in dir_data: + if file_item.get('type') == 'file': + file_name = file_item.get('name') + + # Ensure file_name is a non-empty string before validation + if not file_name or not isinstance(file_name, str): + logger.warning(f"Skipping file with invalid name in {dir_name}: {file_item}") + continue + + # Validate filename for path traversal + if '..' in file_name or '/' in file_name or '\\' in file_name: + logger.warning(f"Skipping potentially unsafe file: {file_name}") + continue + + file_path = f"{self.APPS_PATH}/{app_id}/{dir_name}/{file_name}" + content = self._fetch_raw_file(file_path, binary=True) + if content: + # Write binary content to file + output_path = local_dir / file_name + try: + with open(output_path, 'wb') as f: + f.write(content) + logger.debug(f"[Tronbyte Repo] Downloaded asset: {dir_name}/{file_name}") + except OSError as e: + logger.warning(f"[Tronbyte Repo] Failed to save {dir_name}/{file_name}: {e}", exc_info=True) + else: + logger.warning(f"Failed to download {dir_name}/{file_name}") + + logger.info(f"[Tronbyte Repo] Downloaded assets for {app_id} ({len(asset_dirs)} directories)") + return True, None + + except (OSError, ValueError) as e: + logger.exception(f"[Tronbyte Repo] Error downloading assets for {app_id}: {e}") + return False, f"Error downloading assets: {e}" + + def search_apps(self, query: str, apps_with_metadata: List[Dict[str, Any]]) -> List[Dict[str, Any]]: + """ + Search apps by name, summary, or description. + + Args: + query: Search query string + apps_with_metadata: List of apps with metadata + + Returns: + Filtered list of apps matching query + """ + if not query: + return apps_with_metadata + + query_lower = query.lower() + results = [] + + for app in apps_with_metadata: + # Search in name, summary, description, author + searchable = ' '.join([ + app.get('name', ''), + app.get('summary', ''), + app.get('desc', ''), + app.get('author', ''), + app.get('id', '') + ]).lower() + + if query_lower in searchable: + results.append(app) + + return results + + def filter_by_category(self, category: str, apps_with_metadata: List[Dict[str, Any]]) -> List[Dict[str, Any]]: + """ + Filter apps by category. + + Args: + category: Category name (or 'all' for no filtering) + apps_with_metadata: List of apps with metadata + + Returns: + Filtered list of apps + """ + if not category or category.lower() == 'all': + return apps_with_metadata + + category_lower = category.lower() + results = [] + + for app in apps_with_metadata: + app_category = app.get('category', '').lower() + if app_category == category_lower: + results.append(app) + + return results + + def get_rate_limit_info(self) -> Dict[str, Any]: + """ + Get current GitHub API rate limit information. + + Returns: + Dictionary with rate limit info + """ + url = f"{self.base_url}/rate_limit" + data = self._make_request(url) + + if data: + core = data.get('resources', {}).get('core', {}) + return { + 'limit': core.get('limit', 0), + 'remaining': core.get('remaining', 0), + 'reset': core.get('reset', 0), + 'used': core.get('used', 0) + } + + return { + 'limit': 0, + 'remaining': 0, + 'reset': 0, + 'used': 0 + } diff --git a/scripts/download_pixlet.sh b/scripts/download_pixlet.sh new file mode 100755 index 000000000..b3d4070aa --- /dev/null +++ b/scripts/download_pixlet.sh @@ -0,0 +1,139 @@ +#!/bin/bash +# +# Download Pixlet binaries for bundled distribution +# +# This script downloads Pixlet binaries from the Tronbyte fork +# for multiple architectures to support various platforms. + +set -e + +SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" +PROJECT_ROOT="$(dirname "$SCRIPT_DIR")" +BIN_DIR="$PROJECT_ROOT/bin/pixlet" + +# Pixlet version to download (use 'latest' to auto-detect) +PIXLET_VERSION="${PIXLET_VERSION:-latest}" + +# GitHub repository (Tronbyte fork) +REPO="tronbyt/pixlet" + +echo "========================================" +echo "Pixlet Binary Download Script" +echo "========================================" + +# Auto-detect latest version if needed +if [ "$PIXLET_VERSION" = "latest" ]; then + echo "Detecting latest version..." + PIXLET_VERSION=$(curl -s "https://api.github.com/repos/${REPO}/releases/latest" | grep '"tag_name"' | sed -E 's/.*"([^"]+)".*/\1/') + if [ -z "$PIXLET_VERSION" ]; then + echo "Failed to detect latest version, using fallback" + PIXLET_VERSION="v0.50.2" + fi +fi + +echo "Version: $PIXLET_VERSION" +echo "Target directory: $BIN_DIR" +echo "" + +# Create bin directory if it doesn't exist +mkdir -p "$BIN_DIR" + +# New naming convention: pixlet_v0.50.2_linux-arm64.tar.gz +# Only download ARM64 Linux binary for Raspberry Pi +declare -A ARCHITECTURES=( + ["linux-arm64"]="pixlet_${PIXLET_VERSION}_linux-arm64.tar.gz" +) + +download_binary() { + local arch="$1" + local archive_name="$2" + local binary_name="pixlet-${arch}" + + local output_path="$BIN_DIR/$binary_name" + + # Skip if already exists + if [ -f "$output_path" ]; then + echo "✓ $binary_name already exists, skipping..." + return 0 + fi + + echo "→ Downloading $arch..." + + # Construct download URL + local url="https://github.com/${REPO}/releases/download/${PIXLET_VERSION}/${archive_name}" + + # Download to temp directory (use project-local temp to avoid /tmp permission issues) + local temp_dir + temp_dir=$(mktemp -d -p "$PROJECT_ROOT" -t pixlet_download.XXXXXXXXXX) + local temp_file="$temp_dir/$archive_name" + + if ! curl -L -o "$temp_file" "$url" 2>/dev/null; then + echo "✗ Failed to download $arch" + rm -rf "$temp_dir" + return 1 + fi + + # Extract binary + echo " Extracting..." + if ! tar -xzf "$temp_file" -C "$temp_dir"; then + echo "✗ Failed to extract archive: $temp_file" + rm -rf "$temp_dir" + return 1 + fi + + # Find the pixlet binary in extracted files + local extracted_binary + extracted_binary=$(find "$temp_dir" -name "pixlet" | head -n 1) + + if [ -z "$extracted_binary" ]; then + echo "✗ Binary not found in archive" + rm -rf "$temp_dir" + return 1 + fi + + # Move to final location + mv "$extracted_binary" "$output_path" + + # Make executable + chmod +x "$output_path" + + # Clean up + rm -rf "$temp_dir" + + # Verify + local size + size=$(stat -f%z "$output_path" 2>/dev/null || stat -c%s "$output_path" 2>/dev/null || echo "unknown") + if [ "$size" = "unknown" ]; then + echo "✓ Downloaded $binary_name" + else + echo "✓ Downloaded $binary_name ($(numfmt --to=iec-i --suffix=B $size 2>/dev/null || echo "${size} bytes"))" + fi + + return 0 +} + +# Download binaries for each architecture +success_count=0 +total_count=${#ARCHITECTURES[@]} + +for arch in "${!ARCHITECTURES[@]}"; do + if download_binary "$arch" "${ARCHITECTURES[$arch]}"; then + ((success_count++)) + fi +done + +echo "" +echo "========================================" +echo "Download complete: $success_count/$total_count succeeded" +echo "========================================" + +# List downloaded binaries +echo "" +echo "Installed binaries:" +if compgen -G "$BIN_DIR/*" > /dev/null 2>&1; then + ls -lh "$BIN_DIR"/* +else + echo "No binaries found" +fi + +exit 0 diff --git a/web_interface/blueprints/api_v3.py b/web_interface/blueprints/api_v3.py index adfa31435..36b10d7a4 100644 --- a/web_interface/blueprints/api_v3.py +++ b/web_interface/blueprints/api_v3.py @@ -10,6 +10,7 @@ import logging from datetime import datetime from pathlib import Path +from typing import Optional, Tuple, Dict, Any, Type logger = logging.getLogger(__name__) @@ -1852,6 +1853,55 @@ def get_installed_plugins(): 'vegas_content_type': vegas_content_type }) + # Append virtual entries for installed Starlark apps + starlark_plugin = _get_starlark_plugin() + if starlark_plugin and hasattr(starlark_plugin, 'apps'): + for app_id, app in starlark_plugin.apps.items(): + plugins.append({ + 'id': f'starlark:{app_id}', + 'name': app.manifest.get('name', app_id), + 'version': 'starlark', + 'author': app.manifest.get('author', 'Tronbyte Community'), + 'category': 'Starlark App', + 'description': app.manifest.get('summary', 'Starlark app'), + 'tags': ['starlark'], + 'enabled': app.is_enabled(), + 'verified': False, + 'loaded': True, + 'last_updated': None, + 'last_commit': None, + 'last_commit_message': None, + 'branch': None, + 'web_ui_actions': [], + 'vegas_mode': 'fixed', + 'vegas_content_type': 'multi', + 'is_starlark_app': True, + }) + else: + # Standalone: read from manifest on disk + manifest = _read_starlark_manifest() + for app_id, app_data in manifest.get('apps', {}).items(): + plugins.append({ + 'id': f'starlark:{app_id}', + 'name': app_data.get('name', app_id), + 'version': 'starlark', + 'author': 'Tronbyte Community', + 'category': 'Starlark App', + 'description': 'Starlark app', + 'tags': ['starlark'], + 'enabled': app_data.get('enabled', True), + 'verified': False, + 'loaded': False, + 'last_updated': None, + 'last_commit': None, + 'last_commit_message': None, + 'branch': None, + 'web_ui_actions': [], + 'vegas_mode': 'fixed', + 'vegas_content_type': 'multi', + 'is_starlark_app': True, + }) + return jsonify({'status': 'success', 'data': {'plugins': plugins}}) except Exception as e: import traceback @@ -2127,6 +2177,28 @@ def toggle_plugin(): current_enabled = config.get(plugin_id, {}).get('enabled', False) enabled = not current_enabled + # Handle starlark app toggle (starlark: prefix) + if plugin_id.startswith('starlark:'): + starlark_app_id = plugin_id[len('starlark:'):] + starlark_plugin = _get_starlark_plugin() + if starlark_plugin and starlark_app_id in starlark_plugin.apps: + app = starlark_plugin.apps[starlark_app_id] + app.manifest['enabled'] = enabled + # Use safe manifest update to prevent race conditions + def update_fn(manifest): + manifest['apps'][starlark_app_id]['enabled'] = enabled + starlark_plugin._update_manifest_safe(update_fn) + else: + # Standalone: update manifest directly + manifest = _read_starlark_manifest() + app_data = manifest.get('apps', {}).get(starlark_app_id) + if not app_data: + return jsonify({'status': 'error', 'message': f'Starlark app not found: {starlark_app_id}'}), 404 + app_data['enabled'] = enabled + if not _write_starlark_manifest(manifest): + return jsonify({'status': 'error', 'message': 'Failed to save manifest'}), 500 + return jsonify({'status': 'success', 'message': f"Starlark app {'enabled' if enabled else 'disabled'}", 'enabled': enabled}) + # Check if plugin exists in manifests (discovered but may not be loaded) if plugin_id not in api_v3.plugin_manager.plugin_manifests: return jsonify({'status': 'error', 'message': f'Plugin {plugin_id} not found'}), 404 @@ -6903,4 +6975,907 @@ def clear_old_errors(): message="Failed to clear old errors", details=str(e), status_code=500 - ) \ No newline at end of file + ) + + +# ─── Starlark Apps API ────────────────────────────────────────────────────── + +def _get_tronbyte_repository_class() -> Type[Any]: + """Import TronbyteRepository from plugin-repos directory.""" + import importlib.util + import importlib + + module_path = PROJECT_ROOT / 'plugin-repos' / 'starlark-apps' / 'tronbyte_repository.py' + if not module_path.exists(): + raise ImportError(f"TronbyteRepository module not found at {module_path}") + + # If already imported, reload to pick up code changes + if "tronbyte_repository" in sys.modules: + importlib.reload(sys.modules["tronbyte_repository"]) + return sys.modules["tronbyte_repository"].TronbyteRepository + + spec = importlib.util.spec_from_file_location("tronbyte_repository", str(module_path)) + if spec is None: + raise ImportError(f"Failed to create module spec for tronbyte_repository at {module_path}") + + module = importlib.util.module_from_spec(spec) + if module is None: + raise ImportError("Failed to create module from spec for tronbyte_repository") + + sys.modules["tronbyte_repository"] = module + spec.loader.exec_module(module) + return module.TronbyteRepository + + +def _get_pixlet_renderer_class() -> Type[Any]: + """Import PixletRenderer from plugin-repos directory.""" + import importlib.util + import importlib + + module_path = PROJECT_ROOT / 'plugin-repos' / 'starlark-apps' / 'pixlet_renderer.py' + if not module_path.exists(): + raise ImportError(f"PixletRenderer module not found at {module_path}") + + # If already imported, reload to pick up code changes + if "pixlet_renderer" in sys.modules: + importlib.reload(sys.modules["pixlet_renderer"]) + return sys.modules["pixlet_renderer"].PixletRenderer + + spec = importlib.util.spec_from_file_location("pixlet_renderer", str(module_path)) + if spec is None: + raise ImportError(f"Failed to create module spec for pixlet_renderer at {module_path}") + + module = importlib.util.module_from_spec(spec) + if module is None: + raise ImportError("Failed to create module from spec for pixlet_renderer") + + sys.modules["pixlet_renderer"] = module + spec.loader.exec_module(module) + return module.PixletRenderer + + +def _validate_and_sanitize_app_id(app_id: Optional[str], fallback_source: Optional[str] = None) -> Tuple[Optional[str], Optional[str]]: + """Validate and sanitize app_id to a safe slug.""" + if not app_id and fallback_source: + app_id = fallback_source + if not app_id: + return None, "app_id is required" + if '..' in app_id or '/' in app_id or '\\' in app_id: + return None, "app_id contains invalid characters" + + sanitized = re.sub(r'[^a-z0-9_]', '_', app_id.lower()).strip('_') + if not sanitized: + sanitized = f"app_{hashlib.sha256(app_id.encode()).hexdigest()[:12]}" + if sanitized[0].isdigit(): + sanitized = f"app_{sanitized}" + return sanitized, None + + +def _validate_timing_value(value: Any, field_name: str, min_val: int = 1, max_val: int = 86400) -> Tuple[Optional[int], Optional[str]]: + """Validate and coerce timing values.""" + if value is None: + return None, None + try: + int_value = int(value) + except (ValueError, TypeError): + return None, f"{field_name} must be an integer" + if int_value < min_val: + return None, f"{field_name} must be at least {min_val}" + if int_value > max_val: + return None, f"{field_name} must be at most {max_val}" + return int_value, None + + +def _get_starlark_plugin() -> Optional[Any]: + """Get the starlark-apps plugin instance, or None.""" + if not api_v3.plugin_manager: + return None + return api_v3.plugin_manager.get_plugin('starlark-apps') + + +def _validate_starlark_app_path(app_id: str) -> Tuple[bool, Optional[str]]: + """ + Validate app_id for path traversal attacks before filesystem access. + + Args: + app_id: App identifier from user input + + Returns: + Tuple of (is_valid, error_message) + """ + # Check for path traversal characters + if '..' in app_id or '/' in app_id or '\\' in app_id: + return False, f"Invalid app_id: contains path traversal characters" + + # Construct and resolve the path + try: + app_path = (_STARLARK_APPS_DIR / app_id).resolve() + base_path = _STARLARK_APPS_DIR.resolve() + + # Verify the resolved path is within the base directory + try: + app_path.relative_to(base_path) + return True, None + except ValueError: + return False, f"Invalid app_id: path traversal attempt" + except Exception as e: + logger.warning(f"Path validation error for app_id '{app_id}': {e}") + return False, f"Invalid app_id" + + +# Starlark standalone helpers for web service (plugin not loaded) +_STARLARK_APPS_DIR = PROJECT_ROOT / 'starlark-apps' +_STARLARK_MANIFEST_FILE = _STARLARK_APPS_DIR / 'manifest.json' + + +def _read_starlark_manifest() -> Dict[str, Any]: + """Read the starlark-apps manifest.json directly from disk.""" + try: + if _STARLARK_MANIFEST_FILE.exists(): + with open(_STARLARK_MANIFEST_FILE, 'r') as f: + return json.load(f) + except (json.JSONDecodeError, OSError) as e: + logger.error(f"Error reading starlark manifest: {e}") + return {'apps': {}} + + +def _write_starlark_manifest(manifest: Dict[str, Any]) -> bool: + """Write the starlark-apps manifest.json to disk with atomic write.""" + temp_file = None + try: + _STARLARK_APPS_DIR.mkdir(parents=True, exist_ok=True) + + # Atomic write pattern: write to temp file, then rename + temp_file = _STARLARK_MANIFEST_FILE.with_suffix('.tmp') + with open(temp_file, 'w') as f: + json.dump(manifest, f, indent=2) + f.flush() + os.fsync(f.fileno()) # Ensure data is written to disk + + # Atomic rename (overwrites destination) + temp_file.replace(_STARLARK_MANIFEST_FILE) + return True + except OSError as e: + logger.error(f"Error writing starlark manifest: {e}") + # Clean up temp file if it exists + if temp_file and temp_file.exists(): + try: + temp_file.unlink() + except Exception: + pass + return False + + +def _install_star_file(app_id: str, star_file_path: str, metadata: Dict[str, Any], assets_dir: Optional[str] = None) -> bool: + """Install a .star file and update the manifest (standalone, no plugin needed).""" + import shutil + import json + app_dir = _STARLARK_APPS_DIR / app_id + app_dir.mkdir(parents=True, exist_ok=True) + dest = app_dir / f"{app_id}.star" + shutil.copy2(star_file_path, str(dest)) + + # Copy asset directories if provided (images/, sources/, etc.) + if assets_dir and Path(assets_dir).exists(): + assets_path = Path(assets_dir) + for item in assets_path.iterdir(): + if item.is_dir(): + # Copy entire directory (e.g., images/, sources/) + dest_dir = app_dir / item.name + if dest_dir.exists(): + shutil.rmtree(dest_dir) + shutil.copytree(item, dest_dir) + logger.debug(f"Copied assets directory: {item.name}") + logger.info(f"Installed assets for {app_id}") + + # Try to extract schema using PixletRenderer + schema = None + try: + PixletRenderer = _get_pixlet_renderer_class() + pixlet = PixletRenderer() + if pixlet.is_available(): + _, schema, _ = pixlet.extract_schema(str(dest)) + if schema: + schema_path = app_dir / "schema.json" + with open(schema_path, 'w') as f: + json.dump(schema, f, indent=2) + logger.info(f"Extracted schema for {app_id}") + except Exception as e: + logger.warning(f"Failed to extract schema for {app_id}: {e}") + + # Create default config — pre-populate with schema defaults + default_config = {} + if schema: + fields = schema.get('fields') or schema.get('schema') or [] + for field in fields: + if isinstance(field, dict) and 'id' in field and 'default' in field: + default_config[field['id']] = field['default'] + + # Create config.json file + config_path = app_dir / "config.json" + with open(config_path, 'w') as f: + json.dump(default_config, f, indent=2) + + manifest = _read_starlark_manifest() + manifest.setdefault('apps', {})[app_id] = { + 'name': metadata.get('name', app_id), + 'enabled': True, + 'render_interval': metadata.get('render_interval', 300), + 'display_duration': metadata.get('display_duration', 15), + 'config': metadata.get('config', {}), + 'star_file': str(dest), + } + return _write_starlark_manifest(manifest) + + +@api_v3.route('/starlark/status', methods=['GET']) +def get_starlark_status(): + """Get Starlark plugin status and Pixlet availability.""" + try: + starlark_plugin = _get_starlark_plugin() + if starlark_plugin: + info = starlark_plugin.get_info() + magnify_info = starlark_plugin.get_magnify_recommendation() + return jsonify({ + 'status': 'success', + 'pixlet_available': info.get('pixlet_available', False), + 'pixlet_version': info.get('pixlet_version'), + 'installed_apps': info.get('installed_apps', 0), + 'enabled_apps': info.get('enabled_apps', 0), + 'current_app': info.get('current_app'), + 'plugin_enabled': starlark_plugin.enabled, + 'display_info': magnify_info + }) + + # Plugin not loaded - check Pixlet availability directly + import shutil + import platform + + system = platform.system().lower() + machine = platform.machine().lower() + bin_dir = PROJECT_ROOT / 'bin' / 'pixlet' + + pixlet_binary = None + if system == "linux": + if "aarch64" in machine or "arm64" in machine: + pixlet_binary = bin_dir / "pixlet-linux-arm64" + elif "x86_64" in machine or "amd64" in machine: + pixlet_binary = bin_dir / "pixlet-linux-amd64" + elif system == "darwin": + pixlet_binary = bin_dir / ("pixlet-darwin-arm64" if "arm64" in machine else "pixlet-darwin-amd64") + + pixlet_available = (pixlet_binary and pixlet_binary.exists()) or shutil.which('pixlet') is not None + + # Read app counts from manifest + manifest = _read_starlark_manifest() + apps = manifest.get('apps', {}) + installed_count = len(apps) + enabled_count = sum(1 for a in apps.values() if a.get('enabled', True)) + + return jsonify({ + 'status': 'success', + 'pixlet_available': pixlet_available, + 'pixlet_version': None, + 'installed_apps': installed_count, + 'enabled_apps': enabled_count, + 'plugin_enabled': True, + 'plugin_loaded': False, + 'display_info': {} + }) + + except Exception as e: + logger.error(f"Error getting starlark status: {e}") + return jsonify({'status': 'error', 'message': str(e)}), 500 + + +@api_v3.route('/starlark/apps', methods=['GET']) +def get_starlark_apps(): + """List all installed Starlark apps.""" + try: + starlark_plugin = _get_starlark_plugin() + if starlark_plugin: + apps_list = [] + for app_id, app_instance in starlark_plugin.apps.items(): + apps_list.append({ + 'id': app_id, + 'name': app_instance.manifest.get('name', app_id), + 'enabled': app_instance.is_enabled(), + 'has_frames': app_instance.frames is not None, + 'render_interval': app_instance.get_render_interval(), + 'display_duration': app_instance.get_display_duration(), + 'config': app_instance.config, + 'has_schema': app_instance.schema is not None, + 'last_render_time': app_instance.last_render_time + }) + return jsonify({'status': 'success', 'apps': apps_list, 'count': len(apps_list)}) + + # Standalone: read manifest from disk + manifest = _read_starlark_manifest() + apps_list = [] + for app_id, app_data in manifest.get('apps', {}).items(): + apps_list.append({ + 'id': app_id, + 'name': app_data.get('name', app_id), + 'enabled': app_data.get('enabled', True), + 'has_frames': False, + 'render_interval': app_data.get('render_interval', 300), + 'display_duration': app_data.get('display_duration', 15), + 'config': app_data.get('config', {}), + 'has_schema': False, + 'last_render_time': None + }) + return jsonify({'status': 'success', 'apps': apps_list, 'count': len(apps_list)}) + + except Exception as e: + logger.error(f"Error getting starlark apps: {e}") + return jsonify({'status': 'error', 'message': str(e)}), 500 + + +@api_v3.route('/starlark/apps/', methods=['GET']) +def get_starlark_app(app_id): + """Get details for a specific Starlark app.""" + try: + # Validate app_id before any filesystem access + is_valid, error_msg = _validate_starlark_app_path(app_id) + if not is_valid: + return jsonify({'status': 'error', 'message': error_msg}), 400 + + starlark_plugin = _get_starlark_plugin() + if starlark_plugin: + app = starlark_plugin.apps.get(app_id) + if not app: + return jsonify({'status': 'error', 'message': f'App not found: {app_id}'}), 404 + return jsonify({ + 'status': 'success', + 'app': { + 'id': app_id, + 'name': app.manifest.get('name', app_id), + 'enabled': app.is_enabled(), + 'config': app.config, + 'schema': app.schema, + 'render_interval': app.get_render_interval(), + 'display_duration': app.get_display_duration(), + 'has_frames': app.frames is not None, + 'frame_count': len(app.frames) if app.frames else 0, + 'last_render_time': app.last_render_time, + } + }) + + # Standalone: read from manifest + manifest = _read_starlark_manifest() + app_data = manifest.get('apps', {}).get(app_id) + if not app_data: + return jsonify({'status': 'error', 'message': f'App not found: {app_id}'}), 404 + + # Load schema from schema.json if it exists (path already validated above) + schema = None + schema_file = _STARLARK_APPS_DIR / app_id / 'schema.json' + if schema_file.exists(): + try: + with open(schema_file, 'r') as f: + schema = json.load(f) + except (OSError, json.JSONDecodeError) as e: + logger.warning(f"Failed to load schema for {app_id}: {e}") + + return jsonify({ + 'status': 'success', + 'app': { + 'id': app_id, + 'name': app_data.get('name', app_id), + 'enabled': app_data.get('enabled', True), + 'config': app_data.get('config', {}), + 'schema': schema, + 'render_interval': app_data.get('render_interval', 300), + 'display_duration': app_data.get('display_duration', 15), + 'has_frames': False, + 'frame_count': 0, + 'last_render_time': None, + } + }) + + except Exception as e: + logger.error(f"Error getting starlark app {app_id}: {e}") + return jsonify({'status': 'error', 'message': str(e)}), 500 + + +@api_v3.route('/starlark/upload', methods=['POST']) +def upload_starlark_app(): + """Upload and install a new Starlark app.""" + try: + if 'file' not in request.files: + return jsonify({'status': 'error', 'message': 'No file uploaded'}), 400 + + file = request.files['file'] + if not file.filename or not file.filename.endswith('.star'): + return jsonify({'status': 'error', 'message': 'File must have .star extension'}), 400 + + # Check file size (limit to 5MB for .star files) + file.seek(0, 2) # Seek to end + file_size = file.tell() + file.seek(0) # Reset to beginning + MAX_STAR_SIZE = 5 * 1024 * 1024 # 5MB + if file_size > MAX_STAR_SIZE: + return jsonify({'status': 'error', 'message': f'File too large (max 5MB, got {file_size/1024/1024:.1f}MB)'}), 400 + + app_name = request.form.get('name') + app_id_input = request.form.get('app_id') + filename_base = file.filename.replace('.star', '') if file.filename else None + app_id, app_id_error = _validate_and_sanitize_app_id(app_id_input, fallback_source=filename_base) + if app_id_error: + return jsonify({'status': 'error', 'message': f'Invalid app_id: {app_id_error}'}), 400 + + render_interval_input = request.form.get('render_interval') + render_interval = 300 + if render_interval_input is not None: + render_interval, err = _validate_timing_value(render_interval_input, 'render_interval') + if err: + return jsonify({'status': 'error', 'message': err}), 400 + render_interval = render_interval or 300 + + display_duration_input = request.form.get('display_duration') + display_duration = 15 + if display_duration_input is not None: + display_duration, err = _validate_timing_value(display_duration_input, 'display_duration') + if err: + return jsonify({'status': 'error', 'message': err}), 400 + display_duration = display_duration or 15 + + import tempfile + with tempfile.NamedTemporaryFile(delete=False, suffix='.star') as tmp: + file.save(tmp.name) + temp_path = tmp.name + + try: + metadata = {'name': app_name or app_id, 'render_interval': render_interval, 'display_duration': display_duration} + starlark_plugin = _get_starlark_plugin() + if starlark_plugin: + success = starlark_plugin.install_app(app_id, temp_path, metadata) + else: + success = _install_star_file(app_id, temp_path, metadata) + if success: + return jsonify({'status': 'success', 'message': f'App installed: {app_id}', 'app_id': app_id}) + else: + return jsonify({'status': 'error', 'message': 'Failed to install app'}), 500 + finally: + try: + os.unlink(temp_path) + except OSError: + pass + + except (ValueError, OSError, IOError) as e: + logger.exception("[Starlark] Error uploading starlark app") + return jsonify({'status': 'error', 'message': 'Failed to upload app'}), 500 + + +@api_v3.route('/starlark/apps/', methods=['DELETE']) +def uninstall_starlark_app(app_id): + """Uninstall a Starlark app.""" + try: + # Validate app_id before any filesystem access + is_valid, error_msg = _validate_starlark_app_path(app_id) + if not is_valid: + return jsonify({'status': 'error', 'message': error_msg}), 400 + + starlark_plugin = _get_starlark_plugin() + if starlark_plugin: + success = starlark_plugin.uninstall_app(app_id) + else: + # Standalone: remove app dir and manifest entry (path already validated) + import shutil + app_dir = _STARLARK_APPS_DIR / app_id + + if app_dir.exists(): + shutil.rmtree(app_dir) + manifest = _read_starlark_manifest() + manifest.get('apps', {}).pop(app_id, None) + success = _write_starlark_manifest(manifest) + + if success: + return jsonify({'status': 'success', 'message': f'App uninstalled: {app_id}'}) + else: + return jsonify({'status': 'error', 'message': 'Failed to uninstall app'}), 500 + + except Exception as e: + logger.error(f"Error uninstalling starlark app {app_id}: {e}") + return jsonify({'status': 'error', 'message': str(e)}), 500 + + +@api_v3.route('/starlark/apps//config', methods=['GET']) +def get_starlark_app_config(app_id): + """Get configuration for a Starlark app.""" + try: + # Validate app_id before any filesystem access + is_valid, error_msg = _validate_starlark_app_path(app_id) + if not is_valid: + return jsonify({'status': 'error', 'message': error_msg}), 400 + + starlark_plugin = _get_starlark_plugin() + if starlark_plugin: + app = starlark_plugin.apps.get(app_id) + if not app: + return jsonify({'status': 'error', 'message': f'App not found: {app_id}'}), 404 + return jsonify({'status': 'success', 'config': app.config, 'schema': app.schema}) + + # Standalone: read from config.json file (path already validated) + app_dir = _STARLARK_APPS_DIR / app_id + config_file = app_dir / "config.json" + + if not app_dir.exists(): + return jsonify({'status': 'error', 'message': f'App not found: {app_id}'}), 404 + + config = {} + if config_file.exists(): + try: + with open(config_file, 'r') as f: + config = json.load(f) + except (OSError, json.JSONDecodeError) as e: + logger.warning(f"Failed to load config for {app_id}: {e}") + + # Load schema from schema.json + schema = None + schema_file = app_dir / "schema.json" + if schema_file.exists(): + try: + with open(schema_file, 'r') as f: + schema = json.load(f) + except Exception as e: + logger.warning(f"Failed to load schema for {app_id}: {e}") + + return jsonify({'status': 'success', 'config': config, 'schema': schema}) + + except Exception as e: + logger.error(f"Error getting config for {app_id}: {e}") + return jsonify({'status': 'error', 'message': str(e)}), 500 + + +@api_v3.route('/starlark/apps//config', methods=['PUT']) +def update_starlark_app_config(app_id): + """Update configuration for a Starlark app.""" + try: + # Validate app_id before any filesystem access + is_valid, error_msg = _validate_starlark_app_path(app_id) + if not is_valid: + return jsonify({'status': 'error', 'message': error_msg}), 400 + + data = request.get_json() + if not data: + return jsonify({'status': 'error', 'message': 'No configuration provided'}), 400 + + if 'render_interval' in data: + val, err = _validate_timing_value(data['render_interval'], 'render_interval') + if err: + return jsonify({'status': 'error', 'message': err}), 400 + data['render_interval'] = val + + if 'display_duration' in data: + val, err = _validate_timing_value(data['display_duration'], 'display_duration') + if err: + return jsonify({'status': 'error', 'message': err}), 400 + data['display_duration'] = val + + starlark_plugin = _get_starlark_plugin() + if starlark_plugin: + app = starlark_plugin.apps.get(app_id) + if not app: + return jsonify({'status': 'error', 'message': f'App not found: {app_id}'}), 404 + + # Extract timing keys from data before updating config (they belong in manifest, not config) + render_interval = data.pop('render_interval', None) + display_duration = data.pop('display_duration', None) + + # Update config with non-timing fields only + app.config.update(data) + + # Update manifest with timing fields + timing_changed = False + if render_interval is not None: + app.manifest['render_interval'] = render_interval + timing_changed = True + if display_duration is not None: + app.manifest['display_duration'] = display_duration + timing_changed = True + if app.save_config(): + # Persist manifest if timing changed (same pattern as toggle endpoint) + if timing_changed: + try: + # Use safe manifest update to prevent race conditions + timing_updates = {} + if render_interval is not None: + timing_updates['render_interval'] = render_interval + if display_duration is not None: + timing_updates['display_duration'] = display_duration + + def update_fn(manifest): + manifest['apps'][app_id].update(timing_updates) + starlark_plugin._update_manifest_safe(update_fn) + except Exception as e: + logger.warning(f"Failed to persist timing to manifest for {app_id}: {e}") + starlark_plugin._render_app(app, force=True) + return jsonify({'status': 'success', 'message': 'Configuration updated', 'config': app.config}) + else: + return jsonify({'status': 'error', 'message': 'Failed to save configuration'}), 500 + + # Standalone: update both config.json and manifest + manifest = _read_starlark_manifest() + app_data = manifest.get('apps', {}).get(app_id) + if not app_data: + return jsonify({'status': 'error', 'message': f'App not found: {app_id}'}), 404 + + # Extract timing keys (they go in manifest, not config.json) + render_interval = data.pop('render_interval', None) + display_duration = data.pop('display_duration', None) + + # Update manifest with timing values + if render_interval is not None: + app_data['render_interval'] = render_interval + if display_duration is not None: + app_data['display_duration'] = display_duration + + # Load current config from config.json + app_dir = _STARLARK_APPS_DIR / app_id + config_file = app_dir / "config.json" + current_config = {} + if config_file.exists(): + try: + with open(config_file, 'r') as f: + current_config = json.load(f) + except Exception as e: + logger.warning(f"Failed to load config for {app_id}: {e}") + + # Update config with new values (excluding timing keys) + current_config.update(data) + + # Write updated config to config.json + try: + with open(config_file, 'w') as f: + json.dump(current_config, f, indent=2) + except Exception as e: + logger.error(f"Failed to save config.json for {app_id}: {e}") + return jsonify({'status': 'error', 'message': f'Failed to save configuration: {e}'}), 500 + + # Also update manifest for backward compatibility + app_data.setdefault('config', {}).update(data) + + if _write_starlark_manifest(manifest): + return jsonify({'status': 'success', 'message': 'Configuration updated', 'config': current_config}) + else: + return jsonify({'status': 'error', 'message': 'Failed to save manifest'}), 500 + + except Exception as e: + logger.error(f"Error updating config for {app_id}: {e}") + return jsonify({'status': 'error', 'message': str(e)}), 500 + + +@api_v3.route('/starlark/apps//toggle', methods=['POST']) +def toggle_starlark_app(app_id): + """Enable or disable a Starlark app.""" + try: + data = request.get_json() or {} + + starlark_plugin = _get_starlark_plugin() + if starlark_plugin: + app = starlark_plugin.apps.get(app_id) + if not app: + return jsonify({'status': 'error', 'message': f'App not found: {app_id}'}), 404 + enabled = data.get('enabled') + if enabled is None: + enabled = not app.is_enabled() + app.manifest['enabled'] = enabled + # Use safe manifest update to prevent race conditions + def update_fn(manifest): + manifest['apps'][app_id]['enabled'] = enabled + starlark_plugin._update_manifest_safe(update_fn) + return jsonify({'status': 'success', 'message': f"App {'enabled' if enabled else 'disabled'}", 'enabled': enabled}) + + # Standalone: update manifest directly + manifest = _read_starlark_manifest() + app_data = manifest.get('apps', {}).get(app_id) + if not app_data: + return jsonify({'status': 'error', 'message': f'App not found: {app_id}'}), 404 + + enabled = data.get('enabled') + if enabled is None: + enabled = not app_data.get('enabled', True) + app_data['enabled'] = enabled + if _write_starlark_manifest(manifest): + return jsonify({'status': 'success', 'message': f"App {'enabled' if enabled else 'disabled'}", 'enabled': enabled}) + else: + return jsonify({'status': 'error', 'message': 'Failed to save'}), 500 + + except Exception as e: + logger.error(f"Error toggling app {app_id}: {e}") + return jsonify({'status': 'error', 'message': str(e)}), 500 + + +@api_v3.route('/starlark/apps//render', methods=['POST']) +def render_starlark_app(app_id): + """Force render a Starlark app.""" + try: + starlark_plugin = _get_starlark_plugin() + if not starlark_plugin: + return jsonify({'status': 'error', 'message': 'Rendering requires the main LEDMatrix service (plugin not loaded in web service)'}), 503 + + app = starlark_plugin.apps.get(app_id) + if not app: + return jsonify({'status': 'error', 'message': f'App not found: {app_id}'}), 404 + + success = starlark_plugin._render_app(app, force=True) + if success: + return jsonify({'status': 'success', 'message': 'App rendered', 'frame_count': len(app.frames) if app.frames else 0}) + else: + return jsonify({'status': 'error', 'message': 'Failed to render app'}), 500 + + except Exception as e: + logger.error(f"Error rendering app {app_id}: {e}") + return jsonify({'status': 'error', 'message': str(e)}), 500 + + +@api_v3.route('/starlark/repository/browse', methods=['GET']) +def browse_tronbyte_repository(): + """Browse all apps in the Tronbyte repository (bulk cached fetch). + + Returns ALL apps with metadata, categories, and authors. + Filtering/sorting/pagination is handled client-side. + Results are cached server-side for 2 hours. + """ + try: + TronbyteRepository = _get_tronbyte_repository_class() + + config = api_v3.config_manager.load_config() if api_v3.config_manager else {} + github_token = config.get('github_token') + repo = TronbyteRepository(github_token=github_token) + + result = repo.list_all_apps_cached() + + rate_limit = repo.get_rate_limit_info() + + return jsonify({ + 'status': 'success', + 'apps': result['apps'], + 'categories': result['categories'], + 'authors': result['authors'], + 'count': result['count'], + 'cached': result['cached'], + 'rate_limit': rate_limit, + }) + + except Exception as e: + logger.error(f"Error browsing repository: {e}") + return jsonify({'status': 'error', 'message': str(e)}), 500 + + +@api_v3.route('/starlark/repository/install', methods=['POST']) +def install_from_tronbyte_repository(): + """Install an app from the Tronbyte repository.""" + try: + data = request.get_json() + if not data or 'app_id' not in data: + return jsonify({'status': 'error', 'message': 'app_id is required'}), 400 + + app_id, app_id_error = _validate_and_sanitize_app_id(data['app_id']) + if app_id_error: + return jsonify({'status': 'error', 'message': f'Invalid app_id: {app_id_error}'}), 400 + + TronbyteRepository = _get_tronbyte_repository_class() + import tempfile + + config = api_v3.config_manager.load_config() if api_v3.config_manager else {} + github_token = config.get('github_token') + repo = TronbyteRepository(github_token=github_token) + + success, metadata, error = repo.get_app_metadata(data['app_id']) + if not success: + return jsonify({'status': 'error', 'message': f'Failed to fetch app metadata: {error}'}), 404 + + with tempfile.NamedTemporaryFile(delete=False, suffix='.star') as tmp: + temp_path = tmp.name + + try: + # Pass filename from metadata (e.g., "analog_clock.star" for analogclock app) + # Note: manifest uses 'fileName' (camelCase), not 'filename' + filename = metadata.get('fileName') if metadata else None + success, error = repo.download_star_file(data['app_id'], Path(temp_path), filename=filename) + if not success: + return jsonify({'status': 'error', 'message': f'Failed to download app: {error}'}), 500 + + # Download assets (images, sources, etc.) to a temp directory + import tempfile + temp_assets_dir = tempfile.mkdtemp() + try: + success_assets, error_assets = repo.download_app_assets(data['app_id'], Path(temp_assets_dir)) + # Asset download is non-critical - log warning but continue if it fails + if not success_assets: + logger.warning(f"Failed to download assets for {data['app_id']}: {error_assets}") + + render_interval = data.get('render_interval', 300) + ri, err = _validate_timing_value(render_interval, 'render_interval') + if err: + return jsonify({'status': 'error', 'message': err}), 400 + render_interval = ri or 300 + + display_duration = data.get('display_duration', 15) + dd, err = _validate_timing_value(display_duration, 'display_duration') + if err: + return jsonify({'status': 'error', 'message': err}), 400 + display_duration = dd or 15 + + install_metadata = { + 'name': metadata.get('name', app_id) if metadata else app_id, + 'render_interval': render_interval, + 'display_duration': display_duration + } + + starlark_plugin = _get_starlark_plugin() + if starlark_plugin: + success = starlark_plugin.install_app(app_id, temp_path, install_metadata, assets_dir=temp_assets_dir) + else: + success = _install_star_file(app_id, temp_path, install_metadata, assets_dir=temp_assets_dir) + finally: + # Clean up temp assets directory + import shutil + try: + shutil.rmtree(temp_assets_dir) + except OSError: + pass + + if success: + return jsonify({'status': 'success', 'message': f'App installed: {metadata.get("name", app_id) if metadata else app_id}', 'app_id': app_id}) + else: + return jsonify({'status': 'error', 'message': 'Failed to install app'}), 500 + finally: + try: + os.unlink(temp_path) + except OSError: + pass + + except Exception as e: + logger.error(f"Error installing from repository: {e}") + return jsonify({'status': 'error', 'message': str(e)}), 500 + + +@api_v3.route('/starlark/repository/categories', methods=['GET']) +def get_tronbyte_categories(): + """Get list of available app categories (uses bulk cache).""" + try: + TronbyteRepository = _get_tronbyte_repository_class() + config = api_v3.config_manager.load_config() if api_v3.config_manager else {} + repo = TronbyteRepository(github_token=config.get('github_token')) + + result = repo.list_all_apps_cached() + + return jsonify({'status': 'success', 'categories': result['categories']}) + + except Exception as e: + logger.error(f"Error fetching categories: {e}") + return jsonify({'status': 'error', 'message': str(e)}), 500 + + +@api_v3.route('/starlark/install-pixlet', methods=['POST']) +def install_pixlet(): + """Download and install Pixlet binary.""" + try: + script_path = PROJECT_ROOT / 'scripts' / 'download_pixlet.sh' + if not script_path.exists(): + return jsonify({'status': 'error', 'message': 'Installation script not found'}), 404 + + os.chmod(script_path, 0o755) + + result = subprocess.run( + [str(script_path)], + cwd=str(PROJECT_ROOT), + capture_output=True, + text=True, + timeout=300 + ) + + if result.returncode == 0: + logger.info("Pixlet downloaded successfully") + return jsonify({'status': 'success', 'message': 'Pixlet installed successfully!', 'output': result.stdout}) + else: + return jsonify({'status': 'error', 'message': f'Failed to download Pixlet: {result.stderr}'}), 500 + + except subprocess.TimeoutExpired: + return jsonify({'status': 'error', 'message': 'Download timed out'}), 500 + except Exception as e: + logger.error(f"Error installing Pixlet: {e}") + return jsonify({'status': 'error', 'message': str(e)}), 500 \ No newline at end of file diff --git a/web_interface/blueprints/pages_v3.py b/web_interface/blueprints/pages_v3.py index 34fce5d2e..c9783ddbf 100644 --- a/web_interface/blueprints/pages_v3.py +++ b/web_interface/blueprints/pages_v3.py @@ -1,7 +1,10 @@ from flask import Blueprint, render_template, request, redirect, url_for, flash, jsonify import json +import logging from pathlib import Path +logger = logging.getLogger(__name__) + # Will be initialized when blueprint is registered config_manager = None plugin_manager = None @@ -322,7 +325,11 @@ def _load_plugin_config_partial(plugin_id): try: if not pages_v3.plugin_manager: return '
Plugin manager not available
', 500 - + + # Handle starlark app config (starlark:) + if plugin_id.startswith('starlark:'): + return _load_starlark_config_partial(plugin_id[len('starlark:'):]) + # Try to get plugin info first plugin_info = pages_v3.plugin_manager.get_plugin_info(plugin_id) @@ -429,3 +436,77 @@ def _load_plugin_config_partial(plugin_id): import traceback traceback.print_exc() return f'
Error loading plugin config: {str(e)}
', 500 + + +def _load_starlark_config_partial(app_id): + """Load configuration partial for a Starlark app.""" + try: + starlark_plugin = pages_v3.plugin_manager.get_plugin('starlark-apps') if pages_v3.plugin_manager else None + + if starlark_plugin and hasattr(starlark_plugin, 'apps'): + app = starlark_plugin.apps.get(app_id) + if not app: + return f'
Starlark app not found: {app_id}
', 404 + return render_template( + 'v3/partials/starlark_config.html', + app_id=app_id, + app_name=app.manifest.get('name', app_id), + app_enabled=app.is_enabled(), + render_interval=app.get_render_interval(), + display_duration=app.get_display_duration(), + config=app.config, + schema=app.schema, + has_frames=app.frames is not None, + frame_count=len(app.frames) if app.frames else 0, + last_render_time=app.last_render_time, + ) + + # Standalone: read from manifest file + manifest_file = Path(__file__).resolve().parent.parent.parent / 'starlark-apps' / 'manifest.json' + if not manifest_file.exists(): + return f'
Starlark app not found: {app_id}
', 404 + + with open(manifest_file, 'r') as f: + manifest = json.load(f) + + app_data = manifest.get('apps', {}).get(app_id) + if not app_data: + return f'
Starlark app not found: {app_id}
', 404 + + # Load schema from schema.json if it exists + schema = None + schema_file = Path(__file__).resolve().parent.parent.parent / 'starlark-apps' / app_id / 'schema.json' + if schema_file.exists(): + try: + with open(schema_file, 'r') as f: + schema = json.load(f) + except (OSError, json.JSONDecodeError) as e: + logger.warning(f"[Pages V3] Could not load schema for {app_id}: {e}", exc_info=True) + + # Load config from config.json if it exists + config = {} + config_file = Path(__file__).resolve().parent.parent.parent / 'starlark-apps' / app_id / 'config.json' + if config_file.exists(): + try: + with open(config_file, 'r') as f: + config = json.load(f) + except (OSError, json.JSONDecodeError) as e: + logger.warning(f"[Pages V3] Could not load config for {app_id}: {e}", exc_info=True) + + return render_template( + 'v3/partials/starlark_config.html', + app_id=app_id, + app_name=app_data.get('name', app_id), + app_enabled=app_data.get('enabled', True), + render_interval=app_data.get('render_interval', 300), + display_duration=app_data.get('display_duration', 15), + config=config, + schema=schema, + has_frames=False, + frame_count=0, + last_render_time=None, + ) + + except Exception as e: + logger.exception(f"[Pages V3] Error loading starlark config for {app_id}") + return f'
Error loading starlark config: {str(e)}
', 500 diff --git a/web_interface/static/v3/plugins_manager.js b/web_interface/static/v3/plugins_manager.js index cd2900931..3a8216efb 100644 --- a/web_interface/static/v3/plugins_manager.js +++ b/web_interface/static/v3/plugins_manager.js @@ -1,6 +1,43 @@ +// ─── LocalStorage Safety Wrappers ──────────────────────────────────────────── +// Handles environments where localStorage is unavailable or restricted (private browsing, etc.) +const safeLocalStorage = { + getItem(key) { + try { + if (typeof localStorage !== 'undefined') { + return localStorage.getItem(key); + } + } catch (e) { + console.warn(`safeLocalStorage.getItem failed for key "${key}":`, e.message); + } + return null; + }, + setItem(key, value) { + try { + if (typeof localStorage !== 'undefined') { + localStorage.setItem(key, value); + return true; + } + } catch (e) { + console.warn(`safeLocalStorage.setItem failed for key "${key}":`, e.message); + } + return false; + }, + removeItem(key) { + try { + if (typeof localStorage !== 'undefined') { + localStorage.removeItem(key); + return true; + } + } catch (e) { + console.warn(`localStorage.removeItem failed for key "${key}":`, e.message); + } + return false; + } +}; + // Define critical functions immediately so they're available before any HTML is rendered -// Debug logging controlled by localStorage.setItem('pluginDebug', 'true') -const _PLUGIN_DEBUG_EARLY = typeof localStorage !== 'undefined' && localStorage.getItem('pluginDebug') === 'true'; +// Debug logging controlled by safeLocalStorage.setItem('pluginDebug', 'true') +const _PLUGIN_DEBUG_EARLY = safeLocalStorage.getItem('pluginDebug') === 'true'; if (_PLUGIN_DEBUG_EARLY) console.log('[PLUGINS SCRIPT] Defining configurePlugin and togglePlugin at top level...'); // Expose on-demand functions early as stubs (will be replaced when IIFE runs) @@ -859,46 +896,39 @@ window.currentPluginConfig = null; let pluginStoreCache = null; // Cache for plugin store to speed up subsequent loads let cacheTimestamp = null; const CACHE_DURATION = 5 * 60 * 1000; // 5 minutes in milliseconds - let onDemandStatusInterval = null; - let currentOnDemandPluginId = null; - let hasLoadedOnDemandStatus = false; + let storeFilteredList = []; - // Store filter/sort state + // ── Plugin Store Filter State ─────────────────────────────────────────── const storeFilterState = { - sort: localStorage.getItem('storeSort') || 'a-z', - filterVerified: false, - filterNew: false, - filterInstalled: null, // null = all, true = installed only, false = not installed only - filterAuthors: [], - filterCategories: [], - + sort: safeLocalStorage.getItem('storeSort') || 'a-z', + filterCategory: '', + filterInstalled: null, // null=all, true=installed, false=not-installed + searchQuery: '', + page: 1, + perPage: parseInt(safeLocalStorage.getItem('storePerPage')) || 12, persist() { - localStorage.setItem('storeSort', this.sort); + safeLocalStorage.setItem('storeSort', this.sort); + safeLocalStorage.setItem('storePerPage', this.perPage); }, - reset() { this.sort = 'a-z'; - this.filterVerified = false; - this.filterNew = false; + this.filterCategory = ''; this.filterInstalled = null; - this.filterAuthors = []; - this.filterCategories = []; - this.persist(); + this.searchQuery = ''; + this.page = 1; }, - activeCount() { - let count = 0; - if (this.filterVerified) count++; - if (this.filterNew) count++; - if (this.filterInstalled !== null) count++; - count += this.filterAuthors.length; - count += this.filterCategories.length; - return count; + let n = 0; + if (this.searchQuery) n++; + if (this.filterInstalled !== null) n++; + if (this.filterCategory) n++; + if (this.sort !== 'a-z') n++; + return n; } }; - - // Installed plugins sort state - let installedSort = localStorage.getItem('installedSort') || 'a-z'; + let onDemandStatusInterval = null; + let currentOnDemandPluginId = null; + let hasLoadedOnDemandStatus = false; // Shared on-demand status store (mirrors Alpine store when available) window.__onDemandStore = window.__onDemandStore || { @@ -1013,13 +1043,15 @@ window.initPluginsPage = function() { // If we fetched data before the DOM existed, render it now if (window.__pendingInstalledPlugins) { console.log('[RENDER] Applying pending installed plugins data'); - sortAndRenderInstalledPlugins(window.__pendingInstalledPlugins); + renderInstalledPlugins(window.__pendingInstalledPlugins); window.__pendingInstalledPlugins = null; } if (window.__pendingStorePlugins) { console.log('[RENDER] Applying pending plugin store data'); - renderPluginStore(window.__pendingStorePlugins); + pluginStoreCache = window.__pendingStorePlugins; + cacheTimestamp = Date.now(); window.__pendingStorePlugins = null; + applyStoreFiltersAndSort(); } initializePlugins(); @@ -1028,7 +1060,6 @@ window.initPluginsPage = function() { const refreshBtn = document.getElementById('refresh-plugins-btn'); const updateAllBtn = document.getElementById('update-all-plugins-btn'); const restartBtn = document.getElementById('restart-display-btn'); - const searchBtn = document.getElementById('search-plugins-btn'); const closeBtn = document.getElementById('close-plugin-config'); const closeOnDemandModalBtn = document.getElementById('close-on-demand-modal'); const cancelOnDemandBtn = document.getElementById('cancel-on-demand'); @@ -1056,10 +1087,13 @@ window.initPluginsPage = function() { document.getElementById('restart-display-btn').addEventListener('click', restartDisplay); console.log('[initPluginsPage] Attached restartDisplay listener'); } - if (searchBtn) { - searchBtn.replaceWith(searchBtn.cloneNode(true)); - document.getElementById('search-plugins-btn').addEventListener('click', searchPluginStore); - } + // Restore persisted store sort/perPage + const storeSortEl = document.getElementById('store-sort'); + if (storeSortEl) storeSortEl.value = storeFilterState.sort; + const storePpEl = document.getElementById('store-per-page'); + if (storePpEl) storePpEl.value = storeFilterState.perPage; + setupStoreFilterListeners(); + if (closeBtn) { closeBtn.replaceWith(closeBtn.cloneNode(true)); document.getElementById('close-plugin-config').addEventListener('click', closePluginConfigModal); @@ -1142,15 +1176,13 @@ function initializePluginPageWhenReady() { document.body.addEventListener('htmx:afterSwap', function(event) { const target = event.detail.target; // Check if plugins content was swapped in - if (target.id === 'plugins-content' || - target.querySelector('[data-plugins-loaded]')) { + if (target.id === 'plugins-content' || + target.querySelector('#installed-plugins-grid') || + document.getElementById('installed-plugins-grid')) { console.log('HTMX swap detected for plugins, initializing...'); - // Reset initialization flags to allow re-initialization after HTMX swap + // Reset initialization flag to allow re-initialization after HTMX swap window.pluginManager.initialized = false; window.pluginManager.initializing = false; - pluginsInitialized = false; - pluginLoadCache.data = null; - pluginLoadCache.promise = null; initTimer = setTimeout(attemptInit, 100); } }, { once: false }); // Allow multiple swaps @@ -1211,10 +1243,7 @@ function initializePlugins() { categorySelect._listenerSetup = true; categorySelect.addEventListener('change', searchPluginStore); } - - // Setup store sort/filter controls - setupStoreFilterListeners(); - + // Setup GitHub installation handlers console.log('[initializePlugins] About to call setupGitHubInstallHandlers...'); if (typeof setupGitHubInstallHandlers === 'function') { @@ -1251,8 +1280,8 @@ const pluginLoadCache = { } }; -// Debug flag - set via localStorage.setItem('pluginDebug', 'true') -const PLUGIN_DEBUG = typeof localStorage !== 'undefined' && localStorage.getItem('pluginDebug') === 'true'; +// Debug flag - set via safeLocalStorage.setItem('pluginDebug', 'true') +const PLUGIN_DEBUG = typeof localStorage !== 'undefined' && safeLocalStorage.getItem('pluginDebug') === 'true'; function pluginLog(...args) { if (PLUGIN_DEBUG) console.log(...args); } @@ -1269,7 +1298,7 @@ function loadInstalledPlugins(forceRefresh = false) { })); pluginLog('[CACHE] Dispatched pluginsUpdated event from cache'); // Still render to ensure UI is updated - sortAndRenderInstalledPlugins(pluginLoadCache.data); + renderInstalledPlugins(pluginLoadCache.data); return Promise.resolve(pluginLoadCache.data); } @@ -1328,7 +1357,7 @@ function loadInstalledPlugins(forceRefresh = false) { }); } - sortAndRenderInstalledPlugins(installedPlugins); + renderInstalledPlugins(installedPlugins); // Update count const countEl = document.getElementById('installed-count'); @@ -1369,24 +1398,6 @@ function refreshInstalledPlugins() { window.pluginManager.loadInstalledPlugins = loadInstalledPlugins; // Note: searchPluginStore will be exposed after its definition (see below) -function sortAndRenderInstalledPlugins(plugins) { - const sorted = [...plugins].sort((a, b) => { - const nameA = (a.name || a.id || '').toLowerCase(); - const nameB = (b.name || b.id || '').toLowerCase(); - switch (installedSort) { - case 'z-a': - return nameB.localeCompare(nameA); - case 'enabled': - if (a.enabled !== b.enabled) return a.enabled ? -1 : 1; - return nameA.localeCompare(nameB); - case 'a-z': - default: - return nameA.localeCompare(nameB); - } - }); - renderInstalledPlugins(sorted); -} - function renderInstalledPlugins(plugins) { const container = document.getElementById('installed-plugins-grid'); if (!container) { @@ -1459,6 +1470,7 @@ function renderInstalledPlugins(plugins) {

${escapeHtml(plugin.name || plugin.id)}

+ ${plugin.is_starlark_app ? 'Starlark' : ''} ${plugin.verified ? 'Verified' : ''}
@@ -1670,18 +1682,37 @@ function handlePluginAction(event) { }); break; case 'uninstall': - waitForFunction('uninstallPlugin', 10, 50) - .then(uninstallFunc => { - uninstallFunc(pluginId); - }) - .catch(error => { - console.error('[EVENT DELEGATION]', error.message); - if (typeof showNotification === 'function') { - showNotification('Uninstall function not loaded. Please refresh the page.', 'error'); - } else { - alert('Uninstall function not loaded. Please refresh the page.'); - } - }); + if (pluginId.startsWith('starlark:')) { + // Starlark app uninstall uses dedicated endpoint + const starlarkAppId = pluginId.slice('starlark:'.length); + if (!confirm(`Uninstall Starlark app "${starlarkAppId}"?`)) break; + fetch(`/api/v3/starlark/apps/${encodeURIComponent(starlarkAppId)}`, {method: 'DELETE'}) + .then(r => r.json()) + .then(data => { + if (data.status === 'success') { + if (typeof showNotification === 'function') showNotification('Starlark app uninstalled', 'success'); + else alert('Starlark app uninstalled'); + if (typeof loadInstalledPlugins === 'function') loadInstalledPlugins(); + else if (typeof window.loadInstalledPlugins === 'function') window.loadInstalledPlugins(); + } else { + alert('Uninstall failed: ' + (data.message || 'Unknown error')); + } + }) + .catch(err => alert('Uninstall failed: ' + err.message)); + } else { + waitForFunction('uninstallPlugin', 10, 50) + .then(uninstallFunc => { + uninstallFunc(pluginId); + }) + .catch(error => { + console.error('[EVENT DELEGATION]', error.message); + if (typeof showNotification === 'function') { + showNotification('Uninstall function not loaded. Please refresh the page.', 'error'); + } else { + alert('Uninstall function not loaded. Please refresh the page.'); + } + }); + } break; } } @@ -1742,8 +1773,6 @@ function startOnDemandStatusPolling() { window.loadOnDemandStatus = loadOnDemandStatus; -let updateAllRunning = false; - async function runUpdateAllPlugins() { console.log('[runUpdateAllPlugins] Button clicked, checking for updates...'); const button = document.getElementById('update-all-plugins-btn'); @@ -1753,7 +1782,7 @@ async function runUpdateAllPlugins() { return; } - if (updateAllRunning) { + if (button.dataset.running === 'true') { return; } @@ -1764,7 +1793,7 @@ async function runUpdateAllPlugins() { } const originalContent = button.innerHTML; - updateAllRunning = true; + button.dataset.running = 'true'; button.disabled = true; button.classList.add('opacity-60', 'cursor-wait'); @@ -1774,11 +1803,7 @@ async function runUpdateAllPlugins() { for (let i = 0; i < plugins.length; i++) { const plugin = plugins[i]; const pluginId = plugin.id; - // Re-fetch button in case DOM was replaced by HTMX swap - const btn = document.getElementById('update-all-plugins-btn'); - if (btn) { - btn.innerHTML = `Updating ${i + 1}/${plugins.length}...`; - } + button.innerHTML = `Updating ${i + 1}/${plugins.length}...`; try { const response = await fetch('/api/v3/plugins/update', { @@ -1818,13 +1843,10 @@ async function runUpdateAllPlugins() { console.error('Bulk plugin update failed:', error); showNotification('Failed to update all plugins: ' + error.message, 'error'); } finally { - updateAllRunning = false; - const btn = document.getElementById('update-all-plugins-btn'); - if (btn) { - btn.innerHTML = originalContent; - btn.disabled = false; - btn.classList.remove('opacity-60', 'cursor-wait'); - } + button.innerHTML = originalContent; + button.disabled = false; + button.classList.remove('opacity-60', 'cursor-wait'); + button.dataset.running = 'false'; } } @@ -5106,7 +5128,7 @@ function handleUninstallSuccess(pluginId) { if (typeof installedPlugins !== 'undefined') { installedPlugins = updatedPlugins; } - sortAndRenderInstalledPlugins(updatedPlugins); + renderInstalledPlugins(updatedPlugins); showNotification(`Plugin uninstalled successfully`, 'success'); // Also refresh from server to ensure consistency @@ -5145,405 +5167,88 @@ function restartDisplay() { }); } -// --- Store Filter/Sort Functions --- - -function setupStoreFilterListeners() { - // Sort dropdown - const sortSelect = document.getElementById('store-sort'); - if (sortSelect && !sortSelect._listenerSetup) { - sortSelect._listenerSetup = true; - sortSelect.value = storeFilterState.sort; - sortSelect.addEventListener('change', () => { - storeFilterState.sort = sortSelect.value; - storeFilterState.persist(); - applyStoreFiltersAndSort(); - }); - } - - // Verified filter toggle - const verifiedBtn = document.getElementById('filter-verified'); - if (verifiedBtn && !verifiedBtn._listenerSetup) { - verifiedBtn._listenerSetup = true; - verifiedBtn.addEventListener('click', () => { - storeFilterState.filterVerified = !storeFilterState.filterVerified; - verifiedBtn.dataset.active = storeFilterState.filterVerified; - applyStoreFiltersAndSort(); - }); - } - - // New filter toggle - const newBtn = document.getElementById('filter-new'); - if (newBtn && !newBtn._listenerSetup) { - newBtn._listenerSetup = true; - newBtn.addEventListener('click', () => { - storeFilterState.filterNew = !storeFilterState.filterNew; - newBtn.dataset.active = storeFilterState.filterNew; - applyStoreFiltersAndSort(); - }); - } - - // Installed filter (cycles: All -> Installed -> Not Installed -> All) - const installedBtn = document.getElementById('filter-installed'); - if (installedBtn && !installedBtn._listenerSetup) { - installedBtn._listenerSetup = true; - installedBtn.addEventListener('click', () => { - const states = [null, true, false]; - const labels = ['All', 'Installed', 'Not Installed']; - const icons = ['fa-download', 'fa-check', 'fa-times']; - const current = states.indexOf(storeFilterState.filterInstalled); - const next = (current + 1) % states.length; - storeFilterState.filterInstalled = states[next]; - installedBtn.querySelector('span').textContent = labels[next]; - installedBtn.querySelector('i').className = `fas ${icons[next]} mr-1`; - installedBtn.dataset.active = String(states[next] !== null); - applyStoreFiltersAndSort(); - }); - } - - // Author dropdown - const authorSelect = document.getElementById('filter-author'); - if (authorSelect && !authorSelect._listenerSetup) { - authorSelect._listenerSetup = true; - authorSelect.addEventListener('change', () => { - storeFilterState.filterAuthors = authorSelect.value - ? [authorSelect.value] : []; - applyStoreFiltersAndSort(); - }); - } - - // Tag pills (event delegation on container) - // Category pills (event delegation on container) - const catsPills = document.getElementById('filter-categories-pills'); - if (catsPills && !catsPills._listenerSetup) { - catsPills._listenerSetup = true; - catsPills.addEventListener('click', (e) => { - const pill = e.target.closest('.category-filter-pill'); - if (!pill) return; - const cat = pill.dataset.category; - const idx = storeFilterState.filterCategories.indexOf(cat); - if (idx >= 0) { - storeFilterState.filterCategories.splice(idx, 1); - pill.dataset.active = 'false'; - } else { - storeFilterState.filterCategories.push(cat); - pill.dataset.active = 'true'; - } - applyStoreFiltersAndSort(); - }); - } - - // Clear filters button - const clearBtn = document.getElementById('clear-filters-btn'); - if (clearBtn && !clearBtn._listenerSetup) { - clearBtn._listenerSetup = true; - clearBtn.addEventListener('click', () => { - storeFilterState.reset(); - // Reset all UI elements - const sort = document.getElementById('store-sort'); - if (sort) sort.value = 'a-z'; - const vBtn = document.getElementById('filter-verified'); - if (vBtn) vBtn.dataset.active = 'false'; - const nBtn = document.getElementById('filter-new'); - if (nBtn) nBtn.dataset.active = 'false'; - const iBtn = document.getElementById('filter-installed'); - if (iBtn) { - iBtn.dataset.active = 'false'; - const span = iBtn.querySelector('span'); - if (span) span.textContent = 'All'; - const icon = iBtn.querySelector('i'); - if (icon) icon.className = 'fas fa-download mr-1'; - } - const auth = document.getElementById('filter-author'); - if (auth) auth.value = ''; - document.querySelectorAll('.category-filter-pill').forEach(p => { - p.dataset.active = 'false'; - }); - applyStoreFiltersAndSort(); - }); - } - - // Installed plugins sort dropdown - const installedSortSelect = document.getElementById('installed-sort'); - if (installedSortSelect && !installedSortSelect._listenerSetup) { - installedSortSelect._listenerSetup = true; - installedSortSelect.value = installedSort; - installedSortSelect.addEventListener('change', () => { - installedSort = installedSortSelect.value; - localStorage.setItem('installedSort', installedSort); - const plugins = window.installedPlugins || []; - if (plugins.length > 0) { - sortAndRenderInstalledPlugins(plugins); - } - }); - } -} - -function applyStoreFiltersAndSort(basePlugins) { - const source = basePlugins || pluginStoreCache; - if (!source) return; - - let plugins = [...source]; - const installedIds = new Set( - (window.installedPlugins || []).map(p => p.id) - ); - - // Apply filters - if (storeFilterState.filterVerified) { - plugins = plugins.filter(p => p.verified); - } - if (storeFilterState.filterNew) { - plugins = plugins.filter(p => isNewPlugin(p.last_updated)); - } - if (storeFilterState.filterInstalled === true) { - plugins = plugins.filter(p => installedIds.has(p.id)); - } else if (storeFilterState.filterInstalled === false) { - plugins = plugins.filter(p => !installedIds.has(p.id)); - } - if (storeFilterState.filterAuthors.length > 0) { - const authorSet = new Set(storeFilterState.filterAuthors); - plugins = plugins.filter(p => authorSet.has(p.author)); - } - if (storeFilterState.filterCategories.length > 0) { - const catSet = new Set(storeFilterState.filterCategories); - plugins = plugins.filter(p => catSet.has(p.category)); - } - - // Apply sort - switch (storeFilterState.sort) { - case 'a-z': - plugins.sort((a, b) => - (a.name || a.id).localeCompare(b.name || b.id)); - break; - case 'z-a': - plugins.sort((a, b) => - (b.name || b.id).localeCompare(a.name || a.id)); - break; - case 'verified': - plugins.sort((a, b) => - (b.verified ? 1 : 0) - (a.verified ? 1 : 0) || - (a.name || a.id).localeCompare(b.name || b.id)); - break; - case 'newest': - plugins.sort((a, b) => { - // Prefer static per-plugin last_updated over GitHub pushed_at (which is repo-wide) - const dateA = a.last_updated || a.last_updated_iso || ''; - const dateB = b.last_updated || b.last_updated_iso || ''; - return dateB.localeCompare(dateA); - }); - break; - case 'category': - plugins.sort((a, b) => - (a.category || '').localeCompare(b.category || '') || - (a.name || a.id).localeCompare(b.name || b.id)); - break; - } - - renderPluginStore(plugins); - - // Update result count - const countEl = document.getElementById('store-count'); - if (countEl) { - const total = source.length; - const shown = plugins.length; - countEl.innerHTML = shown < total - ? `${shown} of ${total} shown` - : `${total} available`; - } - - updateFilterCountBadge(); -} - -function populateFilterControls() { - if (!pluginStoreCache) return; - - // Collect unique authors - const authors = [...new Set( - pluginStoreCache.map(p => p.author).filter(Boolean) - )].sort(); - - const authorSelect = document.getElementById('filter-author'); - if (authorSelect) { - const currentVal = authorSelect.value; - authorSelect.innerHTML = '' + - authors.map(a => ``).join(''); - authorSelect.value = currentVal; - } - - // Collect unique categories sorted alphabetically - const categories = [...new Set( - pluginStoreCache.map(p => p.category).filter(Boolean) - )].sort(); - - const catsContainer = document.getElementById('filter-categories-container'); - const catsPills = document.getElementById('filter-categories-pills'); - if (catsContainer && catsPills && categories.length > 0) { - catsContainer.classList.remove('hidden'); - catsPills.innerHTML = categories.map(cat => - `` - ).join(''); - } -} - -function updateFilterCountBadge() { - const count = storeFilterState.activeCount(); - const clearBtn = document.getElementById('clear-filters-btn'); - const badge = document.getElementById('filter-count-badge'); - if (clearBtn) { - clearBtn.classList.toggle('hidden', count === 0); - } - if (badge) { - badge.textContent = count; - } -} - function searchPluginStore(fetchCommitInfo = true) { pluginLog('[STORE] Searching plugin store...', { fetchCommitInfo }); - - // Safely get search values (elements may not exist yet) - const searchInput = document.getElementById('plugin-search'); - const categorySelect = document.getElementById('plugin-category'); - const query = searchInput ? searchInput.value : ''; - const category = categorySelect ? categorySelect.value : ''; - // For filtered searches (user typing), we can use cache to avoid excessive API calls - // For initial load or refresh, always fetch fresh metadata - const isFilteredSearch = query || category; const now = Date.now(); const isCacheValid = pluginStoreCache && cacheTimestamp && (now - cacheTimestamp < CACHE_DURATION); - - // Only use cache for filtered searches that don't explicitly request fresh metadata - if (isFilteredSearch && isCacheValid && !fetchCommitInfo) { - console.log('Using cached plugin store data for filtered search'); - // Ensure plugin store grid exists before rendering + + // If cache is valid and we don't need fresh commit info, just re-filter + if (isCacheValid && !fetchCommitInfo) { + console.log('Using cached plugin store data'); const storeGrid = document.getElementById('plugin-store-grid'); - if (!storeGrid) { - console.error('plugin-store-grid element not found, cannot render cached plugins'); - // Don't return, let it fetch fresh data - } else { + if (storeGrid) { applyStoreFiltersAndSort(); return; } } - // Show loading state - safely check element exists + // Show loading state try { const countEl = document.getElementById('store-count'); - if (countEl) { - countEl.innerHTML = 'Loading...'; - } - } catch (e) { - console.warn('Could not update store count:', e); - } + if (countEl) countEl.innerHTML = 'Loading...'; + } catch (e) { /* ignore */ } showStoreLoading(true); let url = '/api/v3/plugins/store/list'; - const params = new URLSearchParams(); - if (query) params.append('query', query); - if (category) params.append('category', category); - // Always fetch fresh commit metadata unless explicitly disabled (for performance on repeated filtered searches) if (!fetchCommitInfo) { - params.append('fetch_commit_info', 'false'); - } - // Note: fetch_commit_info defaults to true on the server side to keep metadata fresh - - if (params.toString()) { - url += '?' + params.toString(); + url += '?fetch_commit_info=false'; } console.log('Store URL:', url); fetch(url) - .then(response => { - console.log('Store response:', response.status); - return response.json(); - }) + .then(response => response.json()) .then(data => { - console.log('Store data:', data); showStoreLoading(false); - + if (data.status === 'success') { const plugins = data.data.plugins || []; console.log('Store plugins count:', plugins.length); - - // Cache the results if no filters - if (!query && !category) { - pluginStoreCache = plugins; - cacheTimestamp = Date.now(); - console.log('Cached plugin store data'); - populateFilterControls(); - } - // Ensure plugin store grid exists before rendering + pluginStoreCache = plugins; + cacheTimestamp = Date.now(); + const storeGrid = document.getElementById('plugin-store-grid'); if (!storeGrid) { - // Defer rendering until plugin tab loads pluginLog('[STORE] plugin-store-grid not ready, deferring render'); window.__pendingStorePlugins = plugins; return; } - // Route through filter/sort pipeline — pass fresh plugins - // so server-filtered results (query/category) aren't ignored - applyStoreFiltersAndSort(plugins); - - // Ensure GitHub token collapse handler is attached after store is rendered - // The button might not exist until the store content is loaded - console.log('[STORE] Checking for attachGithubTokenCollapseHandler...', { - exists: typeof window.attachGithubTokenCollapseHandler, - checkGitHubAuthStatus: typeof window.checkGitHubAuthStatus - }); + // Update total count + try { + const countEl = document.getElementById('store-count'); + if (countEl) countEl.innerHTML = `${plugins.length} available`; + } catch (e) { /* ignore */ } + + applyStoreFiltersAndSort(); + + // Re-attach GitHub token collapse handler after store render if (window.attachGithubTokenCollapseHandler) { - // Use requestAnimationFrame for faster execution (runs on next frame, ~16ms) requestAnimationFrame(() => { - console.log('[STORE] Re-attaching GitHub token collapse handler after store render'); - try { - window.attachGithubTokenCollapseHandler(); - } catch (error) { - console.error('[STORE] Error attaching collapse handler:', error); - } - // Also check auth status to update UI (already checked earlier, but refresh to be sure) + try { window.attachGithubTokenCollapseHandler(); } catch (e) { /* ignore */ } if (window.checkGitHubAuthStatus) { - console.log('[STORE] Refreshing GitHub auth status after store render...'); - try { - window.checkGitHubAuthStatus(); - } catch (error) { - console.error('[STORE] Error calling checkGitHubAuthStatus:', error); - } - } else { - console.warn('[STORE] checkGitHubAuthStatus not available'); + try { window.checkGitHubAuthStatus(); } catch (e) { /* ignore */ } } }); - } else { - console.warn('[STORE] attachGithubTokenCollapseHandler not available'); } } else { showError('Failed to search plugin store: ' + data.message); try { const countEl = document.getElementById('store-count'); - if (countEl) { - countEl.innerHTML = 'Error loading'; - } - } catch (e) { - console.warn('Could not update store count:', e); - } + if (countEl) countEl.innerHTML = 'Error loading'; + } catch (e) { /* ignore */ } } }) .catch(error => { console.error('Error searching plugin store:', error); showStoreLoading(false); - let errorMsg = 'Error searching plugin store: ' + error.message; - if (error.message && error.message.includes('Failed to Fetch')) { - errorMsg += ' - Please try refreshing your browser.'; - } - showError(errorMsg); + showError('Error searching plugin store: ' + error.message); try { const countEl = document.getElementById('store-count'); - if (countEl) { - countEl.innerHTML = 'Error loading'; - } - } catch (e) { - console.warn('Could not update store count:', e); - } + if (countEl) countEl.innerHTML = 'Error loading'; + } catch (e) { /* ignore */ } }); } @@ -5554,6 +5259,257 @@ function showStoreLoading(show) { } } +// ── Plugin Store: Client-Side Filter/Sort/Pagination ──────────────────────── + +function isStorePluginInstalled(pluginId) { + return (window.installedPlugins || installedPlugins || []).some(p => p.id === pluginId); +} + +function applyStoreFiltersAndSort(skipPageReset) { + if (!pluginStoreCache) return; + const st = storeFilterState; + + let list = pluginStoreCache.slice(); + + // Text search + if (st.searchQuery) { + const q = st.searchQuery.toLowerCase(); + list = list.filter(plugin => { + const hay = [ + plugin.name, plugin.description, plugin.author, + plugin.id, plugin.category, + ...(plugin.tags || []) + ].filter(Boolean).join(' ').toLowerCase(); + return hay.includes(q); + }); + } + + // Category filter + if (st.filterCategory) { + const cat = st.filterCategory.toLowerCase(); + list = list.filter(plugin => (plugin.category || '').toLowerCase() === cat); + } + + // Installed filter + if (st.filterInstalled === true) { + list = list.filter(plugin => isStorePluginInstalled(plugin.id)); + } else if (st.filterInstalled === false) { + list = list.filter(plugin => !isStorePluginInstalled(plugin.id)); + } + + // Sort + list.sort((a, b) => { + const nameA = (a.name || a.id || '').toLowerCase(); + const nameB = (b.name || b.id || '').toLowerCase(); + switch (st.sort) { + case 'z-a': return nameB.localeCompare(nameA); + case 'category': { + const catCmp = (a.category || '').localeCompare(b.category || ''); + return catCmp !== 0 ? catCmp : nameA.localeCompare(nameB); + } + case 'author': { + const authCmp = (a.author || '').localeCompare(b.author || ''); + return authCmp !== 0 ? authCmp : nameA.localeCompare(nameB); + } + case 'newest': { + const dateA = a.last_updated ? new Date(a.last_updated).getTime() : 0; + const dateB = b.last_updated ? new Date(b.last_updated).getTime() : 0; + return dateB - dateA; // newest first + } + default: return nameA.localeCompare(nameB); + } + }); + + storeFilteredList = list; + if (!skipPageReset) st.page = 1; + + renderStorePage(); + updateStoreFilterUI(); +} + +function renderStorePage() { + const st = storeFilterState; + const total = storeFilteredList.length; + const totalPages = Math.max(1, Math.ceil(total / st.perPage)); + if (st.page > totalPages) st.page = totalPages; + + const start = (st.page - 1) * st.perPage; + const end = Math.min(start + st.perPage, total); + const pagePlugins = storeFilteredList.slice(start, end); + + // Results info + const info = total > 0 + ? `Showing ${start + 1}\u2013${end} of ${total} plugins` + : 'No plugins match your filters'; + const infoEl = document.getElementById('store-results-info'); + const infoElBot = document.getElementById('store-results-info-bottom'); + if (infoEl) infoEl.textContent = info; + if (infoElBot) infoElBot.textContent = info; + + // Pagination + renderStorePagination('store-pagination-top', totalPages, st.page); + renderStorePagination('store-pagination-bottom', totalPages, st.page); + + // Grid + renderPluginStore(pagePlugins); +} + +function renderStorePagination(containerId, totalPages, currentPage) { + const container = document.getElementById(containerId); + if (!container) return; + + if (totalPages <= 1) { container.innerHTML = ''; return; } + + const btnClass = 'px-3 py-1 text-sm rounded-md border transition-colors'; + const activeClass = 'bg-blue-600 text-white border-blue-600'; + const normalClass = 'bg-white text-gray-700 border-gray-300 hover:bg-gray-100 cursor-pointer'; + const disabledClass = 'bg-gray-100 text-gray-400 border-gray-200 cursor-not-allowed'; + + let html = ''; + html += ``; + + const pages = []; + pages.push(1); + if (currentPage > 3) pages.push('...'); + for (let i = Math.max(2, currentPage - 1); i <= Math.min(totalPages - 1, currentPage + 1); i++) { + pages.push(i); + } + if (currentPage < totalPages - 2) pages.push('...'); + if (totalPages > 1) pages.push(totalPages); + + pages.forEach(p => { + if (p === '...') { + html += ``; + } else { + html += ``; + } + }); + + html += ``; + + container.innerHTML = html; + + container.querySelectorAll('[data-store-page]').forEach(btn => { + btn.addEventListener('click', function() { + const p = parseInt(this.getAttribute('data-store-page')); + if (p >= 1 && p <= totalPages && p !== currentPage) { + storeFilterState.page = p; + renderStorePage(); + const grid = document.getElementById('plugin-store-grid'); + if (grid) grid.scrollIntoView({ behavior: 'smooth', block: 'start' }); + } + }); + }); +} + +function updateStoreFilterUI() { + const st = storeFilterState; + const count = st.activeCount(); + + const badge = document.getElementById('store-active-filters'); + const clearBtn = document.getElementById('store-clear-filters'); + if (badge) { + badge.classList.toggle('hidden', count === 0); + badge.textContent = count + ' filter' + (count !== 1 ? 's' : '') + ' active'; + } + if (clearBtn) clearBtn.classList.toggle('hidden', count === 0); + + const instBtn = document.getElementById('store-filter-installed'); + if (instBtn) { + if (st.filterInstalled === true) { + instBtn.innerHTML = 'Installed'; + instBtn.classList.add('border-green-400', 'bg-green-50'); + instBtn.classList.remove('border-gray-300', 'bg-white', 'border-red-400', 'bg-red-50'); + } else if (st.filterInstalled === false) { + instBtn.innerHTML = 'Not Installed'; + instBtn.classList.add('border-red-400', 'bg-red-50'); + instBtn.classList.remove('border-gray-300', 'bg-white', 'border-green-400', 'bg-green-50'); + } else { + instBtn.innerHTML = 'All'; + instBtn.classList.add('border-gray-300', 'bg-white'); + instBtn.classList.remove('border-green-400', 'bg-green-50', 'border-red-400', 'bg-red-50'); + } + } +} + +function setupStoreFilterListeners() { + // Search with debounce + const searchEl = document.getElementById('plugin-search'); + if (searchEl && !searchEl._storeFilterInit) { + searchEl._storeFilterInit = true; + let debounce = null; + searchEl.addEventListener('input', function() { + clearTimeout(debounce); + debounce = setTimeout(() => { + storeFilterState.searchQuery = this.value.trim(); + applyStoreFiltersAndSort(); + }, 300); + }); + } + + // Category dropdown + const catEl = document.getElementById('plugin-category'); + if (catEl && !catEl._storeFilterInit) { + catEl._storeFilterInit = true; + catEl.addEventListener('change', function() { + storeFilterState.filterCategory = this.value; + applyStoreFiltersAndSort(); + }); + } + + // Sort dropdown + const sortEl = document.getElementById('store-sort'); + if (sortEl && !sortEl._storeFilterInit) { + sortEl._storeFilterInit = true; + sortEl.addEventListener('change', function() { + storeFilterState.sort = this.value; + storeFilterState.persist(); + applyStoreFiltersAndSort(); + }); + } + + // Installed toggle (cycle: all → installed → not-installed → all) + const instBtn = document.getElementById('store-filter-installed'); + if (instBtn && !instBtn._storeFilterInit) { + instBtn._storeFilterInit = true; + instBtn.addEventListener('click', function() { + const st = storeFilterState; + if (st.filterInstalled === null) st.filterInstalled = true; + else if (st.filterInstalled === true) st.filterInstalled = false; + else st.filterInstalled = null; + applyStoreFiltersAndSort(); + }); + } + + // Clear filters + const clearBtn = document.getElementById('store-clear-filters'); + if (clearBtn && !clearBtn._storeFilterInit) { + clearBtn._storeFilterInit = true; + clearBtn.addEventListener('click', function() { + storeFilterState.reset(); + const searchEl = document.getElementById('plugin-search'); + if (searchEl) searchEl.value = ''; + const catEl = document.getElementById('plugin-category'); + if (catEl) catEl.value = ''; + const sortEl = document.getElementById('store-sort'); + if (sortEl) sortEl.value = 'a-z'; + storeFilterState.persist(); + applyStoreFiltersAndSort(); + }); + } + + // Per-page selector + const ppEl = document.getElementById('store-per-page'); + if (ppEl && !ppEl._storeFilterInit) { + ppEl._storeFilterInit = true; + ppEl.addEventListener('change', function() { + storeFilterState.perPage = parseInt(this.value) || 12; + storeFilterState.persist(); + applyStoreFiltersAndSort(); + }); + } +} + // Expose searchPluginStore on window.pluginManager for Alpine.js integration window.searchPluginStore = searchPluginStore; window.pluginManager.searchPluginStore = searchPluginStore; @@ -5584,26 +5540,17 @@ function renderPluginStore(plugins) { return JSON.stringify(text || ''); }; - // Build installed lookup for badges - const installedMap = new Map(); - (window.installedPlugins || []).forEach(p => { - installedMap.set(p.id, p.version || ''); - }); - container.innerHTML = plugins.map(plugin => { - const isInstalled = installedMap.has(plugin.id); - const installedVersion = installedMap.get(plugin.id); - const hasUpdate = isInstalled && plugin.version && installedVersion && isNewerVersion(plugin.version, installedVersion); + const installed = isStorePluginInstalled(plugin.id); return `
-
+

${escapeHtml(plugin.name || plugin.id)}

${plugin.verified ? 'Verified' : ''} - ${isNewPlugin(plugin.last_updated) ? 'New' : ''} - ${isInstalled ? 'Installed' : ''} - ${hasUpdate ? 'Update' : ''} + ${installed ? 'Installed' : ''} + ${isNewPlugin(plugin.last_updated) ? 'New' : ''} ${plugin._source === 'custom_repository' ? `Custom` : ''}
@@ -5623,26 +5570,25 @@ function renderPluginStore(plugins) { ` : ''} -
+
-
-
-
- `; +
`; }).join(''); } @@ -5664,12 +5610,9 @@ window.installPlugin = function(pluginId, branch = null) { .then(data => { showNotification(data.message, data.status); if (data.status === 'success') { - // Refresh both installed plugins and store + // Refresh installed plugins list, then re-render store to update badges loadInstalledPlugins(); - // Delay store refresh slightly to ensure DOM is ready - setTimeout(() => { - searchPluginStore(); - }, 100); + setTimeout(() => applyStoreFiltersAndSort(true), 500); } }) .catch(error => { @@ -6354,20 +6297,6 @@ function formatCommit(commit, branch) { return 'Latest'; } -// Check if storeVersion is strictly newer than installedVersion (semver-aware) -function isNewerVersion(storeVersion, installedVersion) { - const parse = (v) => (v || '').replace(/^v/, '').split('.').map(n => parseInt(n, 10) || 0); - const a = parse(storeVersion); - const b = parse(installedVersion); - const len = Math.max(a.length, b.length); - for (let i = 0; i < len; i++) { - const diff = (a[i] || 0) - (b[i] || 0); - if (diff > 0) return true; - if (diff < 0) return false; - } - return false; -} - // Check if plugin is new (updated within last 7 days) function isNewPlugin(lastUpdated) { if (!lastUpdated) return false; @@ -7629,3 +7558,598 @@ setTimeout(function() { }, 500); }, 200); +// ─── Starlark Apps Integration ────────────────────────────────────────────── + +(function() { + 'use strict'; + + let starlarkSectionVisible = false; + let starlarkFullCache = null; // All apps from server + let starlarkFilteredList = []; // After filters applied + let starlarkDataLoaded = false; + + // ── Filter State ──────────────────────────────────────────────────────── + const starlarkFilterState = { + sort: safeLocalStorage.getItem('starlarkSort') || 'a-z', + filterInstalled: null, // null=all, true=installed, false=not-installed + filterAuthor: '', + filterCategory: '', + searchQuery: '', + page: 1, + perPage: parseInt(safeLocalStorage.getItem('starlarkPerPage')) || 24, + persist() { + safeLocalStorage.setItem('starlarkSort', this.sort); + safeLocalStorage.setItem('starlarkPerPage', this.perPage); + }, + reset() { + this.sort = 'a-z'; + this.filterInstalled = null; + this.filterAuthor = ''; + this.filterCategory = ''; + this.searchQuery = ''; + this.page = 1; + }, + activeCount() { + let n = 0; + if (this.searchQuery) n++; + if (this.filterInstalled !== null) n++; + if (this.filterAuthor) n++; + if (this.filterCategory) n++; + if (this.sort !== 'a-z') n++; + return n; + } + }; + + // ── Helpers ───────────────────────────────────────────────────────────── + function escapeHtml(str) { + if (!str) return ''; + const div = document.createElement('div'); + div.textContent = str; + return div.innerHTML; + } + + function isStarlarkInstalled(appId) { + // Check window.installedPlugins (populated by loadInstalledPlugins) + if (window.installedPlugins && Array.isArray(window.installedPlugins)) { + return window.installedPlugins.some(p => p.id === 'starlark:' + appId); + } + return false; + } + + // ── Section Toggle + Init ─────────────────────────────────────────────── + function initStarlarkSection() { + const toggleBtn = document.getElementById('toggle-starlark-section'); + if (toggleBtn && !toggleBtn._starlarkInit) { + toggleBtn._starlarkInit = true; + toggleBtn.addEventListener('click', function() { + starlarkSectionVisible = !starlarkSectionVisible; + const content = document.getElementById('starlark-section-content'); + const icon = document.getElementById('starlark-section-icon'); + if (content) content.classList.toggle('hidden', !starlarkSectionVisible); + if (icon) { + icon.classList.toggle('fa-chevron-down', !starlarkSectionVisible); + icon.classList.toggle('fa-chevron-up', starlarkSectionVisible); + } + this.querySelector('span').textContent = starlarkSectionVisible ? 'Hide' : 'Show'; + if (starlarkSectionVisible) { + loadStarlarkStatus(); + if (!starlarkDataLoaded) fetchStarlarkApps(); + } + }); + } + + // Restore persisted sort/perPage + const sortEl = document.getElementById('starlark-sort'); + if (sortEl) sortEl.value = starlarkFilterState.sort; + const ppEl = document.getElementById('starlark-per-page'); + if (ppEl) ppEl.value = starlarkFilterState.perPage; + + setupStarlarkFilterListeners(); + + const uploadBtn = document.getElementById('starlark-upload-btn'); + if (uploadBtn && !uploadBtn._starlarkInit) { + uploadBtn._starlarkInit = true; + uploadBtn.addEventListener('click', function() { + const input = document.createElement('input'); + input.type = 'file'; + input.accept = '.star'; + input.onchange = function(e) { + if (e.target.files.length > 0) uploadStarlarkFile(e.target.files[0]); + }; + input.click(); + }); + } + } + + // ── Status ────────────────────────────────────────────────────────────── + function loadStarlarkStatus() { + fetch('/api/v3/starlark/status') + .then(r => r.json()) + .then(data => { + const banner = document.getElementById('starlark-pixlet-status'); + if (!banner) return; + if (data.pixlet_available) { + banner.innerHTML = `
+ Pixlet available${data.pixlet_version ? ' (' + escapeHtml(data.pixlet_version) + ')' : ''} — ${data.installed_apps || 0} app(s) installed +
`; + } else { + banner.innerHTML = `
+ Pixlet not installed. + +
`; + } + }) + .catch(err => console.error('Starlark status error:', err)); + } + + // ── Bulk Fetch All Apps ───────────────────────────────────────────────── + function fetchStarlarkApps() { + const grid = document.getElementById('starlark-apps-grid'); + if (grid) { + grid.innerHTML = `
+
+ ${Array(10).fill('
').join('')} +
+
`; + } + + fetch('/api/v3/starlark/repository/browse') + .then(r => r.json()) + .then(data => { + if (data.status !== 'success') { + if (grid) grid.innerHTML = `
${escapeHtml(data.message || 'Failed to load')}
`; + return; + } + + starlarkFullCache = data.apps || []; + starlarkDataLoaded = true; + + // Populate category dropdown + const catSelect = document.getElementById('starlark-category'); + if (catSelect) { + catSelect.innerHTML = ''; + (data.categories || []).forEach(cat => { + const opt = document.createElement('option'); + opt.value = cat; + opt.textContent = cat; + catSelect.appendChild(opt); + }); + } + + // Populate author dropdown + const authSelect = document.getElementById('starlark-filter-author'); + if (authSelect) { + authSelect.innerHTML = ''; + (data.authors || []).forEach(author => { + const opt = document.createElement('option'); + opt.value = author; + opt.textContent = author; + authSelect.appendChild(opt); + }); + } + + const countEl = document.getElementById('starlark-apps-count'); + if (countEl) countEl.textContent = `${data.count} apps`; + + if (data.rate_limit) { + console.log(`[Starlark] GitHub rate limit: ${data.rate_limit.remaining}/${data.rate_limit.limit} remaining` + (data.cached ? ' (cached)' : '')); + } + + applyStarlarkFiltersAndSort(); + }) + .catch(err => { + console.error('Starlark browse error:', err); + if (grid) grid.innerHTML = '
Error loading apps
'; + }); + } + + // ── Apply Filters + Sort ──────────────────────────────────────────────── + function applyStarlarkFiltersAndSort(skipPageReset) { + if (!starlarkFullCache) return; + const st = starlarkFilterState; + + let list = starlarkFullCache.slice(); + + // Text search + if (st.searchQuery) { + const q = st.searchQuery.toLowerCase(); + list = list.filter(app => { + const hay = [app.name, app.summary, app.desc, app.author, app.id, app.category] + .filter(Boolean).join(' ').toLowerCase(); + return hay.includes(q); + }); + } + + // Category filter + if (st.filterCategory) { + const cat = st.filterCategory.toLowerCase(); + list = list.filter(app => (app.category || '').toLowerCase() === cat); + } + + // Author filter + if (st.filterAuthor) { + list = list.filter(app => app.author === st.filterAuthor); + } + + // Installed filter + if (st.filterInstalled === true) { + list = list.filter(app => isStarlarkInstalled(app.id)); + } else if (st.filterInstalled === false) { + list = list.filter(app => !isStarlarkInstalled(app.id)); + } + + // Sort + list.sort((a, b) => { + const nameA = (a.name || a.id || '').toLowerCase(); + const nameB = (b.name || b.id || '').toLowerCase(); + switch (st.sort) { + case 'z-a': return nameB.localeCompare(nameA); + case 'category': { + const catCmp = (a.category || '').localeCompare(b.category || ''); + return catCmp !== 0 ? catCmp : nameA.localeCompare(nameB); + } + case 'author': { + const authCmp = (a.author || '').localeCompare(b.author || ''); + return authCmp !== 0 ? authCmp : nameA.localeCompare(nameB); + } + default: return nameA.localeCompare(nameB); // a-z + } + }); + + starlarkFilteredList = list; + if (!skipPageReset) st.page = 1; + + renderStarlarkPage(); + updateStarlarkFilterUI(); + } + + // ── Render Current Page ───────────────────────────────────────────────── + function renderStarlarkPage() { + const st = starlarkFilterState; + const total = starlarkFilteredList.length; + const totalPages = Math.max(1, Math.ceil(total / st.perPage)); + if (st.page > totalPages) st.page = totalPages; + + const start = (st.page - 1) * st.perPage; + const end = Math.min(start + st.perPage, total); + const pageApps = starlarkFilteredList.slice(start, end); + + // Results info + const info = total > 0 + ? `Showing ${start + 1}\u2013${end} of ${total} apps` + : 'No apps match your filters'; + const infoEl = document.getElementById('starlark-results-info'); + const infoElBot = document.getElementById('starlark-results-info-bottom'); + if (infoEl) infoEl.textContent = info; + if (infoElBot) infoElBot.textContent = info; + + // Pagination + renderStarlarkPagination('starlark-pagination-top', totalPages, st.page); + renderStarlarkPagination('starlark-pagination-bottom', totalPages, st.page); + + // Grid + const grid = document.getElementById('starlark-apps-grid'); + renderStarlarkApps(pageApps, grid); + } + + // ── Pagination Controls ───────────────────────────────────────────────── + function renderStarlarkPagination(containerId, totalPages, currentPage) { + const container = document.getElementById(containerId); + if (!container) return; + + if (totalPages <= 1) { container.innerHTML = ''; return; } + + const btnClass = 'px-3 py-1 text-sm rounded-md border transition-colors'; + const activeClass = 'bg-blue-600 text-white border-blue-600'; + const normalClass = 'bg-white text-gray-700 border-gray-300 hover:bg-gray-100 cursor-pointer'; + const disabledClass = 'bg-gray-100 text-gray-400 border-gray-200 cursor-not-allowed'; + + let html = ''; + + // Prev + html += ``; + + // Page numbers with ellipsis + const pages = []; + pages.push(1); + if (currentPage > 3) pages.push('...'); + for (let i = Math.max(2, currentPage - 1); i <= Math.min(totalPages - 1, currentPage + 1); i++) { + pages.push(i); + } + if (currentPage < totalPages - 2) pages.push('...'); + if (totalPages > 1) pages.push(totalPages); + + pages.forEach(p => { + if (p === '...') { + html += ``; + } else { + html += ``; + } + }); + + // Next + html += ``; + + container.innerHTML = html; + + // Event delegation for page buttons + container.querySelectorAll('[data-starlark-page]').forEach(btn => { + btn.addEventListener('click', function() { + const p = parseInt(this.getAttribute('data-starlark-page')); + if (p >= 1 && p <= totalPages && p !== currentPage) { + starlarkFilterState.page = p; + renderStarlarkPage(); + // Scroll to top of grid + const grid = document.getElementById('starlark-apps-grid'); + if (grid) grid.scrollIntoView({ behavior: 'smooth', block: 'start' }); + } + }); + }); + } + + // ── Card Rendering ────────────────────────────────────────────────────── + function renderStarlarkApps(apps, grid) { + if (!grid) return; + if (!apps || apps.length === 0) { + grid.innerHTML = '

No Starlark apps found

'; + return; + } + + grid.innerHTML = apps.map(app => { + const installed = isStarlarkInstalled(app.id); + return ` +
+
+
+
+

${escapeHtml(app.name || app.id)}

+ Starlark + ${installed ? 'Installed' : ''} +
+
+ ${app.author ? `

${escapeHtml(app.author)}

` : ''} + ${app.category ? `

${escapeHtml(app.category)}

` : ''} +
+

${escapeHtml(app.summary || app.desc || 'No description')}

+
+
+
+ + +
+
`; + }).join(''); + + // Add delegated event listener only once (prevent duplicate handlers) + if (!grid.dataset.starlarkHandlerAttached) { + grid.addEventListener('click', function handleStarlarkGridClick(e) { + const button = e.target.closest('button[data-action]'); + if (!button) return; + + const card = button.closest('.plugin-card'); + if (!card) return; + + const appId = card.dataset.appId; + if (!appId) return; + + const action = button.dataset.action; + if (action === 'install') { + window.installStarlarkApp(appId); + } else if (action === 'view') { + window.open('https://github.com/tronbyt/apps/tree/main/apps/' + encodeURIComponent(appId), '_blank'); + } + }); + grid.dataset.starlarkHandlerAttached = 'true'; + } + } + + // ── Filter UI Updates ─────────────────────────────────────────────────── + function updateStarlarkFilterUI() { + const st = starlarkFilterState; + const count = st.activeCount(); + + const badge = document.getElementById('starlark-active-filters'); + const clearBtn = document.getElementById('starlark-clear-filters'); + if (badge) { + badge.classList.toggle('hidden', count === 0); + badge.textContent = count + ' filter' + (count !== 1 ? 's' : '') + ' active'; + } + if (clearBtn) clearBtn.classList.toggle('hidden', count === 0); + + // Update installed toggle button text + const instBtn = document.getElementById('starlark-filter-installed'); + if (instBtn) { + if (st.filterInstalled === true) { + instBtn.innerHTML = 'Installed'; + instBtn.classList.add('border-green-400', 'bg-green-50'); + instBtn.classList.remove('border-gray-300', 'bg-white', 'border-red-400', 'bg-red-50'); + } else if (st.filterInstalled === false) { + instBtn.innerHTML = 'Not Installed'; + instBtn.classList.add('border-red-400', 'bg-red-50'); + instBtn.classList.remove('border-gray-300', 'bg-white', 'border-green-400', 'bg-green-50'); + } else { + instBtn.innerHTML = 'All'; + instBtn.classList.add('border-gray-300', 'bg-white'); + instBtn.classList.remove('border-green-400', 'bg-green-50', 'border-red-400', 'bg-red-50'); + } + } + } + + // ── Event Listeners ───────────────────────────────────────────────────── + function setupStarlarkFilterListeners() { + // Search with debounce + const searchEl = document.getElementById('starlark-search'); + if (searchEl && !searchEl._starlarkInit) { + searchEl._starlarkInit = true; + let debounce = null; + searchEl.addEventListener('input', function() { + clearTimeout(debounce); + debounce = setTimeout(() => { + starlarkFilterState.searchQuery = this.value.trim(); + applyStarlarkFiltersAndSort(); + }, 300); + }); + } + + // Category dropdown + const catEl = document.getElementById('starlark-category'); + if (catEl && !catEl._starlarkInit) { + catEl._starlarkInit = true; + catEl.addEventListener('change', function() { + starlarkFilterState.filterCategory = this.value; + applyStarlarkFiltersAndSort(); + }); + } + + // Sort dropdown + const sortEl = document.getElementById('starlark-sort'); + if (sortEl && !sortEl._starlarkInit) { + sortEl._starlarkInit = true; + sortEl.addEventListener('change', function() { + starlarkFilterState.sort = this.value; + starlarkFilterState.persist(); + applyStarlarkFiltersAndSort(); + }); + } + + // Author dropdown + const authEl = document.getElementById('starlark-filter-author'); + if (authEl && !authEl._starlarkInit) { + authEl._starlarkInit = true; + authEl.addEventListener('change', function() { + starlarkFilterState.filterAuthor = this.value; + applyStarlarkFiltersAndSort(); + }); + } + + // Installed toggle (cycle: all → installed → not-installed → all) + const instBtn = document.getElementById('starlark-filter-installed'); + if (instBtn && !instBtn._starlarkInit) { + instBtn._starlarkInit = true; + instBtn.addEventListener('click', function() { + const st = starlarkFilterState; + if (st.filterInstalled === null) st.filterInstalled = true; + else if (st.filterInstalled === true) st.filterInstalled = false; + else st.filterInstalled = null; + applyStarlarkFiltersAndSort(); + }); + } + + // Clear filters + const clearBtn = document.getElementById('starlark-clear-filters'); + if (clearBtn && !clearBtn._starlarkInit) { + clearBtn._starlarkInit = true; + clearBtn.addEventListener('click', function() { + starlarkFilterState.reset(); + // Reset UI elements + const searchEl = document.getElementById('starlark-search'); + if (searchEl) searchEl.value = ''; + const catEl = document.getElementById('starlark-category'); + if (catEl) catEl.value = ''; + const sortEl = document.getElementById('starlark-sort'); + if (sortEl) sortEl.value = 'a-z'; + const authEl = document.getElementById('starlark-filter-author'); + if (authEl) authEl.value = ''; + starlarkFilterState.persist(); + applyStarlarkFiltersAndSort(); + }); + } + + // Per-page selector + const ppEl = document.getElementById('starlark-per-page'); + if (ppEl && !ppEl._starlarkInit) { + ppEl._starlarkInit = true; + ppEl.addEventListener('change', function() { + starlarkFilterState.perPage = parseInt(this.value) || 24; + starlarkFilterState.persist(); + applyStarlarkFiltersAndSort(); + }); + } + } + + // ── Install / Upload / Pixlet ─────────────────────────────────────────── + window.installStarlarkApp = function(appId) { + if (!confirm(`Install Starlark app "${appId}" from Tronbyte repository?`)) return; + + fetch('/api/v3/starlark/repository/install', { + method: 'POST', + headers: {'Content-Type': 'application/json'}, + body: JSON.stringify({app_id: appId}) + }) + .then(r => r.json()) + .then(data => { + if (data.status === 'success') { + alert(`Installed: ${data.message || appId}`); + // Refresh installed plugins list + if (typeof loadInstalledPlugins === 'function') loadInstalledPlugins(); + else if (typeof window.loadInstalledPlugins === 'function') window.loadInstalledPlugins(); + // Re-render current page to update installed badges + setTimeout(() => applyStarlarkFiltersAndSort(true), 500); + } else { + alert(`Install failed: ${data.message || 'Unknown error'}`); + } + }) + .catch(err => { + console.error('Install error:', err); + alert('Install failed: ' + err.message); + }); + }; + + window.installPixlet = function() { + if (!confirm('Download and install Pixlet binary? This may take a few minutes.')) return; + + fetch('/api/v3/starlark/install-pixlet', {method: 'POST'}) + .then(r => r.json()) + .then(data => { + if (data.status === 'success') { + alert(data.message || 'Pixlet installed!'); + loadStarlarkStatus(); + } else { + alert('Pixlet install failed: ' + (data.message || 'Unknown error')); + } + }) + .catch(err => alert('Pixlet install failed: ' + err.message)); + }; + + function uploadStarlarkFile(file) { + const formData = new FormData(); + formData.append('file', file); + + const appId = file.name.replace('.star', ''); + formData.append('app_id', appId); + formData.append('name', appId.replace(/_/g, ' ').replace(/\b\w/g, c => c.toUpperCase())); + + fetch('/api/v3/starlark/upload', {method: 'POST', body: formData}) + .then(r => r.json()) + .then(data => { + if (data.status === 'success') { + alert(`Uploaded: ${data.app_id}`); + if (typeof loadInstalledPlugins === 'function') loadInstalledPlugins(); + else if (typeof window.loadInstalledPlugins === 'function') window.loadInstalledPlugins(); + setTimeout(() => applyStarlarkFiltersAndSort(true), 500); + } else { + alert('Upload failed: ' + (data.message || 'Unknown error')); + } + }) + .catch(err => alert('Upload failed: ' + err.message)); + } + + // ── Bootstrap ─────────────────────────────────────────────────────────── + const origInit = window.initializePlugins; + window.initializePlugins = function() { + if (origInit) origInit(); + initStarlarkSection(); + }; + + document.addEventListener('DOMContentLoaded', initStarlarkSection); + document.addEventListener('htmx:afterSwap', function(e) { + if (e.detail && e.detail.target && e.detail.target.id === 'plugins-content') { + initStarlarkSection(); + } + }); +})(); + diff --git a/web_interface/templates/v3/partials/plugins.html b/web_interface/templates/v3/partials/plugins.html index b861aabec..3b01bd74d 100644 --- a/web_interface/templates/v3/partials/plugins.html +++ b/web_interface/templates/v3/partials/plugins.html @@ -28,16 +28,6 @@

Plugin Management

Installed Plugins

0 installed
-
- - -
@@ -157,83 +147,58 @@

GitHub API Configuration

-
-
- - - -
+ +
+ +
- -
- -
- -
- - -
- - -
+ +
+ + + +
+ + + - -
- Filter: - - - -
+
- -
+ + + +
- - + + + - - - -
- - -
@@ -246,11 +211,106 @@

GitHub API Configuration

+
+ + +
+ +
+ +
+
+
+

Starlark Apps

+ +
+ +
+ +
+
diff --git a/web_interface/templates/v3/partials/starlark_config.html b/web_interface/templates/v3/partials/starlark_config.html new file mode 100644 index 000000000..af50a7a9f --- /dev/null +++ b/web_interface/templates/v3/partials/starlark_config.html @@ -0,0 +1,456 @@ +
+ +
+
+

+ {{ app_name }} +

+

Starlark App — ID: {{ app_id }}

+
+
+ Starlark + {% if app_enabled %} + Enabled + {% else %} + Disabled + {% endif %} +
+
+ + +
+

Status

+
+
+ Frames: + {{ frame_count if has_frames else 'Not rendered' }} +
+
+ Render Interval: + {{ render_interval }}s +
+
+ Display Duration: + {{ display_duration }}s +
+
+ Last Render: + {{ last_render_time }} +
+
+
+ + +
+ + +
+ + +
+

Timing Settings

+
+
+
+ + +

How often the app re-renders (fetches new data)

+
+
+ + +

How long the app displays before rotating

+
+
+ + {# ── Schema-driven App Settings ── #} + {% set fields = [] %} + {% if schema %} + {% if schema.fields is defined %} + {% set fields = schema.fields %} + {% elif schema.schema is defined and schema.schema is iterable and schema.schema is not string %} + {% set fields = schema.schema %} + {% endif %} + {% endif %} + + {% if fields %} +
+

App Settings

+ + {% for field in fields %} + {% if field.typeOf is defined and field.id is defined %} + {% set field_id = field.id %} + {% set field_type = field.typeOf %} + {% set field_name = field.name or field_id %} + {% set field_desc = field.desc or '' %} + {% set field_default = field.default if field.default is defined else '' %} + {% set current_val = config.get(field_id, field_default) if config else field_default %} + + {# ── text ── #} + {% if field_type == 'text' %} +
+ + + {% if field_desc %} +

{{ field_desc }}

+ {% endif %} +
+ + {# ── dropdown ── #} + {% elif field_type == 'dropdown' %} +
+ + + {% if field_desc %} +

{{ field_desc }}

+ {% endif %} +
+ + {# ── toggle ── #} + {% elif field_type == 'toggle' %} +
+ + {% if field_desc %} +

{{ field_desc }}

+ {% endif %} +
+ + {# ── color ── #} + {% elif field_type == 'color' %} +
+ +
+ + +
+ {% if field_desc %} +

{{ field_desc }}

+ {% endif %} +
+ + {# ── datetime ── #} + {% elif field_type == 'datetime' %} +
+ + + {% if field_desc %} +

{{ field_desc }}

+ {% endif %} +
+ + {# ── location (mini-form) ── #} + {% elif field_type == 'location' %} +
+ +
+
+ +
+
+ +
+
+ +
+
+ {% if field_desc %} +

{{ field_desc }}

+ {% endif %} +
+ + {# ── oauth2 (unsupported) ── #} + {% elif field_type == 'oauth2' %} +
+ +
+ + This app requires OAuth2 authentication, which is not supported in standalone mode. +
+ {% if field_desc %} +

{{ field_desc }}

+ {% endif %} +
+ + {# ── photo_select (unsupported) ── #} + {% elif field_type == 'photo_select' %} +
+ +
+ + Photo upload is not supported in this interface. +
+
+ + {# ── generated (hidden meta-field, skip) ── #} + {% elif field_type == 'generated' %} + {# Invisible — generated fields are handled server-side by Pixlet #} + + {# ── typeahead / location_based (text fallback with note) ── #} + {% elif field_type in ('typeahead', 'location_based') %} +
+ + +

+ + This field normally uses autocomplete which requires a Pixlet server. Enter the value manually. +

+ {% if field_desc %} +

{{ field_desc }}

+ {% endif %} +
+ + {# ── unknown type (text fallback) ── #} + {% else %} +
+ + + {% if field_desc %} +

{{ field_desc }}

+ {% endif %} +
+ {% endif %} + + {% endif %}{# end field.typeOf and field.id check #} + {% endfor %} + + {# Also show any config keys NOT in the schema (user-added or legacy) #} + {% if config %} + {% set schema_ids = [] %} + {% for f in fields %} + {% if f.id is defined %} + {% if schema_ids.append(f.id) %}{% endif %} + {% endif %} + {% endfor %} + {% for key, value in config.items() %} + {% if key not in ('render_interval', 'display_duration') and key not in schema_ids %} +
+ + +
+ {% endif %} + {% endfor %} + {% endif %} + + {# ── No schema: fall back to raw config key/value pairs ── #} + {% elif config %} +
+

App Settings

+ {% for key, value in config.items() %} + {% if key not in ('render_interval', 'display_duration') %} +
+ + +
+ {% endif %} + {% endfor %} + {% endif %} + + +
+
+
+ +