Skip to content

miron/NeonCore

Repository files navigation

NeonCore

image

Welcome to NeonCore - A Text-Based Cyberpunk RPG

Dive into the neon-soaked streets of Night City in this immersive text-based role-playing game. As a cyberpunk operative, navigate through a world of corporate intrigue, street-level conflicts, and high-tech warfare.

Features:

  • Choose your role: Become a street-smart Netrunner, a combat-ready Solo, or other unique characters
  • Deep character customization with detailed lifepaths and backgrounds
  • Dynamic skill-based gameplay system
  • Text-based exploration of Night City's various districts
  • Engaging NPC interactions and encounters (Combat & Social)
  • Command-line interface with cyberpunk aesthetic
  • Detailed Life Path system including cultural regions, personality, and background

Built with Python, NeonCore delivers a classic RPG experience inspired by the Cyberpunk 2020 tabletop game system. Use your skills, cyber-enhancements, and street smarts to survive and thrive in the dark future of 2045.

Installation & Usage

Windows (PowerShell)

  1. Setup Environment:

    python -m venv venv
    .\venv\Scripts\python.exe -m pip install --upgrade pip
    .\venv\Scripts\python.exe -m pip install -e .
  2. Run NeonCore: Client-Server Mode (Default):

    .\play.ps1

    Starts the Game Server in the background and launches the Client.

    Note: play.ps1 automatically uses the virtual environment's Python, bypassing the need for manual activation scripts and avoiding common PowerShell permission errors.

Linux/MacOS

  1. Setup:
    python3 -m venv venv
    source venv/bin/activate
    pip install -e .
  2. Run:
    python run_game.py

AI & Architecture

Architecture

NeonCore utilizes a Client-Server architecture by default.

  • Server: Handles game state, world simulation, and AI requests (server.py).
  • Client: Handles UI and user input (client.py).
  • Database (Concept): A decentralized Nostr relay system is planned for persistent world state and player communications, but is currently in conceptual phase.

AI Backend

The game requires an LLM to power the "Digital Soul" and NPC interactions.

  1. Ollama (Recommended - Local)

    • Install Ollama and pull the verified model:
      ollama pull qwen3:32b
    • Note: Code includes specific JSON sanitization fixes for Qwen's creative output.
    • Configurable in NeonCore/config.py.
  2. Gemini (Alternative - Cloud)

    • Get API Key from Google AI Studio.
    • Set env var: GEMINI_API_KEY="your-key"
    • Change default_backend to gemini in config.py.
  3. Remote Inference (Optional) If running Ollama on a separate powerful machine (e.g., a desktop with an RTX 4090), follow these steps:

    Host Machine (where Ollama runs):

    • Bind Address: Ensure Ollama listens on all interfaces.
      • Linux: OLLAMA_HOST="0.0.0.0" ollama serve
      • Windows: Set env var OLLAMA_HOST to 0.0.0.0 before starting.
    • Firewall: Allow inbound traffic on port 11434 (TCP).
      • Fedora/RHEL: sudo firewall-cmd --add-port=11434/tcp --permanent && sudo firewall-cmd --reload
      • Windows: Add an Inbound Rule in "Windows Defender Firewall".

    Client Machine (where NeonCore runs):

    • Edit .env to point to the host:
      OLLAMA_HOST="http://192.168.0.x:11434"
      OLLAMA_MODEL="qwen3:32b"  # Or your preferred model

In-Game Chat

  1. Approach NPC: Type talk [NPC Name] (e.g., talk Lenard).
  2. Chat: Just type natural text.
    You -> Lenard > Where is the money?
    Lenard: (Sweating) look, I don't have it all...
    
  3. Commands: Type bye to exit, or take [item] to grab things mid-conversation.

Developer & Debug Commands

Use these commands to test game systems on the fly without setting up complex scenarios.

  • dev_fan [npc_name]: Forces the specified NPC to become a "Fan" of the player (Relationship Status: Fan).
    • Usage: dev_fan lenard
    • Effect: look shows [ FAN ] tag. talk uses "Charismatic Impact" context.

About

Immersive Terminal-based Cyberpunk RPG built with Python. Features "Digital Soul" mechanics powered by local AI (Ollama) or cloud (Google Gemini).

Topics

Resources

Contributing

Stars

Watchers

Forks

Releases

No releases published

Sponsor this project

 

Packages

 
 
 

Contributors

Generated from github/codespaces-blank