Skip to content

Conversation

@GIND123
Copy link

@GIND123 GIND123 commented Dec 8, 2025

This PR introduces a minimal REST bridge module that provides HTTP access to the OpenCog CogServer Scheme shell for external tools and services. The bridge enables remote submission of Atomese/Scheme code into the AtomSpace and returns raw evaluation output, allowing integration with Python scripts, LLM orchesators, and other experimental clients without modifying existing CogServer internals.

The implementation consists of a lightweight FastAPI server maintaining a persistent telnet connection to CogServer, a basic /atomese POST endpoint, and placeholder Scheme helpers for future reasoning workflows. End-to-end functionality has been manually verified by executing atom creation commands through REST and receiving live evaluation results from the AtomSpace. No semantic parsing, agent logic, or post-processing is included — the bridge is intentionally infrastructure-only.

This module establishes the basic external interface needed to support ongoing research into LLM–symbolic co-reasoning and long-term neuro-symbolic memory systems. It is placed under stream/rest-bridge as an experimental component aligned with the directory’s broader streaming infrastructure goals, while leaving all higher-level cognitive control design for future work.

@linas
Copy link
Member

linas commented Dec 9, 2025

Well, that's nice, except that the cogserver already provides this; see https://github.com/opencog/cogserver

It provides http and websockets and telnet access as the network communications protocols, and supports python, scheme, json, sexpr and mcp over all of them -- so three protocols times 5 access methods for a total of 15 different ways to get at data. Click through to the examples pages there: once you have a cogserver running, the examples pages will load (over http://localhost:18080) and you can then run e.g. the visualizer, which is a javascript demo.

If the documentation there is insufficient, if things don't work, if you're disappointed and want something else ... open issues, pull requests, etc. stuff like that, there.

@linas
Copy link
Member

linas commented Dec 9, 2025

Also -- what are you interested in? What brought you here? What do you hope to accomplish?

@linas
Copy link
Member

linas commented Dec 10, 2025

A few more notes that might be of interest, based on what you wrote:

  • If you want LLM integration, try the MCP interfaces on the cogserver. They work fine. There are maybe eight large "resource" files (like prompts, but for fixed documentation) that explain exactly what Atomese is, (to the LLM) and how to use it. I've been able to just talk to Claude (Sonnet 4.5), and have it write reasonably complex Atomese data processing pipelines. Mostly you just have to keep reminding it to read the documentation, and then read it again, whenever it does something dumb.

  • If you are interested in network distributed storage, take a look at the https://github.com/opencog/atomsspace-cog project. It provides networking for the AtomSpace, so you can move Atoms around on the net. By using the ProxyNodes, you can build network fabrics -- there are read-load-balancing, write buffering, write-mirroring proxies, so you can ship Atoms around however you want, from node to node.

  • I'm thinking of converting the cogserver to a so-called ObjetNode which means you'll be able to stop and start it on the fly, maybe even bounce it around, with it's dataset, from one place to another. This is in the "cool but stupid computer trick" category; fun to think about, but no one actually has expressed desire for this yet.

  • If you want to avoid the hassle of downloading and compiling everything by hand, try the docker containers over at https://github.com/opencog/docker -- git clone that repo, go to the opencog dir, then say ./docker-build.sh -a and it will download this weeks latest containers. The one you want is probably atomspace-py: it has both the atomspace, and python tools in it. Just start it, then install claude or chatgpt, and then ask the LLM to fiddle with Atomese for you, have it walk through the demos or something. I dunno "do stuff with it". Since its in a container, you can let it run wild without fear of losing anything important.

@linas linas closed this Dec 10, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants