Skip to content

LocalCommandLineCodeExecutor to support PowerShell #5518

@zytoh0

Description

@zytoh0

What happened?

Ran a simple script below with ollama to get local code execution with PowerShell but language not supported.

import asyncio
from autogen_ext.models.openai import OpenAIChatCompletionClient
from autogen_agentchat.agents import AssistantAgent, CodeExecutorAgent
from autogen_agentchat.teams import RoundRobinGroupChat
from autogen_agentchat.ui import Console
from autogen_ext.code_executors.local import LocalCommandLineCodeExecutor

async def main() -> None:
    # Set up the model client to use the local Ollama model "llama3.2:3b-instruct-fp16"
    model_client = OpenAIChatCompletionClient(
        model="qwen2.5:32b-instruct-q6_K",
        base_url="http://localhost:11434/v1",  # Adjust if your Ollama server runs on a different address/port.
        api_key="placeholder",  # Replace with your API key if required.
        model_info={
            "vision": False,
            "function_calling": True,
            "json_output": True,
            "family": "unknown",
        },
    )

    # Create a local code executor that will execute code on the local machine.
    # Note: There is no need to call start() or stop() on LocalCommandLineCodeExecutor.
    code_executor = LocalCommandLineCodeExecutor(work_dir="coding")

    # Create a CodeExecutorAgent that executes local code blocks.
    code_agent = CodeExecutorAgent("CodeExecutor", code_executor=code_executor)

    # Create an AssistantAgent specialized for Windows 11 settings changes.
    # The system message instructs the agent to generate safe code (Python or shell)
    # that will, for example, enable dark mode in Windows 11.
    assistant_agent = AssistantAgent(
        "WindowsAdmin",
        model_client=model_client,
        system_message=(
            "You are a Windows 11 system administrator. Your task is to generate code that modifies Windows 11 settings using local code execution. For example, enable dark mode by modifying the appropriate registry settings. When generating code, ensure that it is safe and non-destructive. Use the provided code execution tool to run your code and report the output."
        ),
    )

    # Assemble the team using RoundRobinGroupChat to act as a round-robin orchestrator.
    # Here, max_turns=5 allows for up to 5 rounds of interactions between agents.
    team = RoundRobinGroupChat([assistant_agent, code_agent], max_turns=5)

    # Define the specialized task: change a Windows 11 setting (e.g., enable dark mode).
    task = "Enable dark mode in Windows 11 using local code execution. Generate and run the code that applies the necessary registry changes."

    # Run the team using streaming output via the Console helper.
    result = await Console(team.run_stream(task=task))
    print(result)

if __name__ == "__main__":
    asyncio.run(main())

got this output:

---------- user ----------
Enable dark mode in Windows 11 using local code execution. Generate and run the code that applies the necessary registry changes.
---------- WindowsAdmin ----------
Error processing publish message for CodeExecutor/99c693ff-3e83-4d09-b624-41cae6becf7f
Traceback (most recent call last):
  File "C:\Users\delli\OneDrive\Documents\autogen-llama3.2\python\autogen\Lib\site-packages\autogen_core\_single_threaded_agent_runtime.py", line 505, in _on_message
    return await agent.on_message(
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\delli\OneDrive\Documents\autogen-llama3.2\python\autogen\Lib\site-packages\autogen_core\_base_agent.py", line 113, in on_message
    return await self.on_message_impl(message, ctx)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\delli\OneDrive\Documents\autogen-llama3.2\python\autogen\Lib\site-packages\autogen_agentchat\teams\_group_chat\_sequential_routed_agent.py", line 48, in on_message_impl
    return await super().on_message_impl(message, ctx)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\delli\OneDrive\Documents\autogen-llama3.2\python\autogen\Lib\site-packages\autogen_core\_routed_agent.py", line 485, in on_message_impl
    return await h(self, message, ctx)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\delli\OneDrive\Documents\autogen-llama3.2\python\autogen\Lib\site-packages\autogen_core\_routed_agent.py", line 268, in wrapper
    return_value = await func(self, message, ctx)  # type: ignore
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\delli\OneDrive\Documents\autogen-llama3.2\python\autogen\Lib\site-packages\autogen_agentchat\teams\_group_chat\_chat_agent_container.py", line 53, in handle_request
    async for msg in self._agent.on_messages_stream(self._message_buffer, ctx.cancellation_token):
  File "C:\Users\delli\OneDrive\Documents\autogen-llama3.2\python\autogen\Lib\site-packages\autogen_agentchat\agents\_base_chat_agent.py", line 102, in on_messages_stream
    response = await self.on_messages(messages, cancellation_token)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\delli\OneDrive\Documents\autogen-llama3.2\python\autogen\Lib\site-packages\autogen_agentchat\agents\_code_executor_agent.py", line 94, in on_messages
    result = await self._code_executor.execute_code_blocks(code_blocks, cancellation_token=cancellation_token)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\delli\OneDrive\Documents\autogen-llama3.2\python\autogen\Lib\site-packages\autogen_ext\code_executors\local\__init__.py", line 261, in execute_code_blocks
    return await self._execute_code_dont_check_setup(code_blocks, cancellation_token)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\delli\OneDrive\Documents\autogen-llama3.2\python\autogen\Lib\site-packages\autogen_ext\code_executors\local\__init__.py", line 313, in _execute_code_dont_check_setup
    program = sys.executable if lang.startswith("python") else lang_to_cmd(lang)
                                                               ^^^^^^^^^^^^^^^^^
  File "C:\Users\delli\OneDrive\Documents\autogen-llama3.2\python\autogen\Lib\site-packages\autogen_ext\code_executors\_common.py", line 162, in lang_to_cmd
    raise ValueError(f"Unsupported language: {lang}")
ValueError: Unsupported language: powershell

https://microsoft.github.io/autogen/stable/reference/python/autogen_ext.code_executors.local.html#autogen_ext.code_executors.local.LocalCommandLineCodeExecutor shows SUPPORTED_LANGUAGES: ClassVar[List[str]] = ['bash', 'shell', 'sh', 'pwsh', 'powershell', 'ps1', 'python']

but digging into autogen\Lib\site-packages\autogen_ext\code_executors_common.py
shows

def lang_to_cmd(lang: str) -> str:
    if lang in PYTHON_VARIANTS:
        return "python"
    if lang.startswith("python") or lang in ["bash", "sh"]:
        return lang
    if lang in ["shell"]:
        return "sh"
    else:
        raise ValueError(f"Unsupported language: {lang}")

What did you expect to happen?

Dark mode enabled on windows

How can we reproduce it (as minimally and precisely as possible)?

# Install uv if not already installed

# Create a new environment named 'autogen' with Python 3.11
uv venv autogen --python=3.11

# Activate the environment (Windows PowerShell)
.\autogen\Scripts\Activate

pip install -U magentic-one-cli autogenstudio autogen-core

install ollama: https://ollama.com/download

# Pull model and test run
ollama run qwen2.5:32b-instruct-q6_K "Hello"

run the script above

AutoGen version

0.4.6

Which package was this bug in

Extensions

Model used

qwen2.5:32b-instruct-q6_K

Python version

3.11

Operating system

Windows 11 Pro Version 24H2

Any additional info you think would be helpful for fixing this bug

No response

Metadata

Metadata

Assignees

Type

Projects

No projects

Relationships

None yet

Development

No branches or pull requests

Issue actions