- Desktop: Click the desktop icon to use immediately.
- Web or docker: Access http://localhost:3456/ after startup.
- API call: Developer-friendly, perfectly compatible with OpenAI format, can output in real-time, and does not affect the original API's response speed. No need to modify the calling code:
from openai import OpenAI
client = OpenAI(
api_key="super-secret-key",
base_url="<http://localhost:3456/v1>"
)
response = client.chat.completions.create(
model="super-model",
messages=[
{"role": "user", "content": "What is Super Agent Party?"}
]
)
print(response.choices[0].message.content)
- MCP call: After starting, you can invoke the local MCP service by writing the following content in the configuration file:
{
"mcpServers": {
"super-agent-party": {
"url": "<http://127.0.0.1:3456/mcp>",
}
}
}