Skip to content

agentic_app.py

The agentic_app.py module is used in orchestrator-core for actually running the entire Agentic WFO FastAPI backend and the CLI.

FastAPI Backend

The code for the WFO's Fast API backend is very well documented, so look through the functions used in this module here:

orchestrator.agentic_app

The main application module.

This module contains the main AgenticOrchestratorCore class for the FastAPI backend and provides the ability to run the CLI.

AgenticOrchestratorCore

Bases: orchestrator.app.OrchestratorCore

Source code in orchestrator/agentic_app.py
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
class AgenticOrchestratorCore(OrchestratorCore):
    def __init__(
        self,
        *args: Any,
        llm_model: OpenAIModel | str = "gpt-4o-mini",
        llm_settings: LLMSettings = llm_settings,
        agent_tools: list[FunctionToolset] | None = None,
        **kwargs: Any,
    ) -> None:
        """Initialize the `AgenticOrchestratorCore` class.

        This class takes the same arguments as the `OrchestratorCore` class.

        Args:
            *args: All the normal arguments passed to the `OrchestratorCore` class.
            llm_model: An OpenAI model class or string, not limited to OpenAI models (gpt-4o-mini etc)
            llm_settings: A class of settings for the LLM
            agent_tools: A list of tools that can be used by the agent
            **kwargs: Additional arguments passed to the `OrchestratorCore` class.

        Returns:
            None
        """
        self.llm_model = llm_model
        self.agent_tools = agent_tools
        self.llm_settings = llm_settings

        super().__init__(*args, **kwargs)

        logger.info("Mounting the agent")
        self.register_llm_integration()

    def register_llm_integration(self) -> None:
        """Mount the Agent endpoint.

        This helper mounts the agent endpoint on the application.

        Returns:
            None

        """
        from orchestrator.search.agent import build_agent_app

        agent_app = build_agent_app(self.llm_model, self.agent_tools)
        self.mount("/agent", agent_app)
__init__
__init__(
    *args: typing.Any,
    llm_model: pydantic_ai.models.openai.OpenAIModel | str = "gpt-4o-mini",
    llm_settings: orchestrator.llm_settings.LLMSettings = llm_settings,
    agent_tools: list[pydantic_ai.toolsets.FunctionToolset] | None = None,
    **kwargs: typing.Any
) -> None

Initialize the AgenticOrchestratorCore class.

This class takes the same arguments as the OrchestratorCore class.

Parameters:

  • *args (typing.Any, default: () ) –

    All the normal arguments passed to the OrchestratorCore class.

  • llm_model (pydantic_ai.models.openai.OpenAIModel | str, default: 'gpt-4o-mini' ) –

    An OpenAI model class or string, not limited to OpenAI models (gpt-4o-mini etc)

  • llm_settings (orchestrator.llm_settings.LLMSettings, default: orchestrator.agentic_app.AgenticOrchestratorCore.llm_settings ) –

    A class of settings for the LLM

  • agent_tools (list[pydantic_ai.toolsets.FunctionToolset] | None, default: None ) –

    A list of tools that can be used by the agent

  • **kwargs (typing.Any, default: {} ) –

    Additional arguments passed to the OrchestratorCore class.

Returns:

  • None

    None

Source code in orchestrator/agentic_app.py
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
def __init__(
    self,
    *args: Any,
    llm_model: OpenAIModel | str = "gpt-4o-mini",
    llm_settings: LLMSettings = llm_settings,
    agent_tools: list[FunctionToolset] | None = None,
    **kwargs: Any,
) -> None:
    """Initialize the `AgenticOrchestratorCore` class.

    This class takes the same arguments as the `OrchestratorCore` class.

    Args:
        *args: All the normal arguments passed to the `OrchestratorCore` class.
        llm_model: An OpenAI model class or string, not limited to OpenAI models (gpt-4o-mini etc)
        llm_settings: A class of settings for the LLM
        agent_tools: A list of tools that can be used by the agent
        **kwargs: Additional arguments passed to the `OrchestratorCore` class.

    Returns:
        None
    """
    self.llm_model = llm_model
    self.agent_tools = agent_tools
    self.llm_settings = llm_settings

    super().__init__(*args, **kwargs)

    logger.info("Mounting the agent")
    self.register_llm_integration()
register_llm_integration
register_llm_integration() -> None

Mount the Agent endpoint.

This helper mounts the agent endpoint on the application.

Returns:

  • None

    None

Source code in orchestrator/agentic_app.py
65
66
67
68
69
70
71
72
73
74
75
76
77
def register_llm_integration(self) -> None:
    """Mount the Agent endpoint.

    This helper mounts the agent endpoint on the application.

    Returns:
        None

    """
    from orchestrator.search.agent import build_agent_app

    agent_app = build_agent_app(self.llm_model, self.agent_tools)
    self.mount("/agent", agent_app)