CrestApp Architecture Explained

🔒 You must be logged in as an Administrator or Editor to listen to this audio.

Welcome to the Orchard Core AI ecosystem! Orchard Core is a highly modular, open-source Content Management System (CMS) and application framework built on ASP.NET Core.

The codebase you shared (crestapps.orchardcore) is a massive, highly advanced extension of Orchard Core that turns it into a full-fledged AI Agent framework. It allows you to integrate Large Language Models (LLMs), build Retrieval-Augmented Generation (RAG) pipelines, and create autonomous agents that can actually control the CMS.

Let's break down the architecture into understandable pieces.

1. High-Level Architecture Diagram

Here is a visual representation of how the modules interact. (You can view this diagram if your markdown viewer supports Mermaid.js)

graph TD
    User((User)) <--> UI

    subgraph "1. Presentation & Interaction Layer"
        UI[AI Chat Widget / Copilot UI]
        Interactions[AI Chat Interactions]
        UI <--> Interactions
    end

    subgraph "2. Core Orchestration Layer"
        Core[AI Core Services & Profiles]
        Prompting[AI Prompting Engine]
        Interactions <--> Core
        Core <--> Prompting
    end

    subgraph "3. RAG & Memory System (Knowledge)"
        Memory[AI Memory]
        DataSources[AI Data Sources]
        Docs[AI Documents Ingestion]
        
        Core <--> Memory
        Core <--> DataSources
        Docs --> DataSources
        
        DataSources <--> AzureSearch[(Azure AI Search)]
        DataSources <--> ElasticSearch[(Elasticsearch)]
    end

    subgraph "4. Agent Tooling (Action)"
        Agents[AI Agent Core]
        A2A[A2A: Agent-to-Agent]
        MCP[MCP: Model Context Protocol]
        OCTools[CMS Tools: Manage Content/Users]

        Core <--> Agents
        Agents <--> A2A
        Agents <--> MCP
        Agents <--> OCTools
    end

    subgraph "5. LLM Providers (The Brains)"
        OpenAI[OpenAI]
        Azure[Azure OpenAI]
        Ollama[Ollama Local LLM]
        AzureInf[Azure AI Inference]

        Core --> OpenAI
        Core --> Azure
        Core --> Ollama
        Core --> AzureInf
    end

2. Module Breakdown & Relationships

To make sense of the dozens of folders, it helps to group them by their responsibility.

Layer 1: Presentation & Interaction (The Frontend)

  • CrestApps.OrchardCore.AI.Chat: Provides the actual chat widgets, user interfaces, and chat session history for users to talk to the AI.
  • CrestApps.OrchardCore.AI.Chat.Copilot: An admin-facing "Copilot" UI to help site administrators manage the CMS using natural language.
  • CrestApps.OrchardCore.AI.Chat.Interactions: Manages complex chat states, tools available to specific chats, and handles the routing of user messages.

Layer 2: The Core Engine (The Heart)

  • CrestApps.OrchardCore.AI: The foundational module. It defines what an AIProfile is (e.g., "Customer Support Bot" vs "Coding Assistant"), manages deployments, handles routing to different models, and tracks metrics.
  • CrestApps.OrchardCore.AI.Prompting: Manages how prompts are built. It uses Orchard Core's "Liquid" templating engine to inject dynamic data (like the user's name or current page context) into the prompt before sending it to the LLM.

Layer 3: LLM Providers (The Brains)

These modules are "plugins" that implement the core interfaces to talk to specific AI companies. You can swap them out without changing the rest of your app.

  • CrestApps.OrchardCore.OpenAI: Talks to ChatGPT (GPT-4o, etc.).
  • CrestApps.OrchardCore.OpenAI.Azure: Talks to enterprise Azure-hosted OpenAI models.
  • CrestApps.OrchardCore.Ollama: Connects to locally hosted, open-source models (like Llama 3) via Ollama, allowing for free and private AI inference.
  • CrestApps.OrchardCore.AzureAIInference: Connects to Azure's broader model catalog (Mistral, Cohere, etc.).

Layer 4: Data & Memory (RAG Pipeline)

Retrieval-Augmented Generation (RAG) allows the AI to read your private data before answering.

  • CrestApps.OrchardCore.AI.Documents: Reads and parses files (PDFs, Word Docs, Text) uploaded by users or admins.
  • CrestApps.OrchardCore.AI.DataSources: Chunks those documents and converts them into "Embeddings" (number vectors).
  • CrestApps.OrchardCore.AI.DataSources.AzureAI / Elasticsearch: Saves those embeddings into specialized search databases so the AI can quickly "search" your documents when a user asks a question.
  • CrestApps.OrchardCore.AI.Memory: Gives the AI long-term memory about specific users so it can remember past conversations.

Layer 5: Agents & Action (Making the AI DO things)

Instead of just chatting, the AI can execute code.

  • CrestApps.OrchardCore.AI.Agent: The most powerful module. It registers "Tools" (C# functions) with the LLM. If you look in this folder, you'll see tools like CreateOrUpdateContentTool, SendEmailTool, EnableFeatureTool. This means the AI can literally publish blog posts, send emails, or turn on CMS features if you ask it to!
  • CrestApps.OrchardCore.AI.A2A: Agent-to-Agent communication. Allows one AI (e.g., a Planning Agent) to delegate tasks to another AI (e.g., a Coding Agent).
  • CrestApps.OrchardCore.AI.Mcp: Model Context Protocol integration. This is an open standard that allows the AI to securely access local files, FTP servers, databases, or external APIs.

3. How a Request Flows (Example)

To tie it all together, here is what happens when a user types "Summarize my last uploaded PDF and email it to my boss" into the chat widget:

  1. AI.Chat receives the message from the user's browser.
  2. AI.Chat.Interactions looks up the current AIProfile to see what this specific bot is allowed to do.
  3. AI.Prompting wraps the user's message in system instructions ("You are a helpful assistant...").
  4. AI.DataSources searches Elasticsearch/Azure to find the text of the "last uploaded PDF" and injects it into the prompt.
  5. AI (Core) sends the giant prompt, along with a list of available tools (from AI.Agent), to CrestApps.OrchardCore.OpenAI.
  6. OpenAI replies: "I need to use the SendEmailTool with these arguments."
  7. AI.Agent intercepts this, runs the C# code to send the email, and tells OpenAI it succeeded.
  8. OpenAI generates a final text response: "I have summarized the PDF and sent the email."
  9. AI.Chat displays this message to the user.

4. Tips for Beginners Starting in this Codebase

If you are just starting to read the code, don't get overwhelmed by the sheer number of projects. Follow this path:

  1. Start in the Core/CrestApps.OrchardCore.AI.Core project: Look at IAICompletionService.cs and Models/AICompletionContext.cs. This is the basic contract for asking the AI a question.
  2. Look at the Providers: Open Modules/CrestApps.OrchardCore.OpenAI. See how OpenAICompletionClient.cs implements the Core interfaces. It's a great, simple example of how the abstract architecture connects to reality.
  3. Look at the Tools: Open Modules/CrestApps.OrchardCore.AI.Agent/Contents/PublishContentTool.cs. This will show you exactly how Orchard Core registers a C# function so that ChatGPT knows it exists and can call it.