🔒 You must be logged in as an Administrator or Editor to listen to this audio.
Welcome to the Orchard Core AI ecosystem! Orchard Core is a highly modular, open-source Content Management System (CMS) and application framework built on ASP.NET Core.
The codebase you shared (crestapps.orchardcore) is a massive, highly advanced extension of Orchard Core that turns it into a full-fledged AI Agent framework. It allows you to integrate Large Language Models (LLMs), build Retrieval-Augmented Generation (RAG) pipelines, and create autonomous agents that can actually control the CMS.
Let's break down the architecture into understandable pieces.
1. High-Level Architecture Diagram
Here is a visual representation of how the modules interact. (You can view this diagram if your markdown viewer supports Mermaid.js)
graph TD
User((User)) <--> UI
subgraph "1. Presentation & Interaction Layer"
UI[AI Chat Widget / Copilot UI]
Interactions[AI Chat Interactions]
UI <--> Interactions
end
subgraph "2. Core Orchestration Layer"
Core[AI Core Services & Profiles]
Prompting[AI Prompting Engine]
Interactions <--> Core
Core <--> Prompting
end
subgraph "3. RAG & Memory System (Knowledge)"
Memory[AI Memory]
DataSources[AI Data Sources]
Docs[AI Documents Ingestion]
Core <--> Memory
Core <--> DataSources
Docs --> DataSources
DataSources <--> AzureSearch[(Azure AI Search)]
DataSources <--> ElasticSearch[(Elasticsearch)]
end
subgraph "4. Agent Tooling (Action)"
Agents[AI Agent Core]
A2A[A2A: Agent-to-Agent]
MCP[MCP: Model Context Protocol]
OCTools[CMS Tools: Manage Content/Users]
Core <--> Agents
Agents <--> A2A
Agents <--> MCP
Agents <--> OCTools
end
subgraph "5. LLM Providers (The Brains)"
OpenAI[OpenAI]
Azure[Azure OpenAI]
Ollama[Ollama Local LLM]
AzureInf[Azure AI Inference]
Core --> OpenAI
Core --> Azure
Core --> Ollama
Core --> AzureInf
end
2. Module Breakdown & Relationships
To make sense of the dozens of folders, it helps to group them by their responsibility.
Layer 1: Presentation & Interaction (The Frontend)
CrestApps.OrchardCore.AI.Chat: Provides the actual chat widgets, user interfaces, and chat session history for users to talk to the AI.CrestApps.OrchardCore.AI.Chat.Copilot: An admin-facing "Copilot" UI to help site administrators manage the CMS using natural language.CrestApps.OrchardCore.AI.Chat.Interactions: Manages complex chat states, tools available to specific chats, and handles the routing of user messages.
Layer 2: The Core Engine (The Heart)
CrestApps.OrchardCore.AI: The foundational module. It defines what anAIProfileis (e.g., "Customer Support Bot" vs "Coding Assistant"), manages deployments, handles routing to different models, and tracks metrics.CrestApps.OrchardCore.AI.Prompting: Manages how prompts are built. It uses Orchard Core's "Liquid" templating engine to inject dynamic data (like the user's name or current page context) into the prompt before sending it to the LLM.
Layer 3: LLM Providers (The Brains)
These modules are "plugins" that implement the core interfaces to talk to specific AI companies. You can swap them out without changing the rest of your app.
CrestApps.OrchardCore.OpenAI: Talks to ChatGPT (GPT-4o, etc.).CrestApps.OrchardCore.OpenAI.Azure: Talks to enterprise Azure-hosted OpenAI models.CrestApps.OrchardCore.Ollama: Connects to locally hosted, open-source models (like Llama 3) via Ollama, allowing for free and private AI inference.CrestApps.OrchardCore.AzureAIInference: Connects to Azure's broader model catalog (Mistral, Cohere, etc.).
Layer 4: Data & Memory (RAG Pipeline)
Retrieval-Augmented Generation (RAG) allows the AI to read your private data before answering.
CrestApps.OrchardCore.AI.Documents: Reads and parses files (PDFs, Word Docs, Text) uploaded by users or admins.CrestApps.OrchardCore.AI.DataSources: Chunks those documents and converts them into "Embeddings" (number vectors).CrestApps.OrchardCore.AI.DataSources.AzureAI/Elasticsearch: Saves those embeddings into specialized search databases so the AI can quickly "search" your documents when a user asks a question.CrestApps.OrchardCore.AI.Memory: Gives the AI long-term memory about specific users so it can remember past conversations.
Layer 5: Agents & Action (Making the AI DO things)
Instead of just chatting, the AI can execute code.
CrestApps.OrchardCore.AI.Agent: The most powerful module. It registers "Tools" (C# functions) with the LLM. If you look in this folder, you'll see tools likeCreateOrUpdateContentTool,SendEmailTool,EnableFeatureTool. This means the AI can literally publish blog posts, send emails, or turn on CMS features if you ask it to!CrestApps.OrchardCore.AI.A2A: Agent-to-Agent communication. Allows one AI (e.g., a Planning Agent) to delegate tasks to another AI (e.g., a Coding Agent).CrestApps.OrchardCore.AI.Mcp: Model Context Protocol integration. This is an open standard that allows the AI to securely access local files, FTP servers, databases, or external APIs.
3. How a Request Flows (Example)
To tie it all together, here is what happens when a user types "Summarize my last uploaded PDF and email it to my boss" into the chat widget:
AI.Chatreceives the message from the user's browser.AI.Chat.Interactionslooks up the currentAIProfileto see what this specific bot is allowed to do.AI.Promptingwraps the user's message in system instructions ("You are a helpful assistant...").AI.DataSourcessearches Elasticsearch/Azure to find the text of the "last uploaded PDF" and injects it into the prompt.AI(Core) sends the giant prompt, along with a list of available tools (fromAI.Agent), toCrestApps.OrchardCore.OpenAI.- OpenAI replies: "I need to use the
SendEmailToolwith these arguments." AI.Agentintercepts this, runs the C# code to send the email, and tells OpenAI it succeeded.- OpenAI generates a final text response: "I have summarized the PDF and sent the email."
AI.Chatdisplays this message to the user.
4. Tips for Beginners Starting in this Codebase
If you are just starting to read the code, don't get overwhelmed by the sheer number of projects. Follow this path:
- Start in the
Core/CrestApps.OrchardCore.AI.Coreproject: Look atIAICompletionService.csandModels/AICompletionContext.cs. This is the basic contract for asking the AI a question. - Look at the Providers: Open
Modules/CrestApps.OrchardCore.OpenAI. See howOpenAICompletionClient.csimplements the Core interfaces. It's a great, simple example of how the abstract architecture connects to reality. - Look at the Tools: Open
Modules/CrestApps.OrchardCore.AI.Agent/Contents/PublishContentTool.cs. This will show you exactly how Orchard Core registers a C# function so that ChatGPT knows it exists and can call it.