🔒 You must be logged in as an Administrator or Editor to listen to this audio.
While I cannot fetch the exact raw C# code from that specific GitHub commit in real-time, I am very familiar with the CrestApps.OrchardCore architecture and the CrestApps.OrchardCore.AI suite.
Based on the ecosystem's architecture and recent module updates, here is a comprehensive guide to understanding the AIProfile.cs class, its purpose, and how it fits into the Orchard Core framework.
1. What is AIProfile.cs?
In the CrestApps AI suite, AIProfile is a central data model (typically a POCO class stored via Orchard Core's YesSql document database) that defines how an AI model behaves and interacts with the user or system.
Its primary architectural goal is decoupling. It separates the underlying AI client (like Azure OpenAI, DeepSeek, or a local Ollama instance) from the specific "persona" or task configuration you are trying to run. This allows administrators to create multiple different bots (e.g., a "Customer Support Bot" and a "Code Review Utility") that use the same underlying API connection but have entirely different behaviors.
2. Core Concepts & Expected Structure
If you look at the properties inside AIProfile.cs, they map directly to the configuration options available in the Orchard Core Admin UI. You will typically find:
- Identity Properties:
Id/ProfileId: The unique identifier for the database.Name/DisplayText: The human-readable name of the profile.TechnicalName: An auto-generated, code-friendly string used to reference this profile in C# code, Liquid templates, or Orchard Core Workflows.
- Profile Type:
- Often an enum or string indicating if this is a Chat (multi-turn conversation), a Utility (one-off backend processing), or a Template Generated Prompt (using Orchard Core's Liquid syntax for dynamic context).
- Provider & Deployment Links:
- Properties that link the profile to a specific AI Deployment or Connection (e.g.,
ConnectionIdorDeploymentName). This tells the profile which LLM to send the prompt to.
- Properties that link the profile to a specific AI Deployment or Connection (e.g.,
- Model Parameters:
- Settings that control the LLM's output, such as
MaxOutputTokens(a feature recently explicitly exposed in the CrestApps AI profile),Temperature, orTopP.
- Settings that control the LLM's output, such as
- System Prompts & Capabilities:
SystemMessage: The foundational instructions for the AI (e.g., "You are a helpful assistant").- Functions/Tools: A collection of strings or references to custom
AIFunctioninstances. This allows the profile to use external tools (like checking the weather or querying a database).
3. How It Works in the CrestApps Pipeline
Understanding AIProfile.cs requires understanding how the application uses it. Here is the typical lifecycle of an AI request using this abstraction:
- Resolution: A user sends a message via the UI, or a system event triggers an Orchard Core Workflow. The system requests a specific
AIProfileby itsTechnicalNameorId. - Hydration: The
AIProfileis retrieved from the database. - Client Mapping: The Orchard Core service layer looks at the profile's configuration to determine which AI Provider to use (e.g., the OpenAI client vs. the Azure client).
- Execution: The system combines the user's input, the profile's
SystemMessage, and the profile's attached functions, then sends the payload to the LLM. - Response: The LLM's response is formatted based on the profile type (e.g., returning a chat bubble for a Chat profile, or a raw string for a Utility profile).
4. Guide for Developers
If you are working with or extending AIProfile.cs in your own module, keep these best practices in mind:
- Don't Hardcode AI Logic: Always rely on
AIProfilefor configuration. If you need a new AI task, create a new profile in the CMS rather than hardcoding prompts in your C# services. - Use the Abstractions: Because this class lives in
CrestApps.OrchardCore.AI.Abstractions, you should interact with it via dependency injection. Look for interfaces likeIAIProfileManagerorIAIProfileStoreto fetch, update, or execute profiles. - Extending with Functions: If you are building custom tools (like an API fetcher), you don't need to modify
AIProfile.cs. Instead, you inherit fromAIFunction, register it in yourStartup.cs, and simply attach it to the profile via the Orchard Core admin dashboard. - MCP (Model Context Protocol): CrestApps recently added MCP support.
AIProfileacts as the bridge that connects your chosen LLM to local or remote MCP servers, giving your profile access to real-time, external context without changing the core model.
By the way, to unlock the full functionality of all Apps, enable Gemini Apps Activity.