diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml
index 0f43dff7..13a74cfe 100644
--- a/.pre-commit-config.yaml
+++ b/.pre-commit-config.yaml
@@ -52,7 +52,7 @@ repos:
- id: oca-gen-addon-readme
args:
- --addons-dir=.
- - --branch=18.0
+ - --branch=17.0
- --org-name=OCA
- --repo-name=server-env
- --if-source-changed
diff --git a/CLAUDE.md b/CLAUDE.md
index 53742b12..61400c95 100644
--- a/CLAUDE.md
+++ b/CLAUDE.md
@@ -1,358 +1,251 @@
-# Odoo LLM Integration Modules - Project Context
+# CLAUDE.md
-## Project Overview
+This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
-This is a comprehensive suite of Odoo modules for integrating Large Language Models (LLMs) with Odoo ERP. The modules provide AI-powered features, knowledge management, and various LLM provider integrations.
+## Overview
-## Current Status
+This repository provides a comprehensive framework for integrating Large Language Models (LLMs) into Odoo. It enables seamless interaction with various AI providers (OpenAI, Anthropic, Ollama, Mistral, Replicate, FAL.ai, LiteLLM) for chat completions, embeddings, RAG (Retrieval-Augmented Generation), and content generation.
-- **Current Version**: Odoo 16.0
-- **Target Version**: Odoo 18.0
-- **Migration Status**: In Progress
-- **Main Branch**: 16.0
-- **Migration Branch**: 18.0-migration
-
-## Module Architecture
-
-### Core Modules
-
-1. **llm** - Base module providing core LLM functionality, models, and providers
-2. **llm_thread** - Thread management for LLM conversations
-3. **llm_tool** - Tool management and consent configuration for LLM operations
-4. **llm_assistant** - Assistant functionality with prompts, categories, and tags
-
-### Provider Modules
-
-- **llm_anthropic** - Anthropic Claude integration
-- **llm_openai** - OpenAI GPT integration
-- **llm_mistral** - Mistral AI integration
-- **llm_ollama** - Ollama local LLM integration
-- **llm_litellm** - LiteLLM proxy integration
-- **llm_replicate** - Replicate API integration
-- **llm_fal_ai** - Fal.ai integration
-
-### Knowledge Management
-
-- **llm_knowledge** - Core knowledge base with chunking and RAG
-- **llm_knowledge_automation** - Automated knowledge collection
-- **llm_knowledge_llama** - Llama-specific knowledge features
-- **llm_knowledge_mistral** - Mistral-specific knowledge features
-- **llm_tool_knowledge** - Tool-knowledge integration
-
-### Vector Storage
-
-- **llm_pgvector** - PostgreSQL vector storage
-- **llm_chroma** - Chroma vector database integration
-- **llm_qdrant** - Qdrant vector database integration
-
-### Generation & Processing
-
-- **llm_generate** - Content generation features
-- **llm_generate_job** - Job queue for generation tasks
-- **llm_training** - Training dataset management
-- **llm_comfyui** - ComfyUI integration
-- **llm_comfy_icu** - ComfyICU integration
-
-### Additional Features
-
-- **llm_document_page** - Document page integration
-- **llm_mcp** - Model Context Protocol server
-- **llm_store** - LLM marketplace/store functionality
-- **web_json_editor** - JSON editor widget
-
-## Migration to Odoo 18.0 - Key Changes
-
-### Critical Breaking Changes
-
-1. **tree → list**: All `` tags must be renamed to ``
-2. **attrs → direct attributes**: Convert domain syntax to Python expressions
-3. **states → invisible**: Button states attribute replaced with invisible
-4. **name_get() → \_compute_display_name()**: Display name computation changed
-5. **message_format() removed**: Use Store system with `_to_store()` method instead
-6. **Registry import**: Use `from odoo.modules.registry import Registry` not `from odoo import registry`
-
-### Module-Specific Migration Requirements
-
-#### High Priority (Core + Heavy UI)
-
-- **llm**: Update manifest, migrate views (4 view files)
-- **llm_thread**: Migrate tree views in thread views
-- **llm_tool**: Migrate consent config and tool views
-- **llm_assistant**: Multiple view files with tree tags
-- **llm_knowledge**: Complex module with multiple views and wizards
-
-#### Medium Priority (Feature Modules)
-
-- **llm_mcp**: Has attrs attributes that need conversion
-- **llm_training**: Dataset and job views need migration
-- **llm_generate_job**: Queue and job views
-- **llm_pgvector**: Embedding views
-- **llm_store**: Store views
-- **llm_document_page**: Wizard attrs attributes
-- **llm_litellm**: Provider views with attrs
-
-#### Low Priority (Manifest Only)
-
-Provider modules with minimal UI:
-
-- llm_anthropic, llm_openai, llm_mistral, llm_ollama
-- llm_replicate, llm_fal_ai, llm_comfy_icu, llm_comfyui
-- llm_generate, llm_chroma, llm_qdrant
-- llm_knowledge_llama, llm_knowledge_mistral, llm_tool_knowledge
-
-## Testing Strategy
-
-1. Run individual module tests after each migration
-2. Test inter-module dependencies
-3. Validate all view rendering
-4. Check all workflows and actions
-5. Verify API compatibility
-
-## Code Quality Standards
-
-- Python 3.11+ compatibility
-- Ruff for linting and formatting
-- Pre-commit hooks configured
-- Type hints where applicable
+**Key Context**: This is a backported version on branch `17.0-backport` from the main branch `18.0`. The backport.md file tracks migration status.
## Development Commands
-### Testing
+### Code Quality & Linting
```bash
-# Run all tests
-./run_tests.sh
-
-# Test specific module
-odoo-bin --test-enable --stop-after-init --test-tags=llm -d test_db -u llm
-```
+# Run pre-commit hooks (includes ruff, prettier, eslint)
+pre-commit run --all-files
-### Code Quality
+# Format Python code with ruff
+ruff format .
-```bash
-# Format and lint
-ruff format . && ruff check . --fix --unsafe-fixes
+# Lint and auto-fix Python code
+ruff check --fix .
-# Pre-commit
-pre-commit run --all-files
+# Run ESLint on JavaScript files
+eslint --color --fix **/*.js
```
-## Migration Progress Tracking
-
-### ✅ Completed (18.0 Compatible)
+### Testing
-#### Core Modules - COMPLETED ✅
+**Note**: Limited test coverage exists (4 test files). Tests are located in module-specific `tests/` directories.
-1. **llm** - Base module providing core LLM functionality, models, and providers
+```bash
+# Run tests for a specific module (from Odoo root)
+odoo-bin -c odoo.conf -d -i --test-enable --stop-after-init
- - ✅ Migrated to Odoo 18.0
- - ✅ Updated manifests and dependencies
- - ✅ Core LLM provider and model management
+# Example: Test llm_assistant module
+odoo-bin -c odoo.conf -d test_db -i llm_assistant --test-enable --stop-after-init
+```
-2. **llm_thread** - Thread management for LLM conversations
+### Module Installation & Development
- - ✅ Migrated to Odoo 18.0 mail system architecture
- - ✅ Implemented proper `_init_messaging()` and `_thread_to_store()` methods
- - ✅ Fixed message handling (tool messages, empty message filtering, squashing)
- - ✅ Fixed HTML escaping issues in streaming messages
- - ✅ Updated thread header components with proper fetchData() patterns
- - ✅ Integrated with standard mail.store service patterns
+```bash
+# Install Python dependencies
+pip install -r requirements.txt
-3. **llm_tool** - Tool management and consent configuration for LLM operations
+# Restart Odoo after code changes (from Odoo directory)
+# Method depends on your Odoo setup - use supervisorctl, systemctl, or direct odoo-bin
- - ✅ Migrated to Odoo 18.0
- - ✅ Updated manifests and tool configuration views
- - ✅ Tool consent and management functionality
+# Update module after changes (from Odoo shell or via UI)
+# Via command line:
+odoo-bin -c odoo.conf -d -u
+```
-4. **llm_assistant** - Assistant functionality with prompts and tools
- - ✅ Migrated assistant dropdown UI with full functionality
- - ✅ Implemented assistant selection and clearing
- - ✅ Fixed UI reactivity issues with proper context binding
- - ✅ Extended `_thread_to_store()` to handle assistant_id states
- - ✅ Clean separation from llm_thread module following DRY principles
+## Architecture
-#### Text/Chat Provider Modules - COMPLETED ✅
+### Core Module Structure
-1. **llm_openai** - OpenAI GPT integration
- - ✅ Migrated to Odoo 18.0
- - ✅ Updated manifests and dependencies
-2. **llm_anthropic** - Anthropic Claude integration
- - ✅ Migrated to Odoo 18.0
- - ✅ Updated manifests and dependencies
-3. **llm_mistral** - Mistral AI integration
- - ✅ Migrated to Odoo 18.0
- - ✅ Updated manifests and dependencies
-4. **llm_ollama** - Ollama local LLM integration
+The architecture centers around **five core modules**:
- - ✅ Migrated to Odoo 18.0
- - ✅ Updated manifests and dependencies
+1. **`llm`** (Foundation) - Base infrastructure, provider abstraction, models, enhanced messaging system
+2. **`llm_assistant`** (Intelligence) - AI assistants with integrated prompt templates
+3. **`llm_generate`** (Generation) - Unified content generation API (text, images, etc.)
+4. **`llm_tool`** (Actions) - Tool framework for LLM-Odoo interactions and function calling
+5. **`llm_store`** (Storage) - Vector store abstraction for embeddings and similarity search
-5. **llm_litellm** - LiteLLM proxy integration
- - ✅ Migrated to Odoo 18.0
- - ✅ Updated manifests and dependencies
+**Supporting Modules**:
+- **`llm_thread`** - Chat thread management with PostgreSQL advisory locking
+- **Provider modules** - `llm_openai`, `llm_anthropic`, `llm_ollama`, `llm_mistral`, `llm_replicate`, `llm_fal_ai`, `llm_litellm`
+- **Knowledge/RAG** - `llm_knowledge` (consolidated from `llm_resource`), `llm_knowledge_automation`, `llm_tool_knowledge`
+- **Vector stores** - `llm_chroma`, `llm_pgvector`, `llm_qdrant`
+- **Specialized** - `llm_mcp_server`, `llm_training`, `llm_generate_job`, `llm_document_page`, `llm_letta`
-### 🚧 In Progress
+### Key Design Patterns
-#### UI/UX Improvements
+#### Provider Dispatch Pattern
-- 🔄 Make LLM components responsive/mobile friendly
-- 🔄 Fix auto scrolling for new messages in thread
-- 🔄 Investigate `_to_store` pattern in mail module for future use
+Providers use dynamic method dispatch to route calls to service-specific implementations:
-### ⏳ Remaining Migration Tasks
+```python
+# In llm.provider model (llm/models/llm_provider.py)
+def _dispatch(self, method, *args, **kwargs):
+ """Dispatch to service-specific implementation"""
+ service_method = f"{self.service}_{method}" # e.g., "openai_chat"
+ return getattr(self, service_method)(*args, **kwargs)
+
+# Provider modules implement: {service}_{method}
+# Example in llm_openai: openai_chat(), openai_embedding(), openai_get_client()
+```
-#### High Priority (Image Generation Providers)
+#### Enhanced Mail Message System
-- **llm_replicate** - Replicate API integration (image generation)
-- **llm_fal_ai** - Fal.ai integration (image generation)
-- **llm_comfyui** - ComfyUI integration (image workflows)
-- **llm_comfy_icu** - ComfyICU integration
+The base `llm` module extends `mail.message` with AI-specific capabilities:
-#### Medium Priority (Knowledge & Advanced Features)
+```python
+# llm/models/mail_message.py
+llm_role = fields.Selection([
+ ('user', 'User'),
+ ('assistant', 'Assistant'),
+ ('tool', 'Tool'),
+ ('system', 'System')
+], compute='_compute_llm_role', store=True, index=True) # 10x faster queries
+
+body_json = fields.Json() # Structured data for tool messages
+```
-- **llm_knowledge** - Knowledge base with chunking and RAG
-- **llm_knowledge_automation** - Automated knowledge collection
-- **llm_mcp** - Model Context Protocol server
-- **llm_generate** - Content generation features
-- **llm_generate_job** - Job queue for generation tasks
-- **llm_training** - Training dataset management
+**Performance**: The indexed `llm_role` field eliminates expensive subtype lookups, providing ~10x performance improvement for message queries.
-#### Low Priority (Vector Storage & Extensions)
+#### Thread as Data Bridge
-- **llm_pgvector**, **llm_chroma**, **llm_qdrant** - Vector database integrations
-- **llm_document_page** - Document page integration
-- **llm_store** - LLM marketplace functionality
-- **web_json_editor** - JSON editor widget
+`llm.thread` serves as the central link between Odoo business data and AI conversations:
+- Inherits from `mail.thread` for message storage
+- Links to any Odoo record via standard `res_model`/`res_id` pattern
+- Supports multiple concurrent conversations per business record
+- PostgreSQL advisory locking prevents concurrent generation conflicts
-## Future Architecture Improvements
+### Module Consolidation History
-### \_to_store Pattern Implementation
+Recent architecture improvements consolidated related functionality:
+- `llm_resource` → `llm_knowledge` (RAG + resource management)
+- `llm_prompt` → `llm_assistant` (prompt templates integrated into assistants)
+- `llm_mail_message_subtypes` → `llm` (message subtypes in base module)
-**Priority**: Medium
-**Investigation needed**: Study how Odoo's mail module implements `_to_store()` methods for different models.
+Migration scripts ensure backward compatibility and zero data loss.
-**Potential Implementation**:
+## File Organization
-- **llm.provider** - Standardize provider data serialization for frontend
-- **llm.model** - Consistent model data structure in mail.store
-- **llm.tool** - Tool data formatting for UI components
-- **llm.assistant** - Enhanced assistant data structure (already partially implemented)
+Each Odoo module follows standard structure:
-**Benefits**:
+```
+module_name/
+├── __manifest__.py # Module metadata, dependencies, assets
+├── __init__.py # Module initialization
+├── models/ # Python business logic
+│ ├── __init__.py
+│ └── model_name.py
+├── views/ # XML UI definitions
+│ └── model_views.xml
+├── security/
+│ ├── ir.model.access.csv # Access control lists
+│ └── security.xml # Security groups and rules
+├── data/ # Data files
+├── wizards/ # Transient models for wizards
+├── tests/ # Unit tests
+├── static/
+│ ├── src/ # JavaScript/CSS source
+│ │ ├── components/ # OWL components
+│ │ ├── services/ # JavaScript services
+│ │ └── patches/ # Patches to extend core
+│ └── description/ # Module images
+└── README.md # Module documentation
+```
-- Consistent data format across all LLM models
-- Better integration with Odoo 18.0 mail.store patterns
-- Simplified frontend data access and reactivity
-- Reduced custom serialization logic
+## Working with This Codebase
-**Research Tasks**:
+### Adding a New AI Provider
-1. Analyze `mail.thread._to_store()` and related methods
-2. Study how different mail models extend the pattern
-3. Design unified approach for LLM model serialization
-4. Create base mixin for LLM models to inherit
+1. Create new module `llm_` depending on `llm`
+2. Extend `llm.provider` model and implement service methods:
+ ```python
+ def _get_available_services(self):
+ return super()._get_available_services() + [('provider_name', 'Display Name')]
-## Known Issues
+ def provider_name_chat(self, messages, model=None, stream=False, **kwargs):
+ # Implementation
-- Some modules may have additional hidden dependencies
-- Vector storage modules might need special attention for data migration
-- Job queue modules need careful testing for async operations
+ def provider_name_embedding(self, texts, model=None):
+ # Implementation
-## Odoo 18.0 Mail System Architecture (IMPORTANT)
+ def provider_name_get_client(self):
+ # Return provider client instance
+ ```
+3. Add external Python dependencies to `__manifest__.py` and update root `requirements.txt`
+4. Register in `_get_available_services()` hook
-### Mail Store System
+### Adding New Tools
-- **USE** `mail.store` service for all message/thread operations
-- **REUSE** existing mail components, don't create separate messaging models
-- **PATCH** components conditionally using `@web/core/utils/patch`
-- The new system uses Record-based reactive architecture
+1. Create tool implementation in `llm_tool` or separate module
+2. Extend `llm.tool` model with `{implementation}_execute()` method
+3. Define input schema (JSON schema or method signature)
+4. Set security flags: `requires_user_consent`, `destructive_hint`, `read_only_hint`
-### Thread and Message Management
+### Extending Message Handling
-```javascript
-// Correct Thread.get() format in Odoo 18.0
-mailStore.Thread.get({ model: "llm.thread", id: threadId });
+When adding custom message processing:
+- Override `message_post()` to handle custom `llm_role` values
+- Use `body_json` field for structured data (e.g., tool results)
+- Leverage message subtypes: `llm.mt_user`, `llm.mt_assistant`, `llm.mt_tool`, `llm.mt_system`
+- Consider streaming updates via `message_post_from_stream()` pattern
-// Message insertion pattern
-mailStore.insert({ "mail.message": [messageData] }, { html: true });
+### Frontend Development (OWL Components)
-// IMPORTANT: Also add to thread.messages collection for UI updates
-if (!thread.messages.some((m) => m.id === message.id)) {
- thread.messages.push(message);
+JavaScript assets are loaded via `__manifest__.py`:
+```python
+'assets': {
+ 'web.assets_backend': [
+ 'module_name/static/src/components/component_name/component.js',
+ 'module_name/static/src/components/component_name/component.xml',
+ 'module_name/static/src/components/component_name/component.scss',
+ ],
}
```
-### Message Serialization
+Key frontend patterns:
+- **Services** integrate with Odoo's service registry (see `llm_store_service.js`)
+- **Patches** extend core mail components (composer, thread, message)
+- **Client Actions** define standalone views (see `llm_chat_client_action.js`)
-```python
-# Use Store system for message formatting
-from odoo.addons.mail.tools.discuss import Store
-
-def to_store_format(self, message):
- store = Store()
- message._to_store(store)
- result = store.get_result()
- return result['mail.message'][0]
-```
+## Branch Strategy
-### LLM-Specific Implementation
+- **Main branch**: `18.0` (latest Odoo version)
+- **Current branch**: `17.0-backport` (backported version for Odoo 17.0)
+- PRs should typically target the main branch unless explicitly for backports
+- Check `backport.md` for module backport status
-#### Service Setup
+## Important Notes
-```javascript
-export const llmStoreService = {
- dependencies: ["orm", "bus_service", "mail.store", "notification"],
- start(env, { orm, bus_service, "mail.store": mailStore, notification }) {
- // mailStore is the standard Odoo mail.store service
- },
-};
-```
+### Version Numbers
-#### Safe Component Patching
-
-```javascript
-patch(Composer.prototype, {
- setup() {
- super.setup();
- try {
- this.llmStore = useService("llm.store");
- } catch (error) {
- this.llmStore = null; // Graceful fallback
- }
- },
-});
-```
+Module versions follow pattern: `{odoo_version}.{major}.{minor}.{patch}`
+- Example: `17.0.1.4.0` = Odoo 17.0, module version 1.4.0
-#### Message Processing Rules
+### Module Dependencies
-- **User messages**: Plain text, no processing through `_process_llm_body()`
-- **Assistant messages**: Process through `_process_llm_body()` for markdown→HTML
-- **Tool messages**: Use `body_json` field, no HTML processing
+Install high-level modules (e.g., `llm_assistant`) to automatically pull in required core modules via Odoo's dependency system. See README.md "Quick Start Guide" section.
-#### Streaming Architecture
+### Security Considerations
-1. User message → `message_post()` → standard bus events
-2. AI response → EventSource streaming → custom handling in llm.store
-3. Messages inserted via `mailStore.insert()`
-4. Manually add to `thread.messages` collection for reactivity
+- API keys stored in `llm.provider.api_key` field
+- User groups: `llm.group_llm_user` (basic), `llm.group_llm_manager` (admin)
+- Tool consent system requires user approval for sensitive operations
+- Record rules enforce company-based and user-specific access control
-### Message History Flow for LLM
+### Performance Optimizations
-1. User message posted with `llm_role="user"` → saved to DB
-2. `generate_messages()` called → `get_llm_messages()` retrieves all messages
-3. Full history including new user message passed to LLM
+Recent improvements include:
+- **10x faster message queries** via indexed `llm_role` field
+- **PostgreSQL advisory locking** prevents race conditions
+- **Module consolidation** reduces complexity
+- **Streaming generation** for real-time UI updates
-### Common Pitfalls to Avoid
+### Known Limitations (17.0 Backport)
-- Don't use `message_format()` - it's removed in 18.0
-- Don't use `existingMessage.update()` for streaming - use `mailStore.insert()`
-- Don't forget to add messages to `thread.messages` collection
-- Don't process user messages as markdown/HTML
-- Don't use wrong Thread.get() format (array instead of object)
+Per backport.md, some modules are marked as untested or uninstallable:
+- `llm_letta` - uninstallable
+- `llm_replicate` - uninstallable
+- `llm_thread`, `llm_assistant`, `llm_generate` - marked as untested
-## References
+Patches in `llm_thread` may be commented out due to conflicts.
-- [MIGRATION_16_TO_18.md](./MIGRATION_16_TO_18.md) - Detailed migration guide
-- [LLM_THREAD_18_MIGRATION_GUIDE.md](./LLM_THREAD_18_MIGRATION_GUIDE.md) - LLM thread specific migration
-- Odoo 18.0 official documentation
-- Module interdependency graph (to be created)
+**Important llm_thread fix**: If experiencing white screen issues, ensure the `action` service is properly initialized in `llm_chat_client_action.js:22`. See `LLM_THREAD_V17_FIX.md` for detailed troubleshooting.
diff --git a/backport.md b/backport.md
new file mode 100644
index 00000000..3db3a053
--- /dev/null
+++ b/backport.md
@@ -0,0 +1,6 @@
+# Pending from 18.0
+- llm_thread (commented conflicting patches)
+- llm_assistant (un-tested)
+- llm_generate (un-tested)
+- llm_letta (uninstallable)
+- llm_replicate (ununinstallable)
\ No newline at end of file
diff --git a/llm/__manifest__.py b/llm/__manifest__.py
index 7965a2a8..b5255a21 100644
--- a/llm/__manifest__.py
+++ b/llm/__manifest__.py
@@ -12,7 +12,7 @@
"author": "Apexive Solutions LLC",
"website": "https://github.com/apexive/odoo-llm",
"category": "Technical",
- "version": "18.0.1.4.0",
+ "version": "17.0.1.4.0",
"depends": ["mail", "web"],
"data": [
"security/llm_security.xml",
diff --git a/llm/models/mail_message.py b/llm/models/mail_message.py
index c49de044..e0c9f435 100644
--- a/llm/models/mail_message.py
+++ b/llm/models/mail_message.py
@@ -101,7 +101,7 @@ def _check_llm_role(self, role):
return {message: message.llm_role == role for message in self}
def to_store_format(self):
- """Convert message to store format compatible with Odoo 18.0. Used by frontend js components"""
+ """Convert message to store format compatible with Odoo 17.0. Used by frontend js components"""
self.ensure_one()
from odoo.addons.mail.tools.discuss import Store
diff --git a/llm/views/llm_model_views.xml b/llm/views/llm_model_views.xml
index b5c814e5..4cf723b4 100644
--- a/llm/views/llm_model_views.xml
+++ b/llm/views/llm_model_views.xml
@@ -61,13 +61,13 @@
llm.model.view.tree
llm.model
-
+
-
+
diff --git a/llm/views/llm_provider_views.xml b/llm/views/llm_provider_views.xml
index f19e35b6..1a5be6ef 100644
--- a/llm/views/llm_provider_views.xml
+++ b/llm/views/llm_provider_views.xml
@@ -56,12 +56,12 @@
name="model_ids"
context="{'active_test': False}"
>
-
+
-
+
diff --git a/llm/views/llm_publisher_views.xml b/llm/views/llm_publisher_views.xml
index d9216f25..240c5590 100644
--- a/llm/views/llm_publisher_views.xml
+++ b/llm/views/llm_publisher_views.xml
@@ -32,13 +32,13 @@
-
+
-
+
@@ -53,13 +53,13 @@
llm.publisher.view.tree
llm.publisher
-
+
-
+
diff --git a/llm/wizards/fetch_models_views.xml b/llm/wizards/fetch_models_views.xml
index 210a19d1..02b40bf7 100644
--- a/llm/wizards/fetch_models_views.xml
+++ b/llm/wizards/fetch_models_views.xml
@@ -24,7 +24,7 @@
name="line_ids"
options="{'reload_on_button': false}"
>
-
+
@@ -46,7 +46,7 @@
decoration-warning="status=='modified'"
/>
-
+
diff --git a/llm_assistant/__manifest__.py b/llm_assistant/__manifest__.py
index f4e06578..e2343c65 100644
--- a/llm_assistant/__manifest__.py
+++ b/llm_assistant/__manifest__.py
@@ -26,7 +26,7 @@
Use cases include creating specialized assistants for customer support, data analysis, training assistance, and more.
""",
"category": "Productivity, Discuss",
- "version": "18.0.1.5.0",
+ "version": "17.0.1.5.0",
"depends": [
"base",
"mail",
diff --git a/llm_assistant/views/llm_assistant_views.xml b/llm_assistant/views/llm_assistant_views.xml
index ee5c1d8a..7012838c 100644
--- a/llm_assistant/views/llm_assistant_views.xml
+++ b/llm_assistant/views/llm_assistant_views.xml
@@ -5,7 +5,7 @@
llm.assistant.tree
llm.assistant
-
+
@@ -19,7 +19,7 @@
-
+
@@ -225,7 +225,7 @@
-
+
@@ -235,7 +235,7 @@
type="object"
class="btn btn-primary btn-sm"
/>
-
+
diff --git a/llm_assistant/views/llm_prompt_category_views.xml b/llm_assistant/views/llm_prompt_category_views.xml
index 38743e81..62f03e43 100644
--- a/llm_assistant/views/llm_prompt_category_views.xml
+++ b/llm_assistant/views/llm_prompt_category_views.xml
@@ -5,12 +5,12 @@
llm.prompt.category.tree
llm.prompt.category
-
+
-
+
@@ -64,12 +64,12 @@
-
+
-
+
diff --git a/llm_assistant/views/llm_prompt_tag_views.xml b/llm_assistant/views/llm_prompt_tag_views.xml
index 9b039a8a..9947a574 100644
--- a/llm_assistant/views/llm_prompt_tag_views.xml
+++ b/llm_assistant/views/llm_prompt_tag_views.xml
@@ -21,10 +21,10 @@
llm.prompt.tag.tree
llm.prompt.tag
-
+
-
+
diff --git a/llm_assistant/views/llm_prompt_views.xml b/llm_assistant/views/llm_prompt_views.xml
index 4aa39b1d..9d95d8c9 100644
--- a/llm_assistant/views/llm_prompt_views.xml
+++ b/llm_assistant/views/llm_prompt_views.xml
@@ -219,7 +219,7 @@
llm.prompt.tree
llm.prompt
-
+
@@ -228,7 +228,7 @@
-
+
diff --git a/llm_generate/__manifest__.py b/llm_generate/__manifest__.py
index 749c1b5b..adc4d3fd 100644
--- a/llm_generate/__manifest__.py
+++ b/llm_generate/__manifest__.py
@@ -1,6 +1,6 @@
{
"name": "LLM Content Generation",
- "version": "18.0.2.0.0",
+ "version": "17.0.2.0.0",
"category": "Productivity/Discuss",
"summary": "Content generation capabilities for LLM models",
"description": """
diff --git a/llm_letta/__manifest__.py b/llm_letta/__manifest__.py
index c52203a7..ebba00d3 100644
--- a/llm_letta/__manifest__.py
+++ b/llm_letta/__manifest__.py
@@ -16,7 +16,7 @@
"author": "Apexive Solutions LLC",
"website": "https://github.com/apexive/odoo-llm",
"category": "Technical",
- "version": "18.0.1.0.0",
+ "version": "17.0.1.0.0",
"depends": ["llm", "llm_thread", "llm_assistant", "llm_mcp_server"],
"external_dependencies": {
# Note: Using forked version until https://github.com/letta-ai/letta-python/issues/25 is fixed
diff --git a/llm_letta/static/description/index.html b/llm_letta/static/description/index.html
index ba3b8cad..4a29172f 100644
--- a/llm_letta/static/description/index.html
+++ b/llm_letta/static/description/index.html
@@ -104,7 +104,7 @@
Letta LLM Integration
Enterprise-ready integration bringing stateful AI agents with persistent memory to your Odoo workflows.