Skip to content
View lingmengcan's full-sized avatar
  • 21:26 (UTC +08:00)

Block or report lingmengcan

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
lingmengcan/README.md

Lingmengcan AI Platform

English | δΈ­ζ–‡

Vue NestJS TypeScript MySQL License

Lingmengcan is an end-to-end AI application development platform powered by large language models. It provides comprehensive solutions including knowledge base management, intelligent conversations, workflow orchestration, and AI image generation. Built with modern microservice architecture, supporting fully local deployment to ensure enterprise data security.

✨ Key Features

  • πŸ€– Multi-Model Support: Compatible with OpenAI, Ollama, and other LLMs
  • πŸ’¬ Intelligent Chat: Multi-turn conversations, context memory, role-playing
  • πŸ“š Knowledge Base RAG: Document upload, vectorization, intelligent retrieval enhancement
  • 🎨 AI Image Generation: Integrated Stable Diffusion for text-to-image generation
  • πŸ”„ Workflow: Visual process orchestration with complex business logic support
  • πŸ‘₯ Permission Management: Complete RBAC permission system
  • πŸ”’ Private Deployment: Fully local operation without external dependencies

πŸ—οΈ Architecture

Frontend Layer (web/)

Vue 3 + TypeScript + Vite
β”œβ”€β”€ UI Framework: TDesign + Tailwind CSS
β”œβ”€β”€ State Management: Pinia
β”œβ”€β”€ Routing: Vue Router
└── HTTP Client: Axios

Service Layer (service/)

NestJS + TypeScript
β”œβ”€β”€ Database: TypeORM + MySQL
β”œβ”€β”€ Authentication: JWT + Passport
β”œβ”€β”€ AI Integration: LangChain + OpenAI API
β”œβ”€β”€ File Storage: Local + Cloud Storage
└── Vector Database: ChromaDB

Core Modules

  • User Management: Registration, login, permission control
  • Chat System: Multi-turn conversations, history, streaming output
  • Knowledge Base: Document parsing, vectorization, similarity search
  • Model Management: Multi-model configuration, load balancing, monitoring
  • Workflow: Node orchestration, conditional branches, loop control
  • Drawing System: Stable Diffusion integration, parameter adjustment

πŸš€ Quick Start

Requirements

Component Version Description
Node.js 18+ Frontend & Backend runtime
Python 3.10+ AI model environment
MySQL 8.0+ Primary database
pnpm Latest Package manager

1️⃣ Clone Repository

git clone https://github.com/lingmengcan/lingmengcan.git
cd lingmengcan

2️⃣ Database Setup

# Create database
mysql -u root -p -e "CREATE DATABASE lingmengcan_ai CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci;"

# Import schema
mysql -u root -p lingmengcan_ai < doc/lingmengcan-ai.sql

3️⃣ Backend Service

cd service

# Install dependencies
pnpm install

# Configure environment
cp .env.example .env
# Edit config.development.yaml for database connection

# Start development server
pnpm run start:dev

4️⃣ Frontend Application

cd web

# Install dependencies
pnpm install

# Start development server
pnpm dev

5️⃣ AI Model Deployment (Optional)

Option 1: Ollama (Recommended)

Ollama is the simplest local LLM deployment solution, supporting various open-source models.

# macOS/Linux Installation
curl -fsSL https://ollama.ai/install.sh | sh

# Windows: Download from https://ollama.ai/download/windows

# Download and run models
ollama pull llama2          # Meta Llama 2
ollama pull qwen:7b         # Alibaba Qwen
ollama pull codellama       # Code-specialized model

# Start service (default port 11434)
ollama serve

# Test model
ollama run llama2

Option 2: LM Studio (GUI Interface)

LM Studio provides a user-friendly graphical interface, suitable for non-technical users.

# 1. Download and install LM Studio
# Official website: https://lmstudio.ai/

# 2. Search and download models in LM Studio:
# - Qwen/Qwen2-7B-Instruct-GGUF
# - microsoft/Phi-3-mini-4k-instruct-gguf
# - TheBloke/Llama-2-7B-Chat-GGUF

# 3. Start local server
# Click "Local Server" tab in LM Studio
# Select model and click "Start Server"
# Default address: http://localhost:1234/v1

Option 3: Stable Diffusion WebUI (AI Image Generation)

# Clone repository
git clone https://github.com/AUTOMATIC1111/stable-diffusion-webui.git
cd stable-diffusion-webui

# Start with API mode
./webui.sh --api --listen --port 7860
# Or Windows: webui-user.bat --api --listen --port 7860

# API address: http://localhost:7860

Model Configuration

Add model configuration in backend config file service/config.development.yaml:

# Stable Diffusion Configuration
stablediffusion:
  apiUrl: "http://localhost:7860"
  enabled: true

6️⃣ Access Application

πŸ“± Screenshots

⚠️ Important Notice: The following screenshots are from early versions. Current version has significant UI and feature updates - screenshots need to be updated urgently!

πŸ’¬ Intelligent Chat System

New Features:

  • βœ… Multi-turn conversations with context understanding
  • βœ… Knowledge base RAG enhancement
  • βœ… Streaming output with typewriter effect
  • βœ… Real-time model switching and comparison
  • βœ… Conversation history management
  • βœ… Export conversation records
Chat Interface 1 - Needs Update Chat Interface 2 - Needs Update

🎨 AI Image Generation Studio

New Features:

  • βœ… Text-to-Image generation
  • βœ… Image style transfer and editing
  • βœ… Advanced parameter fine-tuning
  • βœ… Batch generation and management
  • βœ… ControlNet precise control
  • βœ… Image version history management
AI Drawing Interface - Needs Update

πŸ€– Model Library

Model Library

βš™οΈ System Management Center

Complete enterprise-level admin system

πŸ‘₯ User Permission Management πŸ” Role Permission System
User Management Role Management
β€’ User information management
β€’ Permission assignment
β€’ Login logs
β€’ RBAC permission model
β€’ Fine-grained control
β€’ Permission inheritance
πŸ“‹ Menu Route Management
Menu Management
β€’ Dynamic menu configuration
β€’ Route permissions
β€’ Menu icons

πŸ“š Knowledge Base Management (New Feature)

  • πŸ“„ Multi-format document upload (PDF, Word, Markdown)
  • πŸ” Intelligent document parsing and chunking
  • 🧠 Vectorization storage and retrieval
  • πŸ’‘ Knowledge base Q&A with citations
  • πŸ“Š Usage statistics
  • πŸ”„ Version management and rollback

πŸ”„ Workflow Orchestrator (New Feature)

  • 🎨 Visual process designer
  • πŸ”€ Conditional branches and loop control
  • πŸ€– Multi-AI model collaboration
  • ⏰ Scheduled task execution
  • πŸ“Š Execution monitoring and logs
  • πŸ”§ Custom node development

πŸ“ Project Structure

lingmengcan/
β”œβ”€β”€ πŸ“ web/                    # Frontend (Vue 3 + TypeScript)
β”‚   β”œβ”€β”€ src/
β”‚   β”‚   β”œβ”€β”€ api/              # API layer
β”‚   β”‚   β”œβ”€β”€ components/       # Reusable components
β”‚   β”‚   β”œβ”€β”€ views/            # Page views
β”‚   β”‚   β”œβ”€β”€ store/            # State management (Pinia)
β”‚   β”‚   β”œβ”€β”€ router/           # Route configuration
β”‚   β”‚   └── utils/            # Utility functions
β”‚   └── package.json
β”œβ”€β”€ πŸ“ service/               # Backend (NestJS + TypeScript)
β”‚   β”œβ”€β”€ src/
β”‚   β”‚   β”œβ”€β”€ controllers/      # Controller layer
β”‚   β”‚   β”œβ”€β”€ services/         # Business logic layer
β”‚   β”‚   β”œβ”€β”€ entities/         # Data models
β”‚   β”‚   β”œβ”€β”€ modules/          # Feature modules
β”‚   β”‚   β”œβ”€β”€ dtos/             # Data transfer objects
β”‚   β”‚   └── utils/            # Utility classes
β”‚   └── package.json
β”œβ”€β”€ πŸ“ doc/                   # Documentation
β”‚   β”œβ”€β”€ lingmengcan-ai.sql    # Database schema
β”‚   └── *.md                  # Documentation files
β”œβ”€β”€ πŸ“ images/                # Screenshots
└── README.md

πŸ”§ Configuration

Backend Config (service/config.development.yaml)

# Database configuration
database:
  host: localhost
  port: 3306
  username: root
  password: your_password
  database: lingmengcan_ai

# AI model configuration
llm:
  openai:
    apiKey: your_openai_key
    baseURL: https://api.openai.com/v1
  
  ollama:
    baseURL: http://localhost:11434

# Stable Diffusion configuration
stablediffusion:
  apiUrl: http://localhost:7860

Frontend Config (web/.env.development)

# API base URL
VITE_API_BASE_URL=http://localhost:3000

# Application title
VITE_APP_TITLE=Lingmengcan AI Platform

🀝 Contributing

We welcome all forms of contributions!

  1. 🍴 Fork the repository
  2. 🌿 Create feature branch (git checkout -b feature/AmazingFeature)
  3. πŸ’Ύ Commit changes (git commit -m 'Add some AmazingFeature')
  4. πŸ“€ Push to branch (git push origin feature/AmazingFeature)
  5. πŸ”€ Open Pull Request

Development Guidelines

  • Code Style: ESLint + Prettier
  • Commit Convention: Conventional Commits
  • Test Coverage: Add tests for new features

πŸ“„ License

This project is licensed under the MIT License

🌟 Star History

Star History Chart

πŸ’¬ Community

πŸ™ Acknowledgments

Thanks to the following open source projects:


⭐ If this project helps you, please give us a Star!

Made with ❀️ by Lingmengcan

Popular repositories Loading

  1. lingmengcan lingmengcan Public

    AIGC Application Platform: Lingmengcan AI, large language model, aigc, stable diffusion, langchainjs, nestjs, vue3, naive ui, DeepSeek, chromadb

    TypeScript 37 2

  2. stable-diffusion-vue-ui stable-diffusion-vue-ui Public

    AI、aigc、stable diffusion、vue、vue3、naive ui

    1