Skip to content

OpenHealthForAll/open-health

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🚀 OpenHealth

AI Health Assistant | Powered by Your Data

Platform Language Framework

📢 Now Available on Web!
In response to requests for easier access, we've launched a web version.
Try it now: open-health.me

🌍 Choose Your Language

English | Français | Deutsch | Español | 한국어 | 中文 | 日本語 | Українська | Русский | اردو


OpenHealth Demo

🌟 Overview

OpenHealth helps you take charge of your health data. By leveraging AI and your personal health information, OpenHealth provides a private assistant that helps you better understand and manage your health. You can run it completely locally for maximum privacy.

✨ Project Features

Core Features
  • 📊 Centralized Health Data Input: Easily consolidate all your health data in one place.
  • 🛠️ Smart Parsing: Automatically parses your health data and generates structured data files.
  • 🤝 Contextual Conversations: Use the structured data as context for personalized interactions with GPT-powered AI.

📥 Supporting Data Sources & Language Models

Data Sources You Can Add Supported Language Models
• Blood Test Results
• Health Checkup Data
• Personal Physical Information
• Family History
• Symptoms
• LLaMA
• DeepSeek-V3
• GPT
• Claude
• Gemini

🤔 Why We Built OpenHealth

  • 💡 Your health is your responsibility.
  • ✅ True health management combines your data + intelligence, turning insights into actionable plans.
  • 🧠 AI acts as an unbiased tool to guide and support you in managing your long-term health effectively.

🗺️ Project Diagram

graph LR
    subgraph Health Data Sources
        A1[Clinical Records<br>Blood Tests/Diagnoses/<br>Prescriptions/Imaging]
        A2[Health Platforms<br>Apple Health/Google Fit]
        A3[Wearable Devices<br>Oura/Whoop/Garmin]
        A4[Personal Records<br>Diet/Symptoms/<br>Family History]
    end

    subgraph Data Processing
        B1[Data Parser & Standardization]
        B2[Unified Health Data Format]
    end

    subgraph AI Integration
        C1[LLM Processing<br>Commercial & Local Models]
        C2[Interaction Methods<br>RAG/Cache/Agents]
    end

    A1 & A2 & A3 & A4 --> B1
    B1 --> B2
    B2 --> C1
    C1 --> C2

    style A1 fill:#e6b3cc,stroke:#cc6699,stroke-width:2px,color:#000
    style A2 fill:#b3d9ff,stroke:#3399ff,stroke-width:2px,color:#000
    style A3 fill:#c2d6d6,stroke:#669999,stroke-width:2px,color:#000
    style A4 fill:#d9c3e6,stroke:#9966cc,stroke-width:2px,color:#000
    
    style B1 fill:#c6ecd9,stroke:#66b399,stroke-width:2px,color:#000
    style B2 fill:#c6ecd9,stroke:#66b399,stroke-width:2px,color:#000
    
    style C1 fill:#ffe6cc,stroke:#ff9933,stroke-width:2px,color:#000
    style C2 fill:#ffe6cc,stroke:#ff9933,stroke-width:2px,color:#000

    classDef default color:#000
Loading

Note: The data parsing functionality is currently implemented in a separate Python server and is planned to be migrated to TypeScript in the future.

Getting Started

⚙️ How to Run OpenHealth

Installation Instructions
  1. Clone the Repository:

    git clone https://github.com/OpenHealthForAll/open-health.git
    cd open-health
  2. Setup and Run:

    # Copy environment file
    cp .env.example .env
    
    # Start the application using Docker/Podman Compose
    docker/podman compose --env-file .env up

    For existing users, use:

    # Generate ENCRYPTION_KEY for .env file:
    # Run the command below and add the output to ENCRYPTION_KEY in .env
    echo $(head -c 32 /dev/urandom | base64)
    
    # Rebuild and start the application
    docker/podman compose --env-file .env up --build

    to rebuild the image. Run this also if you make any modifications to the .env file.

  3. Access OpenHealth: Open your browser and navigate to http://localhost:3000 to begin using OpenHealth.

Note: The system consists of two main components: parsing and LLM. For parsing, you can use docling for full local execution, while the LLM component can run fully locally using Ollama.

Note: If you're using Ollama with Docker, make sure to set the Ollama API endpoint to: http://docker.for.mac.localhost:11434 on a Mac or http://host.docker.internal:11434 on Windows.


Star History

Star History Chart


🌐 Community and Support

💫 Share Your Story & Get Updated & Give Feedback

AIDoctor Subreddit Discord

🤝 Talk with Team

Calendly Email