Prerequisites to run My Independent AI¶
Before you start, make sure you have the following installed on your system:
1. Core Requirements¶
- Docker & Docker Compose: Essential for running the full stack (Ollama, Qdrant, and the Dashboard) in a containerized environment.
- Python 3.12+: Required if you plan to run importers or the dashboard outside of Docker for development.
- uv: A fast Python package and project manager. Highly recommended for Managing dependencies.
curl -LsSf https://astral.sh/uv/install.sh | sh
2. Optional but Recommended¶
- Google Cloud CLI: Only needed if you plan to use the cloud-optional features (like GCS for remote storage/sync or Cloud Run for scheduled importers).
3. Hardware Recommendation¶
Running local LLMs requires a decent amount of RAM and CPU/GPU power:
- Minimum: 8GB RAM (16GB+ recommended for better performance with larger models).
- Storage: ~5GB free space for core images and models (llama3.2 and nomic-embed-text).