Ask AI
Flask-based AI Q&A system with Gemini & PostgreSQL.

- Flask
- Python
- PostgreSQL
- Docker
- Alembic
- Gemini API
Core Features
- POST /ask endpoint returns AI-generated answers to user questions.
- Persists questions and answers in PostgreSQL via SQLAlchemy.
- Containerized stack with Docker & Docker Compose for easy local/dev deploy.
- Database lifecycle handled with Alembic migrations.
- Automated tests for /ask using pytest to validate happy paths and errors.
- Configurable provider (Gemini/OpenAI) and secrets via environment variables.
Architecture
- Flask app exposes REST endpoints (e.g., /ask) and orchestrates provider calls.
- Services layer wraps Gemini/OpenAI SDK usage (prompt → response).
- SQLAlchemy models manage persistence; Alembic handles schema migrations.
- Docker Compose runs web + Postgres; .env or compose env injects secrets.
- Tests run in container (docker-compose exec web pytest) against app layer.
Technologies
- Flask for lightweight HTTP routing and request handling.
- SQLAlchemy ORM for models/queries; Alembic for versioned DB migrations.
- PostgreSQL for durable storage of Q&A records.
- Docker & Docker Compose to standardize local/dev environments.
- Gemini API (or OpenAI) as the LLM backend; provider key via env var.
Key Highlights
- Production-ready scaffolding: API + DB + migrations + tests + Docker.
- Clear separation of concerns (routes, models, services, config).
- Swappable AI provider with minimal code changes.
- Simple deployment story: build, run, and test via Compose.