🌐 Open Politics HQ
Open Source Intelligence Platform
Talk: Open Source Political Intelligence @ CCCB Datengarten
🎥 Watch Presentation

The Idea
A journalist knows how to identify “security framing” in news coverage. A policy analyst knows what counts as “meaningful stakeholder engagement” in legislative proposals. A bureaucrat knows whether a grant application is properly filled out. That expertise lives in their heads, maybe in spreadsheets and notes. This works great for tens of documents. At hundreds or thousands, you’re either stuck or you need to hire engineers. Meanwhile, sophisticated analysis infrastructure — the kind that lets you systematically apply analytical frameworks at scale — has only been available to well-funded institutions. HQ. Define your analytical questions in plain language. Apply them at scale. The key innovation: schemas are shareable, transparent, and improvable. Other researchers can see exactly how you defined your framework, critique it, refine it, or apply it to their own data. For example: Imagine as a journalist, you are analyzing 200 news articles. You create a schema:
How It Works
- Ingest content from files, URLs, search results, RSS feeds

- Define schemas that describe what information to extract
- Run analysis using AI to apply your schema at scale
- Explore results through tables, visualizations, maps, or export the data
Coming soon: Images, audio, email inbox ingestion The schemas are the key innovation. They let you formalize your analytical method in natural language, making qualitative approaches reproducible and transparent. Other researchers can see exactly how you defined “populist rhetoric” or “security framing” and apply the same lens to their data.

Chat & MCP
Chat Interface Demo The Chat & MCP (Model Context Protocol) centralizes all platform tools (asset management, schema-based analysis, vector search, content ingestion) into a unified analysis access point that lets you work with structured analytical methods through conversational AI. This combination allows for various workflows or researching & synthesizing data. Editing MCP configs (upcoming on a per user-level) furthermore allows you to grant your LLM access to almost arbitray outside data and use it for your analysis.Infospaces & Vector Storage
Your assets, schemas and analysis results are scoped to “Information Spaces” that you can use to curate information. Each Information Space is a dedicated vector space. Use vector embeddings from local or cloud models to search through your data, cluster it (upcoming) and find duplicates.
Links
- Webapp — hosted instance (public registration opening soon)
- Documentation — user guides and tutorials
- Forum — community discussions
Getting Started
Option 1: Use the Hosted Instance
The easiest way to start. We host the infrastructure, you bring your own LLM API keys (see supported providers).- Register at open-politics.org/accounts/register
- Add your API keys on the home page
- Start uploading content and creating schemas
Your account also works on the forum for community support.
Option 2: Self-Host with Docker
For privacy, customization, or institutional requirements. Run everything on your own infrastructure.Deployment Flexibility
Fully Local: Run everything on your own hardware. Good for air-gapped environments or complete data control. Hybrid: Run the application locally but use managed services (AWS RDS, Upstash Redis, S3) to reduce operational burden. Kubernetes: We provide a Helm chart athttps://github.com/open-politics/open-politics-hq/tree/main/.deploymentskubernetes/open-politics-hq-deployment
Architecture
The platform is built from several independent services that work together. You can run them all locally or mix local and managed services.Core Components
Component | What It Does | Technology |
---|---|---|
Backend | API, analysis jobs, MCP server | FastAPI + Python |
Frontend | Web interface | Next.js + React |
Worker | Background processing for large jobs | Celery |
Database | Data storage with vector search | PostgreSQL + PGVector |
Object Storage | File storage for uploads | MinIO (S3-compatible) |
Cache/Queue | Session management, job queues | Redis |
Geocoding | Location extraction and mapping | Pelias |
LLM (optional) | Local AI inference | Ollama |
LLM Support
Connect any of these AI providers:- Anthropic (Claude Sonnet, etc.)
- OpenAI (GPT-5, etc.)
- Google (Gemini models)
- Ollama (run models locally — Llama, OAI OSS, Qwen, etc.)
Contributing
We’re building this in the open. The codebase, analytical methods, and documentation are all public and improvable. Ways to contribute:- Report bugs or suggest features (GitHub Issues)
- Improve documentation or add examples
- Build and share analytical schemas
- Contribute code (see backend and frontend READMEs)
- Join community discussions on the forum
Contact & Community
- Email: engage@open-politics.org
- Forum: forum.open-politics.org
- Dev Meetings: Wednesdays 15:30 Berlin Time