I built a document indexer that runs completely locally on your machine. It watches folders for new documents (PDFs, Word, Markdown), automatically indexes them using LanceDB vectors, and makes them searchable through natural language queries. Key features: - Uses Ollama for summarization - nothing leaves your computer. No license or API keys needed - Integrates with Claude Desktop via Model Context Protocol - Incremental indexing (only processes changed files) - Runs well on standard laptops (optimized for M1/M2 MacBooks) The main use case is being able to ask Claude things like "What documents mention machine learning?" or "Search my research papers about distributed systems" and get semantic search results from your local documents. Tech stack: Python, LanceDB for vectors, sentence-transformers for embeddings, Ollama for local LLM processing, FastMCP for the protocol server.Comments URL: https://news.ycombinator.com/item?id=44840145Points: 1# Comments: 0