Open Source AI

Nano Collective

Creating powerful, privacy-first AI tools, developed by the community for the community

1,363stars
30contributors
200PRs
178members

Our Mission

We believe AI is too powerful to be in hands of big corporations alone. Everyone should have access to advanced AI tools that respect privacy, run locally, and are shaped by community. Everything we build is open source, transparent, and designed to empower developers and users alike.

Privacy First
Your data should stay on your machine. We're building privacy-first architectures to ensure complete control.
Community Driven
Built by developers, for developers. We're doing true open source and transparent work from day one.
New Capabilities
We're building next generation of AI tools that run locally and offline. Powerful, flexible, and private.
Featured Project

Nanocoder

A beautiful privacy-first coding agent running in your terminal

█▄ █ ▄▀█ █▄ █ █▀█ █▀▀ █▀█ █▀▄ █▀▀ █▀█
█ ▀█ █▀█ █ ▀█ █▄█ █▄▄ █▄█ █▄▀ ██▄ █▀▄
✱ Welcome to Nanocoder 1.22.4
Tips for getting started:
1. Use natural language to describe what you want to build.
2. Ask for file analysis, editing, bash commands and more.
3. Be specific as you would with another engineer for best results.
4. Type /exit or press Ctrl+C to quit.
/help for help
Status
CWD: /nano-collective/nanocoder
Config: /agents.config.json
Provider: Ollama, Model: devstral-small-2:24b
Theme: Tokyo Night
↳ Using AGENTS.md. Project initialized
✓ Preferences loaded
✓ 4 custom commands loaded
✓ LSP: 1/1 connected
What would you like me to help with?
normal mode on (Shift+Tab to cycle)
Multi-Provider Support
Works with OpenAI-style APIs, local models (Ollama, LM Studio), and cloud providers (OpenRouter)
Advanced Tool System
Built-in file operations and command execution, extensible via Model Context Protocol (MCP)
Custom Commands
Create markdown-based custom prompts with template variables and namespace support
Enhanced UX
Smart autocomplete, configurable logging, and development mode toggles for best experience
Other Tools

Nanotune

Fine-tune and optimize your AI models for better performance

Nanotune CLI demonstration
Model Fine-tuning
MacOS
No YAML configs or complex flags. Just an interactive CLI that guides you through LoRA fine-tuning on your own data. Add training examples, validate data, and train with live progress display—all locally and privately.
Export & Benchmark
GGUF
Export trained models to GGUF format with automatic llama.cpp binaries. Run benchmarks with detailed timing metrics (TTFT, tokens/sec) and hardware presets for low to ultra performance tiers.

Featured Packages

Lightweight utilities built by community

get-md
TypeScript
A fast, lightweight HTML to Markdown converter optimized for LLM consumption. Clean, well-structured markdown with intelligent content extraction.

Get Involved

We welcome contributions in code, documentation, design, and marketing. Join our community and help shape future of privacy-first AI tools.