far3x/lumen
{ "createdAt": "2025-02-28T09:12:22Z", "defaultBranch": "main", "description": "The official CLI for the Lumen Protocol & Local Prompt Generation.", "fullName": "far3x/lumen", "homepage": "https://lumen.onl/", "language": "Python", "name": "lumen", "pushedAt": "2025-10-29T23:02:37Z", "stargazersCount": 71, "topics": [ "ai", "client", "code", "lumen", "reward" ], "updatedAt": "2025-11-15T11:07:25Z", "url": "https://github.com/far3x/lumen"}
The official CLI for the Lumen Protocol & Local Prompt Generation.
CA : BkpaxHhE6snExazrPkVAjxDyZa8Nq3oDEzm5GQm2pump
Table of Contents
Section titled “Table of Contents”- [Why Lumen?]!(#why-lumen)
- [Features]!(#features)
- [Prerequisites]!(#prerequisites)
- [Installation & Troubleshooting]!(#installation—troubleshooting)
- [Commands]!(#commands)
- [Network Commands]!(#network-commands)
- [Local Prompt Generation]!(#local-prompt-generation)
- [Configuration]!(#configuration)
- [Documentation]!(#documentation)
- [Contributing]!(#contributing)
- [License]!(#license)
Why Lumen?
Lumen is a dual-purpose CLI designed for developers. It began as a powerful local tool to solve the tedious process of manual context building for LLMs and evolved into a gateway for developers to ethically contribute to the AI data economy.
- A Best-in-Class Local Prompt Helper: A 100% private utility for your daily AI-assisted development.
- A Gateway to the Data Economy: A secure bridge to the Lumen Protocol, allowing developers to ethically contribute their anonymized code and earn rewards.
If you find the local tools useful, please consider starring the repository!
Features
- Network Interaction: Securely contribute your anonymized code to the Lumen Protocol and track your submission history.
- Local Prompt Generation: Assemble entire codebases into a single, LLM-ready prompt without sending any data.
- 100% Local Anonymization: All code sanitization for protocol contributions happens on your machine. Your raw code is never uploaded.
- Smart File Handling: Intelligently respects
.gitignore, ignores dotfiles, parses Jupyter Notebooks (.ipynb) (locally), and uses an optimized / custom-built + unique file reading strategy. - GitHub Repository Support: Analyze any public GitHub repository directly by providing its URL.
- Token Usage Analysis: Identify the most token-heavy files in a project to manage context window limitations.
- Customizable Filtering: Use the CLI or edit a simple
config.jsonfile to control which files, folders, and types are processed.
Prerequisites
- Python (3.7 or higher): Check with
python --version. - Git: Required only for analyzing GitHub repositories (
-gflag). Check withgit --version.
Installation & Troubleshooting
Install directly from PyPI:
pip install pylumenTo upgrade to the latest version:
pip install --upgrade pylumenTroubleshooting command not found: lum
Section titled “Troubleshooting command not found: lum”This occurs when the pip scripts directory is not in your system’s PATH.
- Quick Fix: Run the tool as a Python module:
python -m lum --version. - Permanent Fix (Recommended):
- macOS/Linux: Find your Python script path (often
~/.local/bin) and add it to your shell configuration (~/.zshrc,~/.bashrc):export PATH="$HOME/.local/bin:$PATH". Restart your terminal. - Windows: Reinstall Python and ensure the “Add Python to PATH” checkbox is selected.
- macOS/Linux: Find your Python script path (often
Commands
Network Commands
Section titled “Network Commands”These commands interact with the Lumen Protocol backend.
Authorize Device Initiates the secure login flow to link your CLI to a Lumen account.
lum loginContribute Code Analyzes, sanitizes, and submits the current project to the Lumen network.
lum contributeView History Displays the status of your last 10 contributions.
lum historyDe-authorize Device Logs out and securely removes the local authentication token.
lum logoutLocal Prompt Generation
Section titled “Local Prompt Generation”These commands do not send any data to the network.
Analyze Current Directory Assembles the project into a prompt and copies it to your clipboard.
lum localSave Prompt to File
Saves the prompt to a .txt file instead of copying.
lum local -t my_project_promptAnalyze a GitHub Repository Clones a public repo to a temporary directory for analysis.
lum local -g https://github.com/user/repo-nameIdentify Token-Heavy Files Shows a leaderboard of the most token-consuming files.
# See the top 20 (default) fileslum local -l
# See the top 10 fileslum local -l 10Configuration
Edit Configuration
Opens config.json in your system’s default text editor.
lum config --editReset Configuration
Resets all settings in config.json to their default values.
lum config --resetSet a Specific Value Changes a single setting directly from the terminal.
# Enable a boolean settinglum config --set use_ai_instructions true
# Overwrite a list (provide as a comma-separated string)lum config --set skipped_files ".DS_Store,yarn.lock"Documentation
For detailed documentation on the Lumen Protocol, including the valuation engine, security practices, and our long-term vision, please visit our official documentation site.
- Installation Guide
- CLI Authentication
- Protocol Valuation Engine
- Security by Design
- The Lumen Whitepaper
Contributing
Contributions, issues, and feature requests are welcome! Please check the issues page and see CONTRIBUTING.md for details.
License
This project is licensed under the MIT License. See the [LICENSE]!(./LICENSE) file for details.