Use this content when creating your Play Store listing in Google Play Console.
Neurix - Offline AI Chat
Run AI models on your phone. No internet needed. No cloud. Fully private.
Neurix lets you run AI language models directly on your phone — completely offline, completely private.
No internet required. No accounts. No subscriptions. No data leaves your device. Ever.
HOW IT WORKS
1. Open the app and browse the model store
2. Download a model over WiFi (one-time, 380 MB to 2.2 GB)
3. Chat with your AI assistant — works offline forever after download
AVAILABLE MODELS
Choose from 8 curated models from leading AI companies:
• Qwen 2.5 (Alibaba) — 0.5B, 1.5B, and 3B variants. Great multilingual support
• Llama 3.2 (Meta) — 1B and 3B. Strong chat and reasoning
• SmolLM2 (HuggingFace) — 1.7B. Balanced performance
• Gemma 2 (Google) — 2B. Optimized for on-device use
• Phi-3.5 Mini (Microsoft) — 3.8B. Excellent for code
All models use Q4 quantization for the best balance of quality and size.
WHY OFFLINE?
Every AI app today requires internet and sends your data to remote servers. If you're in an area with limited connectivity — traveling, in a rural region, on a flight — you lose access to AI tools entirely.
Neurix solves this. Download a model once, use it anywhere. On a mountain trail, in a village with no cell tower, or simply when you want your conversations to stay private.
FEATURES
✓ On-device inference — AI runs on your phone's CPU
✓ 8 models from Meta, Google, Microsoft, Alibaba, HuggingFace
✓ Works fully offline after model download
✓ Private by design — zero data collection
✓ Chat history saved locally
✓ Custom system prompt
✓ Adjustable temperature, top-p, and token limits
✓ Pause and resume model downloads
✓ Markdown rendering with code blocks
✓ Copy and share conversations
✓ No account required
BUILT FOR PRIVACY
• No analytics or tracking
• No telemetry
• No ads
• No account creation
• All processing on your hardware
• Conversations stored only on your device
TECHNICAL DETAILS
Built with Tauri 2.0 and Rust for minimal memory footprint. Uses Candle (HuggingFace's ML framework) for inference with GGUF quantized models. The entire app is under 15 MB before model downloads.
TIPS
• Larger models (3B) give better answers but are slower
• Smaller models (0.5B-1B) respond faster but with less depth
• Start with Llama 3.2 1B for your first experience
• Lower the temperature in settings for more focused answers
Free. Open source. No strings attached.
Tools
Everyone (no violent, sexual, or objectionable content)
offline ai, local llm, private ai, chat ai, no internet ai, on device ai, language model, ai assistant, offline chat, privacy ai
Email: (your support email)
Website: https://github.com/Razee4315/neurix
Privacy Policy: https://github.com/Razee4315/neurix/blob/main/PRIVACY_POLICY.md
You need to create a 1024x500 banner image. Suggested design:
- Dark background (#0A0A0B) matching the app theme
- Neurix logo centered
- Tagline: "Your AI. Your phone. No cloud."
- Subtle cyan accent (#00F0FF) elements
You can use Canva, Figma, or any design tool. Keep it simple and clean.
You already have 6 screenshots in the /Screenshots folder:
- Model_Store.jpeg — Model Store
- Download_page.jpeg — Downloading
- Chat.jpeg — Chat
- Chat_History.jpeg — Chat History
- Setting_Page_1.jpeg — Settings
- Setting_page_2.jpeg — Inference Settings
Play Store requires minimum 2 screenshots, maximum 8. All 6 are good to upload.