An RNG-LLM wrapper. Truly the intersection of cryptography and AI.
Raindom is a CLI tool that generates random numbers using an LLM (Large Language Model) backend. It leverages Ollama's local AI models to generate random numbers.
- Rust toolchain (install from rustup.rs)
- Ollama (see installation instructions below)
brew install ollama
ollama serve
We provide a setup script that installs Ollama and configures it as a systemd service:
# Run setup
# Warning: we need sudo to create ollama as a service, but you should probably not trust
# random parodies on the internet with sudo without at least looking at the script.
chmod +x setup-ollama.sh
sudo ./setup-ollama.sh
The script will:
- Install Ollama
- Configure it as a systemd service
- Pull required models
- Start the service automatically
# Clone the repository
git clone https://github.com/yourusername/raindom.git
cd raindom
# Build and install. This puts the raindom cli in your path.
# The author will not upload this to crates.rs, as this is not a serious project.
cargo install --path .
Raindom has three modes of operation:
- Generate a random number between 0 and 10:
raindom
- Generate a random number between 0 and max:
raindom 100 # Generates number between 0-100
- Generate a random number between min and max:
raindom 50 100 # Generates number between 50-100
If you used our setup script, you can manage Ollama using systemctl:
# Check status
sudo systemctl status ollama
# Stop service
sudo systemctl stop ollama
# Start service
sudo systemctl start ollama
# Restart service
sudo systemctl restart ollama
# View logs
journalctl -u ollama -f
-
If you get a connection error:
- Check if Ollama is running:
curl http://localhost:11434/api/generate -d '{"model": "mistral", "prompt": "hi"}'
- On Linux, check service status:
sudo systemctl status ollama
- Check if Ollama is running:
-
If you get model-related errors:
- Ensure the Mistral model is installed:
ollama pull mistral
- Ensure the Mistral model is installed:
-
For other issues, check Ollama logs:
- Linux:
journalctl -u ollama -f
- macOS/Windows: Check the terminal where
ollama serve
is running
- Linux:
Raindom uses the Ollama API to query a local LLM (Mistral by default) to generate random numbers.
This is innovation was only made possible by the work of cryptographers and AI scientists.
This project mainly exists to lightly deride "the intersection of Cryptography and AI".
Licensed under your option of either:
- Apache License, Version 2.0 (LICENSE-APACHE or http://www.apache.org/licenses/LICENSE-2.0)
- MIT license (LICENSE-MIT or http://opensource.org/licenses/MIT)
Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in the work by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.