Skip to content

the intersection of cryptography and AI

License

Notifications You must be signed in to change notification settings

thor314/raindom

Repository files navigation

raindom

An RNG-LLM wrapper. Truly the intersection of cryptography and AI.

Raindom

Raindom is a CLI tool that generates random numbers using an LLM (Large Language Model) backend. It leverages Ollama's local AI models to generate random numbers.

Prerequisites

  • Rust toolchain (install from rustup.rs)
  • Ollama (see installation instructions below)

Installing Ollama

macOS

brew install ollama
ollama serve

Linux

We provide a setup script that installs Ollama and configures it as a systemd service:

# Run setup
# Warning: we need sudo to create ollama as a service, but you should probably not trust
# random parodies on the internet with sudo without at least looking at the script.
chmod +x setup-ollama.sh
sudo ./setup-ollama.sh

The script will:

  • Install Ollama
  • Configure it as a systemd service
  • Pull required models
  • Start the service automatically

Installing Raindom

# Clone the repository
git clone https://github.com/yourusername/raindom.git
cd raindom

# Build and install. This puts the raindom cli in your path.
# The author will not upload this to crates.rs, as this is not a serious project.
cargo install --path .

Usage

Raindom has three modes of operation:

  1. Generate a random number between 0 and 10:
raindom
  1. Generate a random number between 0 and max:
raindom 100  # Generates number between 0-100
  1. Generate a random number between min and max:
raindom 50 100  # Generates number between 50-100

Managing Ollama Service (Linux)

If you used our setup script, you can manage Ollama using systemctl:

# Check status
sudo systemctl status ollama

# Stop service
sudo systemctl stop ollama

# Start service
sudo systemctl start ollama

# Restart service
sudo systemctl restart ollama

# View logs
journalctl -u ollama -f

Troubleshooting

  1. If you get a connection error:

    • Check if Ollama is running:
      curl http://localhost:11434/api/generate -d '{"model": "mistral", "prompt": "hi"}'
    • On Linux, check service status:
      sudo systemctl status ollama
  2. If you get model-related errors:

    • Ensure the Mistral model is installed:
      ollama pull mistral
  3. For other issues, check Ollama logs:

    • Linux: journalctl -u ollama -f
    • macOS/Windows: Check the terminal where ollama serve is running

How it Works

Raindom uses the Ollama API to query a local LLM (Mistral by default) to generate random numbers.

This is innovation was only made possible by the work of cryptographers and AI scientists.

This project mainly exists to lightly deride "the intersection of Cryptography and AI".

License

Licensed under your option of either:

Contribution

Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in the work by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.

Acknowledgments

About

the intersection of cryptography and AI

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published