A private-by-default expert panel in a box for researchers, independent thinkers, and R&D teams.
Ask a raw research question. The R.A.I.N. Lab assembles multiple expert perspectives, grounds strong claims in papers or explicit evidence, and returns the strongest explanations, disagreements, and next moves.
Most tools help you find papers. R.A.I.N. Lab helps you think with a room full of experts.
James is the assistant inside the R.A.I.N. Lab.
- Turns one research question into a four-agent meeting with distinct constraints
- Grounds strong claims in papers or explicit evidence instead of smooth talk
- Forces the strongest counter-argument onto the page before you commit to an idea
- Ends with concrete next moves, not just a stack of links
| You need to... | Typical research tool | R.A.I.N. Lab |
|---|---|---|
| Pressure-test a claim | Surfaces papers that seem relevant and stops there | Has four agents attack the claim from evidence, hardware, geometry, and formal logic |
| Understand a new field | Gives you a search result page | Maps agreements, open questions, and where the literature still fights itself |
| Decide what to read next | Hands you a pile of citations | Ends with the paper, experiment, or measurement most likely to validate or kill the current idea |
| Validate a stimulus against brain science | Sends you into a separate neuroscience workflow | Runs TRIBE v2 on video, audio, or text inside the same research meeting |
| Keep the work private | Assumes a hosted stack | Runs locally with LM Studio or Ollama, no cloud calls, no telemetry, no data sharing |
Each agent has a distinct voice, expertise, and set of constraints they bring to every question.
| Agent | Role | How They Think |
|---|---|---|
| James | Lead Scientist | Draws from your research papers directly. Cites metrics. Says when data is missing. |
| Jasmine | Hardware Architect | Reality-checks everything against real material constraints. If it can't be built, she knows why. |
| Luca | Field Tomographer | Sees geometric patterns others miss. Makes intuitive leaps, then looks for the math to ground them. |
| Elena | Quantum Information Theorist | Demands formal rigor. Runs logical verification. Catches errors everyone else misses. |
Each agent's full personality, reasoning principles, and conversation style are defined in their SOUL file:
- JAMES_SOUL.md — Lead Scientist
- JASMINE_SOUL.md — Hardware Architect
- LUCA_SOUL.md — Field Tomographer
- ELENA_SOUL.md — Quantum Information Theorist
The SOUL files are part of the product. They're what make the agents feel like colleagues, not search results.
R.A.I.N. Lab includes a TRIBE v2 integration that predicts fMRI brain activation patterns from video, audio, or text. Feed a stimulus to the model and James returns predicted cortical response maps, run basic neuroscience experiments from inside a research meeting.
| Capability | Detail |
|---|---|
| Input | video file, audio file, or raw text |
| Output | predicted fMRI activation patterns across 20,484 cortical vertices |
| Use case | validate whether a stimulus (image, sound, phrase) actually engages the brain regions your hypothesis claims |
| Runtime | sidecar service in tools/tribev2_sidecar/ wrapping Facebook Research's TRIBE v2 model |
Want to wire it into your own workflow? Read the TRIBE v2 sidecar README.
License: CC-BY-NC 4.0 — non-commercial use only.
TOPIC: Could acoustic interference patterns guide molecular assembly the way DNA guides cell growth?
James: The frequency comb geometry you're describing — those spectral spacings match what I saw in the DRR paper. If the phononic field creates a standing wave at the right wavelength, you could theoretically anchor molecular complexes at node points. But that's a big "theoretically."
Jasmine: I love the ambition, but let's reality-check. Piezo actuators hit degradation limits around 100 kHz under continuous load. And thermal drift at sub-micron precision? Where's the energy budget?
Luca: The coherence depth function C_α(u) describes how deep pattern information persists in an oscillatory field. If the field is structured well enough, you don't need continuous high-frequency drive. The shape encodes the instructions.
Elena: Elegance doesn't satisfy Landauer's bound. The energy density required for your coherence depth at that spatial resolution is roughly 10⁴⁵ joules — a dozen orders of magnitude beyond what you're describing. verify_logic() on the combined constraints returns satisfiable, but satisfiable isn't plausible.
James: Next step is a targeted literature search on low-energy coherent phonon generation. Jasmine, pull real actuator specs and run the thermal numbers against Luca's geometry. I want a paper that either validates or kills this at the 10 µm scale.
Read the full exchange
James: Luca, the frequency comb geometry you're describing — those spectral spacings match what I saw in the DRR paper. If the phononic field creates a standing wave at the right wavelength, you could theoretically anchor molecular complexes at node points. But that's a big "theoretically."
Jasmine: Okay, pause. I love the ambition, but let's reality-check. Piezo actuators hit degradation limits around 100 kHz under continuous load. And thermal drift at sub-micron precision? We'd need active cooling loops that consume more power than the assembly process produces. Where's the energy budget?
Luca: Jasmine, you're right about the constraints — but watch the geometry. The coherence depth function C_α(u) describes how deep pattern information persists in an oscillatory field. If the field is structured well enough, you don't need continuous high-frequency drive. The self-organizing phase could do the work. The shape encodes the instructions.
Elena: Luca, I appreciate the elegance, but elegance doesn't satisfy Landauer's bound. The energy density required for your coherence depth at that spatial resolution is roughly 10⁴⁵ joules. That's a dozen orders of magnitude beyond what you're describing. James, the DRR paper — did it actually claim feasible energy scales, or was that extrapolation?
James: ...it was an extrapolation. The paper showed the scaling relationship, not the absolute numbers. Elena's right to flag that. The thermal management question is also open — the DRR model assumes heat dissipates linearly, but at those frequencies it doesn't.
Jasmine: So we're looking at a mechanism that might work in principle but needs experimental validation at scales we can't reach yet. That's not a technology. That's a hypothesis.
Luca: I disagree with the framing. A hypothesis with a clear geometry is further along than a vague intuition. We've narrowed the design space. The question now is whether the self-organizing phase can compensate for the power constraints Jasmine raised.
Elena: And the answer to that is: we don't know. verify_logic() on the combined constraints returns satisfiable — so it's not ruled out. But satisfiable isn't plausible.
James: Next step is probably a targeted literature search on low-energy coherent phonon generation. Jasmine, can you pull real actuator specs and run the thermal numbers against Luca's geometry? I want a paper that either validates or kills this at the 10 µm scale.
Jasmine: I can do that. Luca, send me the geometry parameters.
Luca: Will do.
Live demo: rainlabteam.vercel.app — no setup required
Requires: Python 3.12+, uv (recommended) or pip, and optionally a local model via LM Studio or Ollama. Rust is optional and only needed for the ZeroClaw runtime layer.
On your machine:
python rain_lab.pyPress Enter for demo mode, or connect to LM Studio / Ollama for full local operation.
On Windows: double-click INSTALL_RAIN.cmd to create shortcuts.
On macOS/Linux: run ./install.sh.
From source (macOS / Linux):
git clone https://github.com/topherchris420/james_library.git
cd james_library
uv python install 3.12
uv venv .venv --python 3.12
uv pip sync --python .venv/bin/python requirements-dev-pinned.txt
uv run --python .venv/bin/python rain_lab.py --mode first-runFrom source (Windows):
git clone https://github.com/topherchris420/james_library.git
cd james_library
uv python install 3.12
uv venv .venv --python 3.12
uv pip sync --python .venv\Scripts\python.exe requirements-dev-pinned.txt
uv run --python .venv\Scripts\python.exe rain_lab.py --mode first-runR.A.I.N. Lab is built for people who need answers that hold up under scrutiny, not just answers that sound good.
| Role | What you can do with R.A.I.N. Lab |
|---|---|
| Researchers and analysts | Compare competing hypotheses, preserve disagreement, and keep auditable reasoning trails |
| Founders and product leads | Stress-test strategic decisions through structured debate before committing roadmap or budget |
| Operators and technical teams | Turn messy discussions into verifiable outputs that can be reviewed, shared, and replayed |
| Docs | Start Here -- Beginner Guide -- One-Click Install -- Troubleshooting |
| Papers | Research Archive |
| Language | 简体中文 -- 日本語 -- Русский -- Français -- Tiếng Việt |
Architecture, extension points, contribution
R.A.I.N. Lab is a Rust-first autonomous agent runtime with a Python orchestration layer.
src/— Rust core: trait-driven provider/channel/tool/peripheral systemcrates/— ZeroClaw runtime componentspython/— Python orchestration, RAIN Lab meeting logic, agent soulsagents.py— Agent factory and delegation
Extend by implementing traits and registering in factory modules:
src/providers/traits.rs— Add a new model providersrc/channels/traits.rs— Add a new messaging channelsrc/tools/traits.rs— Add a new toolsrc/memory/traits.rs— Add a memory backendsrc/peripherals/traits.rs— Add hardware board supporttools/tribev2_sidecar/server.py— Extend TRIBE v2 sidecar serving and stimulus adapters
ruff check .
pytest -q
cargo fmt --all -- --check
cargo clippy --all-targets -- -D warnings
cargo testThe codebase follows KISS, YAGNI, DRY (rule of three), SRP/ISP, fail-fast, secure-by-default, and reversible changes. See ARCHITECTURE.md and CLAUDE.md for the full contract.
Special thanks to the ZeroClaw team for the Rust runtime engine that powers R.A.I.N. Lab under the hood. The performance, stability, and extensibility of the agent runtime wouldn't be possible without their foundational work. See the crates/ directory for ZeroClaw runtime components.
License: MIT -- Vers3Dynamics

