Skip to content
Open
Show file tree
Hide file tree
Changes from 14 commits
Commits
Show all changes
27 commits
Select commit Hold shift + click to select a range
8352769
feature: add embedded SDK support
ndizazzo Apr 9, 2026
331b451
style: fix rustfmt drift in embedded SDK branch
i386 Apr 10, 2026
97bdb78
fix: compile without prebuilt dashboard assets
i386 Apr 11, 2026
e0e05f8
Merge origin/main into feature/native-embedded-sdks
i386 Apr 11, 2026
ae2873e
feat: wire Swift SDK to mesh-ffi
i386 Apr 11, 2026
553c811
Add live SDK smoke coverage and disable mac CI runners
i386 Apr 12, 2026
7acd2cc
Clarify macOS CI as Linux cross-build
i386 Apr 12, 2026
e7e8490
Merge remote-tracking branch 'origin/main' into HEAD
i386 Apr 12, 2026
4711784
ci: fold protocol and embedded client checks into main workflow
i386 Apr 12, 2026
658e056
docs: add sdk package readmes
i386 Apr 12, 2026
6afcb8e
refactor: rename bindings directory to sdk
i386 Apr 12, 2026
99d9b04
Merge remote-tracking branch 'origin/main' into HEAD
i386 Apr 12, 2026
41764a2
fix: address embedded sdk review feedback
i386 Apr 12, 2026
a04463a
Merge remote-tracking branch 'origin/main' into HEAD
i386 Apr 13, 2026
83a837e
fix: address pullrequestreview-4095777539 review feedback
Copilot Apr 13, 2026
fa71336
Merge main into feature/native-embedded-sdks
i386 Apr 14, 2026
313e057
fix: restore pinned GPU config merge state
i386 Apr 14, 2026
4c1efde
fix: gate pinned config sync for older peers
i386 Apr 14, 2026
b718471
Merge remote-tracking branch 'origin/main' into HEAD
i386 Apr 14, 2026
31e9ae7
Merge remote-tracking branch 'origin/main' into HEAD
i386 Apr 16, 2026
2822e60
Merge remote-tracking branch 'origin/main' into HEAD
i386 Apr 16, 2026
75d9850
Merge branch 'main' into pr-233-local
michaelneale Apr 17, 2026
8be5e18
Unify node.proto: single source of truth in mesh-llm/proto
michaelneale Apr 17, 2026
278ee78
Reject empty owner keypair in create_client instead of silently gener…
michaelneale Apr 17, 2026
c469a2a
Expose generate_owner_keypair_hex() over FFI and fix SDK smoke tests
michaelneale Apr 17, 2026
f660964
ci: align swift_sdk_smoke llama.cpp source with other CI jobs
michaelneale Apr 17, 2026
700b458
Merge remote-tracking branch 'origin/main' into pr-233-local
michaelneale Apr 17, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 8 additions & 0 deletions .cargo/config.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
[target.aarch64-linux-android]
rustflags = ["-C", "link-arg=-Wl,-z,max-page-size=16384"]

[target.armv7-linux-androideabi]
rustflags = ["-C", "link-arg=-Wl,-z,max-page-size=16384"]

[target.x86_64-linux-android]
rustflags = ["-C", "link-arg=-Wl,-z,max-page-size=16384"]
182 changes: 182 additions & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -96,6 +96,8 @@ jobs:

# ── Rust ──
- uses: dtolnay/rust-toolchain@stable
with:
targets: aarch64-linux-android

- uses: mozilla-actions/sccache-action@v0.0.9

Expand All @@ -114,6 +116,17 @@ jobs:
working-directory: mesh-llm
run: cargo fmt -- --check

- name: Check embedded client dependency purity
run: |
cargo tree -p mesh-client --prefix=none --no-dedupe > /tmp/mesh-client-deps.txt
FORBIDDEN="keyring|rpassword|hf-hub|dirs|clap|include_dir|rmcp|keyring-core"
if grep -wE "^($FORBIDDEN)" /tmp/mesh-client-deps.txt; then
echo "ERROR: Forbidden dependency found in mesh-client"
grep -wE "^($FORBIDDEN)" /tmp/mesh-client-deps.txt
Comment thread
i386 marked this conversation as resolved.
Outdated
exit 1
fi
echo "PASS: No forbidden dependencies found"

# Build in dev/debug profile, NOT release. Two reasons:
# 1. Build + Unit tests steps share one target/ subdir and one
# incremental cache — zero duplicated codegen.
Expand All @@ -128,6 +141,11 @@ jobs:
- name: Unit tests
run: cargo test --lib --tests

- name: Protocol compatibility matrix
run: |
cargo test -p mesh-llm --test protocol_compat_v0_client
cargo test -p mesh-llm --test protocol_convert_matrix

- name: Clippy
working-directory: mesh-llm
run: cargo clippy -- -D warnings || true
Expand Down Expand Up @@ -189,6 +207,170 @@ jobs:
cache_key_prefix: ''
workflow_cache_file: .github/workflows/ci.yml

native_sdk_smoke:
needs: [changes, linux, inference_smoke_tests]
if: ${{ needs.inference_smoke_tests.result == 'success' && (github.event_name == 'workflow_dispatch' || needs.changes.outputs.rust == 'true' || needs.changes.outputs.ui == 'true') }}
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v5

- uses: dtolnay/rust-toolchain@stable

- uses: Swatinem/rust-cache@v2
with:
workspaces: . -> target
prefix-key: ${{ env.CACHE_NAMESPACE }}-rust-${{ hashFiles('.github/cache-version.txt') }}

- name: Download Linux inference binaries
uses: actions/download-artifact@v7
with:
name: ci-linux-inference-binaries
path: ci-artifacts/linux

- name: Stage binaries for native SDK smoke
run: |
mkdir -p target/debug llama.cpp/build/bin
cp ci-artifacts/linux/target/debug/mesh-llm target/debug/mesh-llm
cp ci-artifacts/linux/llama.cpp/build/bin/rpc-server llama.cpp/build/bin/rpc-server
cp ci-artifacts/linux/llama.cpp/build/bin/llama-server llama.cpp/build/bin/llama-server
chmod +x target/debug/mesh-llm llama.cpp/build/bin/rpc-server llama.cpp/build/bin/llama-server

- name: Cache integration model
id: cache-model-native-sdk
uses: actions/cache@v5
with:
path: ~/.models/${{ env.MODEL_FILE }}
key: ${{ env.CACHE_NAMESPACE }}-${{ runner.os }}-model-${{ env.MODEL_FILE }}-${{ hashFiles('.github/cache-version.txt', '.github/workflows/ci.yml') }}

- name: Download integration model
if: steps.cache-model-native-sdk.outputs.cache-hit != 'true'
run: |
mkdir -p ~/.models
curl -fSL "$MODEL_URL" -o ~/.models/$MODEL_FILE
ls -lh ~/.models/$MODEL_FILE

- name: Native SDK smoke test
run: |
scripts/ci-native-sdk-smoke.sh \
target/debug/mesh-llm \
llama.cpp/build/bin \
~/.models/$MODEL_FILE

kotlin_sdk_smoke:
needs: [changes, linux, inference_smoke_tests]
if: ${{ needs.inference_smoke_tests.result == 'success' && (github.event_name == 'workflow_dispatch' || needs.changes.outputs.rust == 'true' || needs.changes.outputs.ui == 'true') }}
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v5

- uses: actions/setup-java@v5
with:
distribution: temurin
java-version: '21'

- uses: dtolnay/rust-toolchain@stable

- uses: Swatinem/rust-cache@v2
with:
workspaces: . -> target
prefix-key: ${{ env.CACHE_NAMESPACE }}-rust-${{ hashFiles('.github/cache-version.txt') }}

- name: Download Linux inference binaries
uses: actions/download-artifact@v7
with:
name: ci-linux-inference-binaries
path: ci-artifacts/linux

- name: Stage binaries for Kotlin SDK smoke
run: |
mkdir -p target/debug llama.cpp/build/bin
cp ci-artifacts/linux/target/debug/mesh-llm target/debug/mesh-llm
cp ci-artifacts/linux/llama.cpp/build/bin/rpc-server llama.cpp/build/bin/rpc-server
cp ci-artifacts/linux/llama.cpp/build/bin/llama-server llama.cpp/build/bin/llama-server
chmod +x target/debug/mesh-llm llama.cpp/build/bin/rpc-server llama.cpp/build/bin/llama-server

- name: Cache integration model
id: cache-model-kotlin-sdk
uses: actions/cache@v5
with:
path: ~/.models/${{ env.MODEL_FILE }}
key: ${{ env.CACHE_NAMESPACE }}-${{ runner.os }}-model-${{ env.MODEL_FILE }}-${{ hashFiles('.github/cache-version.txt', '.github/workflows/ci.yml') }}

- name: Download integration model
if: steps.cache-model-kotlin-sdk.outputs.cache-hit != 'true'
run: |
mkdir -p ~/.models
curl -fSL "$MODEL_URL" -o ~/.models/$MODEL_FILE
ls -lh ~/.models/$MODEL_FILE

- name: Kotlin SDK smoke test
run: |
scripts/ci-kotlin-sdk-smoke.sh \
target/debug/mesh-llm \
llama.cpp/build/bin \
~/.models/$MODEL_FILE

swift_sdk_smoke:
needs: [changes, inference_smoke_tests]
if: ${{ needs.inference_smoke_tests.result == 'success' && (github.event_name == 'workflow_dispatch' || needs.changes.outputs.rust == 'true' || needs.changes.outputs.ui == 'true') }}
runs-on: macos-latest
Comment on lines +333 to +336
steps:
- uses: actions/checkout@v5

- uses: dtolnay/rust-toolchain@stable

- uses: mozilla-actions/sccache-action@v0.0.9

- uses: Swatinem/rust-cache@v2
with:
workspaces: . -> target
prefix-key: ${{ env.CACHE_NAMESPACE }}-rust-${{ hashFiles('.github/cache-version.txt') }}

- name: Build mesh-llm binary (debug)
run: cargo build -p mesh-llm --bin mesh-llm

- name: Clone llama.cpp fork
run: git clone -b upstream-latest --depth 1 https://github.com/michaelneale/llama.cpp.git llama.cpp

- name: Build llama.cpp (CPU + RPC)
run: |
CACHE_FLAGS=()
if command -v sccache >/dev/null 2>&1; then
CACHE_FLAGS+=(
-DCMAKE_C_COMPILER_LAUNCHER=sccache
-DCMAKE_CXX_COMPILER_LAUNCHER=sccache
)
fi
cmake -B llama.cpp/build -S llama.cpp \
-DGGML_METAL=OFF \
-DGGML_NATIVE=OFF \
-DGGML_RPC=ON \
-DBUILD_SHARED_LIBS=OFF \
-DLLAMA_OPENSSL=OFF \
"${CACHE_FLAGS[@]}"
cmake --build llama.cpp/build --config Release -j$(sysctl -n hw.ncpu) --target rpc-server llama-server

- name: Cache integration model
id: cache-model-swift-sdk
uses: actions/cache@v5
with:
path: ~/.models/${{ env.MODEL_FILE }}
key: ${{ env.CACHE_NAMESPACE }}-${{ runner.os }}-model-${{ env.MODEL_FILE }}-${{ hashFiles('.github/cache-version.txt', '.github/workflows/ci.yml') }}

- name: Download integration model
if: steps.cache-model-swift-sdk.outputs.cache-hit != 'true'
run: |
mkdir -p ~/.models
curl -fSL "$MODEL_URL" -o ~/.models/$MODEL_FILE
ls -lh ~/.models/$MODEL_FILE

- name: Swift SDK smoke test
run: |
scripts/ci-swift-sdk-smoke.sh \
target/debug/mesh-llm \
llama.cpp/build/bin \
~/.models/$MODEL_FILE

macos:
needs: changes
if: ${{ github.event_name == 'workflow_dispatch' || needs.changes.outputs.rust == 'true' || needs.changes.outputs.ui == 'true' || needs.changes.outputs.benchmarks == 'true' }}
Expand Down
15 changes: 15 additions & 0 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -212,3 +212,18 @@ If the binary is not present, `just bundle` prints a note and continues without

CUDA, HIP, and Intel binaries are **not** included in the Unix tarball bundle; they must be compiled on the target platform.
On Windows release packaging, any `membench-fingerprint*.exe` binaries present in `mesh-llm/target/release/` are included automatically in the generated `.zip`.

## Protocol Backward Compatibility

Any change to `mesh-llm/src/protocol/` or `mesh-client/src/protocol/` requires backward-compatibility tests before merging.

Embedded clients (iOS, macOS, Android) are permanently supported. Protocol changes that break embedded client compatibility are breaking changes.

Run the protocol compatibility tests after any protocol change:

```bash
cargo test -p mesh-llm --test protocol_compat_v0_client
cargo test -p mesh-llm --test protocol_convert_matrix
```

See [`mesh-llm/docs/EMBEDDED_CLIENT_ADR.md`](mesh-llm/docs/EMBEDDED_CLIENT_ADR.md) for the full compatibility policy and rationale.
Loading
Loading