Skip to content

feat(link): add container import UI with drag-drop decryption (task 42)#155

Open
mpiton wants to merge 3 commits intomainfrom
feat/task-42-import-containers-ui
Open

feat(link): add container import UI with drag-drop decryption (task 42)#155
mpiton wants to merge 3 commits intomainfrom
feat/task-42-import-containers-ui

Conversation

@mpiton
Copy link
Copy Markdown
Owner

@mpiton mpiton commented May 6, 2026

Summary

Enables Link Grabber to import container files (DLC, CCF, RSDF, Metalink, Meta4) via drag-and-drop, delegating decryption to the vortex-mod-containers plugin. Extracted URLs are batched into a single link resolution call. Closes task 42 from prd-v2-roadmap.

Why

Vortex users need to import container files from sites like JDownloader archives and direct download services. The container format ecosystem is handled by plugin infrastructure (Extism WASM), so the UI just needs to accept files and orchestrate the decryption flow without reimplementing format logic.

Changes

  • Add LinkGrabberView.handleContainerFiles() to accept dropped container files and dispatch to link_import_container IPC command
  • Add PasteZone.handleDrop() container detection: files with extensions .dlc, .ccf, .rsdf, .metalink, .meta4 route to onContainerFiles callback
  • Export isContainerFile() and CONTAINER_EXTENSIONS from PasteZone for component reuse
  • Create src/types/container.ts with ImportContainerResult interface (format, urls, packageId, packageName)
  • Add Tauri IPC handler link_import_container that calls CommandBus::handle_import_container
  • Implement CommandBus::handle_import_container: validates extension and file size, calls plugin to decrypt, extracts URLs, creates package, returns outcome
  • Implement PluginLoader::decrypt_container in ExtismPluginLoader: scans registry for enabled Container plugin exporting "decrypt", invokes it, maps errors
  • Add trait method to PluginLoader port with default NotFound return
  • Batch URL resolution: handleContainerFiles collects all URLs from all imported containers, calls resolveLinks() once instead of per-container
  • Add i18n keys for container import toast messages (English + French)
  • Refactor PluginRegistry::call_plugin and call_plugin_bytes: extract generic call_plugin_inner<I: ToBytes> helper to eliminate ~25 lines of duplicated lock/poison-check logic
  • Drop redundant file_name field from ImportContainerOutcome (always equals package_name)
  • Remove defensive MAX_CONTAINER_BYTES check from IPC layer (Tauri deserializes before handler; validation at handler layer is single source of truth)
  • Extract test fixture helpers: Fixture struct + fixture() / stub_fixture() functions to reduce ~40 lines of repeated setup
  • Trim WHAT-comments explaining obvious code behavior; keep single-line WHY comments for non-obvious logic
  • Change tracing::info! to tracing::debug! in plugin registry per-call logs to reduce noise

Testing

  • cargo test --workspace: 1493 passed, 7 ignored
  • cargo clippy --workspace -- -D warnings: clean
  • npx vitest run: 702 passed across 85 test files
  • oxlint . && tsc -b: clean
  • Added 4 new tests in PasteZone.test.tsx for container file detection and callback forwarding
  • Verified container import flow: IPC call → plugin decrypt → URL batch resolve → toast confirmation

Related Issues

  • Closes task 42 (prd-v2-roadmap)

Checklist

  • Tests added and passing locally
  • No secrets, debug prints, or commented-out code
  • Self-reviewed the diff (agents reviewed for reuse, quality, efficiency)
  • Cargo clippy and tsc -b clean
  • Vitest full suite passing
  • Two-commit structure: feature + simplification

Summary by cubic

Add drag-and-drop import for .dlc/.ccf/.rsdf/.metalink/.meta4 in Link Grabber, decrypted via vortex-mod-containers. Extracted URLs are resolved in batched calls and each file becomes a container package. Satisfies Linear task 42.

  • New Features

    • Added link_import_container(fileName, fileBytes) IPC: validates extension and 1 MiB cap, decrypts via PluginLoader::decrypt_container, trims URLs, creates a Container package, and returns format, urls, packageId, packageName; missing-plugin maps to an install hint including .meta4.
    • ExtismPluginLoader implements decrypt_container by selecting the first enabled Container plugin (alphabetical) with decrypt, using PluginRegistry::call_plugin_bytes; probe failures now surface as PluginError.
    • UI: PasteZone detects containers with CONTAINER_EXTENSIONS/isContainerFile and forwards to onContainerFiles; LinkGrabberView pre-checks 1 MiB before reading, shows toasts (containerImported_*, containerImportFailed), aggregates URLs, then calls resolveLinks in 500-URL chunks.
  • Refactors

    • Extracted PluginRegistry::call_plugin_inner shared by call_plugin/call_plugin_bytes; per-call logs are now debug.
    • Added PluginLoader::decrypt_container port (default NotFound for test loaders) and documented alphabetical precedence; moved IPC result type to src/types/container.ts.

Written for commit 77c6ff4. Summary will update on new commits.

Summary by CodeRabbit

  • New Features
    • Drag-and-drop container import: import container files (e.g., .dlc) to extract links and create packages; imported links are fed into the existing resolve pipeline.
  • UI / i18n
    • Link Grabber shows success/failure toasts for container imports (English & French).
  • Documentation
    • CHANGELOG updated with Task 42 and plugin/container import details.
  • Tests
    • Added unit/integration tests covering container import success and failure paths.

mpiton added 2 commits May 6, 2026 15:36
Drag-drop .dlc/.ccf/.rsdf/.metalink/.meta4 into Link Grabber paste
zone now decrypts via vortex-mod-containers plugin (task 41) and
feeds extracted URLs into the existing resolve / online-check /
duplicate-detect pipeline. Auto-creates a `source_type=Container`
package per imported file.

New IPC `link_import_container(file_name, file_bytes)` validates
extension + caps payload at 1 MiB, calls the new
`PluginLoader::decrypt_container(bytes)` port, parses the plugin
JSON, and persists the container package. The Extism adapter scans
for the first enabled `Container`-category plugin exporting
`decrypt` and ships raw bytes via the new
`PluginRegistry::call_plugin_bytes` helper (existing
`call_plugin` only takes `&str`, would corrupt non-UTF-8 blobs).

`PasteZone` now exports `CONTAINER_EXTENSIONS` + `isContainerFile`
predicate and surfaces dropped containers through a dedicated
`onContainerFiles(File[])` callback rather than synthesising fake
`container:<name>` URLs that LinkGrabberView used to drop on the
floor (the original `LinkGrabberView.tsx:67` TODO).

Container password protection is wired in spec but vacuously
satisfied: vortex-mod-containers v1.0 uses fixed historic AES keys
per ADR-001 and none of the four supported formats has a per-file
password layer. CCF v2 + DLC v3 service-fetch are deferred to
v1.1, so no `password_required` state can flow through
`decrypt_container` until the plugin gains the capability.

11 new tests:
- 8 backend (`import_container::tests`): golden path, validation
  rejections (blank/extension/empty/oversize), plugin NotFound,
  zero-link response, invalid JSON
- 1 port default test
- 1 adapter "no container plugin loaded" test
- 4 frontend `PasteZone.test.tsx` cases (drop callback,
  callback-missing fallback, text-only drop, isContainerFile)
- 2 frontend `LinkGrabberView.test.tsx` cases (drop → import_container
  → resolve chain ; IPC failure → toast.error)

cargo test --workspace: 1493 pass / 7 ignored
cargo clippy + cargo fmt clean
vitest run: 702 pass
oxlint + tsc -b clean
- Generic `PluginRegistry::call_plugin_inner` shared by `call_plugin`
  and `call_plugin_bytes`; drops ~25 duplicated lines.
- `tracing::info!` → `debug!` on plugin call (per-call hot path).
- Drop redundant `MAX_CONTAINER_BYTES` check in IPC layer; Tauri has
  already deserialized the buffer before the function body runs, so
  the "defensive cap" was a dead branch.
- `handle_import_container` delegates package creation to the
  existing `handle_create_package` instead of inlining UUID mint +
  `Package::new` + `repo.save` + event publish.
- Drop redundant `file_name` field from `ImportContainerOutcome` /
  `ImportContainerResultDto` (always equal to `package_name`).
- Move inline IPC type to `src/types/container.ts` per project
  convention.
- Frontend batches all extracted URLs into a single `link_resolve`
  call after the import loop instead of one IPC per container.
- Test fixture sprawl collapsed into `Fixture` / `fixture(loader)` /
  `stub_fixture()` helpers (~40 lines saved).
- Trim WHAT comments (4 sites): `LinkGrabberView::handleContainerFiles`,
  `PasteZone::handleDrop`, `import_container::MAX_CONTAINER_BYTES`,
  `extism_loader::decrypt_container`. Keep WHY-only one-liners.
- Drop "(task 41)" reference in `import_container` module doc; the
  CHANGELOG carries that lineage.

cargo test --workspace: 1493 pass / 7 ignored
cargo clippy + cargo fmt clean
vitest run: 702 pass
oxlint + tsc -b clean

Net: -173 / +98 lines.
@github-actions github-actions Bot added documentation Improvements or additions to documentation rust frontend labels May 6, 2026
@coderabbitai
Copy link
Copy Markdown

coderabbitai Bot commented May 6, 2026

Review Change Stack
No actionable comments were generated in the recent review. 🎉

ℹ️ Recent review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: 0873e613-dd7b-4605-8c83-3c868c03eccb

📥 Commits

Reviewing files that changed from the base of the PR and between 6be272d and 77c6ff4.

📒 Files selected for processing (6)
  • CHANGELOG.md
  • src-tauri/src/adapters/driven/plugin/extism_loader.rs
  • src-tauri/src/adapters/driving/tauri_ipc.rs
  • src-tauri/src/application/commands/import_container.rs
  • src-tauri/src/domain/ports/driven/plugin_loader.rs
  • src/views/LinkGrabberView/LinkGrabberView.tsx
✅ Files skipped from review due to trivial changes (2)
  • src-tauri/src/application/commands/import_container.rs
  • src-tauri/src/adapters/driving/tauri_ipc.rs
🚧 Files skipped from review as they are similar to previous changes (2)
  • src/views/LinkGrabberView/LinkGrabberView.tsx
  • CHANGELOG.md

📝 Walkthrough

Walkthrough

Introduces container import functionality to the Link Grabber by adding a backend command handler that receives container files, delegates decryption to plugins, parses results, creates package records, and returns extracted URLs. Frontend components route dropped container files to the handler and display import status via toasts. Includes type definitions, i18n keys, and test coverage.

Changes

Container Import Feature

Layer / File(s) Summary
Type & Port Definitions
src/types/container.ts, src-tauri/src/domain/ports/driven/plugin_loader.rs
New TypeScript ImportContainerResult interface defines result shape (format, urls, packageId, packageName). Plugin loader trait gains optional decrypt_container method with default NotFound behavior.
Plugin Implementation
src-tauri/src/adapters/driven/plugin/extism_loader.rs
ExtismPluginLoader implements container decryption by probing enabled container plugins for a decrypt function, sorted by name for determinism. Returns decoded output or NotFound if no plugin available.
Command Handler & Validation
src-tauri/src/application/commands/import_container.rs
New ImportContainer handler validates file extension and size, invokes plugin decrypt via plugin loader, parses metalink-like JSON response, extracts non-empty URLs, creates a package record with type Container, returns outcome with format/URLs/package metadata. Includes helpers for extension checking and extensive test coverage.
Command Infrastructure
src-tauri/src/application/commands/mod.rs, src-tauri/src/application/commands/tests_support.rs
New ImportContainerCommand struct with file metadata; re-exports ImportContainerOutcome. Test builder refactored to accept custom plugin loader via new build_package_bus_with_plugin_loader helper.
IPC & Driving Adapter
src-tauri/src/adapters/driving/tauri_ipc.rs, src-tauri/src/lib.rs
New IPC command link_import_container converts file bytes to command, invokes handler, returns ImportContainerResultDto. Command and outcome types added to public re-exports and Tauri handler list.
Frontend UI Integration
src/views/LinkGrabberView/LinkGrabberView.tsx, src/views/LinkGrabberView/PasteZone.tsx
PasteZone identifies container files via CONTAINER_EXTENSIONS and isContainerFile, forwards to onContainerFiles prop instead of URL processing. LinkGrabberView implements handleContainerFiles to read files, invoke backend IPC, aggregate URLs, dispatch resolution.
Localization & Testing
src/i18n/locales/en.json, src/i18n/locales/fr.json, src/views/LinkGrabberView/__tests__/*
i18n keys added for container import success (singular/plural) and failure toasts in English and French. Frontend tests verify DLC drop handling success/error paths and container file detection; backend tests validate command handler across validation errors, plugin errors, and success scenarios.

Sequence Diagram

sequenceDiagram
    actor User
    participant FrontendUI as Frontend UI<br/>(LinkGrabberView)
    participant PasteZone
    participant TauriIPC as Tauri IPC
    participant CommandBus
    participant PluginLoader
    participant PackageRepo as Package<br/>Repository
    participant EventBus

    User->>PasteZone: Drop .dlc file
    PasteZone->>PasteZone: isContainerFile() check
    PasteZone->>FrontendUI: onContainerFiles([file])
    FrontendUI->>FrontendUI: Read file bytes
    FrontendUI->>TauriIPC: link_import_container(name, bytes)
    TauriIPC->>CommandBus: handle_import_container(cmd)
    CommandBus->>CommandBus: Validate extension & size
    CommandBus->>PluginLoader: decrypt_container(bytes)
    PluginLoader->>PluginLoader: Probe container plugins<br/>in order
    PluginLoader-->>CommandBus: Decrypted JSON (metalink)
    CommandBus->>CommandBus: Parse response,<br/>extract URLs
    CommandBus->>PackageRepo: Create package (type=Container)
    PackageRepo-->>CommandBus: package_id
    CommandBus->>EventBus: PackageCreated event
    CommandBus-->>TauriIPC: ImportContainerOutcome<br/>(format, urls, id, name)
    TauriIPC-->>FrontendUI: ImportContainerResult
    FrontendUI->>FrontendUI: Show success toast<br/>(URL count)
    FrontendUI->>FrontendUI: Dispatch resolveLinks<br/>(aggregated URLs)
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~35 minutes

Possibly related PRs

  • mpiton/vortex#19: Modifies Link Grabber components (PasteZone, LinkGrabberView) and extends IPC command surface for link resolution and container handling.

Suggested labels

ui

Poem

🐰 I found a tiny .dlc in the hay,
The plugin hummed softly and whisked links away,
URLs hopped out, neat and spry,
Resolved and de-duped beneath the sky —
A rabbit’s small victory, hip-hip-hooray!

🚥 Pre-merge checks | ✅ 5
✅ Passed checks (5 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title clearly describes the main feature: drag-and-drop container file import UI with decryption support, matching the file changes and PR objectives.
Docstring Coverage ✅ Passed Docstring coverage is 90.00% which is sufficient. The required threshold is 80.00%.
Linked Issues check ✅ Passed Check skipped because no linked issues were found for this pull request.
Out of Scope Changes check ✅ Passed Check skipped because no linked issues were found for this pull request.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
📝 Generate docstrings
  • Create stacked PR
  • Commit on current branch
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch feat/task-42-import-containers-ui

Tip

💬 Introducing Slack Agent: The best way for teams to turn conversations into code.

Slack Agent is built on CodeRabbit's deep understanding of your code, so your team can collaborate across the entire SDLC without losing context.

  • Generate code and open pull requests
  • Plan features and break down work
  • Investigate incidents and troubleshoot customer tickets together
  • Automate recurring tasks and respond to alerts with triggers
  • Summarize progress and report instantly

Built for teams:

  • Shared memory across your entire org—no repeating context
  • Per-thread sandboxes to safely plan and execute work
  • Governance built-in—scoped access, auditability, and budget controls

One agent for your entire SDLC. Right inside Slack.

👉 Get started


Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 6be272d7b0

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment on lines +213 to +216
defaultValue: `Imported ${result.urls.length} links from ${result.packageName}`,
}),
);
aggregatedUrls.push(...result.urls);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Preserve container package mapping for started downloads

link_import_container returns packageId, but this handler drops that association and only accumulates raw URLs; later starts use the normal download_start flow, so the container package created on import is never auto-attached to those downloads. In practice, importing a container creates an empty package entry and users lose the “import as one package” behavior the backend command is trying to provide.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@chatgpt-codex-connector

link_import_container returns packageId, but this handler drops that association and only accumulates raw URLs

Acknowledged but out of scope for this PR. Carrying the packageId end-to-end requires plumbing it through link_resolve and download_start (and the package read-side) so created downloads attach to the import package — that's a multi-handler, schema-touching change. Deferring to a follow-up task; this PR's package row is intentionally a placeholder until that pipe exists.

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Understood. I reviewed the trigger and PR context, and this specific feedback is already explicitly deferred by the PR author as a multi-handler, schema-touching follow-up (propagating packageId through link_resolve and download_start, plus package read-side wiring).

Given the current code shape, that’s accurate: download_start currently accepts only url, destination, and optional metadata—not a package association—so implementing this here would be broader than a targeted PR follow-up fix and would change command/API contracts across frontend + Tauri + application layers.

So for this follow-up action, I made no code changes and therefore did not create a commit or PR.

Checks run

  • git status --short
  • git rev-parse --abbrev-ref HEAD
  • sed -n '1,280p' src/views/LinkGrabberView/LinkGrabberView.tsx
  • sed -n '70,150p' src-tauri/src/adapters/driving/tauri_ipc.rs
  • rg "download_start" -n src src-tauri | head -n 40

View task →

Comment on lines +226 to +227
if (aggregatedUrls.length > 0) {
resolveLinks({ urls: aggregatedUrls });
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Split aggregated container URLs into bounded resolve batches

This sends all extracted URLs in a single link_resolve call, but the backend resolver rejects payloads above 500 URLs (handle_resolve_links hard cap). A realistic multi-file or large container import can exceed that limit, causing the whole resolve step to fail after successful decrypt/import toasts, leaving users with no resolved rows.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@chatgpt-codex-connector

A realistic multi-file or large container import can exceed that limit, causing the whole resolve step to fail after successful decrypt/import toasts.

Fixed in 77c6ff4handleContainerFiles now slices the aggregated URL list into 500-URL chunks (matches MAX_URLS in handle_resolve_links) and dispatches one resolveLinks mutation per chunk. Multi-container drops above the cap no longer fail the whole batch.

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Summary

  • I reviewed the trigger and current branch state; the reported resolver-cap issue is already fixed in commit 77c6ff4, and handleContainerFiles now batches container URLs into bounded resolveLinks calls using slices rather than one oversized request. No additional code changes were needed. src/views/LinkGrabberView/LinkGrabberView.tsxL238-L240
  • I did not create a new commit or PR because the workspace had no new modifications to commit after verification.

Testing / Checks

  • git status --short && git log --oneline -n 5 && rg -n "MAX_URLS|chunk|resolveLinks\\(" src/views/LinkGrabberView/LinkGrabberView.tsx
  • nl -ba src/views/LinkGrabberView/LinkGrabberView.tsx | sed -n '220,255p'

View task →

Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 7

🧹 Nitpick comments (1)
src/types/container.ts (1)

1-5: ⚡ Quick win

Narrow format to the backend’s closed set.

ImportContainerResult is a new IPC-facing type, and the backend only emits a fixed set of container formats. Keeping format: string drops exhaustiveness checks and makes format typos invisible in the UI/tests.

💡 Suggested change
+export type ContainerFormat = "dlc" | "ccf" | "rsdf" | "metalink";
+
 export interface ImportContainerResult {
-  format: string;
+  format: ContainerFormat;
   urls: string[];
   packageId: string;
   packageName: string;
 }
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@src/types/container.ts` around lines 1 - 5, ImportContainerResult currently
types the field format as string which prevents exhaustiveness checks; change
format in the ImportContainerResult interface to a closed string union matching
the backend’s allowed container formats (replace format: string with format: 'X'
| 'Y' | ...), update any code that constructs or switches on
ImportContainerResult.format to use the new union values, and adjust
tests/typeguards to handle the explicit variants so typos become compile-time
errors; reference the ImportContainerResult interface and any switch/if branches
that inspect .format when making the change.
🤖 Prompt for all review comments with AI agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

Inline comments:
In `@CHANGELOG.md`:
- Line 35: Update the headline "11 new tests" to match the enumerated breakdown
by changing it to "16 new tests" (since the list sums to 8 backend + 1 port + 1
adapter + 4 frontend PasteZone + 2 frontend LinkGrabberView = 16), or
alternatively adjust the per-section counts to total 11 if that was intended;
edit the changelog sentence containing "11 new tests" and ensure the numbers in
the parenthetical breakdown (the "8 backend (...) , 1 port ..., 1 adapter ..., 4
frontend PasteZone ..., 2 frontend LinkGrabberView ...") are consistent with the
headline.

In `@src-tauri/src/adapters/driven/plugin/extism_loader.rs`:
- Around line 424-445: The loop currently treats errors from
self.registry.function_exists(info.name(), "decrypt") as mere warnings and
ultimately returns DomainError::NotFound, which hides plugin probe failures;
change the logic in the decrypt lookup loop so that if function_exists returns
Err(e) you return Err(DomainError::PluginError(...)) (include info.name() and
the error e in the message) instead of continuing, while keeping the existing
behavior of calling self.registry.call_plugin_bytes(info.name(), "decrypt",
bytes) when function_exists is Ok(true); ensure any call_plugin_bytes errors are
still mapped to DomainError::PluginError as before.

In `@src-tauri/src/adapters/driving/tauri_ipc.rs`:
- Around line 1014-1016: Update the fallback message in the
AppError::Domain(DomainError::NotFound(_)) match arm in tauri_ipc.rs to include
the missing ".meta4" extension so users see the full list of supported container
formats; locate the match arm that currently returns "Install
vortex-mod-containers to import .dlc/.ccf/.rsdf/.metalink files" and add
".meta4" into that string (e.g., include ".meta4" among the extensions) so the
guidance is complete.

In `@src-tauri/src/application/commands/import_container.rs`:
- Around line 71-76: The collected URLs currently map from response.links ->
l.url and only check u.trim().is_empty(), leaving surrounding whitespace in the
stored strings; update the pipeline that builds urls (the mapping from
response.links and the variable urls) to trim each l.url (e.g., produce a
trimmed String) before collecting and then filter out empty trimmed strings so
urls contains clean, whitespace-free URLs for later resolve_batch.

In `@src-tauri/src/domain/ports/driven/plugin_loader.rs`:
- Around line 139-150: The docstring for container decryption is incorrect about
precedence—ExtismPluginLoader::decrypt_container() sorts enabled Container
plugins by name before probing, so the winner is the alphabetically-first
enabled container plugin, not the "first loaded"; update the comment on the
decrypt_container() port (referencing Container category and the decrypt
function) to state that enabled container plugins are tried in alphabetical
order by plugin name (or alternatively remove the name-sorting in
ExtismPluginLoader::decrypt_container() if you prefer true load-order
semantics)—make the doc and code behavior consistent and mention the exact
precedence rule used.

In `@src/views/LinkGrabberView/LinkGrabberView.tsx`:
- Around line 205-228: The current container import flow calls
invoke("link_import_container") and throws away result.packageId, only
aggregating result.urls into aggregatedUrls and calling resolveLinks({ urls:
aggregatedUrls }); Update this flow to carry the originating packageId through
the resolution/download pipeline: when handling the
invoke("link_import_container") response, push an object containing
result.packageId and result.urls (or map urls -> { url, packageId }) into the
aggregate instead of plain strings, then change resolveLinks (and downstream
download_start calls) to accept and propagate an optional packageId parameter so
that the server-side download_start can attach created download IDs to the
provided packageId; alternatively, defer creating the package on import and
create it when download IDs exist, but ensure either resolveLinks or
download_start accepts and forwards result.packageId (from
invoke("link_import_container")) so imported links are associated with the
package rather than creating empty packages.
- Around line 203-208: The code in LinkGrabberView.tsx currently calls
file.arrayBuffer() and builds bytes before sending to
invoke("link_import_container"), which allocates the entire file in memory; add
a pre-check on file.size against the backend container size limit (define a
constant e.g. CONTAINER_MAX_BYTES matching the backend cap) and reject oversized
files immediately (show an error/notification and return) before calling
file.arrayBuffer(); keep the rest of the flow (Uint8Array -> Array.from ->
invoke<ImportContainerResult>) unchanged so only files within the limit are read
into memory.

---

Nitpick comments:
In `@src/types/container.ts`:
- Around line 1-5: ImportContainerResult currently types the field format as
string which prevents exhaustiveness checks; change format in the
ImportContainerResult interface to a closed string union matching the backend’s
allowed container formats (replace format: string with format: 'X' | 'Y' | ...),
update any code that constructs or switches on ImportContainerResult.format to
use the new union values, and adjust tests/typeguards to handle the explicit
variants so typos become compile-time errors; reference the
ImportContainerResult interface and any switch/if branches that inspect .format
when making the change.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: 39b5d7b4-f59d-45aa-b741-2535b5d7e918

📥 Commits

Reviewing files that changed from the base of the PR and between c00d74b and 6be272d.

📒 Files selected for processing (16)
  • CHANGELOG.md
  • src-tauri/src/adapters/driven/plugin/extism_loader.rs
  • src-tauri/src/adapters/driven/plugin/registry.rs
  • src-tauri/src/adapters/driving/tauri_ipc.rs
  • src-tauri/src/application/commands/import_container.rs
  • src-tauri/src/application/commands/mod.rs
  • src-tauri/src/application/commands/tests_support.rs
  • src-tauri/src/domain/ports/driven/plugin_loader.rs
  • src-tauri/src/lib.rs
  • src/i18n/locales/en.json
  • src/i18n/locales/fr.json
  • src/types/container.ts
  • src/views/LinkGrabberView/LinkGrabberView.tsx
  • src/views/LinkGrabberView/PasteZone.tsx
  • src/views/LinkGrabberView/__tests__/LinkGrabberView.test.tsx
  • src/views/LinkGrabberView/__tests__/PasteZone.test.tsx

Comment thread CHANGELOG.md Outdated
Comment thread src-tauri/src/adapters/driven/plugin/extism_loader.rs
Comment thread src-tauri/src/adapters/driving/tauri_ipc.rs
Comment thread src-tauri/src/application/commands/import_container.rs
Comment thread src-tauri/src/domain/ports/driven/plugin_loader.rs
Comment thread src/views/LinkGrabberView/LinkGrabberView.tsx
Comment on lines +205 to +228
const result = await invoke<ImportContainerResult>("link_import_container", {
fileName: file.name,
fileBytes: bytes,
});
toast.success(
t("linkGrabber.toast.containerImported", {
count: result.urls.length,
fileName: result.packageName,
defaultValue: `Imported ${result.urls.length} links from ${result.packageName}`,
}),
);
aggregatedUrls.push(...result.urls);
} catch (err) {
toast.error(
t("linkGrabber.toast.containerImportFailed", {
fileName: file.name,
defaultValue: `Could not import ${file.name}: ${String(err)}`,
}),
);
}
}
if (aggregatedUrls.length > 0) {
resolveLinks({ urls: aggregatedUrls });
}
Copy link
Copy Markdown

@coderabbitai coderabbitai Bot May 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major | 🏗️ Heavy lift

Imported container packages are never populated.

link_import_container returns a packageId, but this flow discards it and only forwards raw URLs into the existing resolve/start path. In this file, downloads are still started later with plain download_start, so every successful container import creates an empty package record with no way for the imported downloads to land in it.

Please carry the originating packageId through resolve/start and attach created download IDs to that package, or defer package creation until the point where download IDs exist.

🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@src/views/LinkGrabberView/LinkGrabberView.tsx` around lines 205 - 228, The
current container import flow calls invoke("link_import_container") and throws
away result.packageId, only aggregating result.urls into aggregatedUrls and
calling resolveLinks({ urls: aggregatedUrls }); Update this flow to carry the
originating packageId through the resolution/download pipeline: when handling
the invoke("link_import_container") response, push an object containing
result.packageId and result.urls (or map urls -> { url, packageId }) into the
aggregate instead of plain strings, then change resolveLinks (and downstream
download_start calls) to accept and propagate an optional packageId parameter so
that the server-side download_start can attach created download IDs to the
provided packageId; alternatively, defer creating the package on import and
create it when download IDs exist, but ensure either resolveLinks or
download_start accepts and forwards result.packageId (from
invoke("link_import_container")) so imported links are associated with the
package rather than creating empty packages.

Copy link
Copy Markdown
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@coderabbitai

link_import_container returns a packageId, but this flow discards it and only forwards raw URLs into the existing resolve/start path.

Same root cause as the codex P1 finding above. Acknowledged but deferred — propagating packageId through link_resolve + download_start so created downloads land in the import package is a multi-handler pipeline change, tracked as a follow-up. The placeholder package row stays for now; this PR scope is the import + decrypt path only.

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mpiton, understood — the placeholder package being created ahead of its downloads is a known trade-off for this PR's scope. Deferring the packageId propagation through link_resolve + download_start makes sense given the pipeline scope of the change.

Would you like me to open a GitHub follow-up issue to track the work of propagating packageId so that imported container downloads land in the correct package rather than leaving an empty package record?


🧠 Learnings used
Learnt from: mpiton
Repo: mpiton/vortex PR: 127
File: src/views/AccountsView/ImportExportDialog.tsx:38-49
Timestamp: 2026-04-29T07:21:55.709Z
Learning: In mpiton/vortex, do not flag usages like `React.FormEvent` (and other `React.*` type references) for missing `import React` or `import type { ... }` as long as this codebase’s TypeScript setup provides the `React` namespace globally (i.e., `types/react` global namespace is relied upon) and `tsc --noEmit` is clean with `

Learnt from: mpiton
Repo: mpiton/vortex PR: 127
File: src/views/AccountsView/AddAccountDialog.tsx:65-65
Timestamp: 2026-04-29T07:25:31.488Z
Learning: In mpiton/vortex, do not report a missing import for type references written as `React.*` (e.g., `React.FormEvent`, `React.ChangeEvent`, `React.MouseEvent`) in TSX files. The project relies on `types/react` to provide the `React` namespace globally (with `tsconfig` using `"jsx": "react-jsx"` and `"strict": true`, and `tsc --noEmit` passing cleanly). Only raise an issue if `types/react` is removed from the project (i.e., the global `React` namespace is no longer available).

@codspeed-hq
Copy link
Copy Markdown
Contributor

codspeed-hq Bot commented May 6, 2026

Merging this PR will degrade performance by 18.88%

⚠️ Different runtime environments detected

Some benchmarks with significant performance changes were compared across different runtime environments,
which may affect the accuracy of the results.

Open the report in CodSpeed to investigate

⚡ 2 improved benchmarks
❌ 6 regressed benchmarks
✅ 18 untouched benchmarks

⚠️ Please fix the performance issues or acknowledge them on CodSpeed.

Performance Changes

Benchmark BASE HEAD Efficiency
normalize_link_check_parallelism 150 ns 120.8 ns +24.14%
simple_https 2.4 µs 2.7 µs -11.51%
normalize_max_concurrent 150 ns 120.8 ns +24.14%
with_userinfo 2.6 µs 3.1 µs -16.5%
complex_with_port_and_path 2.6 µs 3.2 µs -16.04%
ftp_scheme 2.6 µs 2.8 µs -10.09%
from_status_code_404 154.7 ns 183.9 ns -15.86%
from_status_code_500 125.3 ns 154.4 ns -18.88%

Comparing feat/task-42-import-containers-ui (77c6ff4) with main (c00d74b)

Open in CodSpeed

- Pre-check container file size before reading into memory (avoid wasting RAM on oversized drops)
- Batch aggregated container URLs into 500-URL chunks so large imports don't trip the resolve cap
- Surface container plugin probe failures as PluginError instead of NotFound (preserves load-failure context)
- Trim whitespace from decrypted URLs before collecting
- Append `.meta4` to install-hint message
- Document alphabetical precedence rule on `decrypt_container` port (matches adapter behavior)
- Fix CHANGELOG test count: 11 -> 16
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 77c6ff4314

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment on lines +238 to +239
for (let i = 0; i < aggregatedUrls.length; i += MAX_RESOLVE_URLS) {
resolveLinks({ urls: aggregatedUrls.slice(i, i + MAX_RESOLVE_URLS) });
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Merge chunked container resolves before setting link list

When a container import yields more than 500 URLs, this loop dispatches multiple link_resolve calls, but resolveLinks's success handler replaces state with setResolvedLinks(resolved) for each response. That means later chunk responses overwrite earlier ones, so users only see (and can start) one chunk instead of the full import. This regression appears only in large container imports that trigger batching.

Useful? React with 👍 / 👎.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

documentation Improvements or additions to documentation frontend rust

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant