Skip to content

Add stop_at_first_success to Prevent Duplicate Results (Fixes #1513)#1514

Open
Aarush289 wants to merge 17 commits intoOWASP:masterfrom
Aarush289:Add_module_and_flow
Open

Add stop_at_first_success to Prevent Duplicate Results (Fixes #1513)#1514
Aarush289 wants to merge 17 commits intoOWASP:masterfrom
Aarush289:Add_module_and_flow

Conversation

@Aarush289
Copy link
Copy Markdown
Contributor

Proposed change

Your PR description goes here:

This PR introduces a new feature stop_at_first_success to prevent duplicate entries in scan results. Currently, the same successful detection can be logged multiple times for a given target, module, and port, which degrades user experience and clutters the output.
Fixes #1513
Before

image

After
image

The feature can be used as
image

Type of change

  • New core framework functionality
  • Bugfix (non-breaking change that fixes an issue)
  • Code refactoring without any functionality changes
  • New or existing module/payload change
  • Documentation/localization improvement
  • Test coverage improvement
  • Dependency upgrade
  • Other improvement (best practice, cleanup, optimization, etc)

Checklist

  • I've followed the contributing guidelines
  • I've digitally signed all my commits in this PR
  • I've run make pre-commit and confirm it didn't generate any warnings/changes
  • I've run make test and I confirm all tests passed locally
  • I've added/updated any relevant documentation in the docs/ folder
  • I've linked this PR with an open issue
  • I've tested and verified that my code works as intended and resolves the issue as described
  • I've attached screenshots demonstrating that my code works as intended (if applicable)
  • I've checked all other open PRs to avoid submitting duplicate work
  • I confirm that the code and comments in this PR are not direct unreviewed outputs of AI
  • I confirm that I am the Sole Responsible Author for every line of code, comment, and design decision

@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai bot commented Apr 14, 2026

No actionable comments were generated in the recent review. 🎉

ℹ️ Recent review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 112f40e8-46e2-43d4-9e15-4e5d4716f678

📥 Commits

Reviewing files that changed from the base of the PR and between cb30538 and 8e7df0b.

📒 Files selected for processing (2)
  • nettacker/core/lib/base.py
  • nettacker/database/db.py

Summary by CodeRabbit

  • Improvements
    • Faster scans through enhanced early termination when a successful condition is detected.
    • More accurate detection of prior temporary events by including port-aware matching.
    • Improved persistence and ordering of temporary-event logs when stopping at the first successful result, ensuring consistent audit records.

Walkthrough

Adds early-exit checks to avoid reprocessing when stop_at_first_success is set and a matching temp event exists; persists a temp log keyed by stop_at_first_success; and updates find_temp_events(...) to accept an optional port parameter used in matching. (50 words)

Changes

Cohort / File(s) Summary
Engine control flow & temp-log handling
nettacker/core/lib/base.py
- Added early-return in process_conditions(...) when event["response"]["stop_at_first_success"] is present and find_temp_events(...) finds a matching temp event (port derived from event.get("ports", "")).
- When conditions_results is truthy and stop_at_first_success is present, now calls submit_temp_logs_to_db(...) to persist a temp log keyed by stop_at_first_success (includes port, event, data=response).
- Added pre-execution early-return in run(...) when sub_step["response"]["stop_at_first_success"] is present and a matching temp event exists (port from sub_step.get("ports", "")).
Temp-event lookup signature and queries
nettacker/database/db.py
- find_temp_events(...) signature updated to accept port=None and uses port when matching.
- APSW/raw SQL branch: conditional query includes AND port = ? when port provided (binding json.dumps(port)); otherwise original query used.
- SQLAlchemy/ORM branch: builds base filtered query then conditionally adds TempEvents.port == json.dumps(port) when port is provided; return behavior unchanged.

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~25 minutes

🚥 Pre-merge checks | ✅ 4 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 75.00% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (4 passed)
Check name Status Explanation
Title check ✅ Passed The PR title clearly summarizes the main change: adding a stop_at_first_success feature to prevent duplicate results, and references the issue it fixes.
Description check ✅ Passed The PR description is well-detailed, explaining the problem (duplicate entries), the solution (stop_at_first_success feature), and includes before/after screenshots and usage examples demonstrating the fix.
Linked Issues check ✅ Passed The PR implements the core requirement from #1513: preventing duplicate success logs by adding stop_at_first_success logic to check for existing temp events before logging, effectively stopping further logging after first detection.
Out of Scope Changes check ✅ Passed All changes are in scope: modifications to BaseEngine.process_conditions and BaseEngine.run for the deduplication logic, and updates to find_temp_events to support port-based filtering, all directly addressing the duplicate results issue.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@nettacker/core/lib/base.py`:
- Around line 294-299: The branch that returns early on finding an existing
event leaks a partially mutated sub_step because run() deletes
sub_step["method"] and sub_step["response"] before the check; move the
"stop_at_first_success" check to run before deleting those keys or,
alternatively, ensure you restore sub_step["method"] and sub_step["response"]
(the originals saved in backup_response) before returning; locate the logic
around run(), the sub_step dict, the deletions of "method" and "response", and
the find_temp_events call to apply the fix.
- Around line 126-127: The dedupe key is missing the port so
find_temp_events(...) currently suppresses successes across different ports;
update all calls to find_temp_events(target, module_name, scan_id, event_name)
to pass the port (use event["response"]["port"]) and change the find_temp_events
function signature/implementation to include port in its lookup key; also update
any temp-event creation/storage logic used by find_temp_events so the stored key
includes port (apply the same change to the other two call sites referenced
around the other ranges).
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 326ef8c1-32d8-4965-9438-406456214e8b

📥 Commits

Reviewing files that changed from the base of the PR and between 06c9bdf and c49eaad.

📒 Files selected for processing (1)
  • nettacker/core/lib/base.py

Comment thread nettacker/core/lib/base.py Outdated
Comment thread nettacker/core/lib/base.py Outdated
Comment thread nettacker/core/lib/base.py
@pUrGe12
Copy link
Copy Markdown
Contributor

pUrGe12 commented Apr 15, 2026

Can you do a small benchmark to see how this will affect the scan efficieny and latencies? If we're making multiple database calls for deduplication then it might increase the number of I/O calls? @Aarush289

@Aarush289
Copy link
Copy Markdown
Contributor Author

Okay sure, will do that.

@Aarush289
Copy link
Copy Markdown
Contributor Author

@pUrGe12 I have done basic bench-marking, file I/O calls will be increased but there is negligible effect on system time and cpu usage.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Duplicate Entries in Scan Results Degrading User Experience

2 participants