Add stop_at_first_success to Prevent Duplicate Results (Fixes #1513)#1514
Add stop_at_first_success to Prevent Duplicate Results (Fixes #1513)#1514Aarush289 wants to merge 17 commits intoOWASP:masterfrom
Conversation
create pr
Document all modules ( fix OWASP#1269 ) (OWASP#1270)
Signed-off-by: Aarush <cs24b064@smail.iitm.ac.in>
Removed entry for FortiWeb authentication bypass vulnerability. Signed-off-by: Aarush <cs24b064@smail.iitm.ac.in>
Signed-off-by: Aarush <cs24b064@smail.iitm.ac.in>
Signed-off-by: Aarush <cs24b064@smail.iitm.ac.in>
Merge new changes
Merge new modules
Merge changes from master
|
No actionable comments were generated in the recent review. 🎉 ℹ️ Recent review info⚙️ Run configurationConfiguration used: Path: .coderabbit.yaml Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (2)
Summary by CodeRabbit
WalkthroughAdds early-exit checks to avoid reprocessing when Changes
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes 🚥 Pre-merge checks | ✅ 4 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (4 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 3
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@nettacker/core/lib/base.py`:
- Around line 294-299: The branch that returns early on finding an existing
event leaks a partially mutated sub_step because run() deletes
sub_step["method"] and sub_step["response"] before the check; move the
"stop_at_first_success" check to run before deleting those keys or,
alternatively, ensure you restore sub_step["method"] and sub_step["response"]
(the originals saved in backup_response) before returning; locate the logic
around run(), the sub_step dict, the deletions of "method" and "response", and
the find_temp_events call to apply the fix.
- Around line 126-127: The dedupe key is missing the port so
find_temp_events(...) currently suppresses successes across different ports;
update all calls to find_temp_events(target, module_name, scan_id, event_name)
to pass the port (use event["response"]["port"]) and change the find_temp_events
function signature/implementation to include port in its lookup key; also update
any temp-event creation/storage logic used by find_temp_events so the stored key
includes port (apply the same change to the other two call sites referenced
around the other ranges).
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 326ef8c1-32d8-4965-9438-406456214e8b
📒 Files selected for processing (1)
nettacker/core/lib/base.py
|
Can you do a small benchmark to see how this will affect the scan efficieny and latencies? If we're making multiple database calls for deduplication then it might increase the number of I/O calls? @Aarush289 |
|
Okay sure, will do that. |
|
@pUrGe12 I have done basic bench-marking, file I/O calls will be increased but there is negligible effect on system time and cpu usage. |
Proposed change
Your PR description goes here:
This PR introduces a new feature stop_at_first_success to prevent duplicate entries in scan results. Currently, the same successful detection can be logged multiple times for a given target, module, and port, which degrades user experience and clutters the output.
Fixes #1513
Before
After

The feature can be used as

Type of change
Checklist
make pre-commitand confirm it didn't generate any warnings/changesmake testand I confirm all tests passed locallydocs/folder