Due to the rampant issue of AI-generated fake vulnerability reports, the official team of the well-known open-source project Node.js has announced a suspension of cash rewards for vulnerability reporters through the HackerOne platform.

The vulnerability bounty platform HackerOne stated that in recent years, a large number of users have used AI tools to scan and submit vulnerabilities in bulk. This behavior has disrupted the balance of the open-source community: the speed at which vulnerabilities (or suspected vulnerabilities) are discovered far exceeds the speed at which developers can fix them. More seriously, these reports are filled with a large number of low-quality, false positives, or even fabricated reports.

image.png

To address this, the "Internet Bug Bounty Program" (IBB) on HackerOne has stopped accepting new reports, which directly cut off the external source of Node.js's reward funds.

As a project led by community volunteers, Node.js does not have an independent budget to pay bounties. Security company Socket pointed out that Node.js has already been adjusting its mechanisms:

  • Review burden: Each report requires developers to spend a lot of time verifying, and low-quality content generated by AI greatly wastes the time of volunteer maintainers.

  • Higher thresholds: In order to resist AI attacks, the project team previously significantly raised the submission threshold, but it is still difficult to resist the impact of automated tools.

The process remains unchanged, only the bounty is suspended

Node.js emphasized that although the bounty is suspended, the security assurance has not been "reduced":

  • Submission process: Researchers can still submit vulnerabilities through HackerOne.

  • Processing priority: The team will maintain the original response speed and patch release process to ensure the project's security.

Node.js is not an isolated case. In January of this year, the well-known network tool cURL also had to terminate its bounty program due to being "bombarded" by AI-generated reports. This reflects the systemic challenges traditional open-source incentive mechanisms are facing after the proliferation of generative AI: how to filter out truly valuable professional feedback has become a pressing problem for the open-source community.