How to handle the public disclosure of bugs and security vulnerabilities
What actions should security researchers take when they discover a security vulnerability in a software program or other piece of technology? That question has often led to disagreement, debate, and conflict among the parties involved. Should the researcher immediately reveal the existence of the bug without contacting the developer or vendor first? Should the bug be kept secret until the developer has had a chance to fix it? Should both the researcher and the developer report the bug to the public to warn them of the problem before a fix is available? A study released Wednesday by security provider Veracode examines the thorny issue of how security vulnerabilities are reported and how industry professionals weigh in on the issue.
SEE: 10 dangerous app vulnerabilities to watch out for (free PDF) (TechRepublic)
Commissioned by Veracode and conducted by 451 Research, the survey questioned a group of security and technology professionals that included developers, IT security staff, third-party penetration testers, and independent security researchers with various responsibilities.
To cut to the chase, a full 90% of the respondents said they see the disclosure of security vulnerabilities as a public good, opining that the identification of such vulnerabilities increases transparency and is positive for everyone's overall security posture. However, most researchers who discover a bug don't believe they should reveal the results on their own without first informing the developer. Only 9% of the respondents who identified a security vulnerability said that they took the full disclosure route, meaning they revealed the bug publicly instead of reporting it to the developer or vendor.
The perception is that vendors and software companies don't always act quickly or effectively enough when a reseacher reports a bug in one of their products. But 75% of the companies surveyed said they do have an established process for receiving bug reports from researchers. Most of them said they maintain such a process as an aspect of due care, but at least a third admitted that they're motivated by the fear of full public disclosure of the bug.
Another perception is that researchers who discover bugs and the vendors responsible for fixing them don't always work smoothly together to disclose or share information about a bug. However, some 37% of the organizations polled said they've received an unsolicited disclosure report from a researcher in the past 12 months. And among that group, 90% of the vulnerabilities were disclosed in a coordinated manner between researchers and organizations.
But even after reporting the bug, researchers don't want to be left out of the picture and are generally motivated by a desire to improve security. Some 57% of researchers said they expect to be notified by the developer when the vulnerability is fixed, while 47% expect regular updates on the fix and 37% want the ability to validate a fix once it's ready. Only 18% of researchers said they expect to be paid for their efforts and just 16% said they want recognition for their discovery.
The results so far paint a pretty rosy picture of bug reporting and collaboration between researchers and developers or vendors. However, there are some kinks in the process. The policies for accepting unsolicted bug reports are still inconsistent across different companies, so researchers may be unsure how to handle a bug discovery. Also, the reporting process itself doesn't necessarily lead to a quick fix for the flaw, at least not in the view of the researchers.
Some 65% of the security reseachers polled said they expect a fix in less than 60 days after reporting a bug to a developer or vendor. However, that deadline might be too aggressive and unpractical, according to Veracode, which found that 70% of all flaws still exist one month after they're discovered and almost 55% remain alive three months after discovery.
Offering the promise of payment, bug bounties are often seen as a helpful way to coax researchers and others to hunt for and report bugs. But the reality is that these bounties don't necessarily deliver on their promise. Some 47% of organizations said they've implemented bug bounty programs but only 19% of the bug reports they receive come from bounty programs. Since most researchers seem motivated more by a desire for secure software rather than by money, Veracode believes vendors should spend less funds on bug bounties and more on secure software development that finds vulnerabilities before they crop up in a public product.
"The alignment that the study reveals is very positive," Veracode Chief Technology Officer and co-founder Chris Wysopal said in a press release. "The challenge, however, is that vulnerability disclosure policies are wildly inconsistent. If researchers are unsure how to proceed when they find a vulnerability, it leaves organizations exposed to security threats, giving criminals a chance to exploit these vulnerabilities. Today, we have both tools and processes to find and reduce bugs in software during the development process. But even with these tools, new vulnerabilities are found every day. A strong disclosure policy is a necessary part of an organization's security strategy and allows researchers to work with an organization to reduce its exposure. A good vulnerability disclosure policy will have established procedures to work with outside security researchers, set expectations on fix timelines and outcomes, and test for defects and fix software before it is shipped."
To gather the data for this report, 451 Research conducted its survey from December 2018 to January 2019 using a sample of 1,000 respondents across a range of industries and organizations in the US, Germany, France, Italy, and the UK. Respondents were required to have an average to high level of familiarity with vulnerability disclosure models.