BALTIMORE -- You've worked hard and long for months to get the software out the door. You think you've covered...
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
all the bases -- it performs well and it's completely secure. There is absolutely no way it can be hacked.
And then someone contacts the company and says he's found a vulnerability. What do you do now?
In a perfect world, according to David Coffey, principal security architect at McAfee Inc., the vulnerability gets reported, the vulnerability is validated, the vulnerability gets fixed, the code goes through QA and testing, an update is released, and customers are safe and happy.
In a not-so-perfect world, Coffey told attendees of the Software Security Summit, the vulnerability gets reported -- if possible -- the vulnerability gets ignored or is mishandled, the finder goes public with the exploit, and customers are unsafe and unhappy -- as are the company's shareholders, employees and managers.
And in a worst-case scenario, the vulnerability is discovered, the finder goes public with the exploit immediately and the company is not prepared to deal with the issue, Coffey said.
The important thing about dealing with vulnerabilities that are discovered after a product's release is knowing what steps to take should someone report a flaw, Coffey said.
"If a vulnerability comes in and it's classified correctly, then you need to know how to handle it," he said. "If you have a process, no one will freak out."
What that process is depends in part on who is reporting the flaw, Coffey said. Types of finders include internal people, such as company employees or hired security consultants. You can usually trust those people; it's their job and they have a contract with the company. And they're also motivated to look for vulnerabilities because because they're curious, they want to one up other developers and they're on a personal mission to find all vulnerabilities and secure them.
One thing to keep in mind, however, is the disgruntled employee, as his motives are not good, Coffey said.
It gets more hazardous if the finder is outside of the company. It could be a security researcher, a business partner or a knowledgeable end user. The concern is that you don't know how well you can trust the person or what his motives are. Is he looking for money or prestige, is he just curious, is he on some kind of mission or does he want to hurt the company?
"In most cases, people really do want to do the right thing. They want to help out," Coffey said. But you still have to consider those who do not.
Different types of disclosure
Another thing that needs to be considered is the type of disclosure the finder is considering. Under full disclosure, the finder says he will disclose all known information about a product's vulnerability. The theory being that a company will fix a flaw faster if it is threatened with full disclosure, Coffey said.
This group of people actually has a set of guidelines that it follows that were developed by a hacker. Those guidelines say the company has five days to answer the report, the company coordinates publication of the flaw with the finder, the company gives full credit to the finder for discovering the flaw, and the company has 30 days before the flaw is disclosed.
"That speaks to me that this is from people who don't know the development process," Coffey said. "It is very difficult for most companies to develop a patch within 30 days."
Another type of disclosure is responsible disclosure. It is similar to full disclosure except the company is not required to publish the flaw information the timeline can be flexible and the finder will allow a fix to be in place before disclosure occurs.
How to handle reporting of security vulnerabilities
The Organization for Internet Safety has created a set of guidelines to help people develop a plan for dealing with vulnerability reporting. The guidelines provide a general idea for what the five phases of reporting are and what should happen at each step.
The first thing that must happen, Coffey said, is you have to have a process in place for handling reports. You need to post the process so employees understand what to do. You need a reporting infrastructure and repository.
"You need a Web site that people can use to report vulnerabilities and an e-mail address, and those need to be manned and working correctly," Coffey said. "You can't have reports going to off into a black hole."
You also need to have formalized vulnerability reports that ask finders specific questions about the vulnerability. What is the person's name and contact information? What product has the flaw? What operating system was used? What tools did they use? What are the finder's intentions? Can he provide their proof of concept code? What are the steps for reproducing the exploit?
"You don't want to be constantly bugging the finder. You want all the information up front," Coffey said.
In addition to the report, you need to have a tracking system in place and you need to have established roles and responsibilities for employees.
Communication is key when dealing with vulnerability reports, Coffey stressed. Once the finder submits a report, the vendor needs to send an acknowledgement and establish rough dates for disclosure. Then, every week the vendor should send a weekly status e-mail to the finder so that he doesn't feel forgotten. It's important that the vendor tracks all communication for legal purposes.
"They did you a favor by telling you about the vulnerability. You want to do them a favor by communicating," Coffey said.
In that communication, however, be careful. You want to be honest, but you want to discuss only what the person needs to know. Don't volunteer any information. For example, if there's actually more vulnerabilities than what the finder reported, don't tell him about those.
Tracking the progress of a report
You need to track the progress of the vulnerability report from discovery to release, Coffey said. Typically one person owns the issue from conception to death and documents everything. Make sure you use an accepted tracking system that can produce reports on demand.
You also need to protect the information. Include only those people who need to know on the issue.
How to deal with disclosure
The finder wants to go public. What do you do? You need to ask yourself the following questions, Coffey said.
You want to disclose the vulnerability at the same time or before the finder, Coffey said. And it's always best to disclose on your home turf -- have the infrastructure and have a patch ready. You should also coordinate the disclosure with the finder. If he wants credit, give it to him. Work with him on how it will be worded. Ask if he's going to publish his findings.
Alternatively, the vulnerabilities can be posted on third-party sites, such as Security Focus or Common Vulnerabilities and Exposures.
Security flaws in the wild
If a finder chooses not to follow any sort of reporting guidelines and discloses the vulnerability immediately, your options are limited. You then need to decide if you should sue the person, if you should come clean with the flaw and explain what you're doing about it, or if you should ignore it and hope nothing bad happens.
Companies need to think about those issues now, Coffey stressed. Documentation is your friend; use it.
"When an emergency occurs, know who is command, who will handle PR, who to contact in the legal department, who is the lead person in the development department, what is the agenda," Coffey said.
And as always, if you know what attackers do, you know how to protect against them, he said. "If you can think like an attacker, you can protect yourself. The way to secure software becomes obvious," Coffey said.