Software Integrity

 

Handle with care: You have my vulnerability assessment report!

Does your organization rely heavily on vendor products or applications for streamlining processes? Do you wonder what threats your data is being exposed to while it’s handled by these applications? Are you a vendor trying to assure clients that your applications are secure—without divulging too much information? Have you faced situations where your client demands they run their own security assessment?

These are common industry questions that I come across time and time again. To answer these questions, I’ve put together some helpful guidelines that will aid your organization in reaching a consensus on application assessments and findings.

Report structure.

The following elements align with the Synopsys and OWASP reporting structure and should always be present in a vulnerability assessment report.

Project objectives

This section should identify who conducted the assessment. Details about the assessor help organizations determine the quality of the assessment. They also establish legitimacy and trustworthiness around the vendor providing it. Gartner’s Magic Quadrant shines a light on some of the industry’s leading vendors.

Other important information to take note of in this section includes the type of assessment. Was it a black box (DAST) assessment, white box (SAST) assessment, architecture review, or a threat model? Was it conducted manually or was it fully automated? These questions depend also on the reason for which you’re using the application. This additionally ties into the data confidentiality of the information the application handles and the frequency of its use.

In some scenarios, the report from an automated scan will suffice. However, if the application is in use throughout the organization, or it deals with sensitive information, also consider conducting manual testing.

Project schedule

Consider when the last assessment took place. Don’t rely on a five-year-old assessment. If the assessment is over a year old, it’s time to move forward with a new one. New attack methods are popping up all the time—an average release cycle is six months to a year.

However, this question changes if the product version is significantly higher (two or more) than the tested application. This product may have also gone through frequent releases, although this is rarely the case.

Apart from the assessment’s age, the duration also matters. If it’s two days, did it cover all functionalities based on the size of the application? Perhaps the assessment was carried out to check for the top five vulnerabilities and nothing more. (Ideally, this should be noted in the project objectives.) Compare the application size and findings to determine whether the assessment duration is suitable to conduct a thorough review.

Targets

The assessment scope should list all domains, functionalities, and modules that were part of the assessment. Did the scope include application server and database testing? What about the outbound streams or APIs connecting to the application? If your firm is utilizing third-party Web application hosting to enter employee information, it’s important to understand where the data is stored and database vulnerabilities. The same is also true for the server. This type of assessment could very easily go beyond the scope of a pure application assessment. It’s also imperative to assess these details.

Some applications have backend reporting capabilities. In most cases, the host company receives these usage statistics. Explore the establishment of this connection and shared information. Valuable information includes details on who logged into the application and where they logged in from.

Limitations

Traditionally, identifying limitations includes modules, domains, and functionalities that weren’t part of the assessment’s scope. Verify any of these that apply to you. Additionally, due to time constraints, limited application availability, and access issues, some of these elements may not have been included in testing. List similar limitations in this section.

Findings and summary

A summary of critical, high, medium, and low vulnerabilities provides the organization with an idea of where to focus the budget. However, it may not provide comfort to know the various ways the application is at risk. Considering this point, begin this section with ways to protect the application against malicious attacks. Include flaws that weren’t discovered (e.g., areas where there were no occurrences of SQL injection and cross-site scripting). Identify whether the critical vulnerabilities relate to a hard-coded password or a cross-site scripting attack. Discuss where the flaws were found and if that is something that the organization uses often or regularly. Discuss if and how to disable these elements if they’re not in use regularly.

But wait. If you’re a vendor, you can’t share information pertaining to the exact location of the flaw and how it was discovered. That’s specific and sensitive information. How do you proceed?

In the case of a vendor, include the following elements into the findings and summary:

  1. vulnerability title and description
  2. risk level
  3. module information (without including details about parameters and fields)

Some vendors receive a text version of the assessment report. Others receive a PDF. If working with a PDF version, be sure to black-out specific and sensitive information.

Appendix

Methodology overview. This section should list the testing tools and scripts. Include assessor checklists, automated checks, and attacks tested. This provides a great deal of information about the kind of assessment. It helps to determine if the assessment meets organizational standards.

Risk rating model overview. Identify the risk rating model (e.g., CVSS/NIST 800-30 r1) used by the assessing team to determine the risk level of the flaws. List the matrix to provide information on risk level calculations. For example, a SQL injection flaw is listed as having medium severity. However, the organization doesn’t agree. The model provides clarity regarding the calculation.

Remediation plan

The expectation is for the vendor to fix the vulnerabilities. With increasing criticality is an increasing need to resolve the vulnerability. In addition, be sure to check the software version against that which was tested. The new version may already have fixes in place for these vulnerabilities.

The vendor should provide a brief description about how they plan to move forward with remediation. This includes framework-level controls versus spot fixes. Illustrate that the remediation plan is in line with current standards. If a vendor doesn’t have a new version in sight, and fixes aren’t in place, be sure to also request a mitigation control to prevent the vulnerability.

Report sharing methodologies.

After sanitizing the report, ensure that distribution aligns with one of the following three approaches.

Password protected PDF via secure mail

When sharing with this approach, send the password in a separate secure mailing. The method of internal report distribution is at the discretion of the organization. While this approach reduces distribution overhead on the part of the vendor, it also provides them with zero visibility as to whom and how distribution takes place.

Online storage

The vendor can host the report in a file hosting/storage site and provide limited access (e.g., view only, no download) to other members of the organization. With this approach, the vendor has full control over who can view the report. They can also keep logs on who accesses the report and from where. However, they have to maintain an externally-facing document sharing portal to ensure availability along with the additional task of approving access.

Onsite visit and reading

This approach is the most drastic of the three options. The biggest advantage of this approach is that the actual report won’t be available to clients at their discretion. The biggest disadvantage is that the vendor has to incur the expenses of sending a representative to the client to physically read it.

Side notes.

When sharing assessments, keep the following pointers in mind.

Remediation and retesting

Even when a vendor commits to remediation deadlines, also schedule a retest to ensure the vulnerabilities are complete. This requires follow-up on the part of the organization. If the vendor doesn’t have a new version coming up for release, and fixes aren’t in place, also request a mitigation control to resolve the vulnerability.

Secure development practice

Vulnerabilities may get resolved; however, the organization needs to determine if there are steps or procedures in place to ensure new issues don’t crop up in future releases. The organization needs to check if the vendor’s development team receive sufficient training in secure development practices. Additionally, do they have any internal security testing practices to ensure that vulnerabilities are identified and resolved as early in the development process as possible?

Vulnerability trend line

Ask for a trend line of vulnerabilities found over the past several years (by version). The trend line provides insight into how mature the vendor is in application security and how effective their internal practices are in the delivery of secure products. Also leverage these findings in contract negotiations.

Summary.

While it is important to control the information that goes into the report, it is equally important to control the distribution of this information so as to prevent a leak of sensitive information.

Which type of vulnerability testing is the best fit for your organization?