Hacktivity can save your company. Take help from hackers. You can’t do it alone. Approach hackers with an assumption of benevolence, and develop relationships with them. Don’t find out about a vulnerability for the first time on Twitter. How do you defend yourself against people who get up in the morning, put on their flip flops (or military uniform) and do nothing but think about how to attack you? These were themes at the Atlantic Council’s panel on coordinated vulnerability disclosure (CVD) on September 18 in Washington, D.C.
The session was titled: Hacker-Powered Security: Voices on Coordinated Vulnerability Disclosure and kicked off the launch of their amazing comic on “How Hacktivity Can Save Your Company”. CVD is a process in which an organization receives information about potential vulnerabilities from hackers, coordinates the sharing of the information amongst those involved, and the resolution of the vulnerabilities. The panel consisted of a wide range of experts in CVD: Leonard Bailey of the Department of Justice, Jessica Wilkerson of the House Committee on Energy and Commerce, Chris Nims, Chief Information Security Officer of Oath, and Bobbie Stempfley from the CERT Division at Carnegie Mellon. Mr. Beau Woods, Cybersecurity Innovation Council at Atlantic Council, moderated. The keynote was given by Evelyn Remaley of the National Telecommunications and Information Administration.
Highlights and takeaways of the panel discussion are below. You can watch the full session here.
Top 5 Takeaways
Significant progress made in the federal government on CVD. The federal government is leading the charge in vulnerability disclosure, showing the private sector and large corporations how to do it. Ms. Remaley said that VDP is now only “slightly controversial”, and “in DC if an issue becomes boring that’s a sign of progress.” The sea change has been the result of years of cross-collaboration between the federal government, private industry, and security researchers themselves. Successful initiatives include the adoption of NIST Cybersecurity Frameworks and exemptions from the copyright laws to protect security researchers. Major privacy data breaches from high profile brands have highlighted cybersecurity as a major risk. Ms. Wilkerson noted that CVDs began to be actively discussed on the Hill after WannaCry. If you want to give a one word context as to why cybersecurity and CVD matters, just say “WannaCry”.
CVD is more than an IT issue. CVD shows up everywhere, and is now a nationwide, and worldwide issue. Awareness has moved beyond IT, and has moved into everyday life, federal legislation, and into board rooms. One of the reasons is that cybersecurity breaches have physical impacts and affect so many industries. Specifically in the area of health care, when Ms. Wilkerson is asked about how to protect against cybersecurity risk, her first question is, “Do you have a CVD program…because If you don’t know how to accept free help, you can’t do much else.” And the Board absolutely needs to be informed of material risks, and if you’re a public company, then disclosing what is material is required.
Cybersecurity is a “wicked problem”. Cybersecurity is an evolving problem because security is never done. You don’t get to check a box and be done, according to Mr. Nims. Ms. Stempfley also stated that CVD can be a “wicked problem” discovering and resolving vulnerabilities is a complex, neverending problem, that doggedly resists a clean resolution. You’re not sure what success looks like, not sure what the levers are to make it successful, and not sure how to motivate security researchers. And when you are done, you’re not sure you’re done. A lot of progress has been made in the last decade, but in two years, she believes that we will be talking about the next set of problems in the space because the scale, pace and participants have all changed.
However, there are major wins in CVD. Winning is ultimately about reducing risk, and creating a security culture. Talk at all-hands meetings about security to make sure that everyone understands its importance. “Every time we find a vulnerability, fix it and close it, that’s one less risk. As you scale that, security gets better and risk goes down. That feels like success,” according to Mr. Nims. He viewed every report Oath receives from a researcher that is ultimately resolved is a win. Mr. Bailey believed that Hack the Pentagon in 2016 and the collaboration between all the stakeholders with the Department of Defense was a huge win. It brought the decade-long work by NIST, FTC, NTIA and others to the forefront and made it clear to everyone that there is no organization so powerful that it would not need coordinated vulnerability disclosure.
About those security researchers: Start from a position that establishes trust. Presume the researcher is trying to help, and over communicate to avoid surprise. And researchers can help, even if you have a dedicated red team because external security researchers help cover areas that you cannot.
Mr. Bailey added that sometimes he hears of situations from the U.S. Attorney’s offices that a company feels like it’s being extorted, and what it really is, is a misunderstanding. No one on the panel wanted to hear about a vulnerability for the first time on Twitter, so having a process in place on how to work and communicate with external security researchers is ideal.
Mr. Nims acknowledged the “maturity curve” of CVD. Early on, a company might start with a “Hall of Fame” for security researchers who find bugs who improved its security profile and posture, and then eventually move on to a formal bug bounty program. Many security researchers are motivated by action and resolution, as opposed to monetary benefits.
Some notable “aha” moments on CVD
Your network isn’t really your network anymore. The idea that “this is my stuff” isn’t true. Because everything is interconnected —you to your suppliers, vendors, and customers—it is virtually impossible to tackle everything. So, ask for, and take all of the help that you can get.
Pentesting isn’t enough. It’s a myth that pentesting is sufficient. An internal red team, even a terrific one, still needs security researchers.
Vulnerabilities aren’t always clearly identifiable, distinct, and immediately fixable. Bugs do not come with a “bug” sticker on them!
A final note
Today everyone in government and the private sector agrees: vulnerability disclosure makes perfect sense. If you can’t make use of it, something is very wrong with your cybersecurity practice. Implementing vulnerability disclosure is the best practice both in the short and long term, and should be a complement to any existing cybersecurity team.