Synopsys on source code security sensitivities
Software application development professionals often have to fix problems with software code – source code in particular; it’s a simple fact of life.
Much is made of the balance between Agile rapid development and the option to iteratively fix issues along the way compared to methodologies that advocate getting it right in the first place.
The ‘suggested findings’ of a 2016 Forrester Research study commissioned by Synopsys call to mind an ancient proverb: a stitch in time saves nine.
Or, in the case of software development, fixing defects early in the lifecycle could reduce remediation costs by a factor of anywhere from five to 15, or so it is claimed.
Senior security strategist at Synopsys Taylor Armerding further suggests that the study set a baseline example of five hours of work to fix a defect in the coding/development stage.
But, he reminds us, finding and fixing that same defect in the final testing phase would take five to seven times longer.
“Waiting until after the product was on the market to discover and fix the same defect would take even longer and cost 10–15 times more. That doesn’t include the potential cost of damages from a bad guy discovering the defect first and exploiting it to attack users,” he said.
Source code sensitivities
So what’s going on out there in the real world?
Synopsys’ Armerding points to a Reuters report saying that major tech companies – SAP, Symantec, Micro Focus and McAfee – have allowed Russian authorities to inspect the source code of their software.
That same software is used by at least a dozen US government departments including Defense, State, NASA, the FBI and other intelligence agencies.
“Several security experts and government officials said it was tantamount to handing tools to the Russians to spy on the US. A Dec. 7 letter from the Pentagon to Sen. Jeanne Shaheen (D-NH) said that allowing governments to review the code, “may aid such countries in discovering vulnerabilities in those products,” said Armerding.
But according to other security experts, neither was that big of a deal.
Armerding points out that commentators have said that when it comes to defects that can be exploited for cyberattacks or espionage, access to the source code is no more dangerous – likely less so – than access to the binary code, which is created from the source code and is sold as part of the commercial product that results.
The mantra is, you sell them (customers) the binary, which means all customers can inspect it for exploitable defects at their leisure.
“But the major risks [in scenarios like this] appear to be to [developers of the operating system such as Apple or Microsoft] themselves, since the source code is its proprietary IP, and access to it might make it easier to jailbreak the OS – something these companies tries ferociously to prevent,” said Armerding.
The question of when to fix and at what level we should regard source code problems may be open to debate for some, but Armerding insists that during development, source code can be (and should be) reviewed by a static analysis program — so when you find a bug in source code, it is easier to fix, so fix it.