https://cacm.acm.org/magazines/2019/8/238344-scaling-static-analyses-at-facebook/fulltext

To industry professionals we say: advanced static analyses, like those found in the research literature, can be deployed at scale and deliver value for general code. And to academics we say: from an industrial point of view the subject appears to have many unexplored avenues, and this provides research opportunities to inform future tools.

Deployments #

“diff time” deployment #

Software Development at Facebook #

Reporting #

The actioned reports and missed bugs are related to the classic concepts of true positives and false negatives from the academic static analysis literature. A true positive is a report of a potential bug that can happen in a run of the program in question (whether or not it will happen in practice); a false positive is one that cannot happen.

False positives #

the false positive rate is challenging to measure for a large, rapidly changing codebase: it would be extremely time consuming for humans to judge all reports as false or true as the code is changing.

Actioned reports #

Observable missed bugs #

Tools #

Tools used by Fb to conduct static analysis

Infer #

Infer has its roots in academic research on program analysis with separation logic,5 research, which led to a startup company (Monoidics Ltd.) that was acquired by Facebook in 2013. Infer was open sourced in 2015 (www.fbinfer.com) and is used at Amazon, Spotify, Mozilla, and other companies.

Zocolan #

Lessons learned #

First run #

First deployment was rather batch than continous:

Results:

Switch to Diff time #

Human factors #

The success of the diff time came as no surprise to Fb’s devs:

Additional resources #