Write a Blog >>
PLDI 2020
Mon 15 - Fri 19 June 2020
Wed 17 Jun 2020 05:20 - 05:40 at PLDI Research Papers live stream - Machine Learning I Chair(s): Antonio Filieri

Verifying real-world programs often requires inferring loop invariants with nonlinear constraints. This is especially true in programs that perform many numerical operations, such as control systems for avionics or industrial plants. Recently, data-driven methods for loop invariant inference have shown promise, especially on linear loop invariants. However, applying data-driven inference to nonlinear loop invariants is challenging due to the large numbers of and large magnitudes of high-order terms, the potential for overfitting on a small number of samples, and the large space of possible nonlinear inequality bounds.

In this paper, we introduce a new neural architecture for general SMT learning, the Gated Continuous Logic Network (G-CLN), and apply it to nonlinear loop invariant learning. G-CLNs extend the Continuous Logic Network (CLN) architecture with gating units and dropout, which allow the model to robustly learn general invariants over large numbers of terms. To address overfitting that arises from finite program sampling, we introduce fractional sampling—a sound relaxation of loop semantics to continuous functions that facilitates unbounded sampling on the real domain. We additionally design a new CLN activation function, the Piecewise Biased Quadratic Unit (PBQU), for naturally learning tight inequality bounds.

We incorporate these methods into a nonlinear loop invariant inference system that can learn general nonlinear loop invariants. We evaluate our system on a benchmark of nonlinear loop invariants and show it solves 26 out of 27 problems, 3 more than prior work, with an average runtime of 53.3 seconds. We further demonstrate the generic learning ability of G-CLNs by solving all 124 problems in the linear Code2Inv benchmark. We also perform a quantitative stability evaluation and show G-CLNs have a convergence rate of $97.5%$ on quadratic problems, a $39.2%$ improvement over CLN models.

Wed 17 Jun
Times are displayed in time zone: (GMT-07:00) Pacific Time (US & Canada) change

pldi-2020-papers
05:00 - 06:00: PLDI Research Papers - Machine Learning I at PLDI Research Papers live stream
Chair(s): Antonio FilieriImperial College London

YouTube lightning session video

pldi-2020-papers05:00 - 05:20
Talk
Miltiadis AllamanisMicrosoft Research, Earl T. BarrUniversity College London, UK, Soline DucoussoENSTA Paris, France, Zheng GaoUniversity College London, UK
pldi-2020-papers05:20 - 05:40
Talk
Jianan YaoColumbia University, USA, Gabriel RyanColumbia University, USA, Justin WongColumbia University, USA, Suman JanaColumbia University, USA, Ronghui GuColumbia University, USA
pldi-2020-papers05:40 - 06:00
Talk
Ke WangVisa Research, Zhendong SuETH Zurich, Switzerland