Write a Blog >>
PLDI 2020
Mon 15 - Fri 19 June 2020
Wed 17 Jun 2020 05:20 - 05:40 at PLDI Research Papers live stream - Machine Learning I Chair(s): Antonio Filieri

Verifying real-world programs often requires inferring loop invariants with nonlinear constraints. This is especially true in programs that perform many numerical operations, such as control systems for avionics or industrial plants. Recently, data-driven methods for loop invariant inference have shown promise, especially on linear loop invariants. However, applying data-driven inference to nonlinear loop invariants is challenging due to the large numbers of and large magnitudes of high-order terms, the potential for overfitting on a small number of samples, and the large space of possible nonlinear inequality bounds.

In this paper, we introduce a new neural architecture for general SMT learning, the Gated Continuous Logic Network (G-CLN), and apply it to nonlinear loop invariant learning. G-CLNs extend the Continuous Logic Network (CLN) architecture with gating units and dropout, which allow the model to robustly learn general invariants over large numbers of terms. To address overfitting that arises from finite program sampling, we introduce fractional sampling—a sound relaxation of loop semantics to continuous functions that facilitates unbounded sampling on the real domain. We additionally design a new CLN activation function, the Piecewise Biased Quadratic Unit (PBQU), for naturally learning tight inequality bounds.

We incorporate these methods into a nonlinear loop invariant inference system that can learn general nonlinear loop invariants. We evaluate our system on a benchmark of nonlinear loop invariants and show it solves 26 out of 27 problems, 3 more than prior work, with an average runtime of 53.3 seconds. We further demonstrate the generic learning ability of G-CLNs by solving all 124 problems in the linear Code2Inv benchmark. We also perform a quantitative stability evaluation and show G-CLNs have a convergence rate of $97.5%$ on quadratic problems, a $39.2%$ improvement over CLN models.

Wed 17 Jun

Displayed time zone: Pacific Time (US & Canada) change

05:00 - 06:00
05:00
20m
Talk
Typilus: Neural Type Hints
PLDI Research Papers
Miltiadis Allamanis Microsoft Research, Earl T. Barr University College London, UK, Soline Ducousso ENSTA Paris, France, Zheng Gao University College London, UK
05:20
20m
Talk
Learning Nonlinear Loop Invariants with Gated Continuous Logic Networks
PLDI Research Papers
Jianan Yao Columbia University, USA, Gabriel Ryan Columbia University, USA, Justin Wong Columbia University, USA, Suman Jana Columbia University, USA, Ronghui Gu Columbia University, USA
05:40
20m
Talk
Blended, Precise Semantic Program Embeddings
PLDI Research Papers
Ke Wang Visa Research, Zhendong Su ETH Zurich, Switzerland