Skip to content

Commit

Permalink
updates
Browse files Browse the repository at this point in the history
  • Loading branch information
nguyenthanhvuh committed Oct 30, 2024
1 parent be7dedf commit b6f8282
Show file tree
Hide file tree
Showing 2 changed files with 28 additions and 13 deletions.
12 changes: 11 additions & 1 deletion ennv.bib
Original file line number Diff line number Diff line change
Expand Up @@ -17193,4 +17193,14 @@ @inproceedings{xu2024training
publisher = {Springer},
year = {2024},
doi={10.1007/978-3-031-57256-2_2}
}
}


@inproceedings{nakagawa2014consolidating,
title={Consolidating a process for the design, representation, and evaluation of reference architectures},
author={Nakagawa, Elisa Y and Guessi, Milena and Maldonado, Jos{\'e} C and Feitosa, Daniel and Oquendo, Flavio},
booktitle={2014 IEEE/IFIP Conference on Software Architecture},
pages={143--152},
year={2014},
organization={IEEE}
}
29 changes: 17 additions & 12 deletions ennv.tex
Original file line number Diff line number Diff line change
Expand Up @@ -470,7 +470,10 @@ \section{Satisfiability and Activation Pattern Search} As with traditional softw
\end{equation}
If \autoref{eq:prob} is unsatisfiable, $\phi$ is a valid property of $\mathcal{N}$. Otherwise, $\phi$ is not valid (and the counterexample of the original problem is any satisfying assignment that makes \autoref{eq:prob} true).

For the widely-used ReLU activation problem, this problem becomes a search for \emph{activation patterns}, i.e., boolean assignments representing activation status of neurons, that lead to satisfaction the formula in \autoref{eq:prob}. Modern DNN verification techniques~\cite{bunel2020branch,wang2021beta,ferrari2022complete,duong2024harnessing,duong2023dpllt,ovalbab,katz2019marabou,bak2021nnenum} all adopt this idea and search for satisfying activation patterns.

\section{Activation Pattern Search} For the widely-used ReLU activation problem, the DNN verification problem becomes a search for \emph{activation patterns}, i.e., boolean assignments representing activation status of neurons, that lead to satisfaction the formula in \autoref{eq:prob}.

Modern DNN verification techniques~\cite{bunel2020branch,wang2021beta,ferrari2022complete,duong2024harnessing,duong2023dpllt,ovalbab,katz2019marabou,bak2021nnenum} all adopt this idea and search for satisfying activation patterns.


\section{Complexity}
Expand Down Expand Up @@ -607,8 +610,7 @@ \section{Branch-and-Bound (BnB)}
\DontPrintSemicolon

\Input{DNN $\mathcal{N}$, property $\phi_{in} \Rightarrow \phi_{out}$}
Output{($\unsat, \prooftree$)}
\Output{($\unsat, \prooftree$) if property is valid, otherwise ($\sat, \counterexample$)}
\Output{$\unsat$ if property is valid, otherwise ($\sat, \counterexample$)}
\BlankLine


Expand All @@ -627,26 +629,29 @@ \section{Branch-and-Bound (BnB)}
\tcp{create new activation patterns}
$\problems \leftarrow \problems \cup \{ \sigma_i \land v_i ~;~ \sigma_i \land \overline{v_i} \}$ \;
}
\Else(\tcp*[h]{detect a conflict}){
%\Else(\tcp*[h]{detect a conflict}){
% $\clauses \leftarrow \clauses \cup \AnalyzeConflict(\igraph_i)$ \\ \label{line:analyze_conflict}
$\prooftree \leftarrow \prooftree \cup \{ \sigma_i \}$ \tcp{build proof tree} \label{line:record_proof}
}
% $\prooftree \leftarrow \prooftree \cup \{ \sigma_i \}$ \tcp{build proof tree} \label{line:record_proof}
%}
% }
% \If(\tcp*[h]{no more problems}){$\isEmpty(\problems)$}{
% }

}\label{line:dpllend}
\Return{$(\unsat, \prooftree)$}
\Return{\unsat}

\caption{The \dd{} algorithm with proof generation.}\label{fig:alg}
\end{algorithm}

\autoref{fig:alg} shows \dd{}, a reference architecture~\cite{nakagawa2014consolidating}
for modern DNN verifiers that we use to illustrate our observations.
\dd{} takes as input a ReLU-based DNN $\mathcal{N}$ and a formulae $\phi_{in}\Rightarrow \phi_{out}$ representing the property of interest.
Modern DNN verifiers often adopt the branch-and-bound (BnB) approach to search for activation patterns that satisfy the DNN verification problem. A BnB search consists of two main components: (branch) splitting into the problem smaller subproblems
by using \emph{neuron splitting}, which decides boolean values representing neuron activation status,
and (bound) using abstraction and LP solving to approximate bounds on neuron values to determine
the satisfiability of the partial activation pattern formed by the split.


\dd{} iterates between two components: \texttt{Decide} (branching, \autoref{line:decide}), which decides (assigns) an activation status value for a neuron, and \texttt{Deduce} (bounding, \autoref{line:deduce}), which checks the feasibility of the current activation pattern.
To add proof generation capability, \dd{} is instrumented with a proof tree (\texttt{proof}) variable (\autoref{line:prooftree}) to record these branching decisions. The proof is represented as a binary tree structure, where each node represents a neuron and its left and right edges represent its activation decision (active or inactive). %The proof tree is then used to generate a proof in the \nnproofformat{} format (\autoref{sec:proof-format}).
\autoref{fig:alg} shows \dd{}, a reference BnB architecture~\cite{nakagawa2014consolidating} for modern DNN verifiers. \dd{} takes as input a ReLU-based DNN $\mathcal{N}$ and a formulae $\phi_{in}\Rightarrow \phi_{out}$ representing the property of interest.
\dd{} iterates between (i) branching ()\autoref{line:decide}), which \emph{decides} (assigns) an activation status value for a neuron, and (ii) bounding (\autoref{line:deduce}), which \emph{deduces} or checks the feasibility of the current activation pattern.
%To add proof generation capability, \dd{} is instrumented with a proof tree (\texttt{proof}) variable (\autoref{line:prooftree}) to record these branching decisions. The proof is represented as a binary tree structure, where each node represents a neuron and its left and right edges represent its activation decision (active or inactive). %The proof tree is then used to generate a proof in the \nnproofformat{} format (\autoref{sec:proof-format}).



Expand Down

0 comments on commit b6f8282

Please sign in to comment.