summaryrefslogtreecommitdiffstats
path: root/introduction.tex
diff options
context:
space:
mode:
authorYann Herklotz <git@yannherklotz.com>2021-08-02 00:26:15 +0200
committerYann Herklotz <git@yannherklotz.com>2021-08-02 00:26:15 +0200
commita89ffd74a82b225be85627926407ed9a05aefc5e (patch)
treee4f37b5c5dec7331323992477e9fe31f01055337 /introduction.tex
parent5ba8d98c98ca17b543f212486a5a4e2e7b91ca84 (diff)
downloadoopsla21_fvhls-a89ffd74a82b225be85627926407ed9a05aefc5e.tar.gz
oopsla21_fvhls-a89ffd74a82b225be85627926407ed9a05aefc5e.zip
Clear up the Introduction and Algorithm section
Diffstat (limited to 'introduction.tex')
-rw-r--r--introduction.tex4
1 files changed, 2 insertions, 2 deletions
diff --git a/introduction.tex b/introduction.tex
index 5a474fe..09190ff 100644
--- a/introduction.tex
+++ b/introduction.tex
@@ -31,12 +31,12 @@ An alternative is to use \emph{translation validation}~\cite{pnueli98_trans} to
But translation validation is an expensive task, especially for large designs, and it must be repeated every time the compiler is invoked.
For example, the translation validation for Catapult C~\cite{mentor20_catap_high_level_synth} may require several rounds of expert `adjustments'~\cite[p.~3]{slec_whitepaper} to the input C program before validation succeeds. And even when it succeeds, translation validation does not provide watertight guarantees unless the validator itself has been mechanically proven correct, which is seldom the case.
-However, translation validation has many benefits in a mechanically verified setting as well to simplify the proofs and depend less on the exact implementation of the optimisation. It has also already been used to prove certain passes in \compcert{} correct. By proving specific optimisations with a constraint on the kinds of transformations it can perform, it is possible to write a verified validator that is also believed to be complete and should not fail on valid transformations unless bugs are present.
+Nevertheless translation validation has many benefits in a mechanically verified setting as well to simplify the proofs to depend less on the exact implementation of the optimisation. It has also already been used to prove certain passes in \compcert{} correct. The main issue with the translation validation methods applied in HLS tools normally is that they try and generalise over all optimisations that are performed and try to compare the generated hardware directly to the high-level input. However, verification requires optimisations to be proven correct incrementally and separately, making translation validation more viable. By proving specific optimisations with a constraint on the kinds of transformations it can perform, it is possible to write a verified validator that is also believed to be complete and should not fail on valid transformations unless bugs are present.
Our position is that none of the above workarounds are necessary if the HLS tool can simply be trusted to work correctly.
\paragraph{Our solution}
-We have designed a new HLS tool in the Coq theorem prover and proved that any output it produces always has the same behaviour as its input. Our tool, called \vericert{},\ifANONYMOUS\footnote{Tool name has been changed for blind review.}\fi{} is automatically extracted to an OCaml program from Coq, which ensures that the object of the proof is the same as the implementation of the tool. \vericert{} is built by extending the \compcert{} verified C compiler~\cite{leroy09_formal_verif_realis_compil} with a new hardware-specific intermediate language and a Verilog back end. It supports most C constructs, including integer operations, function calls, local arrays, structs, unions, and general control-flow statements, but currently excludes support for case statements, function pointers, recursive function calls, integers larger than 32 bits, floats, and global variables.
+We have designed a new HLS tool in the Coq theorem prover and proved that any output it produces always has the same behaviour as its input. Our tool, called \vericert{},\ifANONYMOUS\footnote{Tool name has been changed for blind review.}\fi{} is automatically extracted to an OCaml program from Coq, which ensures that the object of the proof is the same as the implementation of the tool. \vericert{} is built by extending the \compcert{} verified C compiler~\cite{leroy09_formal_verif_realis_compil} with a new hardware-specific intermediate language and a Verilog back end. It supports most C constructs, including integer operations, function calls, local arrays, structs, unions, and general control-flow statements, but currently excludes support for case statements, function pointers, recursive function calls, 32-bit integers, floats, and global variables.
\paragraph{Contributions and Outline}
The contributions of this paper are as follows: