Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cranelift: Run verifier after optimizations during testing #6855

Closed
afonso360 opened this issue Aug 17, 2023 · 6 comments · Fixed by #6858
Closed

cranelift: Run verifier after optimizations during testing #6855

afonso360 opened this issue Aug 17, 2023 · 6 comments · Fixed by #6858

Comments

@afonso360
Copy link
Contributor

afonso360 commented Aug 17, 2023

👋 Hey

Feature

We should run the verifier after all of the optimization passes during testing, or when running via clif-util.

Benefit

This would really improve the error messages for egraphs optimization authors.

Here's a recent example: #6851 (comment) these error messages are provided by the backends, but that really confuses things since what is really happening is that we are producing invalid CLIF in the mid end.

Both of those examples would have really neat error messages if they were produced by the verifier.

Implementation

We probably don't want to enable this by default since it will slow down the compilation pipeline.

On of my ideas on how to implement this would be to add a new flag (i.e. run_verifier_after_opts) that would be disabled by default but we could enable in clif-util and in our test runner.

I don't know if we already have a do_extra_checks flag, but we can also maybe use that one. Or we could maybe even use enable_verifier itself.

Alternatives

Running this in the fuzzer is also an option but I think the costs might outweigh the benefits. I'm not sure.

We can also not do this, and write down somewhere that it's a good idea to recompile the optimized code manually. I have previously started doing that after encountering weird issues, and this time it worked out.

@bjorn3
Copy link
Contributor

bjorn3 commented Aug 17, 2023

Isn't this just a missing self.verify_if(fisa) call at the end of egraph_pass? All other optimizations have this already.

@afonso360
Copy link
Contributor Author

Oh yeah, If we're already running it for all of the other passes we might as well run it for this one as well!

@gurry
Copy link
Contributor

gurry commented Aug 18, 2023

Replacing the Ok(()) in the last line of this function:

pub fn egraph_pass(&mut self) -> CodegenResult<()> {
let _tt = timing::egraph();
trace!(
"About to optimize with egraph phase:\n{}",
self.func.display()
);
self.compute_loop_analysis();
let mut alias_analysis = AliasAnalysis::new(&self.func, &self.domtree);
let mut pass = EgraphPass::new(
&mut self.func,
&self.domtree,
&self.loop_analysis,
&mut alias_analysis,
);
pass.run();
log::debug!("egraph stats: {:?}", pass.stats);
trace!("After egraph optimization:\n{}", self.func.display());
Ok(())
}

with self.verify_if(fisa) should do the trick, right?

@gurry
Copy link
Contributor

gurry commented Aug 18, 2023

Making the above change causes the following error to be displayed:

FAIL filetests\filetests\egraph\simd-splat-simplify.clif: optimize

Caused by:
    function %sshr_splat_into_splat_sshr(i64, i64) -> i64x2 fast {
    block0(v0: i64, v1: i64):
        v4 = sshr v0, v1
    ;   ^~~~~~~~~~~~~~~~
    ; error: inst3 (v4 = sshr.i64x2 v0, v1): arg 0 (v0) has type i64, expected i64x2

        v5 = splat.i64x2 v4
        v6 -> v5
    ;   ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    ; error: inst4 (v5 = splat.i64x2 v4): arg 0 (v4) has type i64x2, expected i64

        return v5
    }

    ; 2 verifier errors detected (see above). Compilation aborted.

1356 tests
Error: 1 failure
error: test failed, to rerun pass `--test filetests`

when cargo test is run against the following opt with a missing lane-type:

(rule (simplify (sshr ty (splat ty x) y))
      (splat ty (sshr ty x y)))

So looks like it works as expected.

@afonso360
Copy link
Contributor Author

Yeah, That's pretty much it! Would you like to open a PR with that change?

@gurry
Copy link
Contributor

gurry commented Aug 18, 2023

Yes, let me do that.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants