-
Notifications
You must be signed in to change notification settings - Fork 13.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement lazy decoding of DefPathTable during incremental compilation #74967
Conversation
(rust_highfive has picked a reviewer for you, use r? to override) |
@bors try @rust-timer queue |
Awaiting bors try build completion |
⌛ Trying commit c169390670b5400b83c79abc8662ffd76eaa321a with merge efc44d114215b797e4e501a021a02c4ab500aea0... |
@bors try- |
This comment has been minimized.
This comment has been minimized.
@bors try |
⌛ Trying commit 983d06c6ac41a906cfb178e46a21a2fe5e0a561a with merge db610e4857f4482661cb2f4a30150f12ec9a6cca... |
☀️ Try build successful - checks-actions, checks-azure |
Queued db610e4857f4482661cb2f4a30150f12ec9a6cca with parent c058a8b, future comparison URL. |
Finished benchmarking try commit (db610e4857f4482661cb2f4a30150f12ec9a6cca): comparison url. Benchmarking this pull request likely means that it is perf-sensitive, so we're automatically marking it as not fit for rolling up. Please note that if the perf results are neutral, you should likely undo the rollup=never given below by specifying Importantly, though, if the results of this run are non-neutral do not roll this PR up -- it will mask other regressions or improvements in the roll up. @bors rollup=never |
Looks like a regression for larger crates, but win for tiny crates. |
@bors try @rust-timer queue |
Awaiting bors try build completion |
⌛ Trying commit c581132756dd09ad72ff2e694d63441d70f2e201 with merge 6616659d34a250401009ff4286292cf29487552b... |
☀️ Try build successful - checks-actions, checks-azure |
Queued 6616659d34a250401009ff4286292cf29487552b with parent 3a92b99, future comparison URL. |
Finished benchmarking try commit (6616659d34a250401009ff4286292cf29487552b): comparison url. Benchmarking this pull request likely means that it is perf-sensitive, so we're automatically marking it as not fit for rolling up. Please note that if the perf results are neutral, you should likely undo the rollup=never given below by specifying Importantly, though, if the results of this run are non-neutral do not roll this PR up -- it will mask other regressions or improvements in the roll up. @bors rollup=never |
c581132
to
57f40e2
Compare
@bors try @rust-timer queue |
Awaiting bors try build completion |
⌛ Trying commit 57f40e2c3a3717367024a93b7409ee32ea3c5052 with merge 8a163407b95486330c3d2cd8b9e49d07a35187c5... |
☀️ Try build successful - checks-actions |
Queued bf8e95436e60effbeb46a32e17df8ab7fcb0c6ad with parent db79d2f, future comparison URL. |
Finished benchmarking try commit (bf8e95436e60effbeb46a32e17df8ab7fcb0c6ad): comparison url. Benchmarking this pull request likely means that it is perf-sensitive, so we're automatically marking it as not fit for rolling up. Please note that if the perf results are neutral, you should likely undo the rollup=never given below by specifying Importantly, though, if the results of this run are non-neutral do not roll this PR up -- it will mask other regressions or improvements in the roll up. @bors rollup=never |
Regressions of up to 0.1% in non-incrememtal mode. This is now ready for review again. |
@@ -77,6 +77,10 @@ crate struct CrateMetadata { | |||
raw_proc_macros: Option<&'static [ProcMacro]>, | |||
/// Source maps for code from the crate. | |||
source_map_import_info: OnceCell<Vec<ImportedSourceFile>>, | |||
/// For every definition in this crate, maps its `DefPathHash` to its | |||
/// `DefIndex`. See `raw_def_id_to_def_id` for more details about how |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should this comment say def_path_hash_to_def_id
instead?
// We have just loaded a deserialized `DepNode` from the previous | ||
// compilation session into the current one. If this was a foreign `DefId`, | ||
// then we stored additional information in the incr comp cache when we | ||
// initially created its fingerprint (see `DepNodeParams::to_fingerprint`) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
super super nit: missing a period ending the sentence after the parenthesis here, making this whole paragraph look like one big run-on sentence.
@bors r+ |
📌 Commit 7a9aa4f has been approved by |
☀️ Test successful - checks-actions |
rustc_metadata: Remove some dead code Follow up to rust-lang#74967
Fixes rust-lang#79661 In incremental compilation mode, we update a `DefPathHash -> DefId` mapping every time we create a `DepNode` for a foreign `DefId`. This mapping is written out to the on-disk incremental cache, and is read by the next compilation session to allow us to lazily decode `DefId`s. When we decode a `DepNode` from the current incremental cache, we need to ensure that any previously-recorded `DefPathHash -> DefId` mapping gets recorded in the new mapping that we write out. However, PR rust-lang#74967 didn't do this in all cases, leading to us being unable to decode a `DefPathHash` in certain circumstances. This PR refactors some of the code around `DepNode` deserialization to prevent this kind of mistake from happening again.
…wesleywiser Properly re-use def path hash in incremental mode Fixes rust-lang#79661 In incremental compilation mode, we update a `DefPathHash -> DefId` mapping every time we create a `DepNode` for a foreign `DefId`. This mapping is written out to the on-disk incremental cache, and is read by the next compilation session to allow us to lazily decode `DefId`s. When we decode a `DepNode` from the current incremental cache, we need to ensure that any previously-recorded `DefPathHash -> DefId` mapping gets recorded in the new mapping that we write out. However, PR rust-lang#74967 didn't do this in all cases, leading to us being unable to decode a `DefPathHash` in certain circumstances. This PR refactors some of the code around `DepNode` deserialization to prevent this kind of mistake from happening again.
PR #75813 implemented lazy decoding of the
DefPathTable
from crate metadata. However, it requires decoding the entireDefPathTable
when incremental compilation is active, so that we can map a decodedDefPathHash
to aDefId
from an arbitrary crate.This PR adds support for lazy decoding of dependency
DefPathTable
s when incremental compilation is active.When we load the incremental cache and dep
graph, we need the ability to map a
DefPathHash
to aDefId
in thecurrent compilation session (if the corresponding definition still
exists).
This is accomplished by storing the old
DefId
(that is, theDefId
from the previous compilation session) for each
DefPathHash
we need toremap. Since a
DefPathHash
includes the owning crate, the old crate isguaranteed to be the right one (if the definition still exists). We then
use the old
DefIndex
as an initial guess, which we validate bycomparing the expected and actual
DefPathHash
es. In most cases,foreign crates will be completely unchanged, which means that we our
guess will be correct. If our guess is wrong, we fall back to decoding
the entire
DefPathTable
for the foreign crate. This still representsan improvement over the status quo, since we can skip decoding the
entire
DefPathTable
for other crates (where all of our guesses werecorrect).