-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hardhat compile yielding no results, not even a "Nothing to compile" message. #4085
Comments
@yvasilyev92 can you try upgrading to the latest version of Hardhat? If that doesn't fix it, then I will need more complete reproduction steps, including which OS and version of node.js you are using. Thanks! |
@fvictorio Thanks for looking into this! I tried upgrading to My repo thats having the issue has >100 contracts , I temporarily removed 30 contracts and the However I still need those 30 contracts & my project will keep growing. I tried using Since I'm assuming its a memory issue causing the OS: macOS Ventura 13.4.1 |
Hmm, that's strange. There are big projects using Hardhat and they don't have this issue. I can take a look if your repo is public (or if it's private but you can give me access). With respect to increasing the memory, you can do this:
|
@fvictorio thanks! Is there a cache hardhat is using for this compilation? (not related to artifact cache)
|
The compilation cache is indeed under |
@fvictorio ah sorry I should've asked is there another local cache or memory space (not the artifacts or cache.solidity-files-cache.json) that hardhat uses when running the compilation command? because I've had 100+ contracts for a while and never had this issue, so I'm wondering if theres anything I can clean out so as to not have to increase memory. My concern is eventually hitting the memory limit again |
No, those should be the only ones involved 😕 |
Hi @fvictorio , so I tried to increase memory size with Rather than deleting 30 contracts I tried deleting just 1 contract and running "
@fvictorio Do you know what could cause an issue like this with Btw after reading the verbose logs in the case of a successful run I noticed the step after
Could the issue lie within that step? |
IIRC, the time indicated there is how much time was ellapsed since the last log, so it's the Which, to be honest, it's not that surprising if you have a lot of contracts (and even less surprising if you are using viaIR) because solcjs is quite slow compared to the native compiler. What I'm not sure about is why you are getting a solcjs compiler instead of a native one, which is much faster. Which OS are you using? |
I had a similar issue a couple of months ago and solved it by disabling the Yul optimizer. The same project also had a Heap out of memory issue, but was when testing not compiling. I was not using |
@clauBv23 thank you! Seems like a very similar issue. So disabling the Yul optimizer helped with the Heap out of memory issue? @fvictorio I'm on macOS Ventura 13.4.1 Update: So with 100+ contracts, I cleared out my I'm still searching for a solution but I suppose this temporary workaround is this.. so in the case I ever need to recompile the |
Hardhat tries to use the native solc first and, if that doesn't work, it switches to solcjs (which is much slower but always works). Since you already tried In macOS, you should run something like this:
(give or take, go down that dir path until you find the compiler that Hardhat downloaded, which in your case seems to be version 0.8.13) What output do you get?
That's terrible, let's see if we can find the root cause here 😞 |
Hey @yvasilyev92, are you still getting this issue? |
I also encountered this issue, and maybe you should try to... wait for around 10 or 20 minutes? For my case (also compiling a lot of contracts in my company's repository), after around 10 minutes, it works as expected. |
@thanhnguyen2187 that's unexpected. Can you share the output with a run with |
@alcuadrado for the output log, I also experienced a freeze at this line:
In the end, it doesn't have any problem:
Not sure what do you mean by |
@thanhnguyen2187 can you share your Hardhat config? |
I'm going to tentatively close this for book-keeping reasons (we don't have enough information to look into it, and it's been some weeks since we asked), but I will happily re-open it if someone can provide more info. |
Hi @fvictorio , apologies for not responding. I'm still having this issue, same as @thanhnguyen2187 .
Anything greater than 266 will not work. I really think the solution for this would be to find where in local storage Hardhat uses and to wipe that completely because I didn't always have this issue. Seems like over time the allowed memory allocation for |
@yvasilyev92 is your project public? These issues are hard to investigate without some way to reproduce them. |
@fvictorio no unfortunately its not public. Its private and the other members have no issue in compiling. Could you help me determine where in local storage hardhat stores any bin files / related files. I want to completely wipe all files hardhat is storing. |
Hey, sorry for not responding sooner. |
Thanks @fvictorio , I was using yarn hardhat compile but after switching to npx I'm getting a new log: |
@fvictorio some additional testing i've been doing.
And on the macbook pro where I keep getting a failed
|
@fvictorio I was somewhat able to fix the issue. Inside my |
Version of Hardhat
2.12.1
What happened?
Earlier today
yarn hardhat compile
was working fine.Now when I run it I get no results at all, and no changes to local
hardhat.config.ts
file orpackage.json
was made.I tried opening an example hardhat typescript project and
yarn hardhat compile
works fine, however with my original repo no such luck.I ran
yarn hardhat compile --verbose
and this is the result:yarn hardhat clean --global
also made no change.Minimal reproduction steps
n/a
Search terms
No response
The text was updated successfully, but these errors were encountered: