Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Amazon EC2 t1.micro can't run HHVM #1129

Closed
makersunion opened this issue Oct 4, 2013 · 38 comments
Closed

Amazon EC2 t1.micro can't run HHVM #1129

makersunion opened this issue Oct 4, 2013 · 38 comments

Comments

@makersunion
Copy link

Server config:
EC2 t1.micro Ubuntu12.04 64bit

HHVM version:
HipHop VM v2.1.0-dev (rel)

Error alert is :
tcmalloc: large alloc 1209565184 bytes == (nil) @
tcmalloc: large alloc 1209565184 bytes == (nil) @
could not allocate 1209565183 bytes for translation cache

And I've tried this: b5163ad
But, still not work.
Help me, please.

@scannell
Copy link
Contributor

scannell commented Oct 4, 2013

What happens when you try that? Are you sure you specified those settings correctly? (It should be complaining about a different number of bytes at least.)

@makersunion
Copy link
Author

No, always show :could not allocate 1209565183 bytes for translation cache .
And I used code :
Eval {
JitASize = 256 << 20 # 256MB
JitAStubsSize = 256 << 20 # 256MB
JitGlobalDataSize = JitASize >> 2 # 64MB
}
When I try : hhvm -m daemon -u ubuntu -c hhvm.hdf show this alert.

@scannell
Copy link
Contributor

scannell commented Oct 4, 2013

Try passing the actual numbers? I don't think configuration file arguments are evaluated...

@makersunion
Copy link
Author

Actual numbers mean is JitASize = 256M ?

@scannell
Copy link
Contributor

scannell commented Oct 4, 2013

An integer number of bytes, e.g. 268435456.

@makersunion
Copy link
Author

There is no change. Eval in Server{ }? or Server { } Eval { }?

@scannell
Copy link
Contributor

scannell commented Oct 4, 2013

Should just be under Eval { }.

@makersunion
Copy link
Author

No work :(

@scannell
Copy link
Contributor

scannell commented Oct 4, 2013

Try passing the arguments on the commandline, or just attaching a debugger in the runtime option parsing code and seeing if it's finding the values -- this should narrow down the problem.

@makersunion
Copy link
Author

Eval {
JitASize = 268435456
JitAStubsSize = 268435456
JitGlobalDataSize = 67108864
}
Eval.Debugger {
EnableDebugger = true
EnableDebuggerServer = true
Port = 80
DefaultSandboxPath = /var/www/
}

Just have this , no change.

@makersunion
Copy link
Author

hhvm --debug-config hhvm.hdf
No change. :(

@scannell
Copy link
Contributor

scannell commented Oct 4, 2013

I verified that using this in a .hdf file:

Eval {
JitASize = 1024
JitAStubsSize = 1024
JitGlobalDataSize = 100
}

That the VM terminates because the sizes are too small, so these values are taking effect. Are you sure that .hdf file exists? I'd try stepping through the code.

@makersunion
Copy link
Author

Yeah,
root@ubuntu:/etc# hhvm -m daemon -u ubuntu -c hhvm.hdf
tcmalloc: large alloc 1209565184 bytes == (nil) @
tcmalloc: large alloc 1209565184 bytes == (nil) @
could not allocate 1209565183 bytes for translation cache
I'm sure /etc/hhvm.hdf is OK

@makersunion
Copy link
Author

This is same question: http://www.swageroo.com/wordpress/hiphop-vm-on-ec2-could-not-allocate-1210089471-bytes-for-translation-cache/ m1.small is OK ,but t1.micro not work. And Amazon EC2 t1.micro memory is 678MB

@scannell
Copy link
Contributor

scannell commented Oct 4, 2013

What happens when you step through the code?

@makersunion
Copy link
Author

Yeap~!
When I used this code :
Eval {
JitASize = 1024
JitAStubsSize = 1024
JitGlobalDataSize = 100
}
alert:
Log file not specified under daemon mode.\n\n

@makersunion
Copy link
Author

Look like is worked.Wait ... I try it now.

@makersunion
Copy link
Author

uh... :(
root@ubuntu:/etc# hhvm -m daemon -u ubuntu -c hhvm_debug.hdf
Log file not specified under daemon mode.\n\n
root@ubuntu:/etc# hhvm -m server
mapping self...
mapping self took 0'00" (54830 us) wall time
loading static content...
searching all files under source root...
analyzing 728 files under source root...
loaded 0 bytes of static content in total
loading static content took 0'00" (22046 us) wall time
tcmalloc: large alloc 1209565184 bytes == (nil) @
tcmalloc: large alloc 1209565184 bytes == (nil) @
could not allocate 1209565183 bytes for translation cache

No work...

@scannell
Copy link
Contributor

scannell commented Oct 4, 2013

Of course the second time didn't work, you didn't specify an .hdf file...

@makersunion
Copy link
Author

How to specify an .hdf file?

@makersunion
Copy link
Author

hhvm -m server -u ubuntu -c hhvm_debug.hdf ?

@makersunion
Copy link
Author

Allocation sizes ASize, AStubsSize, and GlobalDataSize are too small.

@makersunion
Copy link
Author

I try 512MB to byte?

@scannell
Copy link
Contributor

scannell commented Oct 4, 2013

If you got the 'too small' portion, you specified it correctly. Now change the values to sizes that will fit on your VM.

@makersunion
Copy link
Author

I try config for t1.micro use this code:
Eval {
JitASize = 1149304830
JitAStubsSize = 1149304830
JitGlobalDataSize = 608206847
}
But ,alert this:
Combined size of ASize, AStubSize, and GlobalDataSize must be < 2GiB to support 32-bit relative addresses
What does this mean?

@makersunion
Copy link
Author

Yeap~! It's worked
t1.micro use this code:
Eval {
JitASize = 134217728
JitAStubsSize = 134217728
JitGlobalDataSize = 67108864
}

I want to try the most appropriate distribution ratio for Amazon EC2 t1.micro.
Thanks scannell very much.
I am very grateful for that.

@scannell
Copy link
Contributor

scannell commented Oct 4, 2013

Glad you got it working.

@scannell scannell closed this as completed Oct 4, 2013
@kstan79
Copy link

kstan79 commented Oct 24, 2013

Can I know compiling hhvm under Amazon ec2 micro take how long time.

@karpa13a
Copy link

@kstan79 use ubuntu packages
@sgolemon maybe it's time to make "nightly" builds repo?

@scannell
Copy link
Contributor

In general we've been talking about other release/packing strategies but haven't made any concrete decisions yet. (In the meantime, anyone is free to set up nightly builds for us from the available source if they would like.)

@karpa13a
Copy link

@scannell can you submit in master or in this issue debian/ directory contents for self build

@scannell
Copy link
Contributor

@sgolemon is the right person to ask for that -- I'm not entirely sure how she packages these releases.

@jrborbars
Copy link

I'm trying to run hhvm in the t1.micro instance (AWS). The messsage appear when I tried to run:

sudo /usr/bin/hhvm --mode daemon --user web --config /etc/hhvm.hdf

from http://www.hhvm.com/blog/113/getting-wordpress-running-on-hhvm.

alert:
Log file not specified under daemon mode.\n\n

This is related with:

Eval {
JitASize = 134217728
JitAStubsSize = 134217728
JitGlobalDataSize = 67108864
}

Where I put this code? In server.hdf or config.hdf?

Thanks a lot.

@scannell
Copy link
Contributor

scannell commented Jan 3, 2014

I'm not sure if you still need to change the JIT sizes after 5d60ea9. The alert is telling you just that -- I'm guessing /etc/hhvm.hdf doesn't specify a log. The name of the .hdf file doesn't matter, the only thing that matters is if you want HHVM to use it you need to specify it as the --config argument when starting HHVM.

@jrborbars
Copy link

Ok. But /etc/hhvm.hdf doesn't exists anymore. When I install from binaries (sudo apt-get install hhvm) three files are created under /etc/hhvm directory: config.hdf, server.hdf and php.ini (this later is an empty file). My instance seems running (after run sudo /usr/bin/hhvm --mode daemon --config /etc/hhvm/server.hdf), but I pointed to my index.php under app directory and the only response is: not found.

What I miss?
Thanks for your quick response too.

P.S.: the header of response is (from firebug):
Content-Length 9
Content-Type text/html; charset=utf-8
Date Fri, 03 Jan 2014 23:21:50 GMT
X-Powered-By HPHP

@scannell
Copy link
Contributor

scannell commented Jan 3, 2014

I believe in that article it says to create /etc/hhvm.hdf using the sample config file in the article. You can either use the inbuilt ones that ship with the binaries or the one in the article -- pick one and use that as the --config argument.

@jrborbars
Copy link

Yes, I picked one (server.hdf, provided as is). In one hand the server appear to be running, but in the other hand none of the php code was processed, and I don't know why.

@scannell
Copy link
Contributor

scannell commented Jan 4, 2014

For wordpress i would use a config file with the contents of the one on the wiki, otherwise you probably need to make sure all the paths are correct.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants