Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue 70 - Make Rack::Cache multithread friendly #71

Merged
merged 1 commit into from
Aug 5, 2012

Conversation

jtblin
Copy link
Contributor

@jtblin jtblin commented Aug 4, 2012

As described in #70, this will fix the issue with asstes not being loaded correctly when using puma, a multithreaded ruby app server. This changes /lib/rack/cache/context.rb call method from

def call(env)
  if env['rack.run_once']
    call! env
  else
    clone.call! env
  end
end

to

def call(env)
  if env['rack.run_once'] && !env['rack.multithread']
    call! env
  else
    clone.call! env
  end
end

Let me know what you think.

@rtomayko
Copy link
Owner

rtomayko commented Aug 5, 2012

Seems sane enough. Why would rack.run_once be set in that case though?

@rtomayko
Copy link
Owner

rtomayko commented Aug 5, 2012

Sorry, I missed the description in #70. It still seems really wrong that puma would set run_once true but I guess we'll go with it. The run_once path is mostly there for tests. This should be fine.

rtomayko added a commit that referenced this pull request Aug 5, 2012
Issue 70 - Make Rack::Cache multithread friendly
@rtomayko rtomayko merged commit 5f0ce14 into rtomayko:master Aug 5, 2012
@jtblin
Copy link
Contributor Author

jtblin commented Aug 6, 2012

Yes you're correct that puma should not set rack.run_once to true, they will actually change that in 1.0.7 but it doesn't hurt to test for rack.multithread anyways. Thanks for accepting the pull request!

@Gibheer
Copy link

Gibheer commented Dec 21, 2012

Hi,

can you please do a release with that fix? I also hit this bug and search two days for a solution.

thank you

@amacneil
Copy link

👍 please publish a gem version with this patch included

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants