GraphQL::FragmentCache
powers up graphql-ruby with the ability to cache response fragments: you can mark any field as cached and it will never be resolved again (at least, while cache is valid). For instance, the following code caches title
for each post:
class PostType < BaseObject
field :id, ID, null: false
field :title, String, null: false, cache_fragment: true
end
Add the gem to your Gemfile gem 'graphql-fragment_cache'
and add the plugin to your schema class (make sure to turn interpreter mode on with AST analysis!):
class GraphqSchema < GraphQL::Schema
use GraphQL::Execution::Interpreter
use GraphQL::Analysis::AST
use GraphQL::FragmentCache
query QueryType
end
Include GraphQL::FragmentCache::Object
to your base type class:
class BaseType < GraphQL::Schema::Object
include GraphQL::FragmentCache::Object
end
If you're using resolvers — include the module into the base resolver as well:
class Resolvers::BaseResolver < GraphQL::Schema::Resolver
include GraphQL::FragmentCache::ObjectHelpers
end
Now you can add cache_fragment:
option to your fields to turn caching on:
class PostType < BaseObject
field :id, ID, null: false
field :title, String, null: false, cache_fragment: true
end
Alternatively, you can use cache_fragment
method inside resolver methods:
class QueryType < BaseObject
field :post, PostType, null: true do
argument :id, ID, required: true
end
def post(id:)
cache_fragment { Post.find(id) }
end
end
If you use connections and plan to cache them—please turn on brand new connections hierarchy in your schema:
class GraphqSchema < GraphQL::Schema
# ...
use GraphQL::Pagination::Connections
end
Cache keys consist of the following parts: namespace, implicit key, and explicit key.
You can optionally define a namespace that will be prefixed to every cache key:
GraphQL::FragmentCache.namespace = "my-prefix"
Implicit part of a cache key contains the information about the schema and the current query. It includes:
- Hex gsdigest of the schema definition (to make sure cache is cleared when the schema changes).
- The current query fingerprint consisting of a path to the field, arguments information and the selections set.
Let's take a look at the example:
query = <<~GQL
query {
post(id: 1) {
id
title
cachedAuthor {
id
name
}
}
}
GQL
schema_cache_key = GraphqSchema.schema_cache_key
path_cache_key = "post(id:1)/cachedAuthor"
selections_cache_key = "[#{%w[id name].join(".")}]"
query_cache_key = Digest::SHA1.hexdigest("#{path_cache_key}#{selections_cache_key}")
cache_key = "#{schema_cache_key}/#{query_cache_key}/#{object_cache_key}"
You can override schema_cache_key
, query_cache_key
, path_cache_key
or object_cache_key
by passing parameters to the cache_fragment
calls:
class QueryType < BaseObject
field :post, PostType, null: true do
argument :id, ID, required: true
end
def post(id:)
cache_fragment(query_cache_key: "post(#{id})") { Post.find(id) }
end
end
Overriding path_cache_key
might be helpful when you resolve the same object nested in multiple places (e.g., Post
and Comment
both have author
), but want to make sure cache will be invalidated when selection set is different.
Same for the option:
class PostType < BaseObject
field :id, ID, null: false
field :title, String, null: false, cache_fragment: {query_cache_key: "post_title"}
end
Overriding object_cache_key
is helpful in the case where the value that is cached is different than the one used as a key, like a database query that is pre-processed before caching.
class QueryType < BaseObject
field :post, PostType, null: true do
argument :id, ID, required: true
end
def post(id:)
query = Post.where("updated_at < ?", Time.now - 1.day)
cache_fragment(object_cache_key: query.cache_key) { query.some_process }
end
end
In most cases you want your cache key to depend on the resolved object (say, ActiveRecord
model). You can do that by passing an argument to the #cache_fragment
method in a similar way to Rails views #cache
method:
def post(id:)
post = Post.find(id)
cache_fragment(post) { post }
end
You can pass arrays as well to build a compound cache key:
def post(id:)
post = Post.find(id)
cache_fragment([post, current_account]) { post }
end
You can omit the block if its return value is the same as the cached object:
# the following line
cache_fragment(post)
# is the same as
cache_fragment(post) { post }
Using literals: Even when using a same string for all queries, the cache changes per argument and per selection set (because of the query_key).
def post(id:)
cache_fragment("find_post") { Post.find(id) }
end
Combining with options:
def post(id:)
cache_fragment("find_post", expires_in: 5.minutes) { Post.find(id) }
end
Dynamic cache key:
def post(id:)
last_updated_at = Post.select(:updated_at).find_by(id: id)&.updated_at
cache_fragment(last_updated_at, expires_in: 5.minutes) { Post.find(id) }
end
Note the usage of .select(:updated_at)
at the cache key field to make this verifying query as fastest and light as possible.
You can also add touch options for the belongs_to association e.g author's belongs_to: :post
to have a touch: true
.
So that it invalidates the Post when the author is updated.
When using cache_fragment:
option, it's only possible to use the resolved value as a cache key by setting:
field :post, PostType, null: true, cache_fragment: {cache_key: :object} do
argument :id, ID, required: true
end
# this is equal to
def post(id:)
cache_fragment(Post.find(id))
end
Also, you can pass :value
to the cache_key:
argument to use the returned value to build a key:
field :post, PostType, null: true, cache_fragment: {cache_key: :value} do
argument :id, ID, required: true
end
# this is equal to
def post(id:)
post = Post.find(id)
cache_fragment(post) { post }
end
The way cache key part is generated for the passed argument is the following:
- Use
object_cache_key: "some_cache_key"
if passed tocache_fragment
- Use
#graphql_cache_key
if implemented. - Use
#cache_key
(or#cache_key_with_version
for modern Rails) if implemented. - Use
self.to_s
for primitive types (strings, symbols, numbers, booleans). - Raise
ArgumentError
if none of the above.
By default, we do not take context into account when calculating cache keys. That's because caching is more efficient when it's context-free.
However, if you want some fields to be cached per context, you can do that either by passing context objects directly to the #cache_fragment
method (see above) or by adding a context_key
option to cache_fragment:
.
For instance, imagine a query that allows the current user's social profiles:
query {
socialProfiles {
provider
id
}
}
You can cache the result using the context (context[:user]
) as a cache key:
class QueryType < BaseObject
field :social_profiles, [SocialProfileType], null: false, cache_fragment: {context_key: :user}
def social_profiles
context[:user].social_profiles
end
end
This is equal to using #cache_fragment
the following way:
class QueryType < BaseObject
field :social_profiles, [SocialProfileType], null: false
def social_profiles
cache_fragment(context[:user]) { context[:user].social_profiles }
end
end
Use the if:
(or unless:
) option:
def post(id:)
cache_fragment(if: current_user.nil?) { Post.find(id) }
end
# or
field :post, PostType, cache_fragment: {if: -> { current_user.nil? }} do
argument :id, ID, required: true
end
You can configure default options that will be passed to all cache_fragment
calls and cache_fragment:
configurations. For example:
GraphQL::FragmentCache.configure do |config|
config.default_options = {
expires_in: 1.hour, # Expire cache keys after 1 hour
schema_cache_key: nil # Do not clear the cache on each schema change
}
end
You can force the cache to renew during query execution by adding
renew_cache: true
to the query context:
MyAppSchema.execute("query { posts { title } }", context: {renew_cache: true})
This will treat any cached value as missing even if it's present, and store fresh new computed values in the cache. This can be useful for cache warmers.
It's up to your to decide which caching engine to use, all you need is to configure the cache store:
GraphQL::FragmentCache.configure do |config|
config.cache_store = MyCacheStore.new
end
Or, in Rails:
# config/application.rb (or config/environments/<environment>.rb)
Rails.application.configure do |config|
# arguments and options are the same as for `config.cache_store`
config.graphql_fragment_cache.store = :redis_cache_store
end
#read(key)
, #exist?(key)
and #write_multi(hash, **options)
or #write(key, value, **options)
methods.
The gem provides only in-memory store out-of-the-box (GraphQL::FragmentCache::MemoryStore
). It's used by default.
You can pass store-specific options to #cache_fragment
or cache_fragment:
. For example, to set expiration (assuming the store's #write
method supports expires_in
option):
class PostType < BaseObject
field :id, ID, null: false
field :title, String, null: false, cache_fragment: {expires_in: 5.minutes}
end
class QueryType < BaseObject
field :post, PostType, null: true do
argument :id, ID, required: true
end
def post(id:)
cache_fragment(expires_in: 5.minutes) { Post.find(id) }
end
end
If you want to call #cache_fragment
from places other that fields or resolvers, you'll need to pass context
explicitly and turn on raw_value
support. For instance, let's take a look at this extension:
class Types::QueryType < Types::BaseObject
class CurrentMomentExtension < GraphQL::Schema::FieldExtension
# turning on cache_fragment support
include GraphQL::FragmentCache::ObjectHelpers
def resolve(object:, arguments:, context:)
# context is passed explicitly
cache_fragment(context: context) do
result = yield(object, arguments)
"#{result} (at #{Time.now})"
end
end
end
field :event, String, null: false, extensions: [CurrentMomentExtension]
def event
"something happened"
end
end
With this approach you can use #cache_fragment
in any place you have an access to the context
. When context is not available, the error cannot find context, please pass it explicitly
will be thrown.
If you have a fragment that accessed from multiple times (e.g., if you have a list of items that belong to the same owner, and owner is cached), you can avoid multiple cache reads by using :keep_in_context
option:
class QueryType < BaseObject
field :post, PostType, null: true do
argument :id, ID, required: true
end
def post(id:)
cache_fragment(keep_in_context: true, expires_in: 5.minutes) { Post.find(id) }
end
end
This can reduce a number of cache calls but increase memory usage, because the value returned from cache will be kept in the GraphQL context until the query is fully resolved.
Schema#execute
, graphql-batch and graphql-ruby-fragment_cache do not play well together. The problem appears whencache_fragment
is inside the.then
block:
def cached_author_inside_batch
AuthorLoader.load(object).then do |author|
cache_fragment(author, context: context)
end
end
The problem is that context is not properly populated inside the block (the gem uses :current_path
to build the cache key). There are two possible workarounds: use dataloaders or manage :current_path
manually:
def cached_author_inside_batch
outer_path = context.namespace(:interpreter)[:current_path]
AuthorLoader.load(object).then do |author|
context.namespace(:interpreter)[:current_path] = outer_path
cache_fragment(author, context: context)
end
end
- Caching does not work for Union types, because of the
Lookahead
implementation: it requires the exact type to be passed to theselection
method (you can find the discussion here). This method is used for cache key building, and I haven't found a workaround yet (PR in progress). If you getFailed to look ahead the field
error — please passquery_cache_key
explicitly:
field :cached_avatar_url, String, null: false
def cached_avatar_url
cache_fragment(query_cache_key: "post_avatar_url(#{object.id})") { object.avatar_url }
end
Based on the original gist by @palkan and @ssnickolay.
Bug reports and pull requests are welcome on GitHub at https://github.com/DmitryTsepelev/graphql-ruby-fragment_cache.
The gem is available as open source under the terms of the MIT License.