-
Notifications
You must be signed in to change notification settings - Fork 105
Description
While I love that the A+ Promise gives us the ability to compose things in a functional context, I found that the current implementation requires quite a bit of monkey-patching and additional types/interfaces to take advantage of the natural concurrency that futures typically give us. Would the team be open to a PR around a basic future/task/IO type that can take advantage of nonblocking IO?
GraphQL queries are a natural fit for running concurrent fetches at each layer: multiple independent Active Record queries, web requests (e.g., Elasticsearch), etc. The interface could even be the same as Promise
(though I'd recommend that #then
be sugar for #map
and #flat_map
, and a few additional methods to make things more predictable/composable).
Here's an example for a single resolver:
resolve ->(_, inputs, ctx) {
Task
.all(current_user_required_task(ctx), relay_node_task(inputs[:categoryId], ctx))
.then { |user, category| create_project_task(user, category, inputs[:description]) }
.then { |project| { project: project } }
}
# *_task naming for explicitness
This mutation can fire off Active Record queries for the current user and category at the same time using Thread
. As soon as one fails it short-circuits by raising an exception and it doesn't need to wait for the other query to complete. When both queries succeed, it can immediately flat-map onto another task that creates the project, which maps into the final return value.
A library like GraphQL Batch could further take advantage of concurrency by using Task.all
internally at each nested node of the query tree. Fulfilled tasks are memoized, so later queries can be reduced to avoid fetching records that earlier queries have already fetched and combine them afterwards.
There's lots of potential here, but a bit of work to get there. I'm curious about community interest, though, seeing as we're beginning to explore these benefits internally.