-
-
Notifications
You must be signed in to change notification settings - Fork 592
feat: custom AI executors #1903
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Draft
YousefED
wants to merge
8
commits into
feature/call-streamtools-manually
Choose a base branch
from
feature/ai-executors
base: feature/call-streamtools-manually
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Draft
+1,331
−658
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
This was referenced Jul 31, 2025
@blocknote/ariakit
@blocknote/code-block
@blocknote/core
@blocknote/mantine
@blocknote/react
@blocknote/server-util
@blocknote/shadcn
@blocknote/xl-ai
@blocknote/xl-docx-exporter
@blocknote/xl-email-exporter
@blocknote/xl-multi-column
@blocknote/xl-odt-exporter
@blocknote/xl-pdf-exporter
commit: |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Instead of the current solution where BlockNote makes direct calls to LLMs from the client using the Vercel AI SDK (or via a proxy), you can now define custom executors and route calls via your own backend.
On your backend, you can then invoke your LLM (potentially adding more context, tools, etc). While a bit more work to set up, this architecture is more inline with the Vercel SDK:
streamObject
,generateObject
,streamText
orgenerateText
I think this will be a significant architectural improvement, but consider this a first draft / RFC
streamObject
andgenerateObject
are implemented for now) (source)(at this moment the live demo doesn't work yet - it will require a deploy of the server (which is based on
main
))TODOs
streamText
/generateText
instead of object generationTo discuss
Messages configurability
The current implementation still composes the
messages
array on the client (configurable via a PromptBuilder). Of course this can still be modified on the server. Alternatively, we could choose to pass all relevant information from the editor (document, cursor position, etc) to the backend, and compose the messages array there.Response handling
The current demo (see server code) passes in the schema from BlockNote and directly responses to the client as a datastream. Real-world use-cases might also require mixing with other (app-specific) tools which would change the schema. We'll need some feedback on whether this needs better support in the API. Of course, it's always possible to construct your own schema (input) and datastream (output), but this might not be the ideal DX