Skip to content

feat(expand): Enables the use of Pocketbases 'Expand' property #41

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 10 commits into
base: master
Choose a base branch
from

Conversation

cf-david-easton
Copy link

Pocketbase has a useful way to automatically expand relation fields with their data, this PR explores enabled the pocketbase-loader to do the same, this has to be implemented on both schema generation and fetching of the actual data.

At present I am only allowing one level deep, however it would not be terribly complex to allow deeper chaining but there would still need to be a cut off point.

Also one decision I made whilst doing this was to preserve the way Pocketbase returns expanded fields in their own object. However I am wondering if for a more streamlined experience, we should expand the values directly in their original field.

Copy link
Owner

@pawcoding pawcoding left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I had a quick look over the code for some initial feedback, but I'll have a second look on the weekend when I have a bit more time to experiment with it. I'm a bit confused about all that array stuff, but it probably just needs a bit more thinking to understand it.

Regarding the "how deep should we nest", PocketBase itself allows for 6 levels of nesting, so we could also do that. Should be easy since it's just a recursive call. But before that I definitely want to finish a PR myself that reduces the number of calls to get a new token.

About the "overwrite the original value of expanded fields with their relations", I would leave it as is with the current implementation. I don't want to transform the original object to much and only introduced the file URL transformer since the file is almost useless without the actual URL (at least in my use cases so far where I uploaded e.g. an icon for a project).

Can you also turn on GitHub Actions on your fork so that the tests can run? You don't need to set up anything special, just enabling them and maybe pushing another commit should trigger them.

collectionName: expandedFieldDefinition.collectionId,
superuserCredentials: options.superuserCredentials,
localSchema: options.localSchema,
jsonSchemas: options.jsonSchemas,
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's probably fine, but using the same jsonSchemas would leave us unable to define a schema with the same name for the "parent" and the expanded "child" with different values.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I haven't dived too much into the implications of expand with the localSchema and jsonSchemas, how would we go about. resolving this? If its a case of name collisions, can we cleverly modify the collection names for the linked field or check if a pre-existing schema exists before making a new one? Sorry if this sounds like nonsense I need to look much more into the jsonSchema stuff

Comment on lines 161 to 164
expandedFields[expandedFieldName] = z.union([
expandedchema,
z.array(expandedchema)
]);
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is this used for? So that when you have a relation with multiple records it's typed as an array instead of the value?
Since this information is encoded in the original database schema we can use it to only use a single value or array for the type.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes that was my intention with the union to show it could be a single value or an array of the schema type.
Sorry could you elaborate on your 2nd point? Is there a better way I can infer this?

`The provided field in the expand property "${expandedFieldName}" does not have an associated collection linked to it, we need this in order to know the shape of the related schema.`
);
} else {
const expandedchema = await generateSchema({
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note for myself: Now that the generateSchema function can be called multiple times recursively it's definitely time to refactor the whole authentication. I already have a rough draft for this, were I can pass in a promise that resolves the token. This way we only need to do one authentication request.
I'll hopefully finish this up on the weekend.

@@ -129,10 +133,48 @@ export async function generateSchema(
}
}

// Combine the basic schema with the parsed fields
if (options.expand) {
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Doesn't this expand run for every nested layer? So it will always print the error message that the expand property cannot be found on the already expanded collection? 🤔
In other words: where is the cut-of point for x-level nesting?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have address this with new additions today for supporitng 6 levels deep of expand nesting.
I essentially split all expand args as they nest until we are left with an empty array which I check the lenght of and pass undefined to this option so that the very next generateSchema call, expand is undefined for it.

@cf-david-easton
Copy link
Author

Thanks for the great feedback @pawcoding, will give it a proper read over the weekend. Thanks for addressing my initial questions regarding the formatting of the expanded field, was tempted to start work on doing the opposite but glad you addressed it.

A lot of the array stuff came from me realising that relation fields can have multiple relations inside so I had to factor in if it was a singular object or multiple. Maybe a better way of representing and handling this though.

One thing I missed out on my schema gen was the fact that a relational field can be allowed to be null, so when I throw my fork into another project to try it out, it worked like a charm when seeing the collections type for expand but it would fail loading the schema because not all my pockbase items had a relation on them which isn't right so still a few little bits alongside your feedback, thanks again!

Comment on lines 204 to 223
/**
* Builds a Zod object schema from expandedFields, where each property is either a ZodSchema or an array of ZodSchema.
*/
export function buildExpandSchema(
expandedFields: ExpandedFields
): ZodObject<Record<string, ZodSchema | z.ZodArray<ZodSchema>>> {
const shape: Record<string, ZodSchema | z.ZodArray<ZodSchema>> = {};

for (const key in expandedFields) {
const value = expandedFields[key];
if (Array.isArray(value)) {
// If it's an array, wrap the first element as a ZodArray (assuming all elements are the same schema)
shape[key] = z.array(value[0]);
} else {
shape[key] = value;
}
}

return z.object(shape);
}
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That is the purpose of this function?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This was when I was still trying to figure out how to get the conditional shapes of array or singular elements. It was a core misunderstanding by me that I could read the maxSelect property of the field to understand if its a singular or multiple select, have removed and simplified the expanded property.

Comment on lines 195 to 201
return schema.transform((entry) => {
if (Array.isArray(entry)) {
return entry.map((e) => transformFiles(options.url, fileFields, e));
}

return transformFiles(options.url, fileFields, entry);
});
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we really need this? My assumption would be that the transformation for expanded collections is already done it the recursive call 🤔

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes probably, not sure why I added this originally, reverted

(parseEntry as Mock).mockImplementation((entry: PocketBaseEntry) => {
parsedEntries.push(entry);
return entry; // or whatever parseEntry should return
});
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does this also work with spyOn?

vi.spyOn(parseEntry).mockImplementationOnce(/* whatever */)

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Almost, had to do some fiddling and landed on extracting the arg out without a mock implmentation. Made it a global one for the file so the other checks for the call to use it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants