-
Notifications
You must be signed in to change notification settings - Fork 1.3k
feat: add br
& zstd
to supported compression ποΈ algos
#4549
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. Weβll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Zstd compression cannot be supported by hapi itself, as the API is still experimental. This will have to be implemented as a custom compressor / decompressor by the user (or a plugin).
I still don't see much point in adding built-in brotli support, which is already supported using a plugin. The built-in gzip compression makes sense, since it is widely supported and can provide major bandwidth savings. Newer compressors only add a marginal benefit and can have other trade-offs. As such they should only be enabled when configured by the user.
I like that it is possible to customise the encoder priority, but your implementation doesn't support user-defined encoders. A proper implementation would be more complex, and should probably go in a separate PR.
const internals = { | ||
common: ['gzip, deflate', 'deflate, gzip', 'gzip', 'deflate', 'gzip, deflate, br'] | ||
common: [ | ||
'gzip, deflate, br, zstd', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It makes sense to update the common encoding with 'gzip, deflate, br, zstd'
, as this is used by both Chrome and Firefox. This can go in a separate PR.
I really don't understand the reason why Zstd can be disabled by default, and it should solve its experimental status issue. Or even better, both algos can be disabled by default allowing usage of user-defined encoders. Marginal benefits and overall progress is what most of the users usually are looking for. |
I made changes to make both |
This looks sane. I think it makes sense to have this be framework-supported out of the box since, especially given that it simply leverages NodeJS builtins. Even if they are experimental, it helps if the framework ships with it. to @kanongil 's point, however, I would add a notice and a link to the MDN docs explicitly stating that Zstd compression is experimental. Thank you @bricss |
I added a notice π about the experimental π₯Ό status. And also the ability to pass compression options ποΈ to both algorithms. |
@bricss Hey check out these skip patterns:
We're going to need to add some skip patterns for Add the util in |
All done β added node version note π and test skippers π§ |
I found a better way to toast π against |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Regarding Zstd support, I don't think it makes sense to make an exception for the policy around node API feature stability. Especially because there is a standard way to integrate custom compressors.
For brotli, I still don't think we need to include it in core, but I'm not opposed to it, if done properly. This implementation still has several issues, most pressingly a default compression level of 11, which is way too high for just-in-time compression. The default should probably not be more computationally expensive than the default gzip option. A quick test on a 300KB json file shows gzip to take 20 ms
, while brotli takes 335 ms
! For this file, brotli needs to use level 5 to have comparable speed. I would probably choose 4 as default, improving both compression speed and a size.
I wish you would make a separate PR for the priority
option, which provides a meaningful feature by itself.
The issue with As for the experimental status of |
Hapi core doesn't have "experimental" features, and built-in zstd compression certainly doesn't warrant introducing such a concept |
I couldn't find any policy that mentioned anything about experimental features is not being allowed. The only reason why |
Meanwhile, @fastify/compress & hono-compress already have |
@kanongil It kills the ecosystem to not support things like this. I'll all for stability and not breaking user space, but this shouldn't break user spaceβ only enhance it. It just enables Hapi to use modern JS features. If people are using Hapi, want these features, and contribute, I can't think of a good reason to not allow it. @bricss Sorry mate, one last thing: you'll need to tell Lab to not cover those lines since it won't be covered in anything under Node 22 (https://hapi.dev/module/lab/api/?v=26.0.0#inline-enablingdisabling). |
Pushed cure π©Ή for coverage. Hope it works π |
6d07439
to
f07fa73
Compare
Rebased on |
That is quite the hot take. If it "kills the ecosystem" to have to use an alternative, slightly more complex, but more powerful API, then it is a very fragile ecosystem indeed. Many/most? deployments won't care about this, as the compression can already be applied in reverse proxies and CDNs. Implementing features just because a user wants it, is never a good path. Features need to be maintained indefinitely, causing more maintenance work. As such, it should provide a tangible benefit before it should be considered. As I said, the "priority" option is an interesting candidate. Extending the built-in compressors less so, as it doesn't enable anything new. Using experimental APIs, exposing experimental features, and dropping the 100% code coverage policy, even less so! I understand that you want to support new contributors, as do I, especially since @bricss seems to care a lot about doing it right. That is why I try to help guide their efforts. Unfortunately for new contributors, this project has (or at least used to have) a high coding standard, making it difficult to get anything but small fixes merged. |
What kind of maintenance work it may require? Coz, it seems that current algos ain't causing any work, at all π€ Shall it wait until |
Maybe I misspoke. I meant in the sense of contributions and getting eyes and hands on this project. It's my opinion that the framework should offer support for these native builtins related to what it does (http transmission). The competition offers it and we stay behind. That's generally bad for our marketshare.
I agree, but think there's an asterisk with this in particular: what this PR proposes (compression support) is a stable NodeJS feature. The API for compression hasn't changed, so I'm not sure what there is to maintainβ complexity here is quite low. We have also seen people drop by and ask for Brotli compression in Hapi in the past. You've have even made a Plugin to support it. It honestly seems like such a trivial feature-add that it's worth it.
Zstd seems like it's getting wide adoption, eg: Python's adopting it in upcoming release: https://docs.python.org/3.14/whatsnew/3.14.html On the code-coverage critique: Yeah, 100% fair. Lets uphold the high standards. Lets say we came to a consensus, how would you approach code coverage for this, since it is only Node 21+? We can't stay in Node 14 support forever. At some point, we have to move with the times. Node 18 is now legacy and unsupported by all major providers. |
I'd also like to add that I'm trying to add π§© these features to the framework π§° not just bcos I personally need them, but bcos I believe it'll be beneficial for the framework as a whole, and help attract π§² more users. This should be a win-win π for everyone β both the community and its users. I'm already using these features in production, so if it were only about my own needs, its already covered π |
The simple solution is to punt it off to be handled by a 3rd party plugin. Ie. what we already do. To make it more palatable, hapi could make and publish one such plugin. User installed plugins don't need to apply the same standards as Hapi core. For example Inert uses a third party code dependency. All?? other frameworks require plugins for even gzip compression. Again, regarding brotli, I'm not opposed to including it, as such. Though this implementation still has both major and minor issues. The majors being that it introduces a new conflicting way to specify compression options, and have no way to only enable only the compressor (with no decompression). A proper implementation would probably also revisit the current gzip/deflate integration, and allow these to be toggled on/off for consistency. |
@bricss I very much appreciate your enthusiasm and efforts, and I apologise that our responses are not more aligned, making you get caught in the middle. Though our contribution guidelines did advise you to "ask first", to avoid working on something that might not get accepted. |
IMO, we can withhold Alternatively, we could move compression ποΈ entirely out of the core, into a standalone package π¦ with the name |
Yeah, makes sense
That's what I was proposing to do, and perhaps use Gil's package as the base since he's already laid the ground work for it (up to you Gil). Thoughts? |
I thought about creating a general-purpose compression plugin, but moving compression wholesale into a plugin would probably go against the "full out-of-the-box functionality" hapi promise. Making people install a plugin for experimental zstd support does seem quite appropriate. I expect it will work best as a standalone feature. This also means that it can require node 22+, and ignore old versions. Someone will need to create/implement a new I'm looking into reworking the current implementation to better support new encoders, and to allow built-in encoders/decoders to be selectively disabled/enabled. If merged, this will make it simpler to extend with new encoders and a priority option. |
For "full out-of-the-box functionality" it will make more sense just to add missing algos into the core, bump min supported node version to |
@bricss Please don't shame people for using older versions of node. There are a multitude of reasons not to upgrade. We try our best to have empathy, even if we don't understand their reasoning. |
I just tryna motivate people to update π their codebases π Anyway, those who cannot do that for any reason, simply should not update dependencies to newer semver majors π€ |
No, you were pushing for a change in Hapi with an elitist attitude.
β¦and doubling down on the elitism. We try to be considerate of everyone in our community of users, as outlined in the CoC. Something we expect our contributors try to to embody. |
If following semver approach is considered as elitism, then something went wrong. Coz, I was thinking that's how semver versioning works π€· And I'm not pushing for anything, I'm just expressing my opinion. |
This PR is an attempt to bring support for
br
&zstd
compression ποΈ algos into the core.