Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

enable gzip encoding for capcitor http #6387

Closed
wants to merge 45 commits into from

Conversation

Dante1349
Copy link
Contributor

@Dante1349 Dante1349 commented Mar 12, 2023

this is a follow up of capacitor-community/http#222 since the community plugin was not maintined anymore shortly after we opened that pull request.

@Dante1349 Dante1349 marked this pull request as ready for review March 14, 2023 12:51
@ItsChaceD ItsChaceD self-requested a review April 19, 2023 15:56
@Dante1349
Copy link
Contributor Author

@nimo23 sorry for the late reaction, i didn't had the time to work on it. But I fixed the branch and the code should be working like this.

@Dante1349
Copy link
Contributor Author

@ItsChaceD any chance for a review?

* Use this option if you need a gzip compression of the data payload
* A compatible consumer interface must be ensured. The default is _false_.
*/
gzipCompression?: boolean;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think this should be an on/off setting that is applied for the entire lifecycle of the plugin.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we will probably move this to a param in the http function call, this makes sense not to have it globally

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not a huge fan of just doing copy-pasta of a source file from another open source project personally.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we will include the library into the project

options.headers['Content-Encoding'] = 'gzip';
output.headers = options.headers;

const gzippedData: Uint8Array = Pako.gzip(options.data);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not a fan of adding this dependency to core. Is there a reason you can't just gzip and gunzip on the client side from javascript on your side?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the thing is that we could do that, but why we switched to capacitor http is, that we need background functionality of the http services and therefore we started using capacitor-community/http which brought us here. The Pako compression could be done client side in the app code, but we moved it to the web part of the plugin to complete the gzip functionality for all platforms. I didn't find any good code example to do it without Pako, maybe this dependency can be kept?

@Steven0351
Copy link
Member

Steven0351 commented Jan 25, 2024

There are also pretty big changes to http via PR #6818 that more or less avoids using the related http plugin code at all in certain cases

@Dante1349
Copy link
Contributor Author

@Steven0351 I read your concerns and i have the rest of my team behind me to fix the issues, at least most of them, i hope we can find a solution that fits everyones needs. we isolated some tasks and we're on it, i think we can do it by the end of the week. I think you are trying to merge as much as possible to the capacitor 6 release, so we understood that we probably have to be quick now. I will respond to the rest in the related comments

@Dante1349
Copy link
Contributor Author

There are also pretty big changes to http via PR #6818 that more or less avoids using the related http plugin code at all in certain cases

we will try to merge the other feature branch and see if its compatible here

@Steven0351
Copy link
Member

Before you put much more work into this, I'll bring this up to the core team. I personally don't see this making in for Cap 6, but that's not my call to make.

@Dante1349
Copy link
Contributor Author

@Steven0351 thanks for the warning. Then maybe text here if you know more. We would love to see this in cap6 since we live from fork to fork for the last two years ;-)

@MobileMarine
Copy link

Would be really nice to have this kind of standard feature in the core.
OKHttp also automatically takes care of compression and sets header and body if required.
This makes it much more comfortable for developers to use compression.
Otherwise everybody would have to workaround the Android/iOS glitches we had to fight to make compression work on all OSes.

@Steven0351
Copy link
Member

Steven0351 commented Jan 31, 2024

Going to level-set expectations here and relay that this PR is not going to be accepted according to discussions with the core team. The features and functions you are requesting and suggesting being made available can be achieved by building a plugin to handle those specific cases. Compressing a request body is not standard browser behavior, so I'm not sure this feature is warranted in core for the http plugin.

@markemer @giralte-ionic @theproducer

@dfun90
Copy link

dfun90 commented Feb 12, 2024

Going to level-set expectations here and relay that this PR is not going to be accepted according to discussions with the core team. The features and functions you are requesting and suggesting being made available can be achieved by building a plugin to handle those specific cases. Compressing a request body is not standard browser behavior, so I'm not sure this feature is warranted in core for the http plugin.

@markemer @giralte-ionic @theproducer

Hey, just chiming in here, I've been silently watching this PR for a while now as being able to compress the body of a HTTP request is a feature that would be a really vital for us to have.
I can understand the core team's opinion and reasoning on not wanting to implement this, as it's not a standard browser behavior - but nonetheless a very important feature considering mobile apps.

The features and functions you are requesting and suggesting being made available can be achieved by building a plugin to handle those specific cases.
That statement is valid, even though, maintaining a plugin that basically does the exact same as the included HTTP plugin, just adding support for compression to it, seems a bit overkill to me. But maybe there are some people out there who would be interested in putting in the effort for this.

Either way, from my standpoint I originally tried to compress the body of my HTTP requests using the fflate JS library. On client side it did compress it correctly and when sent from a web browser to the backend, it works as expected. However, when the same request is sent to the backend from a physical device, something under the hood of the HTTP plugin seems to serialize the compressed data into a JSON object before sending it, leading to the backend not being able to decompress it.
I've come up with a workaround, when the client sends the proper header for compression, I basically check if the received payload is a valid JSON string (and looks to be actually an Uint8Array) and then convert it back into useable data and decompress it. Not really optimal, but aside from patching the native HTTP plugin code to support compression or creating a whole new plugin, the only way to do it right now, it seems (correct me if I'm wrong).

I would be fine with the plugin not supporting compression as a feature if there is a way to tell the native HTTP plugin to simply send the payload raw as it is instead of transforming it in some way.

@Steven0351
Copy link
Member

Steven0351 commented Feb 12, 2024

Going to level-set expectations here and relay that this PR is not going to be accepted according to discussions with the core team. The features and functions you are requesting and suggesting being made available can be achieved by building a plugin to handle those specific cases. Compressing a request body is not standard browser behavior, so I'm not sure this feature is warranted in core for the http plugin.
@markemer @giralte-ionic @theproducer

Hey, just chiming in here, I've been silently watching this PR for a while now as being able to compress the body of a HTTP request is a feature that would be a really vital for us to have. I can understand the core team's opinion and reasoning on not wanting to implement this, as it's not a standard browser behavior - but nonetheless a very important feature considering mobile apps.

The features and functions you are requesting and suggesting being made available can be achieved by building a plugin to handle those specific cases.
That statement is valid, even though, maintaining a plugin that basically does the exact same as the included HTTP plugin, just adding support for compression to it, seems a bit overkill to me. But maybe there are some people out there who would be interested in putting in the effort for this.

Either way, from my standpoint I originally tried to compress the body of my HTTP requests using the fflate JS library. On client side it did compress it correctly and when sent from a web browser to the backend, it works as expected. However, when the same request is sent to the backend from a physical device, something under the hood of the HTTP plugin seems to serialize the compressed data into a JSON object before sending it, leading to the backend not being able to decompress it. I've come up with a workaround, when the client sends the proper header for compression, I basically check if the received payload is a valid JSON string (and looks to be actually an Uint8Array) and then convert it back into useable data and decompress it. Not really optimal, but aside from patching the native HTTP plugin code to support compression or creating a whole new plugin, the only way to do it right now, it seems (correct me if I'm wrong).

I would be fine with the plugin not supporting compression as a feature if there is a way to tell the native HTTP plugin to simply send the payload raw as it is instead of transforming it in some way.

Out of curiosity, what do you gain out of native http if you have some amount of control over the backend?

@dfun90
Copy link

dfun90 commented Mar 16, 2024

Going to level-set expectations here and relay that this PR is not going to be accepted according to discussions with the core team. The features and functions you are requesting and suggesting being made available can be achieved by building a plugin to handle those specific cases. Compressing a request body is not standard browser behavior, so I'm not sure this feature is warranted in core for the http plugin.
@markemer @giralte-ionic @theproducer

Hey, just chiming in here, I've been silently watching this PR for a while now as being able to compress the body of a HTTP request is a feature that would be a really vital for us to have. I can understand the core team's opinion and reasoning on not wanting to implement this, as it's not a standard browser behavior - but nonetheless a very important feature considering mobile apps.

The features and functions you are requesting and suggesting being made available can be achieved by building a plugin to handle those specific cases.
That statement is valid, even though, maintaining a plugin that basically does the exact same as the included HTTP plugin, just adding support for compression to it, seems a bit overkill to me. But maybe there are some people out there who would be interested in putting in the effort for this.

Either way, from my standpoint I originally tried to compress the body of my HTTP requests using the fflate JS library. On client side it did compress it correctly and when sent from a web browser to the backend, it works as expected. However, when the same request is sent to the backend from a physical device, something under the hood of the HTTP plugin seems to serialize the compressed data into a JSON object before sending it, leading to the backend not being able to decompress it. I've come up with a workaround, when the client sends the proper header for compression, I basically check if the received payload is a valid JSON string (and looks to be actually an Uint8Array) and then convert it back into useable data and decompress it. Not really optimal, but aside from patching the native HTTP plugin code to support compression or creating a whole new plugin, the only way to do it right now, it seems (correct me if I'm wrong).
I would be fine with the plugin not supporting compression as a feature if there is a way to tell the native HTTP plugin to simply send the payload raw as it is instead of transforming it in some way.

Out of curiosity, what do you gain out of native http if you have some amount of control over the backend?

Hey, sorry for getting back to this so late.
You are right. I have since adjusted the backend side to properly handle CORS and disabled the native HTTP plugin.
It works just as well, using fflate to compress payloads wherever needed. Originally I saw the plugin as something to not having to deal with CORS and possible performance benefits (although I've never really measured the difference with and without the use of the plugin and don't really feel any real difference).
But as you have said, having control over the backend you talk to, at least in my case there is no real benefit for using the plugin.

That said, can't speak for other projects that may have some requirements that require the use of the plugin. But creating a custom plugin in those cases with some additional features they need wouldn't be all that hard to do.
Your reasoning of not adding this feature, as it is non-standard, is pretty warranted, even if some people might not be happy about it.

@mobilemarines
Copy link

Going to level-set expectations here and relay that this PR is not going to be accepted according to discussions with the core team. The features and functions you are requesting and suggesting being made available can be achieved by building a plugin to handle those specific cases. Compressing a request body is not standard browser behaviour, so I'm not sure this feature is warranted in core for the http plugin.

@markemer @giralte-ionic @theproducer

Thanks @Steven0351.
We see your point.
There might be an even simpler solution that would make more people happy, including us.
The only thing missing is to be able to provide an ArrayBuffer to a post request. That way we could set the gzip header on JS side and provide the raw gzipped data.

Unfortunately only JSON and string is supported as data type for the request according to https://capacitorjs.com/docs/apis/http#httpoptions.
For the response also 'arraybuffer', 'blob' and 'document' are supported according to https://capacitorjs.com/docs/apis/http#httpresponsetype.

Would this approach have better chances to pass your review?
The ArrayBuffer could be e.g. Base64 encoded on JS side to pass the JSBridge as string and decoded on native side. We could add a HttpOptions.dataType like e.g. "arraybuffer" to mark the data to hold Base64 encoded raw data.
This would help all binary data folks a lot.

@Steven0351
Copy link
Member

Steven0351 commented Mar 19, 2024

Going to level-set expectations here and relay that this PR is not going to be accepted according to discussions with the core team. The features and functions you are requesting and suggesting being made available can be achieved by building a plugin to handle those specific cases. Compressing a request body is not standard browser behaviour, so I'm not sure this feature is warranted in core for the http plugin.
@markemer @giralte-ionic @theproducer

Thanks @Steven0351. We see your point. There might be an even simpler solution that would make more people happy, including us. The only thing missing is to be able to provide an ArrayBuffer to a post request. That way we could set the gzip header on JS side and provide the raw gzipped data.

Unfortunately only JSON and string is supported as data type for the request according to https://capacitorjs.com/docs/apis/http#httpoptions. For the response also 'arraybuffer', 'blob' and 'document' are supported according to https://capacitorjs.com/docs/apis/http#httpresponsetype.

Would this approach have better chances to pass your review? The ArrayBuffer could be e.g. Base64 encoded on JS side to pass the JSBridge as string and decoded on native side. We could add a HttpOptions.dataType like e.g. "arraybuffer" to mark the data to hold Base64 encoded raw data. This would help all binary data folks a lot.

I talked it over with the core team and we're more positive on supporting arraybuffer. It adds a potential footgun if trying to encode large amounts of data, but overall I agree with your assessment about adding that support.

@mobilemarines
Copy link

Going to level-set expectations here and relay that this PR is not going to be accepted according to discussions with the core team. The features and functions you are requesting and suggesting being made available can be achieved by building a plugin to handle those specific cases. Compressing a request body is not standard browser behaviour, so I'm not sure this feature is warranted in core for the http plugin.
@markemer @giralte-ionic @theproducer

Thanks @Steven0351. We see your point. There might be an even simpler solution that would make more people happy, including us. The only thing missing is to be able to provide an ArrayBuffer to a post request. That way we could set the gzip header on JS side and provide the raw gzipped data.
Unfortunately only JSON and string is supported as data type for the request according to https://capacitorjs.com/docs/apis/http#httpoptions. For the response also 'arraybuffer', 'blob' and 'document' are supported according to https://capacitorjs.com/docs/apis/http#httpresponsetype.
Would this approach have better chances to pass your review? The ArrayBuffer could be e.g. Base64 encoded on JS side to pass the JSBridge as string and decoded on native side. We could add a HttpOptions.dataType like e.g. "arraybuffer" to mark the data to hold Base64 encoded raw data. This would help all binary data folks a lot.

I talked it over with the core team and we're more positive on supporting arraybuffer. It adds a potential footgun if trying to encode large amounts of data, but overall I agree with your assessment about adding that support.

Sounds great, @Steven0351.
This would also cover transferring binary data like pictures (not only from a form), which is quite common.
What are the next steps then? Are you gonna provide this in one of the next releases (core team knows best what to do) or is another PR required?

@Steven0351
Copy link
Member

Going to level-set expectations here and relay that this PR is not going to be accepted according to discussions with the core team. The features and functions you are requesting and suggesting being made available can be achieved by building a plugin to handle those specific cases. Compressing a request body is not standard browser behaviour, so I'm not sure this feature is warranted in core for the http plugin.
@markemer @giralte-ionic @theproducer

Thanks @Steven0351. We see your point. There might be an even simpler solution that would make more people happy, including us. The only thing missing is to be able to provide an ArrayBuffer to a post request. That way we could set the gzip header on JS side and provide the raw gzipped data.
Unfortunately only JSON and string is supported as data type for the request according to https://capacitorjs.com/docs/apis/http#httpoptions. For the response also 'arraybuffer', 'blob' and 'document' are supported according to https://capacitorjs.com/docs/apis/http#httpresponsetype.
Would this approach have better chances to pass your review? The ArrayBuffer could be e.g. Base64 encoded on JS side to pass the JSBridge as string and decoded on native side. We could add a HttpOptions.dataType like e.g. "arraybuffer" to mark the data to hold Base64 encoded raw data. This would help all binary data folks a lot.

I talked it over with the core team and we're more positive on supporting arraybuffer. It adds a potential footgun if trying to encode large amounts of data, but overall I agree with your assessment about adding that support.

Sounds great, @Steven0351. This would also cover transferring binary data like pictures (not only from a form), which is quite common. What are the next steps then? Are you gonna provide this in one of the next releases (core team knows best what to do) or is another PR required?

The scope for the next two releases has already been planned and are large efforts. The core team is now only a handful of people that are spread pretty thin. I personally focus on Portals and chip into Capacitor where it makes sense. If you want it in sooner than later, I would suggest a new community driven PR that we can review. I'd be more than happy to provide pointers or suggestions. I'm going to go ahead and close this PR since the work done as-is isn't a good fit for core, but thank you for everyones participation in this discussion.

@Steven0351 Steven0351 closed this Mar 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

9 participants