Allow optionally compressed VFS image data #460
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uses pako for fast decompression of VFS images before mounting.
In principle static file web servers can serve content
gzip
compressed, but in practice it turns out services like GitHub Pages only compress certain types of object, likely based on MIME type.From experimentation, it looks like VFS images with names ending in
.data
are not compressed over the wire, and so pre-compression with gzip can be useful when deploying webR to such services.Using pako should be faster than R's built in
.tar.gz
decompressor, and is not a heavy prerequisite.Note: Using Emscripten's
--lz4
infile_packager
is not a great fit here, because we do not directly execute the.js
output from that tool. Instead we manually mount the image usingMEMFS.mount()
. Unfortunately, with that workflow the.js.metadata
file does not actually contain all the required metadata to decompress the VFS contents.Note: Once merged and included in a release,
{rwasm}
can be tweaked to include compressed R packages. Then, once R 4.5.0 is released we can turn off the uncompressed output.