yet another zip library for node

Related tags

Compression yazl
Overview

yazl

Build Status Coverage Status

yet another zip library for node. For unzipping, see yauzl.

Design principles:

  • Don't block the JavaScript thread. Use and provide async APIs.
  • Keep memory usage under control. Don't attempt to buffer entire files in RAM at once.
  • Prefer to open input files one at a time than all at once. This is slightly suboptimal for time performance, but avoids OS-imposed limits on the number of simultaneously open file handles.

Usage

var yazl = require("yazl");

var zipfile = new yazl.ZipFile();
zipfile.addFile("file1.txt", "file1.txt");
// (add only files, not directories)
zipfile.addFile("path/to/file.txt", "path/in/zipfile.txt");
// pipe() can be called any time after the constructor
zipfile.outputStream.pipe(fs.createWriteStream("output.zip")).on("close", function() {
  console.log("done");
});
// alternate apis for adding files:
zipfile.addReadStream(process.stdin, "stdin.txt");
zipfile.addBuffer(Buffer.from("hello"), "hello.txt");
// call end() after all the files have been added
zipfile.end();

API

Class: ZipFile

new ZipFile()

No parameters. Nothing can go wrong.

addFile(realPath, metadataPath, [options])

Adds a file from the file system at realPath into the zipfile as metadataPath. Typically metadataPath would be calculated as path.relative(root, realPath). Unzip programs would extract the file from the zipfile as metadataPath. realPath is not stored in the zipfile.

A valid metadataPath must not be blank. If a metadataPath contains "\\" characters, they will be replaced by "/" characters. After this substitution, a valid metadataPath must not start with "/" or /[A-Za-z]:\//, and must not contain ".." path segments. File paths must not end with "/", but see addEmptyDirectory(). After UTF-8 encoding, metadataPath must be at most 0xffff bytes in length.

options may be omitted or null and has the following structure and default values:

{
  mtime: stats.mtime,
  mode: stats.mode,
  compress: true,
  forceZip64Format: false,
  fileComment: "", // or a UTF-8 Buffer
}

Use mtime and/or mode to override the values that would normally be obtained by the fs.Stats for the realPath. The mode is the unix permission bits and file type. The mtime and mode are stored in the zip file in the fields "last mod file time", "last mod file date", and "external file attributes". yazl does not store group and user ids in the zip file.

If compress is true, the file data will be deflated (compression method 8). If compress is false, the file data will be stored (compression method 0).

If forceZip64Format is true, yazl will use ZIP64 format in this entry's Data Descriptor and Central Directory Record regardless of if it's required or not (this may be useful for testing.). Otherwise, yazl will use ZIP64 format where necessary.

If fileComment is a string, it will be encoded with UTF-8. If fileComment is a Buffer, it should be a UTF-8 encoded string. In UTF-8, fileComment must be at most 0xffff bytes in length. This becomes the "file comment" field in this entry's central directory file header.

Internally, fs.stat() is called immediately in the addFile function, and fs.createReadStream() is used later when the file data is actually required. Throughout adding and encoding n files with addFile(), the number of simultaneous open files is O(1), probably just 1 at a time.

addReadStream(readStream, metadataPath, [options])

Adds a file to the zip file whose content is read from readStream. See addFile() for info about the metadataPath parameter. options may be omitted or null and has the following structure and default values:

{
  mtime: new Date(),
  mode: 0o100664,
  compress: true,
  forceZip64Format: false,
  fileComment: "", // or a UTF-8 Buffer
  size: 12345, // example value
}

See addFile() for the meaning of mtime, mode, compress, forceZip64Format, and fileComment. If size is given, it will be checked against the actual number of bytes in the readStream, and an error will be emitted if there is a mismatch.

Note that yazl will .pipe() data from readStream, so be careful using .on('data'). In certain versions of node, .on('data') makes .pipe() behave incorrectly.

addBuffer(buffer, metadataPath, [options])

Adds a file to the zip file whose content is buffer. See below for info on the limitations on the size of buffer. See addFile() for info about the metadataPath parameter. options may be omitted or null and has the following structure and default values:

{
  mtime: new Date(),
  mode: 0o100664,
  compress: true,
  forceZip64Format: false,
  fileComment: "", // or a UTF-8 Buffer
}

See addFile() for the meaning of mtime, mode, compress, forceZip64Format, and fileComment.

This method has the unique property that General Purpose Bit 3 will not be used in the Local File Header. This doesn't matter for unzip implementations that conform to the Zip File Spec. However, 7-Zip 9.20 has a known bug where General Purpose Bit 3 is declared an unsupported compression method (note that it really has nothing to do with the compression method.). See issue #11. If you would like to create zip files that 7-Zip 9.20 can understand, you must use addBuffer() instead of addFile() or addReadStream() for all entries in the zip file (and addEmptyDirectory() is fine too).

Note that even when yazl provides the file sizes in the Local File Header, yazl never uses ZIP64 format for Local File Headers due to the size limit on buffer (see below).

Size limitation on buffer

In order to require the ZIP64 format for a local file header, the provided buffer parameter would need to exceed 0xfffffffe in length. Alternatively, the buffer parameter might not exceed 0xfffffffe in length, but zlib compression fails to compress the buffer and actually inflates the data to more than 0xfffffffe in length. Both of these scenarios are not allowed by yazl, and those are enforced by a size limit on the buffer parameter.

According to this zlib documentation, the worst case compression results in "an expansion of at most 13.5%, plus eleven bytes". Furthermore, some configurations of Node.js impose a size limit of 0x3fffffff on every Buffer object. Running this size through the worst case compression of zlib still produces a size less than 0xfffffffe bytes,

Therefore, yazl enforces that the provided buffer parameter must be at most 0x3fffffff bytes long.

addEmptyDirectory(metadataPath, [options])

Adds an entry to the zip file that indicates a directory should be created, even if no other items in the zip file are contained in the directory. This method is only required if the zip file is intended to contain an empty directory.

See addFile() for info about the metadataPath parameter. If metadataPath does not end with a "/", a "/" will be appended.

options may be omitted or null and has the following structure and default values:

{
  mtime: new Date(),
  mode: 040775,
}

See addFile() for the meaning of mtime and mode.

end([options], [finalSizeCallback])

Indicates that no more files will be added via addFile(), addReadStream(), or addBuffer(), and causes the eventual close of outputStream.

options may be omitted or null and has the following structure and default values:

{
  forceZip64Format: false,
  comment: "", // or a CP437 Buffer
}

If forceZip64Format is true, yazl will include the ZIP64 End of Central Directory Locator and ZIP64 End of Central Directory Record regardless of whether or not they are required (this may be useful for testing.). Otherwise, yazl will include these structures if necessary.

If comment is a string, it will be encoded with CP437. If comment is a Buffer, it should be a CP437 encoded string. comment must be at most 0xffff bytes in length and must not include the byte sequence [0x50,0x4b,0x05,0x06]. This becomes the ".ZIP file comment" field in the end of central directory record. Note that in practice, most zipfile readers interpret this field in UTF-8 instead of CP437. If your string uses only codepoints in the range 0x20...0x7e (printable ASCII, no whitespace except for sinlge space ' '), then UTF-8 and CP437 (and ASCII) encodings are all identical. This restriction is recommended for maxium compatibility. To use UTF-8 encoding at your own risk, pass a Buffer into this function; it will not be validated.

If specified and non-null, finalSizeCallback is given the parameters (finalSize) sometime during or after the call to end(). finalSize is of type Number and can either be -1 or the guaranteed eventual size in bytes of the output data that can be read from outputStream.

Note that finalSizeCallback is usually called well before outputStream has piped all its data; this callback does not mean that the stream is done.

If finalSize is -1, it means means the final size is too hard to guess before processing the input file data. This will happen if and only if the compress option is true on any call to addFile(), addReadStream(), or addBuffer(), or if addReadStream() is called and the optional size option is not given. In other words, clients should know whether they're going to get a -1 or a real value by looking at how they are using this library.

The call to finalSizeCallback might be delayed if yazl is still waiting for fs.Stats for an addFile() entry. If addFile() was never called, finalSizeCallback will be called during the call to end(). It is not required to start piping data from outputStream before finalSizeCallback is called. finalSizeCallback will be called only once, and only if this is the first call to end().

outputStream

A readable stream that will produce the contents of the zip file. It is typical to pipe this stream to a writable stream created from fs.createWriteStream().

Internally, large amounts of file data are piped to outputStream using pipe(), which means throttling happens appropriately when this stream is piped to a slow destination.

Data becomes available in this stream soon after calling one of addFile(), addReadStream(), or addBuffer(). Clients can call pipe() on this stream at any time, such as immediately after getting a new ZipFile instance, or long after calling end().

This stream will remain open while you add entries until you end() the zip file.

As a reminder, be careful using both .on('data') and .pipe() with this stream. In certain versions of node, you cannot use both .on('data') and .pipe() successfully.

dateToDosDateTime(jsDate)

jsDate is a Date instance. Returns {date: date, time: time}, where date and time are unsigned 16-bit integers.

Regarding ZIP64 Support

yazl automatically uses ZIP64 format to support files and archives over 2^32 - 2 bytes (~4GB) in size and to support archives with more than 2^16 - 2 (65534) files. (See the forceZip64Format option in the API above for more control over this behavior.) ZIP64 format is necessary to exceed the limits inherent in the original zip file format.

ZIP64 format is supported by most popular zipfile readers, but not by all of them. Notably, the Mac Archive Utility does not understand ZIP64 format (as of writing this), and will behave very strangely when presented with such an archive.

Output Structure

The Zip File Spec leaves a lot of flexibility up to the zip file creator. This section explains and justifies yazl's interpretation and decisions regarding this flexibility.

This section is probably not useful to yazl clients, but may be interesting to unzip implementors and zip file enthusiasts.

Disk Numbers

All values related to disk numbers are 0, because yazl has no multi-disk archive support. (The exception being the Total Number of Disks field in the ZIP64 End of Central Directory Locator, which is always 1.)

Version Made By

Always 0x033f == (3 << 8) | 63, which means UNIX (3) and made from the spec version 6.3 (63).

Note that the "UNIX" has implications in the External File Attributes.

Version Needed to Extract

Usually 20, meaning 2.0. This allows filenames and file comments to be UTF-8 encoded.

When ZIP64 format is used, some of the Version Needed to Extract values will be 45, meaning 4.5. When this happens, there may be a mix of 20 and 45 values throughout the zipfile.

General Purpose Bit Flag

Bit 11 is always set. Filenames (and file comments) are always encoded in UTF-8, even if the result is indistinguishable from ascii.

Bit 3 is usually set in the Local File Header. To support both a streaming input and streaming output api, it is impossible to know the crc32 before processing the file data. When bit 3 is set, data Descriptors are given after each file data with this information, as per the spec. But remember a complete metadata listing is still always available in the central directory record, so if unzip implementations are relying on that, like they should, none of this paragraph will matter anyway. Even so, some popular unzip implementations do not follow the spec. The Mac Archive Utility requires Data Descriptors to include the optional signature, so yazl includes the optional data descriptor signature. When bit 3 is not used, the Mac Archive Utility requires there to be no data descriptor, so yazl skips it in that case. Additionally, 7-Zip 9.20 does not seem to support bit 3 at all (see issue #11).

All other bits are unset.

Internal File Attributes

Always 0. The "apparently an ASCII or text file" bit is always unset meaning "apparently binary". This kind of determination is outside the scope of yazl, and is probably not significant in any modern unzip implementation.

External File Attributes

Always stats.mode << 16. This is apparently the convention for "version made by" = 0x03xx (UNIX).

Note that for directory entries (see addEmptyDirectory()), it is conventional to use the lower 8 bits for the MS-DOS directory attribute byte. However, the spec says this is only required if the Version Made By is DOS, so this library does not do that.

Directory Entries

When adding a metadataPath such as "parent/file.txt", yazl does not add a directory entry for "parent/", because file entries imply the need for their parent directories. Unzip clients seem to respect this style of pathing, and the zip file spec does not specify what is standard in this regard.

In order to create empty directories, use addEmptyDirectory().

Size of Local File and Central Directory Entry Metadata

The spec recommends that "The combined length of any directory record and [the file name, extra field, and comment fields] should not generally exceed 65,535 bytes". yazl makes no attempt to respect this recommendation. Instead, each of the fields is limited to 65,535 bytes due to the length of each being encoded as an unsigned 16 bit integer.

Change History

  • 2.5.1
    • Fix support for old versions of Node and add official support for Node versions 0.10, 4, 6, 8, 10. pull #49
  • 2.5.0
    • Add support for comment and fileComment. pull #44
    • Avoid new Buffer(). pull #43
  • 2.4.3
  • 2.4.2
    • Remove octal literals to make yazl compatible with strict mode. pull #28
  • 2.4.1
    • Fix Mac Archive Utility compatibility issue. issue #24
  • 2.4.0
  • 2.3.1
    • Remove .npmignore from npm package. pull #22
  • 2.3.0
    • metadataPath can have \ characters now; they will be replaced with /. issue #18
  • 2.2.2
  • 2.2.1
    • Fix Mac Archive Utility compatibility issue. issue #14
  • 2.2.0
    • Avoid using general purpose bit 3 for addBuffer() calls. issue #13
  • 2.1.3
    • Fix bug when only addBuffer() and end() are called. issue #12
  • 2.1.2
  • 2.1.1
    • Fixed stack overflow when using addBuffer() in certain ways. issue #9
  • 2.1.0
    • Added addEmptyDirectory().
    • options is now optional for addReadStream() and addBuffer().
  • 2.0.0
    • Initial release.
Comments
  • output zips cannot be opened by Mac Archive Utility

    output zips cannot be opened by Mac Archive Utility

    I noticed this because Groove Basin multi-file downloads were not working for me.

    I create a zip like this:

    $ echo THIS IS A TEST. > test.txt
    $ node test/zip.js test.txt -o test.zip
    

    Then when I try to open the zip with Archive Utility I get this error:

    Unable to expand "test.zip" into "yazl".
    (Error 2 - No such file or directory.)
    

    Trying to repair the file gives some diagnostic information:

    $ zip -FF test.zip --out test-fixed.zip
    Fix archive (-FF) - salvage what can
     Found end record (EOCDR) - says expect single disk archive
    Scanning for entries...
     copying: test.txt 
        zip warning: no end of stream entry found: test.txt
        zip warning: rewinding and scanning for later entries
        zip warning: zip file empty
    
    bug 
    opened by dwrensha 14
  • Support for archive + file comments

    Support for archive + file comments

    This adds support (and tests) for archive and file comments. Archive comments are set by providing comment in .end()'s options. File comments are set by providing fileComment when adding files.

    opened by mojodna 11
  • zipped files decompress much larger than the original files

    zipped files decompress much larger than the original files

    There is a 105MB file within the lambda-lib.zip archive at node-oracledb-for-lambda. When I use gulp-zip (which uses yazl) to add it to a new zip archive, and then unzip that archive the 105MB file has grown to 140MB.

    Also, when submitting a yazl-generated zip containing binary libraries to AWS Lambda, an error is thrown:

    ELF file's phentsize not the expected size

    (this was the cause of https://github.com/oracle/node-oracledb/issues/468)

    opened by nalbion 8
  • unzip, edit and re-zip

    unzip, edit and re-zip

    Hi, not so much of a bug, but a question/request for assistance... I'm trying to create a gulp plugin which edits an entry within a zip file.

    I didn't have any luck with yauzl, (it complained of an invalid signature when provided with a zip file created by yazl) so I'm using the [unzip](https://github.com/EvanOxfeld/node-unzip) package.

    I'd really appreciate it if you could take a quick look at the code below and help me figure out how to update the zip "file" with the updated file, leaving all other files intact.

    var through = require('through2');
    var yazl = require('yazl');
    var unzip = require('unzip');
    
    module.exports = function() {
        function transformZipFile(file, encoding, callback) {
    
        file.pipe(unzip.Parse())
                .on('entry', function(entry) {
                    if (entry.path === 'lib/env-config.js') {
                        entry.pipe(through.obj(transformEnvConfig, function(flushCallback) {
                            // not sure what to do here - I have 2 callbacks which need to be called (not sure what flushCallback does
                            // I'm not sure if the entry is writable, if I need to update a zip header if I do...
                            // If the entry is not writable, I suppose I need to copy across all of the other files as well as this one to the new output zip archive
    
                            flushCallback();
                            callback(null, file);
                        }));
                    } else {
                        entry.autodrain();
                    }
                })
                .on('close', function() {
                    console.info('unzip closed');
    
                });
        }
    
       function transformEnvConfig(data, encoding, callback) {
            deployEnvConfig.updateEnvConfigData(data.toString()).then( function(newEnvConfigData) {
                console.info('env-config updated');
                // newEnvConfig is correctly updated, now I need to write it back to the zip "file"
                callback(null, newEnvConfigData);
            }, callback);
        }
    
        return through.obj(transformZipFile);
    };
    
    opened by nalbion 7
  • Issues when extracting with OSX Archive

    Issues when extracting with OSX Archive

    When extracting zip files generated with yazl 2.4.0 using the Archive app, a filename.zip.cpgz is generated instead of the contents. Zipfiles generated with 2.3.1 work as expected.

    I'm using yazl through gulp-zip so it might not be an issue with yazl itself.

    opened by paulgoldbaum 7
  • 2.5.0 TypeError: this is not a typed array

    2.5.0 TypeError: this is not a typed array

    We use gulp-zip which uses yazl and a few days ago our build broke because of changes in 2.5.0 Here's the error

    node_modules/gulp-zip/node_modules/yazl/index.js:111
    var eocdrSignatureBuffer = Buffer.from([0x50, 0x4b, 0x05, 0x06]);
                                      ^
    TypeError: this is not a typed array.
      at Function.from (na  at Object.<anonymous> (node_modules/gulp-zip/node_modules/yazl/index.js:111:35)
    

    I forked gulp-zip and hardcoded yazl to use version 2.4.3 till this is fixed. Hopefully will be fixed soon.

    opened by chadn 6
  • Error: file data stream has unexpected number of bytes

    Error: file data stream has unexpected number of bytes

    Hi!

    For some reason, I occasionally get the following error when trying to create a zip file. I'd be really grateful if you were able to provide any insight.

    Uncaught Exception:
    Error: file data stream has unexpected number of bytes
        at ByteCounter.<anonymous> (/Applications/Castbridge.app/Contents/Resources/app.asar/node_modules/yazl/index.js:144:99)
        at emitNone (events.js:91:20)
        at ByteCounter.emit (events.js:185:7)
        at endReadableNT (_stream_readable.js:974:12)
        at _combinedTickCallback (internal/process/next_tick.js:74:11)
        at process._tickCallback (internal/process/next_tick.js:98:9)
    

    Thanks!

    opened by ariporad 6
  • OS X Archive Utility fails to extract zip file

    OS X Archive Utility fails to extract zip file

    With the latest version 2.2.0 the following code produces a zip file that OS X's Archive Utility cannot open:

    var yazl = require('yazl');
    var fs = require('fs');
    
    var zipfile = new yazl.ZipFile();
    
    var ostream = fs.createWriteStream('out.zip');
    zipfile.outputStream.pipe(ostream);
    
    zipfile.addBuffer(new Buffer('hello'), 'hello.txt');
    zipfile.end();
    

    screen shot 2015-03-26 at 20 51 24

    System log

    26/03/15 20:51:23.972 Archive Utility[29825]: bomCopierFatalError:Couldn't read pkzip signature.
    26/03/15 20:51:23.973 Archive Utility[29825]: bomCopierFatalError:Not a central directory signature
    
    opened by joaomoreno 6
  • Windows path issues

    Windows path issues

    I'm using path.normalize() basically everywhere, but yazl is throwing me the following error:

    Error: invalid characters in path: fonts\fontawesome-webfont.eot
    

    Is yazl cross-platform/compatible with Windows?

    opened by Dids 5
  • 7-zip 9.20 fails to extract zip produced by yazl

    7-zip 9.20 fails to extract zip produced by yazl

    My script creates the simplest zip file possible:

    var yazl = require('yazl');
    var fs = require('fs');
    
    var zipfile = new yazl.ZipFile();
    
    var ostream = fs.createWriteStream('out.zip');
    zipfile.outputStream.pipe(ostream);
    
    zipfile.addBuffer(new Buffer('hello'), 'hello.txt');
    zipfile.end();
    

    I cannot extract this file with 7-zip 9.20 (its latest stable release). This application has a feature that verifies an archive and with the produced zip I get:

    screen shot 2015-03-19 at 16 01 09

    I can successfully extract it using the native Windows Extract... context menu action. I can also extract it using 7-zip 9.38 (their latest beta release).

    opened by joaomoreno 5
  • Concurrent calls to addReadStream results in empty files

    Concurrent calls to addReadStream results in empty files

    This standalone code demonstrates the issue. generateFile returns a readable stream to a file. We use it to create files concurrently (concurrency determines how many in parallel).

    If concurrency is 1 - all files are written, but if we use 2 or higher some file will be empty:

    var Yazl = require('yazl'), Bluebird = require('bluebird'),
      Stream = require('stream'), Fs = require('fs');
    
    var zipFile = new Yazl.ZipFile(),
      zipStream = zipFile.outputStream;
    
    var concurrency = 5;
    
    console.log('concurrency = ' + concurrency);
    
    // return a readable stream to 1000 bytes file
    var generateFile = function(idx) {
      var stream = new Stream.Readable();
      for (var i = 0; i < 1000; i++) {
        stream.push('' + idx);
      }
      stream.push(null);
      return stream;
    };
    
    Bluebird.map([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9 ], function(idx) {
    
      var fileStream = generateFile(idx);
    
      zipFile.addReadStream(fileStream, idx + ".txt", { compress: false });
    
      var totalWritten = 0;
      fileStream.on('data', function(chunk) { totalWritten += chunk.length; });
    
      var deferred = Bluebird.defer();
    
      fileStream.on('error', function(err) {
        console.log('fileStream error');
        deferred.reject(err);
      });
      fileStream.on('end', function() {
        console.log("fileStream end: " + totalWritten + " written");
        deferred.resolve();
      });
    
      return deferred.promise.delay(0);
    
    }, { concurrency: concurrency }).then(function() {
      zipFile.end(function(total) { console.log("zipFile end cb: " + total + " written"); });
    }).catch(function(err) {
      zipStream.emit('error', err);
    });
    
    var total = 0;
    
    zipStream.on('data', function(chunk) { total += chunk.length });
    zipStream.on('end', function() { console.log("zipStream end: " + total + " written"); });
    
    zipStream.pipe(Fs.createWriteStream('tmp.zip'));
    
    >node tmp.js
    concurrency = 1
    fileStream end: 1000 written 
    fileStream end: 1000 written 
    fileStream end: 1000 written 
    fileStream end: 1000 written 
    fileStream end: 1000 written 
    fileStream end: 1000 writtez
    fileStream end: 1000 written
    fileStream end: 1000 written
    fileStream end: 1000 written
    fileStream end: 1000 written
    zipFile end cb: 11042 written
    zipStream end: 11042 written
    
    >node tmp.js
    concurrency = 2
    fileStream end: 1000 written 
    fileStream end: 1000 written 
    fileStream end: 1000 written 
    fileStream end: 1000 written 
    fileStream end: 1000 written 
    fileStream end: 1000 written 
    fileStream end: 1000 written 
    fileStream end: 1000 written 
    fileStream end: 1000 written 
    fileStream end: 1000 written 
    zipFile end cb: 6042 written 
    zipStream end: 6042 written
    
    >node tmp.js
    concurrency = 5
    fileStream end: 1000 written 
    fileStream end: 1000 written 
    fileStream end: 1000 written 
    fileStream end: 1000 written 
    fileStream end: 1000 written 
    fileStream end: 1000 written 
    fileStream end: 1000 written 
    fileStream end: 1000 written 
    fileStream end: 1000 written 
    fileStream end: 1000 written 
    zipFile end cb: 3042 written 
    zipStream end: 3042 written                
    

    image

    opened by tomyam1 5
  • Manipulating with files in a exist zip archive made by yazl

    Manipulating with files in a exist zip archive made by yazl

    First of all, thank u for this library, this is one of the best library for archive files.

    But I have a question, do you have task to add functional for adding, delete and rename files to exist zip archive? I have some ideas how to create this, but I don't think that unzipping and zipping all files is the most optimized way to do this.

    I know, that this library is very old, but the hope is steal alive.

    opened by Noothing 0
  • Piping streams which come from third party servers

    Piping streams which come from third party servers

    I use addReadStream API to create a ZIP file based on multiple streams. Works as expected when streams are locally created streams. But there is an issue when streams come from third party server and yazl even doesn't detect the issue. Current implementation takes 'first not done entry', pipes it, and on "end" event takes the next not done entry and pipes it ... When there are many entries to pipe, might pass significant time interval when some next entry/stream is being started handled. Significant means in this case that the third party server decides that some stream is to long idle, and as a result the stream is just closed by the server. And as a result Zip isn't created. that such stream is already aborted. It's what happens when we try to create Zip by yazl based on streams from our S3 server. The solution for such situation is new API which doesn't takes streams like addReadStream does, but takes a function which creates stream just before to start piping this specific stream/file. Something like the following: ZipFile.prototype.addStreamCreator = function(creator, metadataPath, options) { var self = this; metadataPath = validateMetadataPath(metadataPath, false); if (options == null) options = {}; var entry = new Entry(metadataPath, false, options); self.entries.push(entry); entry.setFileDataPumpFunction(async function() { creator(metadataPath).then((stream) => { entry.state = Entry.FILE_DATA_IN_PROGRESS; console.log(Starting to pump ${metadataPath}); pumpFileDataReadStream(self, entry, stream); //pumpEntries(self); }); }); }; BTW, this is already working and tested function. Thanks

    opened by fkocherovsky 0
  • [Question] .xapk support for zipping/unzipping?

    [Question] .xapk support for zipping/unzipping?

    G'day guys, I came across your repository and it seems great, exactly what I'm looking for, just have a quick question, I want to know if this supports unzipping for .xapk files? hope I'm not wasting time by asking 🙂

    opened by Morsmalleo 0
  • Documentation: error handling

    Documentation: error handling

    Hi,

    I noticed that you throw errors on the instance of ZipFile class. So should I catch errors like this?

    var zipfile = new yazl.ZipFile();
    //...
    zipfile.on('error', error=>console.error(error));
    

    This was only way how to prevent termination of the process when for example I tried to add to zip path, that does not exists.

    If this is the correct way, could you please add it to documentation? Also if this is the correct way, typing on https://github.com/DefinitelyTyped/DefinitelyTyped/blob/master/types/yazl/index.d.ts does not allow this call.

    opened by minomikula 0
  • RE: API to predict final zipfile size

    RE: API to predict final zipfile size

    This is a follow up of #1 .

    I see that we can get the final zipfile size in the callback to .end. However, I would like to know that size way before adding files, so I can pass it along the pipeline before creating the archive.

    Would it be possible to add another API to return the predicted size for a list of inputs whose size is provided?

    For example:

    yazl.size(
      fs.readdirSync("my/folder", { withFileTypes: true }).map(dirEntry => {
        return {
           ...dirEntry,
          size: fs.statSync(dirEntry.name).size,
        };
      }),
      {
        compress: false,
      },
    );
    

    Basically, it takes a list of file-like entries, with required name and size properties, and returns the final zipfile size.

    It's fine to return -1 when the size can't be determined, but when it does (such as the size option is passed along the stream), the size should be computed. It's nicer that yazl exposes this number so that the calculation adheres to its method of archiving.

    opened by sntran 1
Owner
Josh Wolfe
Josh Wolfe
Create, read and edit .zip files with Javascript

JSZip A library for creating, reading and editing .zip files with JavaScript, with a lovely and simple API. See https://stuk.github.io/jszip for all t

Stuart Knightley 8.6k Jan 5, 2023
Module that decompresses zip files

decompress-zip Extract files from a ZIP archive Usage .extract(options) Extracts the contents of the ZIP archive file. Returns an EventEmitter with tw

Bower 101 Sep 2, 2022
high speed zlib port to javascript, works in browser & node.js

pako zlib port to javascript, very fast! Why pako is cool: Results are binary equal to well known zlib (now contains ported zlib v1.2.8). Almost as fa

Nodeca 4.5k Dec 30, 2022
Yet-Another-Relog-Mod - Just another relog mod. Call it YARM!

Yet Another Relog Mod A relog mod with a name so long, you can just call it YARM for short. Features An aesthetic relog list design that follows my "p

Hail 0 Oct 19, 2022
yet another unzip library for node

yauzl yet another unzip library for node. For zipping, see yazl. Design principles: Follow the spec. Don't scan for local file headers. Read the centr

Josh Wolfe 622 Dec 26, 2022
Grupprojekt för kurserna 'Javascript med Ramverk' och 'Agil Utveckling'

JavaScript-med-Ramverk-Laboration-3 Grupprojektet för kurserna Javascript med Ramverk och Agil Utveckling. Utvecklingsguide För information om hur utv

Svante Jonsson IT-Högskolan 3 May 18, 2022
Hemsida för personer i Sverige som kan och vill erbjuda boende till människor på flykt

Getting Started with Create React App This project was bootstrapped with Create React App. Available Scripts In the project directory, you can run: np

null 4 May 3, 2022
Kurs-repo för kursen Webbserver och Databaser

Webbserver och databaser This repository is meant for CME students to access exercises and codealongs that happen throughout the course. I hope you wi

null 14 Jan 3, 2023
Yet Another Clickhouse Client for Node.js

yacc-node - Yet Another Clickhouse Client for NodeJS Introduction yacc-node is a zero depencies Clickhouse Client written in Typescript. Installation

Antonio Vizuete 3 Nov 3, 2022
Yet another library for generating NFT artwork, uploading NFT assets and metadata to IPFS, deploying NFT smart contracts, and minting NFT collections

eznft Yet another library for generating NFT artwork, uploading NFT assets and metadata to IPFS, deploying NFT smart contracts, and minting NFT collec

null 3 Sep 21, 2022
A JavaScript library to read, write, and merge ZIP archives in web browsers.

Armarius About Armarius is a JavaScript library to read, write, and merge ZIP archives in web browsers. This library mainly focuses on a low memory fo

Aternos 5 Nov 9, 2022
Yet another megamenu for Bootstrap 3

Yamm This is Yet another megamenu for Bootstrap 3 from Twitter. Lightweight and pure CSS megamenu that uses the standard navbar markup and the fluid g

geedmo 1.2k Nov 10, 2022
Yet another Linux distribution for voice-enabled IoT and embrace Web standards

YodaOS is Yet another Linux Distribution for voice-enabled IoT and embrace Web standards, thus it uses JavaScript as the main application/scripting la

YODAOS Project 1.2k Dec 22, 2022
Yet Another JSX using tagged template

우아한 JSX Yet Another Simple JSX using tagged template 언어의 한계가 곧 세계의 한계다 - Ludwig Wittgenstein 우아한 JSX가 캠퍼들의 표현의 자유를 넓히고 세계를 넓히는데 도움이 되었으면 합니다 Example i

null 20 Sep 22, 2022
Alternatively called Yet Another Enhancement Point Tracker

Yet Another Talent Tracker Alternatively called Yet Another Enhancement Point Tracker, but that name doesn't sound as cool in an acronym, so let's cal

Hail 2 Oct 17, 2022
Yet another concurrent priority task queue, yay!

YQueue Yet another concurrent priority task queue, yay! Install npm install yqueue Features Concurrency control Prioritized tasks Error handling for b

null 6 Apr 4, 2022
Yet another eslint blame (might) with better adaptability

yet-another-eslint-blame Yet another eslint blame (might) with better adaptability. The input is eslint's output with json format (You can see it here

快手“探索组”前端 5 Mar 7, 2022
Yet another basic minter.

Mojito Yet another basic minter. Live demo: https://mojito-app.netlify.app/ Motivation The create-eth-app team recently added useDApp in their v1.8.0,

Julien Béranger 3 Apr 26, 2022
Yet another advanced djs slash command handler made by dano with ❤️

Advanced djs slash command handler Yet another advanced djs slash command handler made by dano with ❤️ Ultimate, Efficient, Slash command handler for

null 5 Nov 7, 2022