Skip to content
This repository has been archived by the owner on Dec 4, 2018. It is now read-only.

Streams2 #61

Open
wants to merge 13 commits into
base: master
Choose a base branch
from
Open

Streams2 #61

wants to merge 13 commits into from

Conversation

calvinmetcalf
Copy link

this would likely be a major version bump as there are edge cases that can be broken by the streams1->streams2 change, (which actually popped up in the destory_missing.js test), I also added a clarification of the multiple_objects_error test.

I have a less radical version that just switches from through1 to through2 if you'd prefer that

@calvinmetcalf
Copy link
Author

ping @dominictarr

@dominictarr
Copy link
Owner

generally we have just made an 2 version of module for the streams2 version - can you just publish as a separate module?

@calvinmetcalf
Copy link
Author

that seems a lot more complex then just putting out a 2.0

@tcort
Copy link

tcort commented Jun 24, 2015

Any updates on this topic? I'm looking for a streams2 JSON parser and I'm trying to decide on which one to take. So far I've found this PR, jsonstream2 and jsonstream2-native.

@calvinmetcalf I started down a similar path as you, making a few minor changes to switch JSONStream over to through2, but I'm curious, why did you abandon through2 in favour of readable-stream and later io-stream?

@calvinmetcalf
Copy link
Author

mainly in order to create a constructor function which in theory should be marginally faster, I ended it publishing it as jsonstream2a

@dominictarr
Copy link
Owner

so I was looking at the streams2 (I mean 3) code the other day... and, well, it now works like streams1 did!
look at Readable#pipe it uses the data event, and pause and resume, like classic streams...

It's just streams1, except the stream starts off paused (and resumes on pipe, if you didn't pause it explicitly)

@martinheidegger martinheidegger mentioned this pull request Sep 5, 2016
27 tasks
@yocontra
Copy link

@dominictarr Would you be interested in taking a PR to update through1 -> through2 or do you still think a new repo is the best choice?

@dominictarr
Copy link
Owner

@contra node streams are back to streams1 api, streams 2 was just a detour, look at the code - it uses on('data', ...) by default and only falls back to streams2 style if you set a on('readable', ...) event. The one thing that classic streams don't do is be paused by default (before pipe). But is that even a big deal for something like JSONStream?

basically: what is the advantage of streams2 here?

@dominictarr
Copy link
Owner

@calvinmetcalf
Copy link
Author

basically: what is the advantage of streams2 here?

compatibility, stream2+ objects have different methods, came up today when wanting to use something with syncthrough

@dominictarr
Copy link
Owner

Okay, so I think the solution here is refactor the streams out of this module, or at least into a separate file - so, that at it's core you get an update, end api which can be trivially wrapped into any stream, classic streams, streams2, or pull-stream. I want to be able to use this without getting the whole ReadableStream crap (50k!) then you can do require('JSONStream/streams2') if you want that.

I just use pull-stream now, it's just much easier all round than node streams. and compatibility isn't really a problem because there are converters, and there is a large ecosystem of highly composable modules.

@dominictarr
Copy link
Owner

@calvinmetcalf incidentially, syncthrough now works with through based stream1 modules like this one. mcollina/syncthrough#3

Also, syncthrough would be a very appropiate stream for JSONStream to export.

@calvinmetcalf
Copy link
Author

calvinmetcalf commented Jan 25, 2017 via email

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants