node_modules ignore

This commit is contained in:
2025-05-08 23:43:47 +02:00
parent e19d52f172
commit 4574544c9f
65041 changed files with 10593536 additions and 0 deletions

11
server/node_modules/stream-chain/LICENSE generated vendored Normal file
View File

@@ -0,0 +1,11 @@
Copyright 2019 Eugene Lazutkin
Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:
1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.
3. Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

334
server/node_modules/stream-chain/README.md generated vendored Normal file
View File

@@ -0,0 +1,334 @@
# stream-chain [![NPM version][npm-img]][npm-url]
[npm-img]: https://img.shields.io/npm/v/stream-chain.svg
[npm-url]: https://npmjs.org/package/stream-chain
`stream-chain` creates a chain of streams out of regular functions, asynchronous functions, generator functions, and existing streams, while properly handling [backpressure](https://nodejs.org/en/docs/guides/backpressuring-in-streams/). The resulting chain is represented as a [Duplex](https://nodejs.org/api/stream.html#stream_class_stream_duplex) stream, which can be combined with other streams the usual way. It eliminates a boilerplate helping to concentrate on functionality without losing the performance especially make it easy to build object mode data processing pipelines.
Originally `stream-chain` was used internally with [stream-fork](https://www.npmjs.com/package/stream-fork) and [stream-json](https://www.npmjs.com/package/stream-json) to create flexible data processing pipelines.
`stream-chain` is a lightweight, no-dependencies micro-package. It is distributed under New BSD license.
## Intro
```js
const Chain = require('stream-chain');
const fs = require('fs');
const zlib = require('zlib');
const {Transform} = require('stream');
// the chain will work on a stream of number objects
const chain = new Chain([
// transforms a value
x => x * x,
// returns several values
x => [x - 1, x, x + 1],
// waits for an asynchronous operation
async x => await getTotalFromDatabaseByKey(x),
// returns multiple values with a generator
function* (x) {
for (let i = x; i > 0; --i) {
yield i;
}
return 0;
},
// filters out even values
x => x % 2 ? x : null,
// uses an arbitrary transform stream
new Transform({
writableObjectMode: true,
transform(x, _, callback) {
// transform to text
callback(null, x.toString());
}
}),
// compress
zlib.createGzip()
]);
// log errors
chain.on('error', error => console.log(error));
// use the chain, and save the result to a file
dataSource.pipe(chain).pipe(fs.createWriteStream('output.txt.gz'));
```
Making processing pipelines appears to be easy: just chain functions one after another, and we are done. Real life pipelines filter objects out and/or produce more objects out of a few ones. On top of that we have to deal with asynchronous operations, while processing or producing data: networking, databases, files, user responses, and so on. Unequal number of values per stage, and unequal throughput of stages introduced problems like [backpressure](https://nodejs.org/en/docs/guides/backpressuring-in-streams/), which requires algorithms implemented by [streams](https://nodejs.org/api/stream.html).
While a lot of API improvements were made to make streams easy to use, in reality, a lot of boilerplate is required when creating a pipeline. `stream-chain` eliminates most of it.
## Installation
```bash
npm i --save stream-chain
# or: yarn add stream-chain
```
## Documentation
`Chain`, which is returned by `require('stream-chain')`, is based on [Duplex](https://nodejs.org/api/stream.html#stream_class_stream_duplex). It chains its dependents in a single pipeline optionally binding `error` events.
Many details about this package can be discovered by looking at test files located in `tests/` and in the source code (`index.js`).
### Constructor: `new Chain(fns[, options])`
The constructor accepts the following arguments:
* `fns` is an array of functions arrays or stream instances.
* If a value is a function, a [Transform](https://nodejs.org/api/stream.html#stream_class_stream_transform) stream is created, which calls this function with two parameters: `chunk` (an object), and an optional `encoding`. See [Node's documentation](https://nodejs.org/api/stream.html#stream_transform_transform_chunk_encoding_callback) for more details on those parameters. The function will be called in the context of the created stream.
* If it is a regular function, it can return:
* Regular value:
* *(deprecated since 2.1.0)* Array of values to pass several or zero values to the next stream as they are.
```js
// produces no values:
x => []
// produces two values:
x => [x, x + 1]
// produces one array value:
x => [[x, x + 1]]
```
* Single value.
* If it is `undefined` or `null`, no value shall be passed.
* Otherwise, the value will be passed to the next stream.
```js
// produces no values:
x => null
x => undefined
// produces one value:
x => x
```
* Special value:
* If it is an instance of [Promise](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise) or "thenable" (an object with a method called `then()`), it will be waited for. Its result should be a regular value.
```js
// delays by 0.5s:
x => new Promise(
resolve => setTimeout(() => resolve(x), 500))
```
* If it is an instance of a generator or "nextable" (an object with a method called `next()`), it will be iterated according to the [generator](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Generator) protocol. The results should be regular values.
```js
// produces multiple values:
class Nextable {
constructor(x) {
this.x = x;
this.i = -1;
}
next() {
return {
done: this.i <= 1,
value: this.x + this.i++
};
}
}
x => new Nextable(x)
```
`next()` can return a `Promise` according to the [asynchronous generator](https://zaiste.net/nodejs_10_asynchronous_iteration_async_generators/) protocol.
* Any thrown exception will be caught and passed to a callback function effectively generating an error event.
```js
// fails
x => { throw new Error('Bad!'); }
```
* If it is an asynchronous function, it can return a regular value.
* In essence, it is covered under "special values" as a function that returns a promise.
```js
// delays by 0.5s:
async x => {
await new Promise(resolve => setTimeout(() => resolve(), 500));
return x;
}
```
* If it is a generator function, each yield should produce a regular value.
* In essence, it is covered under "special values" as a function that returns a generator object.
```js
// produces multiple values:
function* (x) {
for (let i = -1; i <= 1; ++i) {
if (i) yield x + i;
}
return x;
}
```
* *(since 2.2.0)* If it is an asynchronous generator function, each yield should produce a regular value.
* In essence, it is covered under "special values" as a function that returns a generator object.
```js
// produces multiple values:
async function* (x) {
for (let i = -1; i <= 1; ++i) {
if (i) {
await new Promise(resolve => setTimeout(() => resolve(), 50));
yield x + i;
}
}
return x;
}
```
* *(since 2.1.0)* If a value is an array, it is assumed to be an array of regular functions.
Their values are passed in a chain. All values (including `null`, `undefined`, and arrays) are allowed
and passed without modifications. The last value is a subject to processing defined above for regular functions.
* Empty arrays are ignored.
* If any function returns a value produced by `Chain.final(value)` (see below), it terminates the chain using
`value` as the final value of the chain.
* This feature bypasses streams. It is implemented for performance reasons.
* If a value is a valid stream, it is included as is in the pipeline.
* [Transform](https://nodejs.org/api/stream.html#stream_class_stream_transform).
* [Duplex](https://nodejs.org/api/stream.html#stream_class_stream_duplex).
* The very first stream can be [Readable](https://nodejs.org/api/stream.html#stream_class_stream_readable).
* In this case a `Chain` instance ignores all possible writes to the front, and ends when the first stream ends.
* The very last stream can be [Writable](https://nodejs.org/api/stream.html#stream_class_stream_writable).
* In this case a `Chain` instance does not produce any output, and finishes when the last stream finishes.
* Because `'data'` event is not used in this case, the instance resumes itself automatically. Read about it in Node's documentation:
* [Two modes](https://nodejs.org/api/stream.html#stream_two_modes).
* [readable.resume()](https://nodejs.org/api/stream.html#stream_readable_resume).
* `options` is an optional object detailed in the [Node's documentation](https://nodejs.org/api/stream.html#stream_new_stream_duplex_options).
* If `options` is not specified, or falsy, it is assumed to be:
```js
{writableObjectMode: true, readableObjectMode: true}
```
* Always make sure that `writableObjectMode` is the same as the corresponding object mode of the first stream, and `readableObjectMode` is the same as the corresponding object mode of the last stream.
* Eventually both these modes can be deduced, but Node does not define the standard way to determine it, so currently it cannot be done reliably.
* Additionally the following custom properties are recognized:
* `skipEvents` is an optional flag. If it is falsy (the default), `'error'` events from all streams are forwarded to the created instance. If it is truthy, no event forwarding is made. A user can always do so externally or in a constructor of derived classes.
An instance can be used to attach handlers for stream events.
```js
const chain = new Chain([x => x * x, x => [x - 1, x, x + 1]]);
chain.on('error', error => console.error(error));
dataSource.pipe(chain);
```
### Properties
Following public properties are available:
* `streams` is an array of streams created by a constructor. Its values either [Transform](https://nodejs.org/api/stream.html#stream_class_stream_transform) streams that use corresponding functions from a constructor parameter, or user-provided streams. All streams are piped sequentially starting from the beginning.
* `input` is the beginning of the pipeline. Effectively it is the first item of `streams`.
* `output` is the end of the pipeline. Effectively it is the last item of `streams`.
Generally, a `Chain` instance should be used to represent a chain:
```js
const chain = new Chain([
x => x * x,
x => [x - 1, x, x + 1],
new Transform({
writableObjectMode: true,
transform(chunk, _, callback) {
callback(null, chunk.toString());
}
})
]);
dataSource
.pipe(chain);
.pipe(zlib.createGzip())
.pipe(fs.createWriteStream('output.txt.gz'));
```
But in some cases `input` and `output` provide a better control over how a data processing pipeline should be organized:
```js
chain.output
.pipe(zlib.createGzip())
.pipe(fs.createWriteStream('output.txt.gz'));
dataSource.pipe(chain.input);
```
Please select what style you want to use, and never mix them together with the same object.
### Static methods
Following static methods are available:
* `chain(fns[, options)` is a helper factory function, which has the same arguments as the constructor and returns a `Chain` instance.
```js
const {chain} = require('stream-chain');
// simple
dataSource
.pipe(chain([x => x * x, x => [x - 1, x, x + 1]]));
// all inclusive
chain([
dataSource,
x => x * x,
x => [x - 1, x, x + 1],
zlib.createGzip(),
fs.createWriteStream('output.txt.gz')
])
```
* *(since 2.1.0)* `final(value)` is a helper factory function, which can be used in by chained functions (see above the array of functions).
It returns a special value, which terminates the chain and uses the passed value as the result of the chain.
```js
const {chain, final} = require('stream-chain');
// simple
dataSource
.pipe(chain([[x => x * x, x => 2 * x + 1]]));
// faster than [x => x * x, x => 2 * x + 1]
// final
dataSource
.pipe(chain([[
x => x * x,
x => final(x),
x => 2 * x + 1
]]));
// the same as [[x => x * x, x => x]]
// the same as [[x => x * x]]
// the same as [x => x * x]
// final as a terminator
dataSource
.pipe(chain([[
x => x * x,
x => final(),
x => 2 * x + 1
]]));
// produces no values, because the final value is undefined,
// which is interpreted as "no value shall be passed"
// see the doc above
// final() as a filter
dataSource
.pipe(chain([[
x => x * x,
x => x % 2 ? final() : x,
x => 2 * x + 1
]]));
// only even values are passed, odd values are ignored
// if you want to be really performant...
const none = final();
dataSource
.pipe(chain([[
x => x * x,
x => x % 2 ? none : x,
x => 2 * x + 1
]]));
```
* *(since 2.1.0)* `many(array)` is a helper factory function, which is used to wrap arrays to be interpreted as multiple values returned from a function.
At the moment it is redundant: you can use a simple array to indicate that, but a naked array is being deprecated and in future versions it will be passed as is.
The thinking is that using `many()` is better indicates the intention. Additionally, in the future versions it will be used by array of functions (see above).
```js
const {chain, many} = require('stream-chain');
dataSource
.pipe(chain([x => many([x, x + 1, x + 2])]));
// currently the same as [x => [x, x + 1, x + 2]]
```
## Release History
- 2.2.5 *Relaxed the definition of a stream (thx [Rich Hodgkins](https://github.com/rhodgkins)).*
- 2.2.4 *Bugfix: wrong `const`-ness in the async generator branch (thx [Patrick Pang](https://github.com/patrickpang)).*
- 2.2.3 *Technical release. No need to upgrade.*
- 2.2.2 *Technical release. No need to upgrade.*
- 2.2.1 *Technical release: new symbols namespace, explicit license (thx [Keen Yee Liau](https://github.com/kyliau)), added Greenkeeper.*
- 2.2.0 *Added utilities: `take`, `takeWhile`, `skip`, `skipWhile`, `fold`, `scan`, `Reduce`, `comp`.*
- 2.1.0 *Added simple transducers, dropped Node 6.*
- 2.0.3 *Added TypeScript typings and the badge.*
- 2.0.2 *Workaround for Node 6: use `'finish'` event instead of `_final()`.*
- 2.0.1 *Improved documentation.*
- 2.0.0 *Upgraded to use Duplex instead of EventEmitter as the base.*
- 1.0.3 *Improved documentation.*
- 1.0.2 *Better README.*
- 1.0.1 *Fixed the README.*
- 1.0.0 *The initial release.*

22
server/node_modules/stream-chain/defs.js generated vendored Normal file
View File

@@ -0,0 +1,22 @@
'use strict';
const none = Symbol.for('object-stream.none');
const finalSymbol = Symbol.for('object-stream.final');
const manySymbol = Symbol.for('object-stream.many');
const final = value => ({[finalSymbol]: value});
const many = values => ({[manySymbol]: values});
const isFinal = o => o && typeof o == 'object' && finalSymbol in o;
const isMany = o => o && typeof o == 'object' && manySymbol in o;
const getFinalValue = o => o[finalSymbol];
const getManyValues = o => o[manySymbol];
module.exports.none = none;
module.exports.final = final;
module.exports.isFinal = isFinal;
module.exports.getFinalValue = getFinalValue;
module.exports.many = many;
module.exports.isMany = isMany;
module.exports.getManyValues = getManyValues;

203
server/node_modules/stream-chain/index.js generated vendored Normal file
View File

@@ -0,0 +1,203 @@
'use strict';
const {Readable, Writable, Duplex, Transform} = require('stream');
const none = Symbol.for('object-stream.none');
const finalSymbol = Symbol.for('object-stream.final');
const manySymbol = Symbol.for('object-stream.many');
const final = value => ({[finalSymbol]: value});
const many = values => ({[manySymbol]: values});
const isFinal = o => o && typeof o == 'object' && finalSymbol in o;
const isMany = o => o && typeof o == 'object' && manySymbol in o;
const getFinalValue = o => o[finalSymbol];
const getManyValues = o => o[manySymbol];
const runAsyncGenerator = async (gen, stream) => {
for (;;) {
let data = gen.next();
if (data && typeof data.then == 'function') {
data = await data;
}
if (data.done) break;
let value = data.value;
if (value && typeof value.then == 'function') {
value = await value;
}
Chain.sanitize(value, stream);
}
};
const wrapFunction = fn =>
new Transform({
writableObjectMode: true,
readableObjectMode: true,
transform(chunk, encoding, callback) {
try {
const result = fn.call(this, chunk, encoding);
if (result && typeof result.then == 'function') {
// thenable
result.then(
result => (Chain.sanitize(result, this), callback(null)),
error => callback(error)
);
return;
}
if (result && typeof result.next == 'function') {
// generator
runAsyncGenerator(result, this).then(
() => callback(null),
error => callback(error)
);
return;
}
Chain.sanitize(result, this);
callback(null);
} catch (error) {
callback(error);
}
}
});
const wrapArray = fns =>
new Transform({
writableObjectMode: true,
readableObjectMode: true,
transform(chunk, encoding, callback) {
try {
let value = chunk;
for (let i = 0; i < fns.length; ++i) {
const result = fns[i].call(this, value, encoding);
if (result === Chain.none) {
callback(null);
return;
}
if (Chain.isFinal(result)) {
value = Chain.getFinalValue(result);
break;
}
value = result;
}
Chain.sanitize(value, this);
callback(null);
} catch (error) {
callback(error);
}
}
});
// is*NodeStream functions taken from https://github.com/nodejs/node/blob/master/lib/internal/streams/utils.js
const isReadableNodeStream = obj =>
obj &&
typeof obj.pipe === 'function' &&
typeof obj.on === 'function' &&
(!obj._writableState || (typeof obj._readableState === 'object' ? obj._readableState.readable : null) !== false) && // Duplex
(!obj._writableState || obj._readableState); // Writable has .pipe.
const isWritableNodeStream = obj =>
obj &&
typeof obj.write === 'function' &&
typeof obj.on === 'function' &&
(!obj._readableState || (typeof obj._writableState === 'object' ? obj._writableState.writable : null) !== false); // Duplex
const isDuplexNodeStream = obj =>
obj && typeof obj.pipe === 'function' && obj._readableState && typeof obj.on === 'function' && typeof obj.write === 'function';
class Chain extends Duplex {
constructor(fns, options) {
super(options || {writableObjectMode: true, readableObjectMode: true});
if (!(fns instanceof Array) || !fns.length) {
throw Error("Chain's argument should be a non-empty array.");
}
this.streams = fns
.filter(fn => fn)
.map((fn, index, fns) => {
if (typeof fn === 'function' || fn instanceof Array) return Chain.convertToTransform(fn);
if (isDuplexNodeStream(fn) || (!index && isReadableNodeStream(fn)) || (index === fns.length - 1 && isWritableNodeStream(fn))) {
return fn;
}
throw Error('Arguments should be functions, arrays or streams.');
})
.filter(s => s);
this.input = this.streams[0];
this.output = this.streams.reduce((output, stream) => (output && output.pipe(stream)) || stream);
if (!isWritableNodeStream(this.input)) {
this._write = (_1, _2, callback) => callback(null);
this._final = callback => callback(null); // unavailable in Node 6
this.input.on('end', () => this.end());
}
if (isReadableNodeStream(this.output)) {
this.output.on('data', chunk => !this.push(chunk) && this.output.pause());
this.output.on('end', () => this.push(null));
} else {
this._read = () => {}; // nop
this.resume();
this.output.on('finish', () => this.push(null));
}
// connect events
if (!options || !options.skipEvents) {
this.streams.forEach(stream => stream.on('error', error => this.emit('error', error)));
}
}
_write(chunk, encoding, callback) {
let error = null;
try {
this.input.write(chunk, encoding, e => callback(e || error));
} catch (e) {
error = e;
}
}
_final(callback) {
let error = null;
try {
this.input.end(null, null, e => callback(e || error));
} catch (e) {
error = e;
}
}
_read() {
this.output.resume();
}
static make(fns, options) {
return new Chain(fns, options);
}
static sanitize(result, stream) {
if (Chain.isFinal(result)) {
result = Chain.getFinalValue(result);
} else if (Chain.isMany(result)) {
result = Chain.getManyValues(result);
}
if (result !== undefined && result !== null && result !== Chain.none) {
if (result instanceof Array) {
result.forEach(value => value !== undefined && value !== null && stream.push(value));
} else {
stream.push(result);
}
}
}
static convertToTransform(fn) {
if (typeof fn === 'function') return wrapFunction(fn);
if (fn instanceof Array) return fn.length ? wrapArray(fn) : null;
return null;
}
}
Chain.none = none;
Chain.final = final;
Chain.isFinal = isFinal;
Chain.getFinalValue = getFinalValue;
Chain.many = many;
Chain.isMany = isMany;
Chain.getManyValues = getManyValues;
Chain.chain = Chain.make;
Chain.make.Constructor = Chain;
module.exports = Chain;

31
server/node_modules/stream-chain/package.json generated vendored Normal file
View File

@@ -0,0 +1,31 @@
{
"name": "stream-chain",
"version": "2.2.5",
"description": "Chain functions as transform streams.",
"main": "index.js",
"scripts": {
"debug": "node --inspect-brk tests/tests.js",
"test": "node tests/tests.js"
},
"repository": {
"type": "git",
"url": "git+https://github.com/uhop/stream-chain.git"
},
"keywords": [
"stream",
"chain"
],
"author": "Eugene Lazutkin <eugene.lazutkin@gmail.com> (http://lazutkin.com/)",
"license": "BSD-3-Clause",
"bugs": {
"url": "https://github.com/uhop/stream-chain/issues"
},
"homepage": "https://github.com/uhop/stream-chain#readme",
"devDependencies": {
"heya-unit": "^0.3.0"
},
"files": [
"/*.js",
"/utils"
]
}

94
server/node_modules/stream-chain/utils/FromIterable.js generated vendored Normal file
View File

@@ -0,0 +1,94 @@
'use strict';
const {Readable} = require('stream');
class FromIterable extends Readable {
constructor(options) {
super(Object.assign({}, options, {objectMode: true}));
this._iterable = null;
this._next = null;
if (options) {
'iterable' in options && (this._iterable = options.iterable);
}
!this._iterable && (this._read = this._readStop);
}
_read() {
if (Symbol.asyncIterator && typeof this._iterable[Symbol.asyncIterator] == 'function') {
this._next = this._iterable[Symbol.asyncIterator]();
this._iterable = null;
this._read = this._readNext;
this._readNext();
return;
}
if (Symbol.iterator && typeof this._iterable[Symbol.iterator] == 'function') {
this._next = this._iterable[Symbol.iterator]();
this._iterable = null;
this._read = this._readNext;
this._readNext();
return;
}
if (typeof this._iterable.next == 'function') {
this._next = this._iterable;
this._iterable = null;
this._read = this._readNext;
this._readNext();
return;
}
const result = this._iterable();
this._iterable = null;
if (result && typeof result.then == 'function') {
result.then(value => this.push(value), error => this.emit('error', error));
this._read = this._readStop;
return;
}
if (result && typeof result.next == 'function') {
this._next = result;
this._read = this._readNext;
this._readNext();
return;
}
this.push(result);
this._read = this._readStop;
}
_readNext() {
for (;;) {
const result = this._next.next();
if (result && typeof result.then == 'function') {
result.then(
value => {
if (value.done || value.value === null) {
this.push(null);
this._next = null;
this._read = this._readStop;
} else {
this.push(value.value);
}
},
error => this.emit('error', error)
);
break;
}
if (result.done || result.value === null) {
this.push(null);
this._next = null;
this._read = this._readStop;
break;
}
if (!this.push(result.value)) break;
}
}
_readStop() {
this.push(null);
}
static make(iterable) {
return new FromIterable(typeof iterable == 'object' && iterable.iterable ? iterable : {iterable});
}
}
FromIterable.fromIterable = FromIterable.make;
FromIterable.make.Constructor = FromIterable;
module.exports = FromIterable;

40
server/node_modules/stream-chain/utils/Reduce.js generated vendored Normal file
View File

@@ -0,0 +1,40 @@
'use strict';
const {Writable} = require('stream');
const defaultInitial = 0;
const defaultReducer = (acc, value) => value;
class Reduce extends Writable {
constructor(options) {
super(Object.assign({}, options, {objectMode: true}));
this.accumulator = defaultInitial;
this._reducer = defaultReducer;
if (options) {
'initial' in options && (this.accumulator = options.initial);
'reducer' in options && (this._reducer = options.reducer);
}
}
_write(chunk, encoding, callback) {
const result = this._reducer.call(this, this.accumulator, chunk);
if (result && typeof result.then == 'function') {
result.then(
value => {
this.accumulator = value;
callback(null);
},
error => callback(error)
);
} else {
this.accumulator = result;
callback(null);
}
}
static make(reducer, initial) {
return new Reduce(typeof reducer == 'object' ? reducer : {reducer, initial});
}
}
Reduce.reduce = Reduce.make;
Reduce.make.Constructor = Reduce;
module.exports = Reduce;

85
server/node_modules/stream-chain/utils/asFun.js generated vendored Normal file
View File

@@ -0,0 +1,85 @@
'use strict';
const {none, final, isFinal, getFinalValue, many, isMany, getManyValues} = require('../defs');
const next = async (value, fns, index, push) => {
for (let i = index; i <= fns.length; ++i) {
if (value && typeof value.then == 'function') {
// thenable
value = await value;
}
if (value === none) break;
if (isFinal(value)) {
const val = getFinalValue(value);
val !== none && push(val);
break;
}
if (isMany(value)) {
const values = getManyValues(value);
if (i == fns.length) {
values.forEach(val => push(val));
} else {
for (let j = 0; j < values.length; ++j) {
await next(values[j], fns, i, push);
}
}
break;
}
if (value && typeof value.next == 'function') {
// generator
for (;;) {
let data = value.next();
if (data && typeof data.then == 'function') {
data = await data;
}
if (data.done) break;
if (i == fns.length) {
push(data.value);
} else {
await next(data.value, fns, i, push);
}
}
break;
}
if (i == fns.length) {
push(value);
break;
}
value = fns[i](value);
}
};
const nop = () => {};
const asFun = (...fns) => {
fns = fns.filter(fn => fn);
if (!fns.length) return nop;
if (Symbol.asyncIterator && fns[0][Symbol.asyncIterator]) {
fns[0] = fns[0][Symbol.asyncIterator];
} else if (Symbol.iterator && fns[0][Symbol.iterator]) {
fns[0] = fns[0][Symbol.iterator];
}
return async value => {
const results = [];
await next(value, fns, 0, value => results.push(value));
switch (results.length) {
case 0:
return none;
case 1:
return results[0];
}
return many(results);
};
};
asFun.next = next;
asFun.none = none;
asFun.final = final;
asFun.isFinal = isFinal;
asFun.getFinalValue = getFinalValue;
asFun.many = many;
asFun.isMany = isMany;
asFun.getManyValues = getManyValues;
module.exports = asFun;

77
server/node_modules/stream-chain/utils/asGen.js generated vendored Normal file
View File

@@ -0,0 +1,77 @@
'use strict';
const {none, final, isFinal, getFinalValue, many, isMany, getManyValues} = require('../defs');
const next = async function*(value, fns, index) {
for (let i = index; i <= fns.length; ++i) {
if (value && typeof value.then == 'function') {
// thenable
value = await value;
}
if (value === none) break;
if (isFinal(value)) {
const val = getFinalValue(value);
if (val !== none) yield val;
break;
}
if (isMany(value)) {
const values = getManyValues(value);
if (i == fns.length) {
yield* values;
} else {
for (let j = 0; j < values.length; ++j) {
yield* next(values[j], fns, i);
}
}
break;
}
if (value && typeof value.next == 'function') {
// generator
for (;;) {
let data = value.next();
if (data && typeof data.then == 'function') {
data = await data;
}
if (data.done) break;
if (i == fns.length) {
yield data.value;
} else {
yield* next(data.value, fns, i);
}
}
break;
}
if (i == fns.length) {
yield value;
break;
}
value = fns[i](value);
}
};
const nop = async function*() {};
const asGen = (...fns) => {
fns = fns.filter(fn => fn);
if (!fns.length) return nop;
if (Symbol.asyncIterator && fns[0][Symbol.asyncIterator]) {
fns[0] = fns[0][Symbol.asyncIterator];
} else if (Symbol.iterator && fns[0][Symbol.iterator]) {
fns[0] = fns[0][Symbol.iterator];
}
return async function*(value) {
yield* next(value, fns, 0);
};
};
asGen.next = next;
asGen.none = none;
asGen.final = final;
asGen.isFinal = isFinal;
asGen.getFinalValue = getFinalValue;
asGen.many = many;
asGen.isMany = isMany;
asGen.getManyValues = getManyValues;
module.exports = asGen;

20
server/node_modules/stream-chain/utils/comp.js generated vendored Normal file
View File

@@ -0,0 +1,20 @@
'use strict';
const {Transform} = require('stream');
const {next} = require('./asFun');
const {sanitize} = require('../index');
const comp = (...fns) => {
fns = fns.filter(fn => fn);
return fns.length
? new Transform({
writableObjectMode: true,
readableObjectMode: true,
transform(chunk, encoding, callback) {
next(chunk, fns, 0, value => sanitize(value, this)).then(() => callback(null), error => callback(error));
}
})
: null;
};
module.exports = comp;

43
server/node_modules/stream-chain/utils/fold.js generated vendored Normal file
View File

@@ -0,0 +1,43 @@
'use strict';
const {Transform} = require('stream');
const defaultInitial = 0;
const defaultReducer = (acc, value) => value;
class Fold extends Transform {
constructor(options) {
super(Object.assign({}, options, {writableObjectMode: true, readableObjectMode: true}));
this._accumulator = defaultInitial;
this._reducer = defaultReducer;
if (options) {
'initial' in options && (this._accumulator = options.initial);
'reducer' in options && (this._reducer = options.reducer);
}
}
_transform(chunk, encoding, callback) {
const result = this._reducer.call(this, this._accumulator, chunk);
if (result && typeof result.then == 'function') {
result.then(
value => {
this._accumulator = value;
callback(null);
},
error => callback(error)
);
} else {
this._accumulator = result;
callback(null);
}
}
_final(callback) {
this.push(this._accumulator);
callback(null);
}
static make(reducer, initial) {
return new Fold(typeof reducer == 'object' ? reducer : {reducer, initial});
}
}
Fold.make.Constructor = Fold;
module.exports = Fold.make;

24
server/node_modules/stream-chain/utils/gen.js generated vendored Normal file
View File

@@ -0,0 +1,24 @@
'use strict';
const {Transform} = require('stream');
const {next} = require('./asGen');
const {sanitize} = require('../index');
const gen = (...fns) => {
fns = fns.filter(fn => fn);
return fns.length
? new Transform({
writableObjectMode: true,
readableObjectMode: true,
transform(chunk, encoding, callback) {
(async () => {
for await (let value of next(chunk, fns, 0)) {
sanitize(value, this);
}
})().then(() => callback(null), error => callback(error));
}
})
: null;
};
module.exports = gen;

41
server/node_modules/stream-chain/utils/scan.js generated vendored Normal file
View File

@@ -0,0 +1,41 @@
'use strict';
const {Transform} = require('stream');
const defaultInitial = 0;
const defaultReducer = (acc, value) => value;
class Scan extends Transform {
constructor(options) {
super(Object.assign({}, options, {writableObjectMode: true, readableObjectMode: true}));
this._accumulator = defaultInitial;
this._reducer = defaultReducer;
if (options) {
'initial' in options && (this._accumulator = options.initial);
'reducer' in options && (this._reducer = options.reducer);
}
}
_transform(chunk, encoding, callback) {
const result = this._reducer.call(this, this._accumulator, chunk);
if (result && typeof result.then == 'function') {
result.then(
value => {
this._accumulator = value;
this.push(this._accumulator);
callback(null);
},
error => callback(error)
);
} else {
this._accumulator = result;
this.push(this._accumulator);
callback(null);
}
}
static make(reducer, initial) {
return new Scan(typeof reducer == 'object' ? reducer : {reducer, initial});
}
}
Scan.make.Constructor = Scan;
module.exports = Scan.make;

32
server/node_modules/stream-chain/utils/skip.js generated vendored Normal file
View File

@@ -0,0 +1,32 @@
'use strict';
const {Transform} = require('stream');
class Skip extends Transform {
constructor(options) {
super(Object.assign({}, options, {writableObjectMode: true, readableObjectMode: true}));
this._n = 0;
if (options) {
'n' in options && (this._n = options.n);
}
if (this._n <= 0) {
this._transform = this._passThrough;
}
}
_transform(chunk, encoding, callback) {
if (--this._n <= 0) {
this._transform = this._passThrough;
}
callback(null);
}
_passThrough(chunk, encoding, callback) {
this.push(chunk);
callback(null);
}
static make(n) {
return new Skip(typeof n == 'object' ? n : {n});
}
}
Skip.make.Constructor = Skip;
module.exports = Skip.make;

46
server/node_modules/stream-chain/utils/skipWhile.js generated vendored Normal file
View File

@@ -0,0 +1,46 @@
'use strict';
const {Transform} = require('stream');
const alwaysFalse = () => false;
class SkipWhile extends Transform {
constructor(options) {
super(Object.assign({}, options, {writableObjectMode: true, readableObjectMode: true}));
this._condition = alwaysFalse;
if (options) {
'condition' in options && (this._condition = options.condition);
}
}
_transform(chunk, encoding, callback) {
const result = this._condition.call(this, chunk);
if (result && typeof result.then == 'function') {
result.then(
flag => {
if (!flag) {
this._transform = this._passThrough;
this.push(chunk);
}
callback(null);
},
error => callback(error)
);
} else {
if (!result) {
this._transform = this._passThrough;
this.push(chunk);
}
callback(null);
}
}
_passThrough(chunk, encoding, callback) {
this.push(chunk);
callback(null);
}
static make(condition) {
return new SkipWhile(typeof condition == 'object' ? condition : {condition});
}
}
SkipWhile.make.Constructor = SkipWhile;
module.exports = SkipWhile.make;

39
server/node_modules/stream-chain/utils/take.js generated vendored Normal file
View File

@@ -0,0 +1,39 @@
'use strict';
const {Transform} = require('stream');
class Take extends Transform {
constructor(options) {
super(Object.assign({}, options, {writableObjectMode: true, readableObjectMode: true}));
this._n = this._skip = 0;
if (options) {
'n' in options && (this._n = options.n);
'skip' in options && (this._skip = options.skip);
}
if (this._skip <= 0) {
this._transform = this._n > 0 ? this._countValues : this._doNothing;
}
}
_transform(chunk, encoding, callback) {
if (--this._skip <= 0) {
this._transform = this._n > 0 ? this._countValues : this._doNothing;
}
callback(null);
}
_countValues(chunk, encoding, callback) {
if (--this._n <= 0) {
this._transform = this._doNothing;
}
this.push(chunk);
callback(null);
}
_doNothing(chunk, encoding, callback) {
callback(null);
}
static make(n) {
return new Take(typeof n == 'object' ? n : {n});
}
}
Take.make.Constructor = Take;
module.exports = Take.make;

47
server/node_modules/stream-chain/utils/takeWhile.js generated vendored Normal file
View File

@@ -0,0 +1,47 @@
'use strict';
const {Transform} = require('stream');
const alwaysTrue = () => true;
class TakeWhile extends Transform {
constructor(options) {
super(Object.assign({}, options, {writableObjectMode: true, readableObjectMode: true}));
this._condition = alwaysTrue;
if (options) {
'condition' in options && (this._condition = options.condition);
}
}
_transform(chunk, encoding, callback) {
const result = this._condition.call(this, chunk);
if (result && typeof result.then == 'function') {
result.then(
flag => {
if (flag) {
this.push(chunk);
} else {
this._transform = this._doNothing;
}
callback(null);
},
error => callback(error)
);
} else {
if (result) {
this.push(chunk);
} else {
this._transform = this._doNothing;
}
callback(null);
}
}
_doNothing(chunk, encoding, callback) {
callback(null);
}
static make(condition) {
return new TakeWhile(typeof condition == 'object' ? condition : {condition});
}
}
TakeWhile.make.Constructor = TakeWhile;
module.exports = TakeWhile.make;