Merge pull request #11 from seald/remove-async

Remove async dependency
pull/22/head
tex0l 3 years ago committed by GitHub
commit 6708541c72
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
  1. 1024
      API.md
  2. 54
      CHANGELOG.md
  3. 847
      README.md
  4. 3
      benchmarks/commonUtilities.js
  5. 9
      benchmarks/ensureIndex.js
  6. 10
      benchmarks/find.js
  7. 10
      benchmarks/findOne.js
  8. 10
      benchmarks/findWithIn.js
  9. 8
      benchmarks/insert.js
  10. 10
      benchmarks/loadDatabase.js
  11. 14
      benchmarks/remove.js
  12. 14
      benchmarks/update.js
  13. 12
      browser-version/lib/customUtils.js
  14. 196
      browser-version/lib/storage.browser.js
  15. 296
      browser-version/lib/storage.react-native.js
  16. 52
      index.d.ts
  17. 5
      jsdoc.conf.js
  18. 2
      karma.conf.template.js
  19. 106
      lib/byline.js
  20. 167
      lib/cursor.js
  21. 9
      lib/customUtils.js
  22. 1009
      lib/datastore.js
  23. 99
      lib/executor.js
  24. 79
      lib/indexes.js
  25. 432
      lib/model.js
  26. 314
      lib/persistence.js
  27. 305
      lib/storage.js
  28. 61
      lib/utils.js
  29. 48
      lib/waterfall.js
  30. 15041
      package-lock.json
  31. 14
      package.json
  32. 58
      test/browser/load.spec.js
  33. 6
      test/browser/nedb-browser.spec.js
  34. 2
      test/byline.test.js
  35. 519
      test/cursor.async.test.js
  36. 47
      test/cursor.test.js
  37. 2036
      test/db.async.test.js
  38. 81
      test/db.test.js
  39. 83
      test/executor.async.test.js
  40. 17
      test/executor.test.js
  41. 1001
      test/persistence.async.test.js
  42. 231
      test/persistence.test.js
  43. 46
      test/utils.test.js
  44. 138
      test_lac/loadAndCrash.test.js
  45. 107
      test_lac/openFds.test.js
  46. 4
      typings-tests.ts
  47. 8
      webpack.config.js

1024
API.md

File diff suppressed because it is too large Load Diff

@ -6,6 +6,58 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres and this project adheres
to [Semantic Versioning](https://semver.org/spec/v2.0.0.html). to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [3.0.0] - Unreleased
### Added
- Added a `Promise`-based interface.
- The JSDoc is now much more exhaustive.
- An auto-generated JSDoc file is generated: [API.md](./API.md).
- Added `Datastore#dropDatabaseAsync` and its callback equivalent.
- The Error given when the `Datastore#corruptAlertThreshold` is reached now has three properties: `dataLength` which is the amount of lines in the database file (excluding empty lines), `corruptItems` which is the amount of corrupted lines, `corruptionRate` which the rate of corruption between 0 and 1.
### Changed
- The `corruptionAlertThreshold` now doesn't take into account empty lines, and the error message is slightly changed.
- The `Datastore#update`'s callback has its signature slightly changed. The
`upsert` flag is always defined either at `true` or `false` but not `null` nor
`undefined`, and `affectedDocuments` is `null` when none is given rather than
`undefined` (except when there is an error of course).
- In order to expose a `Promise`-based interface and to remove `async` from the dependencies, many internals have been either rewritten or removed:
- Datastore:
- `Datastore#getCandidates` replaced with `Datastore#_getCandidatesAsync`;
- `Datastore#resetIndexes` replaced with `Datastore#_resetIndexes`;
- `Datastore#addToIndexes` replaced with `Datastore#_addToIndexes`;
- `Datastore#removeFromIndexes` replaced with `Datastore#_removeFromIndexes`;
- `Datastore#updateIndexes` replaced with `Datastore#_updateIndexes`;
- `Datastore#_insert` replaced with `Datastore#_insertAsync`;
- `Datastore#_update` replaced with `Datastore#_updateAsync`;
- `Datastore#_remove` replaced with `Datastore#_removeAsync`;
- Persistence:
- `Persistence#loadDatabase` replaced with `Persistence#loadDatabaseAsync`;
- `Persistence#persistCachedDatabase` replaced with `Persistence#persistCachedDatabaseAsync`;
- `Persistence#persistNewState` replaced with `Persistence#persistNewStateAsync`;
- `Persistence#treatRawStream` replaced with `Persistence#treatRawStreamAsync`;
- `Persistence.ensureDirectoryExists` replaced with `Persistence#ensureDirectoryExistsAsync`;
- Cursor:
- `Cursor#_exec` replaced with `Cursor#_execAsync`;
- `Cursor#project` replaced with `Cursor#_project`;
- `Cursor#execFn` has been renamed to `Cursor#mapFn` and no longer supports a callback in its signature, it must be a synchronous function.
- Executor: it has been rewritten entirely without the `async`library.
- `Executor#buffer` & `Executor#queue` do not have the same signatures as before;
- `Executor#push` replaced with `Executor#pushAsync` which is substantially different;
- Storage modules : callback-based functions have been replaced with promise-based functions.
- Model module: it has been slightly re-written for clarity, but no changes in its interface were made.
- Typings were updated accordingly.
## Deprecated
- Using a `string` in the constructor of NeDB is now deprecated.
- Using `Datastore#persistence#compactDatafile` is now deprecated, please use `Datastore#compactDatafile` instead.
- Using `Datastore#persistence#setAutocompactionInterval` is now deprecated, please use `Datastore#setAutocompactionInterval` instead.
- Using `Datastore#persistence#stopAutocompaction` is now deprecated, please use `Datastore#stopAutocompaction` instead.
## Removed
- The option for passing `options.nodeWebkitAppName` to the Datastore and the Persistence constructors has been removed, subsequently, the static method `Persistence.getNWAppFilename` has been removed as well;
- Compatibility with node < 10.1.0 (we use `fs.promises`).
## [2.2.1] - 2022-01-18 ## [2.2.1] - 2022-01-18
### Changed ### Changed
- [#20](https://github.com/seald/nedb/pull/20) storage.js: check fsync capability from return code rather than using process.platform heuristics (Thanks [@bitmeal](https://github.com/bitmeal)). - [#20](https://github.com/seald/nedb/pull/20) storage.js: check fsync capability from return code rather than using process.platform heuristics (Thanks [@bitmeal](https://github.com/bitmeal)).
@ -13,6 +65,8 @@ to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [2.2.0] - 2021-10-29 ## [2.2.0] - 2021-10-29
### Added ### Added
- Include a `"react-native"` version (heavily inspired from [react-native-local-mongdb](https://github.com/antoniopresto/react-native-local-mongodb)). - Include a `"react-native"` version (heavily inspired from [react-native-local-mongdb](https://github.com/antoniopresto/react-native-local-mongodb)).
### Changed
- The browser version uses `browser-version/lib/storage.browser.js` instead of `browser-version/lib/storage.js` in the `"browser"` field of the package.json.
## [2.1.0] - 2021-10-21 ## [2.1.0] - 2021-10-21
Thanks to [@eliot-akira](https://github.com/eliot-akira) for the amazing work on file streaming. Thanks to [@eliot-akira](https://github.com/eliot-akira) for the amazing work on file streaming.

File diff suppressed because it is too large Load Diff

@ -5,6 +5,7 @@ const fs = require('fs')
const path = require('path') const path = require('path')
const Datastore = require('../lib/datastore') const Datastore = require('../lib/datastore')
const Persistence = require('../lib/persistence') const Persistence = require('../lib/persistence')
const { callbackify } = require('util')
let executeAsap let executeAsap
try { try {
@ -45,7 +46,7 @@ module.exports.getConfiguration = function (benchDb) {
* Ensure the workspace stat and the db datafile is empty * Ensure the workspace stat and the db datafile is empty
*/ */
module.exports.prepareDb = function (filename, cb) { module.exports.prepareDb = function (filename, cb) {
Persistence.ensureDirectoryExists(path.dirname(filename), function () { callbackify((dirname) => Persistence.ensureDirectoryExistsAsync(dirname))(path.dirname(filename), function () {
fs.access(filename, fs.constants.FS_OK, function (err) { fs.access(filename, fs.constants.FS_OK, function (err) {
if (!err) { if (!err) {
fs.unlink(filename, cb) fs.unlink(filename, cb)

@ -1,5 +1,5 @@
const async = require('async')
const program = require('commander') const program = require('commander')
const { apply, waterfall } = require('../test/utils.test.js')
const Datastore = require('../lib/datastore') const Datastore = require('../lib/datastore')
const commonUtilities = require('./commonUtilities') const commonUtilities = require('./commonUtilities')
const Profiler = require('./profiler') const Profiler = require('./profiler')
@ -19,8 +19,8 @@ console.log('----------------------------')
console.log('Test with ' + n + ' documents') console.log('Test with ' + n + ' documents')
console.log('----------------------------') console.log('----------------------------')
async.waterfall([ waterfall([
async.apply(commonUtilities.prepareDb, benchDb), apply(commonUtilities.prepareDb, benchDb),
function (cb) { function (cb) {
d.loadDatabase(function (err) { d.loadDatabase(function (err) {
if (err) { return cb(err) } if (err) { return cb(err) }
@ -28,7 +28,7 @@ async.waterfall([
}) })
}, },
function (cb) { profiler.beginProfiling(); return cb() }, function (cb) { profiler.beginProfiling(); return cb() },
async.apply(commonUtilities.insertDocs, d, n, profiler), apply(commonUtilities.insertDocs, d, n, profiler),
function (cb) { function (cb) {
let i let i
@ -41,6 +41,7 @@ async.waterfall([
console.log('Average time for one ensureIndex: ' + (profiler.elapsedSinceLastStep() / n) + 'ms') console.log('Average time for one ensureIndex: ' + (profiler.elapsedSinceLastStep() / n) + 'ms')
profiler.step('Finished calling ensureIndex ' + n + ' times') profiler.step('Finished calling ensureIndex ' + n + ' times')
cb()
} }
], function (err) { ], function (err) {
profiler.step('Benchmark finished') profiler.step('Benchmark finished')

@ -1,4 +1,4 @@
const async = require('async') const { apply, waterfall } = require('../test/utils.test.js')
const commonUtilities = require('./commonUtilities') const commonUtilities = require('./commonUtilities')
const Profiler = require('./profiler') const Profiler = require('./profiler')
@ -8,8 +8,8 @@ const config = commonUtilities.getConfiguration(benchDb)
const d = config.d const d = config.d
const n = config.n const n = config.n
async.waterfall([ waterfall([
async.apply(commonUtilities.prepareDb, benchDb), apply(commonUtilities.prepareDb, benchDb),
function (cb) { function (cb) {
d.loadDatabase(function (err) { d.loadDatabase(function (err) {
if (err) { return cb(err) } if (err) { return cb(err) }
@ -18,8 +18,8 @@ async.waterfall([
}) })
}, },
function (cb) { profiler.beginProfiling(); return cb() }, function (cb) { profiler.beginProfiling(); return cb() },
async.apply(commonUtilities.insertDocs, d, n, profiler), apply(commonUtilities.insertDocs, d, n, profiler),
async.apply(commonUtilities.findDocs, d, n, profiler) apply(commonUtilities.findDocs, d, n, profiler)
], function (err) { ], function (err) {
profiler.step('Benchmark finished') profiler.step('Benchmark finished')

@ -1,4 +1,4 @@
const async = require('async') const { apply, waterfall } = require('../test/utils.test.js')
const commonUtilities = require('./commonUtilities') const commonUtilities = require('./commonUtilities')
const Profiler = require('./profiler') const Profiler = require('./profiler')
@ -8,8 +8,8 @@ const config = commonUtilities.getConfiguration(benchDb)
const d = config.d const d = config.d
const n = config.n const n = config.n
async.waterfall([ waterfall([
async.apply(commonUtilities.prepareDb, benchDb), apply(commonUtilities.prepareDb, benchDb),
function (cb) { function (cb) {
d.loadDatabase(function (err) { d.loadDatabase(function (err) {
if (err) { return cb(err) } if (err) { return cb(err) }
@ -18,9 +18,9 @@ async.waterfall([
}) })
}, },
function (cb) { profiler.beginProfiling(); return cb() }, function (cb) { profiler.beginProfiling(); return cb() },
async.apply(commonUtilities.insertDocs, d, n, profiler), apply(commonUtilities.insertDocs, d, n, profiler),
function (cb) { setTimeout(function () { cb() }, 500) }, function (cb) { setTimeout(function () { cb() }, 500) },
async.apply(commonUtilities.findOneDocs, d, n, profiler) apply(commonUtilities.findOneDocs, d, n, profiler)
], function (err) { ], function (err) {
profiler.step('Benchmark finished') profiler.step('Benchmark finished')

@ -1,4 +1,4 @@
const async = require('async') const { apply, waterfall } = require('../test/utils.test.js')
const commonUtilities = require('./commonUtilities') const commonUtilities = require('./commonUtilities')
const Profiler = require('./profiler') const Profiler = require('./profiler')
@ -8,8 +8,8 @@ const config = commonUtilities.getConfiguration(benchDb)
const d = config.d const d = config.d
const n = config.n const n = config.n
async.waterfall([ waterfall([
async.apply(commonUtilities.prepareDb, benchDb), apply(commonUtilities.prepareDb, benchDb),
function (cb) { function (cb) {
d.loadDatabase(function (err) { d.loadDatabase(function (err) {
if (err) { return cb(err) } if (err) { return cb(err) }
@ -18,8 +18,8 @@ async.waterfall([
}) })
}, },
function (cb) { profiler.beginProfiling(); return cb() }, function (cb) { profiler.beginProfiling(); return cb() },
async.apply(commonUtilities.insertDocs, d, n, profiler), apply(commonUtilities.insertDocs, d, n, profiler),
async.apply(commonUtilities.findDocsWithIn, d, n, profiler) apply(commonUtilities.findDocsWithIn, d, n, profiler)
], function (err) { ], function (err) {
profiler.step('Benchmark finished') profiler.step('Benchmark finished')

@ -1,4 +1,4 @@
const async = require('async') const { apply, waterfall } = require('../test/utils.test.js')
const commonUtilities = require('./commonUtilities') const commonUtilities = require('./commonUtilities')
const Profiler = require('./profiler') const Profiler = require('./profiler')
@ -8,8 +8,8 @@ const config = commonUtilities.getConfiguration(benchDb)
const d = config.d const d = config.d
let n = config.n let n = config.n
async.waterfall([ waterfall([
async.apply(commonUtilities.prepareDb, benchDb), apply(commonUtilities.prepareDb, benchDb),
function (cb) { function (cb) {
d.loadDatabase(function (err) { d.loadDatabase(function (err) {
if (err) { return cb(err) } if (err) { return cb(err) }
@ -24,7 +24,7 @@ async.waterfall([
}) })
}, },
function (cb) { profiler.beginProfiling(); return cb() }, function (cb) { profiler.beginProfiling(); return cb() },
async.apply(commonUtilities.insertDocs, d, n, profiler) apply(commonUtilities.insertDocs, d, n, profiler)
], function (err) { ], function (err) {
profiler.step('Benchmark finished') profiler.step('Benchmark finished')

@ -1,4 +1,4 @@
const async = require('async') const { apply, waterfall } = require('../test/utils.test.js')
const program = require('commander') const program = require('commander')
const Datastore = require('../lib/datastore') const Datastore = require('../lib/datastore')
const commonUtilities = require('./commonUtilities') const commonUtilities = require('./commonUtilities')
@ -20,14 +20,14 @@ console.log('Test with ' + n + ' documents')
console.log(program.withIndex ? 'Use an index' : "Don't use an index") console.log(program.withIndex ? 'Use an index' : "Don't use an index")
console.log('----------------------------') console.log('----------------------------')
async.waterfall([ waterfall([
async.apply(commonUtilities.prepareDb, benchDb), apply(commonUtilities.prepareDb, benchDb),
function (cb) { function (cb) {
d.loadDatabase(cb) d.loadDatabase(cb)
}, },
function (cb) { profiler.beginProfiling(); return cb() }, function (cb) { profiler.beginProfiling(); return cb() },
async.apply(commonUtilities.insertDocs, d, n, profiler), apply(commonUtilities.insertDocs, d, n, profiler),
async.apply(commonUtilities.loadDatabase, d, n, profiler) apply(commonUtilities.loadDatabase, d, n, profiler)
], function (err) { ], function (err) {
profiler.step('Benchmark finished') profiler.step('Benchmark finished')

@ -1,4 +1,4 @@
const async = require('async') const { apply, waterfall } = require('../test/utils.test.js')
const commonUtilities = require('./commonUtilities') const commonUtilities = require('./commonUtilities')
const Profiler = require('./profiler') const Profiler = require('./profiler')
@ -8,8 +8,8 @@ const config = commonUtilities.getConfiguration(benchDb)
const d = config.d const d = config.d
const n = config.n const n = config.n
async.waterfall([ waterfall([
async.apply(commonUtilities.prepareDb, benchDb), apply(commonUtilities.prepareDb, benchDb),
function (cb) { function (cb) {
d.loadDatabase(function (err) { d.loadDatabase(function (err) {
if (err) { return cb(err) } if (err) { return cb(err) }
@ -18,16 +18,16 @@ async.waterfall([
}) })
}, },
function (cb) { profiler.beginProfiling(); return cb() }, function (cb) { profiler.beginProfiling(); return cb() },
async.apply(commonUtilities.insertDocs, d, n, profiler), apply(commonUtilities.insertDocs, d, n, profiler),
// Test with remove only one document // Test with remove only one document
function (cb) { profiler.step('MULTI: FALSE'); return cb() }, function (cb) { profiler.step('MULTI: FALSE'); return cb() },
async.apply(commonUtilities.removeDocs, { multi: false }, d, n, profiler), apply(commonUtilities.removeDocs, { multi: false }, d, n, profiler),
// Test with multiple documents // Test with multiple documents
function (cb) { d.remove({}, { multi: true }, function () { return cb() }) }, function (cb) { d.remove({}, { multi: true }, function () { return cb() }) },
async.apply(commonUtilities.insertDocs, d, n, profiler), apply(commonUtilities.insertDocs, d, n, profiler),
function (cb) { profiler.step('MULTI: TRUE'); return cb() }, function (cb) { profiler.step('MULTI: TRUE'); return cb() },
async.apply(commonUtilities.removeDocs, { multi: true }, d, n, profiler) apply(commonUtilities.removeDocs, { multi: true }, d, n, profiler)
], function (err) { ], function (err) {
profiler.step('Benchmark finished') profiler.step('Benchmark finished')

@ -1,4 +1,4 @@
const async = require('async') const { apply, waterfall } = require('../test/utils.test.js')
const commonUtilities = require('./commonUtilities') const commonUtilities = require('./commonUtilities')
const Profiler = require('./profiler') const Profiler = require('./profiler')
@ -8,8 +8,8 @@ const config = commonUtilities.getConfiguration(benchDb)
const d = config.d const d = config.d
const n = config.n const n = config.n
async.waterfall([ waterfall([
async.apply(commonUtilities.prepareDb, benchDb), apply(commonUtilities.prepareDb, benchDb),
function (cb) { function (cb) {
d.loadDatabase(function (err) { d.loadDatabase(function (err) {
if (err) { return cb(err) } if (err) { return cb(err) }
@ -18,18 +18,18 @@ async.waterfall([
}) })
}, },
function (cb) { profiler.beginProfiling(); return cb() }, function (cb) { profiler.beginProfiling(); return cb() },
async.apply(commonUtilities.insertDocs, d, n, profiler), apply(commonUtilities.insertDocs, d, n, profiler),
// Test with update only one document // Test with update only one document
function (cb) { profiler.step('MULTI: FALSE'); return cb() }, function (cb) { profiler.step('MULTI: FALSE'); return cb() },
async.apply(commonUtilities.updateDocs, { multi: false }, d, n, profiler), apply(commonUtilities.updateDocs, { multi: false }, d, n, profiler),
// Test with multiple documents // Test with multiple documents
// eslint-disable-next-line node/handle-callback-err // eslint-disable-next-line node/handle-callback-err
function (cb) { d.remove({}, { multi: true }, function (err) { return cb() }) }, function (cb) { d.remove({}, { multi: true }, function (err) { return cb() }) },
async.apply(commonUtilities.insertDocs, d, n, profiler), apply(commonUtilities.insertDocs, d, n, profiler),
function (cb) { profiler.step('MULTI: TRUE'); return cb() }, function (cb) { profiler.step('MULTI: TRUE'); return cb() },
async.apply(commonUtilities.updateDocs, { multi: true }, d, n, profiler) apply(commonUtilities.updateDocs, { multi: true }, d, n, profiler)
], function (err) { ], function (err) {
profiler.step('Benchmark finished') profiler.step('Benchmark finished')

@ -1,11 +1,16 @@
/** /**
* Specific customUtils for the browser, where we don't have access to the Crypto and Buffer modules * Utility functions that need to be reimplemented for each environment.
* This is the version for the browser & React-Native
* @module customUtilsBrowser
* @private
*/ */
/** /**
* Taken from the crypto-browserify module * Taken from the crypto-browserify module
* https://github.com/dominictarr/crypto-browserify * https://github.com/dominictarr/crypto-browserify
* NOTE: Math.random() does not guarantee "cryptographic quality" but we actually don't need it * NOTE: Math.random() does not guarantee "cryptographic quality" but we actually don't need it
* @param {number} size in bytes
* @return {Array<number>}
*/ */
const randomBytes = size => { const randomBytes = size => {
const bytes = new Array(size) const bytes = new Array(size)
@ -21,6 +26,8 @@ const randomBytes = size => {
/** /**
* Taken from the base64-js module * Taken from the base64-js module
* https://github.com/beatgammit/base64-js/ * https://github.com/beatgammit/base64-js/
* @param {Array} uint8
* @return {string}
*/ */
const byteArrayToBase64 = uint8 => { const byteArrayToBase64 = uint8 => {
const lookup = 'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/' const lookup = 'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/'
@ -60,6 +67,9 @@ const byteArrayToBase64 = uint8 => {
* that's not an issue here * that's not an issue here
* The probability of a collision is extremely small (need 3*10^12 documents to have one chance in a million of a collision) * The probability of a collision is extremely small (need 3*10^12 documents to have one chance in a million of a collision)
* See http://en.wikipedia.org/wiki/Birthday_problem * See http://en.wikipedia.org/wiki/Birthday_problem
* @param {number} len
* @return {string}
* @alias module:customUtilsNode.uid
*/ */
const uid = len => byteArrayToBase64(randomBytes(Math.ceil(Math.max(8, len * 2)))).replace(/[+/]/g, '').slice(0, len) const uid = len => byteArrayToBase64(randomBytes(Math.ceil(Math.max(8, len * 2)))).replace(/[+/]/g, '').slice(0, len)

@ -1,11 +1,13 @@
/** /**
* Way data is stored for this database * Way data is stored for this database
* For a Node.js/Node Webkit database it's the file system
* For a browser-side database it's localforage, which uses the best backend available (IndexedDB then WebSQL then localStorage)
* For a react-native database, we use @react-native-async-storage/async-storage
* *
* This version is the browser version * This version is the browser version and uses [localforage]{@link https://github.com/localForage/localForage} which chooses the best option depending on user browser (IndexedDB then WebSQL then localStorage).
* @module storageBrowser
* @see module:storage
* @see module:storageReactNative
* @private
*/ */
const localforage = require('localforage') const localforage = require('localforage')
// Configure localforage to display NeDB name for now. Would be a good idea to let user use his own app name // Configure localforage to display NeDB name for now. Would be a good idea to let user use his own app name
@ -14,73 +16,163 @@ const store = localforage.createInstance({
storeName: 'nedbdata' storeName: 'nedbdata'
}) })
const exists = (filename, cback) => { /**
// eslint-disable-next-line node/handle-callback-err * Returns Promise<true> if file exists.
store.getItem(filename, (err, value) => { *
if (value !== null) return cback(true) // Even if value is undefined, localforage returns null * @param {string} file
else return cback(false) * @return {Promise<boolean>}
}) * @async
* @alias module:storageBrowser.existsAsync
*/
const existsAsync = async file => {
try {
const value = await store.getItem(file)
if (value !== null) return true // Even if value is undefined, localforage returns null
return false
} catch (error) {
return false
}
} }
const rename = (filename, newFilename, callback) => { /**
// eslint-disable-next-line node/handle-callback-err * Moves the item from one path to another.
store.getItem(filename, (err, value) => { * @param {string} oldPath
if (value === null) store.removeItem(newFilename, () => callback()) * @param {string} newPath
* @return {Promise<void>}
* @alias module:storageBrowser.renameAsync
* @async
*/
const renameAsync = async (oldPath, newPath) => {
try {
const value = await store.getItem(oldPath)
if (value === null) await store.removeItem(newPath)
else { else {
store.setItem(newFilename, value, () => { await store.setItem(newPath, value)
store.removeItem(filename, () => callback()) await store.removeItem(oldPath)
}) }
} catch (err) {
console.warn('An error happened while renaming, skip')
} }
})
} }
const writeFile = (filename, contents, options, callback) => { /**
* Saves the item at given path.
* @param {string} file
* @param {string} data
* @param {object} [options]
* @return {Promise<void>}
* @alias module:storageBrowser.writeFileAsync
* @async
*/
const writeFileAsync = async (file, data, options) => {
// Options do not matter in browser setup // Options do not matter in browser setup
if (typeof options === 'function') { callback = options } try {
store.setItem(filename, contents, () => callback()) await store.setItem(file, data)
} catch (error) {
console.warn('An error happened while writing, skip')
}
} }
const appendFile = (filename, toAppend, options, callback) => { /**
* Append to the item at given path.
* @function
* @param {string} filename
* @param {string} toAppend
* @param {object} [options]
* @return {Promise<void>}
* @alias module:storageBrowser.appendFileAsync
* @async
*/
const appendFileAsync = async (filename, toAppend, options) => {
// Options do not matter in browser setup // Options do not matter in browser setup
if (typeof options === 'function') { callback = options } try {
const contents = (await store.getItem(filename)) || ''
// eslint-disable-next-line node/handle-callback-err await store.setItem(filename, contents + toAppend)
store.getItem(filename, (err, contents) => { } catch (error) {
contents = contents || '' console.warn('An error happened appending to file writing, skip')
contents += toAppend }
store.setItem(filename, contents, () => callback())
})
} }
const readFile = (filename, options, callback) => { /**
// Options do not matter in browser setup * Read data at given path.
if (typeof options === 'function') { callback = options } * @function
// eslint-disable-next-line node/handle-callback-err * @param {string} filename
store.getItem(filename, (err, contents) => callback(null, contents || '')) * @param {object} [options]
* @return {Promise<Buffer>}
* @alias module:storageBrowser.readFileAsync
* @async
*/
const readFileAsync = async (filename, options) => {
try {
return (await store.getItem(filename)) || ''
} catch (error) {
console.warn('An error happened while reading, skip')
return ''
}
} }
const unlink = (filename, callback) => { /**
store.removeItem(filename, () => callback()) * Async version of {@link module:storageBrowser.unlink}.
* @function
* @param {string} filename
* @return {Promise<void>}
* @async
* @alias module:storageBrowser.unlink
*/
const unlinkAsync = async filename => {
try {
await store.removeItem(filename)
} catch (error) {
console.warn('An error happened while unlinking, skip')
}
} }
// Nothing to do, no directories will be used on the browser /**
const mkdir = (dir, options, callback) => callback() * Shim for {@link module:storage.mkdirAsync}, nothing to do, no directories will be used on the browser.
* @function
* @param {string} path
* @param {object} [options]
* @return {Promise<void|string>}
* @alias module:storageBrowser.mkdirAsync
* @async
*/
const mkdirAsync = (path, options) => Promise.resolve()
// Nothing to do, no data corruption possible in the browser /**
const ensureDatafileIntegrity = (filename, callback) => callback(null) * Shim for {@link module:storage.ensureDatafileIntegrityAsync}, nothing to do, no data corruption possible in the browser.
* @param {string} filename
* @return {Promise<void>}
* @alias module:storageBrowser.ensureDatafileIntegrityAsync
*/
const ensureDatafileIntegrityAsync = (filename) => Promise.resolve()
const crashSafeWriteFileLines = (filename, lines, callback) => { /**
* Fully write or rewrite the datafile, immune to crashes during the write operation (data will not be lost)
* * @param {string} filename
* @param {string[]} lines
* @return {Promise<void>}
* @alias module:storageBrowser.crashSafeWriteFileLinesAsync
*/
const crashSafeWriteFileLinesAsync = async (filename, lines) => {
lines.push('') // Add final new line lines.push('') // Add final new line
writeFile(filename, lines.join('\n'), callback) await writeFileAsync(filename, lines.join('\n'))
} }
// Interface // Interface
module.exports.exists = exists module.exports.existsAsync = existsAsync
module.exports.rename = rename
module.exports.writeFile = writeFile module.exports.renameAsync = renameAsync
module.exports.crashSafeWriteFileLines = crashSafeWriteFileLines
module.exports.appendFile = appendFile module.exports.writeFileAsync = writeFileAsync
module.exports.readFile = readFile
module.exports.unlink = unlink module.exports.crashSafeWriteFileLinesAsync = crashSafeWriteFileLinesAsync
module.exports.mkdir = mkdir
module.exports.ensureDatafileIntegrity = ensureDatafileIntegrity module.exports.appendFileAsync = appendFileAsync
module.exports.readFileAsync = readFileAsync
module.exports.unlinkAsync = unlinkAsync
module.exports.mkdirAsync = mkdirAsync
module.exports.ensureDatafileIntegrityAsync = ensureDatafileIntegrityAsync

@ -1,86 +1,282 @@
/** /**
* Way data is stored for this database * Way data is stored for this database
* For a Node.js/Node Webkit database it's the file system
* For a browser-side database it's localforage, which uses the best backend available (IndexedDB then WebSQL then localStorage)
* For a react-native database, we use @react-native-async-storage/async-storage
* *
* This version is the react-native version * This version is the React-Native version and uses [@react-native-async-storage/async-storage]{@link https://github.com/react-native-async-storage/async-storage}.
* @module storageReactNative
* @see module:storageBrowser
* @see module:storage
* @private
*/ */
const AsyncStorage = require('@react-native-async-storage/async-storage').default const AsyncStorage = require('@react-native-async-storage/async-storage').default
const { callbackify } = require('util')
const exists = (filename, cback) => { /**
// eslint-disable-next-line node/handle-callback-err * Async version of {@link module:storageReactNative.exists}.
AsyncStorage.getItem(filename, (err, value) => { * @param {string} file
if (value !== null) { * @return {Promise<boolean>}
return cback(true) * @async
} else { * @alias module:storageReactNative.existsAsync
return cback(false) * @see module:storageReactNative.exists
*/
const existsAsync = async file => {
try {
const value = await AsyncStorage.getItem(file)
if (value !== null) return true // Even if value is undefined, AsyncStorage returns null
return false
} catch (error) {
return false
} }
})
} }
/**
* @callback module:storageReactNative~existsCallback
* @param {boolean} exists
*/
/**
* Callback returns true if file exists
* @function
* @param {string} file
* @param {module:storageReactNative~existsCallback} cb
* @alias module:storageReactNative.exists
*/
const exists = callbackify(existsAsync)
const rename = (filename, newFilename, callback) => { /**
// eslint-disable-next-line node/handle-callback-err * Async version of {@link module:storageReactNative.rename}.
AsyncStorage.getItem(filename, (err, value) => { * @param {string} oldPath
if (value === null) { * @param {string} newPath
this.storage.removeItem(newFilename, callback) * @return {Promise<void>}
} else { * @alias module:storageReactNative.renameAsync
this.storage.setItem(newFilename, value, () => { * @async
this.storage.removeItem(filename, callback) * @see module:storageReactNative.rename
}) */
const renameAsync = async (oldPath, newPath) => {
try {
const value = await AsyncStorage.getItem(oldPath)
if (value === null) await AsyncStorage.removeItem(newPath)
else {
await AsyncStorage.setItem(newPath, value)
await AsyncStorage.removeItem(oldPath)
}
} catch (err) {
console.warn('An error happened while renaming, skip')
} }
})
} }
const writeFile = (filename, contents, options, callback) => { /**
// Options do not matter in a react-native setup * Moves the item from one path to another
if (typeof options === 'function') { callback = options } * @function
AsyncStorage.setItem(filename, contents, callback) * @param {string} oldPath
* @param {string} newPath
* @param {NoParamCallback} c
* @return {void}
* @alias module:storageReactNative.rename
*/
const rename = callbackify(renameAsync)
/**
* Async version of {@link module:storageReactNative.writeFile}.
* @param {string} file
* @param {string} data
* @param {object} [options]
* @return {Promise<void>}
* @alias module:storageReactNative.writeFileAsync
* @async
* @see module:storageReactNative.writeFile
*/
const writeFileAsync = async (file, data, options) => {
// Options do not matter in react-native setup
try {
await AsyncStorage.setItem(file, data)
} catch (error) {
console.warn('An error happened while writing, skip')
}
} }
const appendFile = (filename, toAppend, options, callback) => { /**
// Options do not matter in a react-native setup * Saves the item at given path
if (typeof options === 'function') { callback = options } * @function
* @param {string} path
* @param {string} data
* @param {object} options
* @param {function} callback
* @alias module:storageReactNative.writeFile
*/
const writeFile = callbackify(writeFileAsync)
// eslint-disable-next-line node/handle-callback-err /**
AsyncStorage.getItem(filename, (err, contents) => { * Async version of {@link module:storageReactNative.appendFile}.
contents = contents || '' * @function
contents += toAppend * @param {string} filename
AsyncStorage.setItem(filename, contents, callback) * @param {string} toAppend
}) * @param {object} [options]
* @return {Promise<void>}
* @alias module:storageReactNative.appendFileAsync
* @async
* @see module:storageReactNative.appendFile
*/
const appendFileAsync = async (filename, toAppend, options) => {
// Options do not matter in react-native setup
try {
const contents = (await AsyncStorage.getItem(filename)) || ''
await AsyncStorage.setItem(filename, contents + toAppend)
} catch (error) {
console.warn('An error happened appending to file writing, skip')
}
} }
const readFile = (filename, options, callback) => { /**
// Options do not matter in a react-native setup * Append to the item at given path
if (typeof options === 'function') { callback = options } * @function
// eslint-disable-next-line node/handle-callback-err * @param {string} filename
AsyncStorage.getItem(filename, (err, contents) => { * @param {string} toAppend
return callback(null, contents || '') * @param {object} [options]
}) * @param {function} callback
* @alias module:storageReactNative.appendFile
*/
const appendFile = callbackify(appendFileAsync)
/**
* Async version of {@link module:storageReactNative.readFile}.
* @function
* @param {string} filename
* @param {object} [options]
* @return {Promise<string>}
* @alias module:storageReactNative.readFileAsync
* @async
* @see module:storageReactNative.readFile
*/
const readFileAsync = async (filename, options) => {
try {
return (await AsyncStorage.getItem(filename)) || ''
} catch (error) {
console.warn('An error happened while reading, skip')
return ''
}
} }
const unlink = (filename, callback) => { /**
AsyncStorage.removeItem(filename, callback) * Read data at given path
* @function
* @param {string} filename
* @param {object} options
* @param {function} callback
* @alias module:storageReactNative.readFile
*/
const readFile = callbackify(readFileAsync)
/**
* Async version of {@link module:storageReactNative.unlink}.
* @function
* @param {string} filename
* @return {Promise<void>}
* @async
* @alias module:storageReactNative.unlinkAsync
* @see module:storageReactNative.unlink
*/
const unlinkAsync = async filename => {
try {
await AsyncStorage.removeItem(filename)
} catch (error) {
console.warn('An error happened while unlinking, skip')
}
} }
// Nothing to do, no directories will be used on react-native /**
const mkdir = (dir, options, callback) => callback() * Remove the data at given path
* @function
* @param {string} path
* @param {function} callback
* @alias module:storageReactNative.unlink
*/
const unlink = callbackify(unlinkAsync)
/**
* Shim for {@link module:storage.mkdirAsync}, nothing to do, no directories will be used on the react-native.
* @function
* @param {string} dir
* @param {object} [options]
* @return {Promise<void|string>}
* @alias module:storageReactNative.mkdirAsync
* @async
*/
const mkdirAsync = (dir, options) => Promise.resolve()
/**
* Shim for {@link module:storage.mkdir}, nothing to do, no directories will be used on the react-native.
* @function
* @param {string} path
* @param {object} options
* @param {function} callback
* @alias module:storageReactNative.mkdir
*/
const mkdir = callbackify(mkdirAsync)
/**
* Shim for {@link module:storage.ensureDatafileIntegrityAsync}, nothing to do, no data corruption possible in the react-native.
* @param {string} filename
* @return {Promise<void>}
* @alias module:storageReactNative.ensureDatafileIntegrityAsync
*/
const ensureDatafileIntegrityAsync = (filename) => Promise.resolve()
// Nothing to do, no data corruption possible on react-native /**
const ensureDatafileIntegrity = (filename, callback) => callback(null) * Shim for {@link module:storage.ensureDatafileIntegrity}, nothing to do, no data corruption possible in the react-native.
* @function
* @param {string} filename
* @param {NoParamCallback} callback signature: err
* @alias module:storageReactNative.ensureDatafileIntegrity
*/
const ensureDatafileIntegrity = callbackify(ensureDatafileIntegrityAsync)
const crashSafeWriteFileLines = (filename, lines, callback) => { /**
* Async version of {@link module:storageReactNative.crashSafeWriteFileLines}.
* @param {string} filename
* @param {string[]} lines
* @return {Promise<void>}
* @alias module:storageReactNative.crashSafeWriteFileLinesAsync
* @see module:storageReactNative.crashSafeWriteFileLines
*/
const crashSafeWriteFileLinesAsync = async (filename, lines) => {
lines.push('') // Add final new line lines.push('') // Add final new line
writeFile(filename, lines.join('\n'), callback) await writeFileAsync(filename, lines.join('\n'))
} }
/**
* Fully write or rewrite the datafile, immune to crashes during the write operation (data will not be lost)
* @function
* @param {string} filename
* @param {string[]} lines
* @param {NoParamCallback} [callback] Optional callback, signature: err
* @alias module:storageReactNative.crashSafeWriteFileLines
*/
const crashSafeWriteFileLines = callbackify(crashSafeWriteFileLinesAsync)
// Interface // Interface
module.exports.exists = exists module.exports.exists = exists
module.exports.existsAsync = existsAsync
module.exports.rename = rename module.exports.rename = rename
module.exports.renameAsync = renameAsync
module.exports.writeFile = writeFile module.exports.writeFile = writeFile
module.exports.writeFileAsync = writeFileAsync
module.exports.crashSafeWriteFileLines = crashSafeWriteFileLines module.exports.crashSafeWriteFileLines = crashSafeWriteFileLines
module.exports.crashSafeWriteFileLinesAsync = crashSafeWriteFileLinesAsync
module.exports.appendFile = appendFile module.exports.appendFile = appendFile
module.exports.appendFileAsync = appendFileAsync
module.exports.readFile = readFile module.exports.readFile = readFile
module.exports.readFileAsync = readFileAsync
module.exports.unlink = unlink module.exports.unlink = unlink
module.exports.unlinkAsync = unlinkAsync
module.exports.mkdir = mkdir module.exports.mkdir = mkdir
module.exports.mkdirAsync = mkdirAsync
module.exports.ensureDatafileIntegrity = ensureDatafileIntegrity module.exports.ensureDatafileIntegrity = ensureDatafileIntegrity
module.exports.ensureDatafileIntegrityAsync = ensureDatafileIntegrityAsync

52
index.d.ts vendored

@ -1,6 +1,7 @@
// Type definitions for @seald-io/nedb 2.1.0 // Type definitions for @seald-io/nedb 2.1.0
// Project: https://github.com/seald/nedb forked from https://github.com/louischatriot/nedb // Project: https://github.com/seald/nedb forked from https://github.com/louischatriot/nedb
// Definitions by: Mehdi Kouhen <https://github.com/arantes555> // Definitions by: Timothée Rebours <https://gihub.com/tex0l>
// Mehdi Kouhen <https://github.com/arantes555>
// Stefan Steinhart <https://github.com/reppners> // Stefan Steinhart <https://github.com/reppners>
// Anthony Nichols <https://github.com/anthonynichols> // Anthony Nichols <https://github.com/anthonynichols>
// Alejandro Fernandez Haro <https://github.com/afharo> // Alejandro Fernandez Haro <https://github.com/afharo>
@ -17,43 +18,65 @@ declare class Nedb<G = any> extends EventEmitter {
persistence: Nedb.Persistence; persistence: Nedb.Persistence;
autoloadPromise: Promise<void>|null;
loadDatabase(): void; loadDatabase(): void;
getAllData<T extends G>(): T[]; loadDatabaseAsync(): Promise<void>;
resetIndexes(newData?: any): void; dropDatabase(callback?: (err: Error |null) => void): void;
ensureIndex(options: Nedb.EnsureIndexOptions, callback?: (err: Error | null) => void): void; dropDatabaseAsync(): Promise<void>;
removeIndex(fieldName: string, callback?: (err: Error | null) => void): void; compactDatafile(callback?: (err: Error |null) => void): void;
compactDatafileAsync(): Promise<void>;
addToIndexes<T extends G>(doc: T | T[]): void; setAutocompactionInterval(interval: number): void;
removeFromIndexes<T extends G>(doc: T | T[]): void; stopAutocompaction(): void;
updateIndexes<T extends G>(oldDoc: T, newDoc: T): void; getAllData<T extends G>(): T[];
updateIndexes<T extends G>(updates: Array<{ oldDoc: T; newDoc: T }>): void;
ensureIndex(options: Nedb.EnsureIndexOptions, callback?: (err: Error | null) => void): void;
getCandidates<T extends G>(query: any, dontExpireStaleDocs: boolean, callback?: (err: Error | null, candidates: T[]) => void): void; ensureIndexAsync(options: Nedb.EnsureIndexOptions): Promise<void>;
removeIndex(fieldName: string, callback?: (err: Error | null) => void): void;
removeIndexAsync(fieldName: string): Promise<void>;
insert<T extends G>(newDoc: T, callback?: (err: Error | null, document: T) => void): void; insert<T extends G>(newDoc: T, callback?: (err: Error | null, document: T) => void): void;
insert<T extends G>(newDocs: T[], callback?: (err: Error | null, documents: T[]) => void): void; insert<T extends G>(newDocs: T[], callback?: (err: Error | null, documents: T[]) => void): void;
insertAsync<T extends G>(newDoc: T): Promise<T>;
insertAsync<T extends G>(newDocs: T[]): Promise<T[]>;
count(query: any, callback: (err: Error | null, n: number) => void): void; count(query: any, callback: (err: Error | null, n: number) => void): void;
count(query: any): Nedb.CursorCount; count(query: any): Nedb.CursorCount;
countAsync(query: any): Nedb.Cursor<number>;
find<T extends G>(query: any, projection: any, callback?: (err: Error | null, documents: T[]) => void): void; find<T extends G>(query: any, projection: any, callback?: (err: Error | null, documents: T[]) => void): void;
find<T extends G>(query: any, projection?: any): Nedb.Cursor<T>; find<T extends G>(query: any, projection?: any): Nedb.Cursor<T>;
find<T extends G>(query: any, callback: (err: Error | null, documents: T[]) => void): void; find<T extends G>(query: any, callback: (err: Error | null, documents: T[]) => void): void;
findAsync<T extends G>(query: any, projection?: any): Nedb.Cursor<T[]>;
findOne<T extends G>(query: any, projection: any, callback: (err: Error | null, document: T) => void): void; findOne<T extends G>(query: any, projection: any, callback: (err: Error | null, document: T) => void): void;
findOne<T extends G>(query: any, callback: (err: Error | null, document: T) => void): void; findOne<T extends G>(query: any, callback: (err: Error | null, document: T) => void): void;
findOneAsync<T extends G>(query: any, projection?: any): Nedb.Cursor<T>;
update<T extends G>(query: any, updateQuery: any, options?: Nedb.UpdateOptions, callback?: (err: Error | null, numberOfUpdated: number, affectedDocuments: T | T[] | null, upsert: boolean | null) => void): void; update<T extends G>(query: any, updateQuery: any, options?: Nedb.UpdateOptions, callback?: (err: Error | null, numberOfUpdated: number, affectedDocuments: T | T[] | null, upsert: boolean | null) => void): void;
updateAsync<T extends G>(query: any, updateQuery: any, options?: Nedb.UpdateOptions): Promise<{numAffected: number, affectedDocuments: T|T[]|null, upsert: boolean}>;
remove(query: any, options: Nedb.RemoveOptions, callback?: (err: Error | null, n: number) => void): void; remove(query: any, options: Nedb.RemoveOptions, callback?: (err: Error | null, n: number) => void): void;
remove(query: any, callback?: (err: Error | null, n: number) => void): void; remove(query: any, callback?: (err: Error | null, n: number) => void): void;
removeAsync(query: any, options: Nedb.RemoveOptions): Promise<number>;
addListener(event: 'compaction.done', listener: () => void): this; addListener(event: 'compaction.done', listener: () => void): this;
on(event: 'compaction.done', listener: () => void): this; on(event: 'compaction.done', listener: () => void): this;
once(event: 'compaction.done', listener: () => void): this; once(event: 'compaction.done', listener: () => void): this;
@ -67,12 +90,13 @@ declare class Nedb<G = any> extends EventEmitter {
} }
declare namespace Nedb { declare namespace Nedb {
interface Cursor<T> { interface Cursor<T> extends Promise<T> {
sort(query: any): Cursor<T>; sort(query: any): Cursor<T>;
skip(n: number): Cursor<T>; skip(n: number): Cursor<T>;
limit(n: number): Cursor<T>; limit(n: number): Cursor<T>;
projection(query: any): Cursor<T>; projection(query: any): Cursor<T>;
exec(callback: (err: Error | null, documents: T[]) => void): void; exec(callback: (err: Error | null, documents: T[]) => void): void;
execAsync(): Promise<T>;
} }
interface CursorCount { interface CursorCount {
@ -83,7 +107,6 @@ declare namespace Nedb {
filename?: string; filename?: string;
timestampData?: boolean; timestampData?: boolean;
inMemoryOnly?: boolean; inMemoryOnly?: boolean;
nodeWebkitAppName?: string;
autoload?: boolean; autoload?: boolean;
onload?(error: Error | null): any; onload?(error: Error | null): any;
beforeDeserialization?(line: string): string; beforeDeserialization?(line: string): string;
@ -110,8 +133,13 @@ declare namespace Nedb {
} }
interface Persistence { interface Persistence {
/** @deprecated */
compactDatafile(): void; compactDatafile(): void;
/** @deprecated */
compactDatafileAsync(): Promise<void>;
/** @deprecated */
setAutocompactionInterval(interval: number): void; setAutocompactionInterval(interval: number): void;
/** @deprecated */
stopAutocompaction(): void; stopAutocompaction(): void;
} }
} }

@ -0,0 +1,5 @@
'use strict'
module.exports = {
plugins: ['plugins/markdown']
}

@ -14,7 +14,7 @@ module.exports = (config) => ({
// list of files / patterns to load in the browser // list of files / patterns to load in the browser
files: [ files: [
'node_modules/localforage/dist/localforage.min.js', 'node_modules/localforage/dist/localforage.min.js',
'node_modules/async/lib/async.js', 'browser-version/out/testutils.min.js',
'browser-version/out/nedb.min.js', 'browser-version/out/nedb.min.js',
'test/browser/nedb-browser.spec.js', 'test/browser/nedb-browser.spec.js',
'test/browser/load.spec.js' 'test/browser/load.spec.js'

@ -19,51 +19,30 @@
// LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING // LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
// FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS // FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
// IN THE SOFTWARE. // IN THE SOFTWARE.
/**
* @module byline
* @private
*/
const stream = require('stream') const stream = require('stream')
const util = require('util')
const timers = require('timers') const timers = require('timers')
// convinience API const createLineStream = (readStream, options) => {
module.exports = function (readStream, options) { if (!readStream) throw new Error('expected readStream')
return module.exports.createStream(readStream, options) if (!readStream.readable) throw new Error('readStream must be readable')
}
// basic API
module.exports.createStream = function (readStream, options) {
if (readStream) {
return createLineStream(readStream, options)
} else {
return new LineStream(options)
}
}
// deprecated API
module.exports.createLineStream = function (readStream) {
console.log('WARNING: byline#createLineStream is deprecated and will be removed soon')
return createLineStream(readStream)
}
function createLineStream (readStream, options) {
if (!readStream) {
throw new Error('expected readStream')
}
if (!readStream.readable) {
throw new Error('readStream must be readable')
}
const ls = new LineStream(options) const ls = new LineStream(options)
readStream.pipe(ls) readStream.pipe(ls)
return ls return ls
} }
// /**
// using the new node v0.10 "streams2" API * Fork from {@link https://github.com/jahewson/node-byline}.
// * @see https://github.com/jahewson/node-byline
* @alias module:byline.LineStream
module.exports.LineStream = LineStream * @private
*/
function LineStream (options) { class LineStream extends stream.Transform {
stream.Transform.call(this, options) constructor (options) {
super(options)
options = options || {} options = options || {}
// use objectMode to stop the output from being buffered // use objectMode to stop the output from being buffered
@ -74,19 +53,12 @@ function LineStream (options) {
this._lastChunkEndedWithCR = false this._lastChunkEndedWithCR = false
// take the source's encoding if we don't have one // take the source's encoding if we don't have one
const self = this this.once('pipe', src => {
this.on('pipe', function (src) { if (!this.encoding && src instanceof stream.Readable) this.encoding = src._readableState.encoding // but we can't do this for old-style streams
if (!self.encoding) {
// but we can't do this for old-style streams
if (src instanceof stream.Readable) {
self.encoding = src._readableState.encoding
}
}
}) })
} }
util.inherits(LineStream, stream.Transform)
LineStream.prototype._transform = function (chunk, encoding, done) { _transform (chunk, encoding, done) {
// decode binary chunks as UTF-8 // decode binary chunks as UTF-8
encoding = encoding || 'utf8' encoding = encoding || 'utf8'
@ -94,9 +66,7 @@ LineStream.prototype._transform = function (chunk, encoding, done) {
if (encoding === 'buffer') { if (encoding === 'buffer') {
chunk = chunk.toString() // utf8 chunk = chunk.toString() // utf8
encoding = 'utf8' encoding = 'utf8'
} else { } else chunk = chunk.toString(encoding)
chunk = chunk.toString(encoding)
}
} }
this._chunkEncoding = encoding this._chunkEncoding = encoding
@ -104,9 +74,7 @@ LineStream.prototype._transform = function (chunk, encoding, done) {
const lines = chunk.split(/\r\n|[\n\v\f\r\x85\u2028\u2029]/g) const lines = chunk.split(/\r\n|[\n\v\f\r\x85\u2028\u2029]/g)
// don't split CRLF which spans chunks // don't split CRLF which spans chunks
if (this._lastChunkEndedWithCR && chunk[0] === '\n') { if (this._lastChunkEndedWithCR && chunk[0] === '\n') lines.shift()
lines.shift()
}
if (this._lineBuffer.length > 0) { if (this._lineBuffer.length > 0) {
this._lineBuffer[this._lineBuffer.length - 1] += lines[0] this._lineBuffer[this._lineBuffer.length - 1] += lines[0]
@ -116,9 +84,9 @@ LineStream.prototype._transform = function (chunk, encoding, done) {
this._lastChunkEndedWithCR = chunk[chunk.length - 1] === '\r' this._lastChunkEndedWithCR = chunk[chunk.length - 1] === '\r'
this._lineBuffer = this._lineBuffer.concat(lines) this._lineBuffer = this._lineBuffer.concat(lines)
this._pushBuffer(encoding, 1, done) this._pushBuffer(encoding, 1, done)
} }
LineStream.prototype._pushBuffer = function (encoding, keep, done) { _pushBuffer (encoding, keep, done) {
// always buffer the last (possibly partial) line // always buffer the last (possibly partial) line
while (this._lineBuffer.length > keep) { while (this._lineBuffer.length > keep) {
const line = this._lineBuffer.shift() const line = this._lineBuffer.shift()
@ -126,28 +94,24 @@ LineStream.prototype._pushBuffer = function (encoding, keep, done) {
if (this._keepEmptyLines || line.length > 0) { if (this._keepEmptyLines || line.length > 0) {
if (!this.push(this._reencode(line, encoding))) { if (!this.push(this._reencode(line, encoding))) {
// when the high-water mark is reached, defer pushes until the next tick // when the high-water mark is reached, defer pushes until the next tick
timers.setImmediate(() => { timers.setImmediate(() => { this._pushBuffer(encoding, keep, done) })
this._pushBuffer(encoding, keep, done)
})
return return
} }
} }
} }
done() done()
} }
LineStream.prototype._flush = function (done) { _flush (done) {
this._pushBuffer(this._chunkEncoding, 0, done) this._pushBuffer(this._chunkEncoding, 0, done)
} }
// see Readable::push // see Readable::push
LineStream.prototype._reencode = function (line, chunkEncoding) { _reencode (line, chunkEncoding) {
if (this.encoding && this.encoding !== chunkEncoding) { if (this.encoding && this.encoding !== chunkEncoding) return Buffer.from(line, chunkEncoding).toString(this.encoding)
return Buffer.from(line, chunkEncoding).toString(this.encoding) else if (this.encoding) return line // this should be the most common case, i.e. we're using an encoded source stream
} else if (this.encoding) { else return Buffer.from(line, chunkEncoding)
// this should be the most common case, i.e. we're using an encoded source stream
return line
} else {
return Buffer.from(line, chunkEncoding)
} }
} }
module.exports = createLineStream

@ -1,23 +1,73 @@
const model = require('./model.js')
const { callbackify } = require('util')
/** /**
* Manage access to data, be it to find, update or remove it * Has a callback
* @callback Cursor~mapFn
* @param {document[]} res
* @return {*|Promise<*>}
*/ */
const model = require('./model.js')
/**
* Manage access to data, be it to find, update or remove it.
*
* It extends `Promise` so that its methods (which return `this`) are chainable & awaitable.
* @extends Promise
*/
class Cursor { class Cursor {
/** /**
* Create a new cursor for this collection * Create a new cursor for this collection.
* @param {Datastore} db - The datastore this cursor is bound to * @param {Datastore} db - The datastore this cursor is bound to
* @param {Query} query - The query this cursor will operate on * @param {query} query - The query this cursor will operate on
* @param {Function} execFn - Handler to be executed after cursor has found the results and before the callback passed to find/findOne/update/remove * @param {Cursor~mapFn} [mapFn] - Handler to be executed after cursor has found the results and before the callback passed to find/findOne/update/remove
*/
constructor (db, query, mapFn) {
/**
* @protected
* @type {Datastore}
*/ */
constructor (db, query, execFn) {
this.db = db this.db = db
/**
* @protected
* @type {query}
*/
this.query = query || {} this.query = query || {}
if (execFn) { this.execFn = execFn } /**
* The handler to be executed after cursor has found the results.
* @type {Cursor~mapFn}
* @protected
*/
if (mapFn) this.mapFn = mapFn
/**
* @see Cursor#limit
* @type {undefined|number}
* @private
*/
this._limit = undefined
/**
* @see Cursor#skip
* @type {undefined|number}
* @private
*/
this._skip = undefined
/**
* @see Cursor#sort
* @type {undefined|Object.<string, number>}
* @private
*/
this._sort = undefined
/**
* @see Cursor#projection
* @type {undefined|Object.<string, number>}
* @private
*/
this._projection = undefined
} }
/** /**
* Set a limit to the number of results * Set a limit to the number of results for the given Cursor.
* @param {Number} limit
* @return {Cursor} the same instance of Cursor, (useful for chaining).
*/ */
limit (limit) { limit (limit) {
this._limit = limit this._limit = limit
@ -25,7 +75,9 @@ class Cursor {
} }
/** /**
* Skip a the number of results * Skip a number of results for the given Cursor.
* @param {Number} skip
* @return {Cursor} the same instance of Cursor, (useful for chaining).
*/ */
skip (skip) { skip (skip) {
this._skip = skip this._skip = skip
@ -33,8 +85,9 @@ class Cursor {
} }
/** /**
* Sort results of the query * Sort results of the query for the given Cursor.
* @param {SortQuery} sortQuery - SortQuery is { field: order }, field can use the dot-notation, order is 1 for ascending and -1 for descending * @param {Object.<string, number>} sortQuery - sortQuery is { field: order }, field can use the dot-notation, order is 1 for ascending and -1 for descending
* @return {Cursor} the same instance of Cursor, (useful for chaining).
*/ */
sort (sortQuery) { sort (sortQuery) {
this._sort = sortQuery this._sort = sortQuery
@ -42,9 +95,10 @@ class Cursor {
} }
/** /**
* Add the use of a projection * Add the use of a projection to the given Cursor.
* @param {Object} projection - MongoDB-style projection. {} means take all fields. Then it's { key1: 1, key2: 1 } to take only key1 and key2 * @param {Object.<string, number>} projection - MongoDB-style projection. {} means take all fields. Then it's { key1: 1, key2: 1 } to take only key1 and key2
* { key1: 0, key2: 0 } to omit only key1 and key2. Except _id, you can't mix takes and omits * { key1: 0, key2: 0 } to omit only key1 and key2. Except _id, you can't mix takes and omits.
* @return {Cursor} the same instance of Cursor, (useful for chaining).
*/ */
projection (projection) { projection (projection) {
this._projection = projection this._projection = projection
@ -52,9 +106,14 @@ class Cursor {
} }
/** /**
* Apply the projection * Apply the projection.
*
* This is an internal function. You should use {@link Cursor#execAsync} or {@link Cursor#exec}.
* @param {document[]} candidates
* @return {document[]}
* @private
*/ */
project (candidates) { _project (candidates) {
const res = [] const res = []
let action let action
@ -99,27 +158,17 @@ class Cursor {
/** /**
* Get all matching elements * Get all matching elements
* Will return pointers to matched elements (shallow copies), returning full copies is the role of find or findOne * Will return pointers to matched elements (shallow copies), returning full copies is the role of find or findOne
* This is an internal function, use exec which uses the executor * This is an internal function, use execAsync which uses the executor
* * @return {document[]|Promise<*>}
* @param {Function} callback - Signature: err, results * @private
*/ */
_exec (_callback) { async _execAsync () {
let res = [] let res = []
let added = 0 let added = 0
let skipped = 0 let skipped = 0
let error = null
let keys
let key
const callback = (error, res) => {
if (this.execFn) return this.execFn(error, res, _callback)
else return _callback(error, res)
}
this.db.getCandidates(this.query, (err, candidates) => { const candidates = await this.db._getCandidatesAsync(this.query)
if (err) return callback(err)
try {
for (const candidate of candidates) { for (const candidate of candidates) {
if (model.match(candidate, this.query)) { if (model.match(candidate, this.query)) {
// If a sort is defined, wait for the results to be sorted before applying limit and skip // If a sort is defined, wait for the results to be sorted before applying limit and skip
@ -133,20 +182,11 @@ class Cursor {
} else res.push(candidate) } else res.push(candidate)
} }
} }
} catch (err) {
return callback(err)
}
// Apply all sorts // Apply all sorts
if (this._sort) { if (this._sort) {
keys = Object.keys(this._sort)
// Sorting // Sorting
const criteria = [] const criteria = Object.entries(this._sort).map(([key, direction]) => ({ key, direction }))
keys.forEach(item => {
key = item
criteria.push({ key: key, direction: this._sort[key] })
})
res.sort((a, b) => { res.sort((a, b) => {
for (const criterion of criteria) { for (const criterion of criteria) {
const compare = criterion.direction * model.compareThings(model.getDotValue(a, criterion.key), model.getDotValue(b, criterion.key), this.db.compareStrings) const compare = criterion.direction * model.compareThings(model.getDotValue(a, criterion.key), model.getDotValue(b, criterion.key), this.db.compareStrings)
@ -163,19 +203,46 @@ class Cursor {
} }
// Apply projection // Apply projection
try { res = this._project(res)
res = this.project(res) if (this.mapFn) return this.mapFn(res)
} catch (e) { return res
error = e
res = undefined
} }
return callback(error, res) /**
}) * @callback Cursor~execCallback
* @param {Error} err
* @param {document[]|*} res If a mapFn was given to the Cursor, then the type of this parameter is the one returned by the mapFn.
*/
/**
* Callback version of {@link Cursor#exec}.
* @param {Cursor~execCallback} _callback
* @see Cursor#execAsync
*/
exec (_callback) {
callbackify(() => this.execAsync())(_callback)
}
/**
* Get all matching elements.
* Will return pointers to matched elements (shallow copies), returning full copies is the role of {@link Datastore#findAsync} or {@link Datastore#findOneAsync}.
* @return {Promise<document[]|*>}
* @async
*/
execAsync () {
return this.db.executor.pushAsync(() => this._execAsync())
}
then (onFulfilled, onRejected) {
return this.execAsync().then(onFulfilled, onRejected)
}
catch (onRejected) {
return this.execAsync().catch(onRejected)
} }
exec () { finally (onFinally) {
this.db.executor.push({ this: this, fn: this._exec, arguments: arguments }) return this.execAsync().finally(onFinally)
} }
} }

@ -1,3 +1,9 @@
/**
* Utility functions that need to be reimplemented for each environment.
* This is the version for Node.js
* @module customUtilsNode
* @private
*/
const crypto = require('crypto') const crypto = require('crypto')
/** /**
@ -7,6 +13,9 @@ const crypto = require('crypto')
* that's not an issue here * that's not an issue here
* The probability of a collision is extremely small (need 3*10^12 documents to have one chance in a million of a collision) * The probability of a collision is extremely small (need 3*10^12 documents to have one chance in a million of a collision)
* See http://en.wikipedia.org/wiki/Birthday_problem * See http://en.wikipedia.org/wiki/Birthday_problem
* @param {number} len
* @return {string}
* @alias module:customUtilsNode.uid
*/ */
const uid = len => crypto.randomBytes(Math.ceil(Math.max(8, len * 2))) const uid = len => crypto.randomBytes(Math.ceil(Math.max(8, len * 2)))
.toString('base64') .toString('base64')

File diff suppressed because it is too large Load Diff

@ -1,56 +1,56 @@
const Waterfall = require('./waterfall')
/** /**
* Responsible for sequentially executing actions on the database * Executes operations sequentially.
* Has an option for a buffer that can be triggered afterwards.
* @private
*/ */
const async = require('async')
class Executor { class Executor {
/**
* Instantiates a new Executor.
*/
constructor () { constructor () {
this.buffer = [] /**
* If this.ready is `false`, then every task pushed will be buffered until this.processBuffer is called.
* @type {boolean}
* @private
*/
this.ready = false this.ready = false
/**
// This queue will execute all commands, one-by-one in order * The main queue
this.queue = async.queue((task, cb) => { * @type {Waterfall}
// task.arguments is an array-like object on which adding a new field doesn't work, so we transform it into a real array * @private
const newArguments = Array.from(task.arguments) */
this.queue = new Waterfall()
const lastArg = newArguments[newArguments.length - 1] /**
* The buffer queue
// Always tell the queue task is complete. Execute callback if any was given. * @type {Waterfall}
if (typeof lastArg === 'function') { * @private
// Callback was supplied */
newArguments[newArguments.length - 1] = function () { this.buffer = null
if (typeof setImmediate === 'function') { /**
setImmediate(cb) * Method to trigger the buffer processing.
} else { *
process.nextTick(cb) * Do not be use directly, use `this.processBuffer` instead.
} * @function
lastArg.apply(null, arguments) * @private
} */
} else if (!lastArg && task.arguments.length !== 0) { this._triggerBuffer = null
// false/undefined/null supplied as callback this.resetBuffer()
newArguments[newArguments.length - 1] = () => { cb() }
} else {
// Nothing supplied as callback
newArguments.push(() => { cb() })
}
task.fn.apply(task.this, newArguments)
}, 1)
} }
/** /**
* If executor is ready, queue task (and process it immediately if executor was idle) * If executor is ready, queue task (and process it immediately if executor was idle)
* If not, buffer task for later processing * If not, buffer task for later processing
* @param {Object} task * @param {AsyncFunction} task Function to execute
* task.this - Object to use as this * @param {boolean} [forceQueuing = false] Optional (defaults to false) force executor to queue task even if it is not ready
* task.fn - Function to execute * @return {Promise<*>}
* task.arguments - Array of arguments, IMPORTANT: only the last argument may be a function (the callback) * @async
* and the last argument cannot be false/undefined/null * @see Executor#push
* @param {Boolean} forceQueuing Optional (defaults to false) force executor to queue task even if it is not ready
*/ */
push (task, forceQueuing) { pushAsync (task, forceQueuing = false) {
if (this.ready || forceQueuing) this.queue.push(task) if (this.ready || forceQueuing) return this.queue.waterfall(task)()
else this.buffer.push(task) else return this.buffer.waterfall(task)()
} }
/** /**
@ -59,8 +59,19 @@ class Executor {
*/ */
processBuffer () { processBuffer () {
this.ready = true this.ready = true
this.buffer.forEach(task => { this.queue.push(task) }) this._triggerBuffer()
this.buffer = [] this.queue.waterfall(() => this.buffer.guardian)
}
/**
* Removes all tasks queued up in the buffer
*/
resetBuffer () {
this.buffer = new Waterfall()
this.buffer.chain(new Promise(resolve => {
this._triggerBuffer = resolve
}))
if (this.ready) this._triggerBuffer()
} }
} }

@ -3,14 +3,21 @@ const model = require('./model.js')
const { uniq, isDate } = require('./utils.js') const { uniq, isDate } = require('./utils.js')
/** /**
* Two indexed pointers are equal iif they point to the same place * Two indexed pointers are equal if they point to the same place
* @param {*} a
* @param {*} b
* @return {boolean}
* @private
*/ */
const checkValueEquality = (a, b) => a === b const checkValueEquality = (a, b) => a === b
/** /**
* Type-aware projection * Type-aware projection
* @param {*} elt
* @return {string|*}
* @private
*/ */
function projectForUnique (elt) { const projectForUnique = elt => {
if (elt === null) return '$null' if (elt === null) return '$null'
if (typeof elt === 'string') return '$string' + elt if (typeof elt === 'string') return '$string' + elt
if (typeof elt === 'boolean') return '$boolean' + elt if (typeof elt === 'boolean') return '$boolean' + elt
@ -20,29 +27,55 @@ function projectForUnique (elt) {
return elt // Arrays and objects, will check for pointer equality return elt // Arrays and objects, will check for pointer equality
} }
/**
* Indexes on field names, with atomic operations and which can optionally enforce a unique constraint or allow indexed
* fields to be undefined
* @private
*/
class Index { class Index {
/** /**
* Create a new index * Create a new index
* All methods on an index guarantee that either the whole operation was successful and the index changed * All methods on an index guarantee that either the whole operation was successful and the index changed
* or the operation was unsuccessful and an error is thrown while the index is unchanged * or the operation was unsuccessful and an error is thrown while the index is unchanged
* @param {String} options.fieldName On which field should the index apply (can use dot notation to index on sub fields) * @param {object} options
* @param {Boolean} options.unique Optional, enforce a unique constraint (default: false) * @param {string} options.fieldName On which field should the index apply (can use dot notation to index on sub fields)
* @param {Boolean} options.sparse Optional, allow a sparse index (we can have documents for which fieldName is undefined) (default: false) * @param {boolean} [options.unique = false] Enforces a unique constraint
* @param {boolean} [options.sparse = false] Allows a sparse index (we can have documents for which fieldName is `undefined`)
*/ */
constructor (options) { constructor (options) {
/**
* On which field the index applies to (may use dot notation to index on sub fields).
* @type {string}
*/
this.fieldName = options.fieldName this.fieldName = options.fieldName
/**
* Defines if the index enforces a unique constraint for this index.
* @type {boolean}
*/
this.unique = options.unique || false this.unique = options.unique || false
/**
* Defines if we can have documents for which fieldName is `undefined`
* @type {boolean}
*/
this.sparse = options.sparse || false this.sparse = options.sparse || false
/**
* Options object given to the underlying BinarySearchTree.
* @type {{unique: boolean, checkValueEquality: (function(*, *): boolean), compareKeys: ((function(*, *, compareStrings): (number|number))|*)}}
*/
this.treeOptions = { unique: this.unique, compareKeys: model.compareThings, checkValueEquality: checkValueEquality } this.treeOptions = { unique: this.unique, compareKeys: model.compareThings, checkValueEquality: checkValueEquality }
this.reset() // No data in the beginning /**
* Underlying BinarySearchTree for this index. Uses an AVLTree for optimization.
* @type {AVLTree}
*/
this.tree = new BinarySearchTree(this.treeOptions)
} }
/** /**
* Reset an index * Reset an index
* @param {Document or Array of documents} newData Optional, data to initialize the index with * @param {?document|?document[]} [newData] Data to initialize the index with. If an error is thrown during
* If an error is thrown during insertion, the index is not modified * insertion, the index is not modified.
*/ */
reset (newData) { reset (newData) {
this.tree = new BinarySearchTree(this.treeOptions) this.tree = new BinarySearchTree(this.treeOptions)
@ -54,6 +87,7 @@ class Index {
* Insert a new document in the index * Insert a new document in the index
* If an array is passed, we insert all its elements (if one insertion fails the index is not modified) * If an array is passed, we insert all its elements (if one insertion fails the index is not modified)
* O(log(n)) * O(log(n))
* @param {document|document[]} doc The document, or array of documents, to insert.
*/ */
insert (doc) { insert (doc) {
let keys let keys
@ -98,8 +132,8 @@ class Index {
/** /**
* Insert an array of documents in the index * Insert an array of documents in the index
* If a constraint is violated, the changes should be rolled back and an error thrown * If a constraint is violated, the changes should be rolled back and an error thrown
* * @param {document[]} docs Array of documents to insert.
* @API private * @private
*/ */
insertMultipleDocs (docs) { insertMultipleDocs (docs) {
let error let error
@ -125,10 +159,11 @@ class Index {
} }
/** /**
* Remove a document from the index * Removes a document from the index.
* If an array is passed, we remove all its elements * If an array is passed, we remove all its elements
* The remove operation is safe with regards to the 'unique' constraint * The remove operation is safe with regards to the 'unique' constraint
* O(log(n)) * O(log(n))
* @param {document[]|document} doc The document, or Array of documents, to remove.
*/ */
remove (doc) { remove (doc) {
if (Array.isArray(doc)) { if (Array.isArray(doc)) {
@ -153,6 +188,10 @@ class Index {
* Update a document in the index * Update a document in the index
* If a constraint is violated, changes are rolled back and an error thrown * If a constraint is violated, changes are rolled back and an error thrown
* Naive implementation, still in O(log(n)) * Naive implementation, still in O(log(n))
* @param {document|Array.<{oldDoc: document, newDoc: document}>} oldDoc Document to update, or an `Array` of
* `{oldDoc, newDoc}` pairs.
* @param {document} [newDoc] Document to replace the oldDoc with. If the first argument is an `Array` of
* `{oldDoc, newDoc}` pairs, this second argument is ignored.
*/ */
update (oldDoc, newDoc) { update (oldDoc, newDoc) {
if (Array.isArray(oldDoc)) { if (Array.isArray(oldDoc)) {
@ -174,7 +213,7 @@ class Index {
* Update multiple documents in the index * Update multiple documents in the index
* If a constraint is violated, the changes need to be rolled back * If a constraint is violated, the changes need to be rolled back
* and an error thrown * and an error thrown
* @param {Array<{ oldDoc: T, newDoc: T }>} pairs * @param {Array.<{oldDoc: document, newDoc: document}>} pairs
* *
* @private * @private
*/ */
@ -212,6 +251,8 @@ class Index {
/** /**
* Revert an update * Revert an update
* @param {document|Array.<{oldDoc: document, newDoc: document}>} oldDoc Document to revert to, or an `Array` of `{oldDoc, newDoc}` pairs.
* @param {document} [newDoc] Document to revert from. If the first argument is an Array of {oldDoc, newDoc}, this second argument is ignored.
*/ */
revertUpdate (oldDoc, newDoc) { revertUpdate (oldDoc, newDoc) {
const revert = [] const revert = []
@ -227,8 +268,8 @@ class Index {
/** /**
* Get all documents in index whose key match value (if it is a Thing) or one of the elements of value (if it is an array of Things) * Get all documents in index whose key match value (if it is a Thing) or one of the elements of value (if it is an array of Things)
* @param {Thing} value Value to match the key against * @param {Array.<*>|*} value Value to match the key against
* @return {Array of documents} * @return {document[]}
*/ */
getMatching (value) { getMatching (value) {
if (!Array.isArray(value)) return this.tree.search(value) if (!Array.isArray(value)) return this.tree.search(value)
@ -253,8 +294,12 @@ class Index {
/** /**
* Get all documents in index whose key is between bounds are they are defined by query * Get all documents in index whose key is between bounds are they are defined by query
* Documents are sorted by key * Documents are sorted by key
* @param {Query} query * @param {object} query An object with at least one matcher among $gt, $gte, $lt, $lte.
* @return {Array of documents} * @param {*} [query.$gt] Greater than matcher.
* @param {*} [query.$gte] Greater than or equal matcher.
* @param {*} [query.$lt] Lower than matcher.
* @param {*} [query.$lte] Lower than or equal matcher.
* @return {document[]}
*/ */
getBetweenBounds (query) { getBetweenBounds (query) {
return this.tree.betweenBounds(query) return this.tree.betweenBounds(query)
@ -262,7 +307,7 @@ class Index {
/** /**
* Get all elements in the index * Get all elements in the index
* @return {Array of documents} * @return {document[]}
*/ */
getAll () { getAll () {
const res = [] const res = []

@ -3,21 +3,19 @@
* Serialization/deserialization * Serialization/deserialization
* Copying * Copying
* Querying, update * Querying, update
* @module model
* @private
*/ */
const { uniq, isDate, isRegExp } = require('./utils.js') const { uniq, isDate, isRegExp } = require('./utils.js')
const modifierFunctions = {}
const lastStepModifierFunctions = {}
const comparisonFunctions = {}
const logicalOperators = {}
const arrayComparisonFunctions = {}
/** /**
* Check a key, throw an error if the key is non valid * Check a key, throw an error if the key is non valid
* @param {String} k key * @param {string} k key
* @param {Model} v value, needed to treat the Date edge case * @param {document} v value, needed to treat the Date edge case
* Non-treatable edge cases here: if part of the object if of the form { $$date: number } or { $$deleted: true } * Non-treatable edge cases here: if part of the object if of the form { $$date: number } or { $$deleted: true }
* Its serialized-then-deserialized version it will transformed into a Date object * Its serialized-then-deserialized version it will transformed into a Date object
* But you really need to want it to trigger such behaviour, even when warned not to use '$' at the beginning of the field names... * But you really need to want it to trigger such behaviour, even when warned not to use '$' at the beginning of the field names...
* @private
*/ */
const checkKey = (k, v) => { const checkKey = (k, v) => {
if (typeof k === 'number') k = k.toString() if (typeof k === 'number') k = k.toString()
@ -36,6 +34,8 @@ const checkKey = (k, v) => {
/** /**
* Check a DB object and throw an error if it's not valid * Check a DB object and throw an error if it's not valid
* Works by applying the above checkKey function to all fields recursively * Works by applying the above checkKey function to all fields recursively
* @param {document|document[]} obj
* @alias module:model.checkObject
*/ */
const checkObject = obj => { const checkObject = obj => {
if (Array.isArray(obj)) { if (Array.isArray(obj)) {
@ -61,6 +61,9 @@ const checkObject = obj => {
* so eval and the like are not safe * so eval and the like are not safe
* Accepted primitive types: Number, String, Boolean, Date, null * Accepted primitive types: Number, String, Boolean, Date, null
* Accepted secondary types: Objects, Arrays * Accepted secondary types: Objects, Arrays
* @param {document} obj
* @return {string}
* @alias module:model.serialize
*/ */
const serialize = obj => { const serialize = obj => {
return JSON.stringify(obj, function (k, v) { return JSON.stringify(obj, function (k, v) {
@ -80,6 +83,9 @@ const serialize = obj => {
/** /**
* From a one-line representation of an object generate by the serialize function * From a one-line representation of an object generate by the serialize function
* Return the object itself * Return the object itself
* @param {string} rawData
* @return {document}
* @alias module:model.deserialize
*/ */
const deserialize = rawData => JSON.parse(rawData, function (k, v) { const deserialize = rawData => JSON.parse(rawData, function (k, v) {
if (k === '$$date') return new Date(v) if (k === '$$date') return new Date(v)
@ -98,6 +104,10 @@ const deserialize = rawData => JSON.parse(rawData, function (k, v) {
* Deep copy a DB object * Deep copy a DB object
* The optional strictKeys flag (defaulting to false) indicates whether to copy everything or only fields * The optional strictKeys flag (defaulting to false) indicates whether to copy everything or only fields
* where the keys are valid, i.e. don't begin with $ and don't contain a . * where the keys are valid, i.e. don't begin with $ and don't contain a .
* @param {?document} obj
* @param {boolean} [strictKeys=false]
* @return {?document}
* @alias module:modelel:(.*)
*/ */
function deepCopy (obj, strictKeys) { function deepCopy (obj, strictKeys) {
if ( if (
@ -129,6 +139,9 @@ function deepCopy (obj, strictKeys) {
/** /**
* Tells if an object is a primitive type or a "real" object * Tells if an object is a primitive type or a "real" object
* Arrays are considered primitive * Arrays are considered primitive
* @param {*} obj
* @return {boolean}
* @alias module:modelel:(.*)
*/ */
const isPrimitiveType = obj => ( const isPrimitiveType = obj => (
typeof obj === 'boolean' || typeof obj === 'boolean' ||
@ -143,6 +156,10 @@ const isPrimitiveType = obj => (
* Utility functions for comparing things * Utility functions for comparing things
* Assumes type checking was already done (a and b already have the same type) * Assumes type checking was already done (a and b already have the same type)
* compareNSB works for numbers, strings and booleans * compareNSB works for numbers, strings and booleans
* @param {number|string|boolean} a
* @param {number|string|boolean} b
* @return {number} 0 if a == b, 1 i a > b, -1 if a < b
* @private
*/ */
const compareNSB = (a, b) => { const compareNSB = (a, b) => {
if (a < b) return -1 if (a < b) return -1
@ -150,6 +167,15 @@ const compareNSB = (a, b) => {
return 0 return 0
} }
/**
* Utility function for comparing array
* Assumes type checking was already done (a and b already have the same type)
* compareNSB works for numbers, strings and booleans
* @param {Array} a
* @param {Array} b
* @return {number} 0 if arrays have the same length and all elements equal one another. Else either 1 or -1.
* @private
*/
const compareArrays = (a, b) => { const compareArrays = (a, b) => {
const minLength = Math.min(a.length, b.length) const minLength = Math.min(a.length, b.length)
for (let i = 0; i < minLength; i += 1) { for (let i = 0; i < minLength; i += 1) {
@ -169,8 +195,11 @@ const compareArrays = (a, b) => {
* In the case of objects and arrays, we deep-compare * In the case of objects and arrays, we deep-compare
* If two objects dont have the same type, the (arbitrary) type hierarchy is: undefined, null, number, strings, boolean, dates, arrays, objects * If two objects dont have the same type, the (arbitrary) type hierarchy is: undefined, null, number, strings, boolean, dates, arrays, objects
* Return -1 if a < b, 1 if a > b and 0 if a = b (note that equality here is NOT the same as defined in areThingsEqual!) * Return -1 if a < b, 1 if a > b and 0 if a = b (note that equality here is NOT the same as defined in areThingsEqual!)
* * @param {*} a
* @param {Function} _compareStrings String comparing function, returning -1, 0 or 1, overriding default string comparison (useful for languages with accented letters) * @param {*} b
* @param {compareStrings} [_compareStrings] String comparing function, returning -1, 0 or 1, overriding default string comparison (useful for languages with accented letters)
* @return {number}
* @alias module:model.compareThings
*/ */
const compareThings = (a, b, _compareStrings) => { const compareThings = (a, b, _compareStrings) => {
const compareStrings = _compareStrings || compareNSB const compareStrings = _compareStrings || compareNSB
@ -221,35 +250,136 @@ const compareThings = (a, b, _compareStrings) => {
// ============================================================== // ==============================================================
/** /**
* @callback modifierFunction
* The signature of modifier functions is as follows * The signature of modifier functions is as follows
* Their structure is always the same: recursively follow the dot notation while creating * Their structure is always the same: recursively follow the dot notation while creating
* the nested documents if needed, then apply the "last step modifier" * the nested documents if needed, then apply the "last step modifier"
* @param {Object} obj The model to modify * @param {Object} obj The model to modify
* @param {String} field Can contain dots, in that case that means we will set a subfield recursively * @param {String} field Can contain dots, in that case that means we will set a subfield recursively
* @param {Model} value * @param {document} value
*/ */
/** /**
* Set a field to a new value * Create the complete modifier function
* @param {modifierFunction} lastStepModifierFunction a lastStepModifierFunction
* @param {boolean} [unset = false] Bad looking specific fix, needs to be generalized modifiers that behave like $unset are implemented
* @return {modifierFunction}
* @private
*/ */
lastStepModifierFunctions.$set = (obj, field, value) => { const createModifierFunction = (lastStepModifierFunction, unset = false) => (obj, field, value) => {
obj[field] = value const func = (obj, field, value) => {
const fieldParts = typeof field === 'string' ? field.split('.') : field
if (fieldParts.length === 1) lastStepModifierFunction(obj, field, value)
else {
if (obj[fieldParts[0]] === undefined) {
if (unset) return
obj[fieldParts[0]] = {}
}
func(obj[fieldParts[0]], fieldParts.slice(1), value)
}
}
return func(obj, field, value)
}
const $addToSetPartial = (obj, field, value) => {
// Create the array if it doesn't exist
if (!Object.prototype.hasOwnProperty.call(obj, field)) { obj[field] = [] }
if (!Array.isArray(obj[field])) throw new Error('Can\'t $addToSet an element on non-array values')
if (value !== null && typeof value === 'object' && value.$each) {
if (Object.keys(value).length > 1) throw new Error('Can\'t use another field in conjunction with $each')
if (!Array.isArray(value.$each)) throw new Error('$each requires an array value')
value.$each.forEach(v => {
$addToSetPartial(obj, field, v)
})
} else {
let addToSet = true
obj[field].forEach(v => {
if (compareThings(v, value) === 0) addToSet = false
})
if (addToSet) obj[field].push(value)
}
} }
/** /**
* @enum {modifierFunction}
*/
const modifierFunctions = {
/**
* Set a field to a new value
*/
$set: createModifierFunction((obj, field, value) => {
obj[field] = value
}),
/**
* Unset a field * Unset a field
*/ */
lastStepModifierFunctions.$unset = (obj, field, value) => { $unset: createModifierFunction((obj, field, value) => {
delete obj[field] delete obj[field]
} }, true),
/**
* Updates the value of the field, only if specified field is smaller than the current value of the field
*/
$min: createModifierFunction((obj, field, value) => {
if (typeof obj[field] === 'undefined') obj[field] = value
else if (value < obj[field]) obj[field] = value
}),
/**
* Updates the value of the field, only if specified field is greater than the current value of the field
*/
$max: createModifierFunction((obj, field, value) => {
if (typeof obj[field] === 'undefined') obj[field] = value
else if (value > obj[field]) obj[field] = value
}),
/**
* Increment a numeric field's value
*/
$inc: createModifierFunction((obj, field, value) => {
if (typeof value !== 'number') throw new Error(`${value} must be a number`)
/** if (typeof obj[field] !== 'number') {
if (!Object.prototype.hasOwnProperty.call(obj, field)) obj[field] = value
else throw new Error('Don\'t use the $inc modifier on non-number fields')
} else obj[field] += value
}),
/**
* Removes all instances of a value from an existing array
*/
$pull: createModifierFunction((obj, field, value) => {
if (!Array.isArray(obj[field])) throw new Error('Can\'t $pull an element from non-array values')
const arr = obj[field]
for (let i = arr.length - 1; i >= 0; i -= 1) {
if (match(arr[i], value)) arr.splice(i, 1)
}
}),
/**
* Remove the first or last element of an array
*/
$pop: createModifierFunction((obj, field, value) => {
if (!Array.isArray(obj[field])) throw new Error('Can\'t $pop an element from non-array values')
if (typeof value !== 'number') throw new Error(`${value} isn't an integer, can't use it with $pop`)
if (value === 0) return
if (value > 0) obj[field] = obj[field].slice(0, obj[field].length - 1)
else obj[field] = obj[field].slice(1)
}),
/**
* Add an element to an array field only if it is not already in it
* No modification if the element is already in the array
* Note that it doesn't check whether the original array contains duplicates
*/
$addToSet: createModifierFunction($addToSetPartial),
/**
* Push an element to the end of an array field * Push an element to the end of an array field
* Optional modifier $each instead of value to push several values * Optional modifier $each instead of value to push several values
* Optional modifier $slice to slice the resulting array, see https://docs.mongodb.org/manual/reference/operator/update/slice/ * Optional modifier $slice to slice the resulting array, see https://docs.mongodb.org/manual/reference/operator/update/slice/
* Différeence with MongoDB: if $slice is specified and not $each, we act as if value is an empty array * Difference with MongoDB: if $slice is specified and not $each, we act as if value is an empty array
*/ */
lastStepModifierFunctions.$push = (obj, field, value) => { $push: createModifierFunction((obj, field, value) => {
// Create the array if it doesn't exist // Create the array if it doesn't exist
if (!Object.prototype.hasOwnProperty.call(obj, field)) obj[field] = [] if (!Object.prototype.hasOwnProperty.call(obj, field)) obj[field] = []
@ -292,108 +422,16 @@ lastStepModifierFunctions.$push = (obj, field, value) => {
} else { } else {
obj[field].push(value) obj[field].push(value)
} }
}
/**
* Add an element to an array field only if it is not already in it
* No modification if the element is already in the array
* Note that it doesn't check whether the original array contains duplicates
*/
lastStepModifierFunctions.$addToSet = (obj, field, value) => {
// Create the array if it doesn't exist
if (!Object.prototype.hasOwnProperty.call(obj, field)) { obj[field] = [] }
if (!Array.isArray(obj[field])) throw new Error('Can\'t $addToSet an element on non-array values')
if (value !== null && typeof value === 'object' && value.$each) {
if (Object.keys(value).length > 1) throw new Error('Can\'t use another field in conjunction with $each')
if (!Array.isArray(value.$each)) throw new Error('$each requires an array value')
value.$each.forEach(v => {
lastStepModifierFunctions.$addToSet(obj, field, v)
})
} else {
let addToSet = true
obj[field].forEach(v => {
if (compareThings(v, value) === 0) addToSet = false
}) })
if (addToSet) obj[field].push(value)
}
}
/**
* Remove the first or last element of an array
*/
lastStepModifierFunctions.$pop = (obj, field, value) => {
if (!Array.isArray(obj[field])) throw new Error('Can\'t $pop an element from non-array values')
if (typeof value !== 'number') throw new Error(`${value} isn't an integer, can't use it with $pop`)
if (value === 0) return
if (value > 0) obj[field] = obj[field].slice(0, obj[field].length - 1)
else obj[field] = obj[field].slice(1)
} }
/**
* Removes all instances of a value from an existing array
*/
lastStepModifierFunctions.$pull = (obj, field, value) => {
if (!Array.isArray(obj[field])) throw new Error('Can\'t $pull an element from non-array values')
const arr = obj[field]
for (let i = arr.length - 1; i >= 0; i -= 1) {
if (match(arr[i], value)) arr.splice(i, 1)
}
}
/**
* Increment a numeric field's value
*/
lastStepModifierFunctions.$inc = (obj, field, value) => {
if (typeof value !== 'number') throw new Error(`${value} must be a number`)
if (typeof obj[field] !== 'number') {
if (!Object.prototype.hasOwnProperty.call(obj, field)) obj[field] = value
else throw new Error('Don\'t use the $inc modifier on non-number fields')
} else obj[field] += value
}
/**
* Updates the value of the field, only if specified field is greater than the current value of the field
*/
lastStepModifierFunctions.$max = (obj, field, value) => {
if (typeof obj[field] === 'undefined') obj[field] = value
else if (value > obj[field]) obj[field] = value
}
/**
* Updates the value of the field, only if specified field is smaller than the current value of the field
*/
lastStepModifierFunctions.$min = (obj, field, value) => {
if (typeof obj[field] === 'undefined') obj[field] = value
else if (value < obj[field]) obj[field] = value
}
// Given its name, create the complete modifier function
const createModifierFunction = modifier => (obj, field, value) => {
const fieldParts = typeof field === 'string' ? field.split('.') : field
if (fieldParts.length === 1) lastStepModifierFunctions[modifier](obj, field, value)
else {
if (obj[fieldParts[0]] === undefined) {
if (modifier === '$unset') return // Bad looking specific fix, needs to be generalized modifiers that behave like $unset are implemented
obj[fieldParts[0]] = {}
}
modifierFunctions[modifier](obj[fieldParts[0]], fieldParts.slice(1), value)
}
}
// Actually create all modifier functions
Object.keys(lastStepModifierFunctions).forEach(modifier => {
modifierFunctions[modifier] = createModifierFunction(modifier)
})
/** /**
* Modify a DB object according to an update query * Modify a DB object according to an update query
* @param {document} obj
* @param {query} updateQuery
* @return {document}
* @alias module:model.modify
*/ */
const modify = (obj, updateQuery) => { const modify = (obj, updateQuery) => {
const keys = Object.keys(updateQuery) const keys = Object.keys(updateQuery)
@ -441,8 +479,10 @@ const modify = (obj, updateQuery) => {
/** /**
* Get a value from object with dot notation * Get a value from object with dot notation
* @param {Object} obj * @param {object} obj
* @param {String} field * @param {string} field
* @return {*}
* @alias module:model.getDotValue
*/ */
const getDotValue = (obj, field) => { const getDotValue = (obj, field) => {
const fieldParts = typeof field === 'string' ? field.split('.') : field const fieldParts = typeof field === 'string' ? field.split('.') : field
@ -468,6 +508,10 @@ const getDotValue = (obj, field) => {
* Things are defined as any native types (string, number, boolean, null, date) and objects * Things are defined as any native types (string, number, boolean, null, date) and objects
* In the case of object, we check deep equality * In the case of object, we check deep equality
* Returns true if they are, false otherwise * Returns true if they are, false otherwise
* @param {*} a
* @param {*} a
* @return {boolean}
* @alias module:model.areThingsEqual
*/ */
const areThingsEqual = (a, b) => { const areThingsEqual = (a, b) => {
// Strings, booleans, numbers, null // Strings, booleans, numbers, null
@ -513,6 +557,10 @@ const areThingsEqual = (a, b) => {
/** /**
* Check that two values are comparable * Check that two values are comparable
* @param {*} a
* @param {*} a
* @return {boolean}
* @private
*/ */
const areComparable = (a, b) => { const areComparable = (a, b) => {
if ( if (
@ -530,21 +578,29 @@ const areComparable = (a, b) => {
} }
/** /**
* @callback comparisonOperator
* Arithmetic and comparison operators * Arithmetic and comparison operators
* @param {Native value} a Value in the object * @param {*} a Value in the object
* @param {Native value} b Value in the query * @param {*} b Value in the query
* @return {boolean}
*/ */
comparisonFunctions.$lt = (a, b) => areComparable(a, b) && a < b
comparisonFunctions.$lte = (a, b) => areComparable(a, b) && a <= b
comparisonFunctions.$gt = (a, b) => areComparable(a, b) && a > b
comparisonFunctions.$gte = (a, b) => areComparable(a, b) && a >= b /**
* @enum {comparisonOperator}
comparisonFunctions.$ne = (a, b) => a === undefined || !areThingsEqual(a, b) */
const comparisonFunctions = {
comparisonFunctions.$in = (a, b) => { /** Lower than */
$lt: (a, b) => areComparable(a, b) && a < b,
/** Lower than or equals */
$lte: (a, b) => areComparable(a, b) && a <= b,
/** Greater than */
$gt: (a, b) => areComparable(a, b) && a > b,
/** Greater than or equals */
$gte: (a, b) => areComparable(a, b) && a >= b,
/** Does not equal */
$ne: (a, b) => a === undefined || !areThingsEqual(a, b),
/** Is in Array */
$in: (a, b) => {
if (!Array.isArray(b)) throw new Error('$in operator called with a non-array') if (!Array.isArray(b)) throw new Error('$in operator called with a non-array')
for (const el of b) { for (const el of b) {
@ -552,53 +608,57 @@ comparisonFunctions.$in = (a, b) => {
} }
return false return false
} },
/** Is not in Array */
comparisonFunctions.$nin = (a, b) => { $nin: (a, b) => {
if (!Array.isArray(b)) throw new Error('$nin operator called with a non-array') if (!Array.isArray(b)) throw new Error('$nin operator called with a non-array')
return !comparisonFunctions.$in(a, b) return !comparisonFunctions.$in(a, b)
} },
/** Matches Regexp */
comparisonFunctions.$regex = (a, b) => { $regex: (a, b) => {
if (!isRegExp(b)) throw new Error('$regex operator called with non regular expression') if (!isRegExp(b)) throw new Error('$regex operator called with non regular expression')
if (typeof a !== 'string') return false if (typeof a !== 'string') return false
else return b.test(a) else return b.test(a)
} },
/** Returns true if field exists */
comparisonFunctions.$exists = (value, exists) => { $exists: (a, b) => {
// This will be true for all values of stat except false, null, undefined and 0 // This will be true for all values of stat except false, null, undefined and 0
// That's strange behaviour (we should only use true/false) but that's the way Mongo does it... // That's strange behaviour (we should only use true/false) but that's the way Mongo does it...
if (exists || exists === '') exists = true if (b || b === '') b = true
else exists = false else b = false
if (value === undefined) return !exists if (a === undefined) return !b
else return exists else return b
} },
/** Specific to Arrays, returns true if a length equals b */
// Specific to arrays $size: (a, b) => {
comparisonFunctions.$size = (obj, value) => { if (!Array.isArray(a)) return false
if (!Array.isArray(obj)) return false if (b % 1 !== 0) throw new Error('$size operator called without an integer')
if (value % 1 !== 0) throw new Error('$size operator called without an integer')
return a.length === b
return obj.length === value },
} /** Specific to Arrays, returns true if some elements of a match the query b */
$elemMatch: (a, b) => {
comparisonFunctions.$elemMatch = (obj, value) => { if (!Array.isArray(a)) return false
if (!Array.isArray(obj)) return false return a.some(el => match(el, b))
return obj.some(el => match(el, value)) }
} }
arrayComparisonFunctions.$size = true const arrayComparisonFunctions = { $size: true, $elemMatch: true }
arrayComparisonFunctions.$elemMatch = true
/** /**
* @enum
*/
const logicalOperators = {
/**
* Match any of the subqueries * Match any of the subqueries
* @param {Model} obj * @param {document} obj
* @param {Array of Queries} query * @param {query[]} query
* @return {boolean}
*/ */
logicalOperators.$or = (obj, query) => { $or: (obj, query) => {
if (!Array.isArray(query)) throw new Error('$or operator used without an array') if (!Array.isArray(query)) throw new Error('$or operator used without an array')
for (let i = 0; i < query.length; i += 1) { for (let i = 0; i < query.length; i += 1) {
@ -606,14 +666,14 @@ logicalOperators.$or = (obj, query) => {
} }
return false return false
} },
/**
/**
* Match all of the subqueries * Match all of the subqueries
* @param {Model} obj * @param {document} obj
* @param {Array of Queries} query * @param {query[]} query
* @return {boolean}
*/ */
logicalOperators.$and = (obj, query) => { $and: (obj, query) => {
if (!Array.isArray(query)) throw new Error('$and operator used without an array') if (!Array.isArray(query)) throw new Error('$and operator used without an array')
for (let i = 0; i < query.length; i += 1) { for (let i = 0; i < query.length; i += 1) {
@ -621,33 +681,43 @@ logicalOperators.$and = (obj, query) => {
} }
return true return true
} },
/**
/**
* Inverted match of the query * Inverted match of the query
* @param {Model} obj * @param {document} obj
* @param {Query} query * @param {query} query
* @return {boolean}
*/ */
logicalOperators.$not = (obj, query) => !match(obj, query) $not: (obj, query) => !match(obj, query),
/** /**
* @callback whereCallback
* @param {document} obj
* @return {boolean}
*/
/**
* Use a function to match * Use a function to match
* @param {Model} obj * @param {document} obj
* @param {Query} query * @param {whereCallback} fn
* @return {boolean}
*/ */
logicalOperators.$where = (obj, fn) => { $where: (obj, fn) => {
if (typeof fn !== 'function') throw new Error('$where operator used without a function') if (typeof fn !== 'function') throw new Error('$where operator used without a function')
const result = fn.call(obj) const result = fn.call(obj)
if (typeof result !== 'boolean') throw new Error('$where function must return boolean') if (typeof result !== 'boolean') throw new Error('$where function must return boolean')
return result return result
}
} }
/** /**
* Tell if a given document matches a query * Tell if a given document matches a query
* @param {Object} obj Document to check * @param {document} obj Document to check
* @param {Object} query * @param {query} query
* @return {boolean}
* @alias module:model.match
*/ */
const match = (obj, query) => { const match = (obj, query) => {
// Primitive query against a primitive type // Primitive query against a primitive type
@ -672,6 +742,12 @@ const match = (obj, query) => {
/** /**
* Match an object against a specific { key: value } part of a query * Match an object against a specific { key: value } part of a query
* if the treatObjAsValue flag is set, don't try to match every part separately, but the array as a whole * if the treatObjAsValue flag is set, don't try to match every part separately, but the array as a whole
* @param {object} obj
* @param {string} queryKey
* @param {*} queryValue
* @param {boolean} [treatObjAsValue=false]
* @return {boolean}
* @private
*/ */
function matchQueryPart (obj, queryKey, queryValue, treatObjAsValue) { function matchQueryPart (obj, queryKey, queryValue, treatObjAsValue) {
const objValue = getDotValue(obj, queryKey) const objValue = getDotValue(obj, queryKey)

@ -1,23 +1,48 @@
/**
* Handle every persistence-related task
* The interface Datastore expects to be implemented is
* * Persistence.loadDatabase(callback) and callback has signature err
* * Persistence.persistNewState(newDocs, callback) where newDocs is an array of documents and callback has signature err
*/
const path = require('path') const path = require('path')
const async = require('async') const { deprecate } = require('util')
const byline = require('./byline') const byline = require('./byline')
const customUtils = require('./customUtils.js') const customUtils = require('./customUtils.js')
const Index = require('./indexes.js') const Index = require('./indexes.js')
const model = require('./model.js') const model = require('./model.js')
const storage = require('./storage.js') const storage = require('./storage.js')
/**
* Under the hood, NeDB's persistence uses an append-only format, meaning that all
* updates and deletes actually result in lines added at the end of the datafile,
* for performance reasons. The database is automatically compacted (i.e. put back
* in the one-line-per-document format) every time you load each database within
* your application.
*
* Persistence handles the compaction exposed in the Datastore {@link Datastore#compactDatafileAsync},
* {@link Datastore#setAutocompactionInterval}.
*
* Since version 3.0.0, using {@link Datastore.persistence} methods manually is deprecated.
*
* Compaction takes a bit of time (not too much: 130ms for 50k
* records on a typical development machine) and no other operation can happen when
* it does, so most projects actually don't need to use it.
*
* Compaction will also immediately remove any documents whose data line has become
* corrupted, assuming that the total percentage of all corrupted documents in that
* database still falls below the specified `corruptAlertThreshold` option's value.
*
* Durability works similarly to major databases: compaction forces the OS to
* physically flush data to disk, while appends to the data file do not (the OS is
* responsible for flushing the data). That guarantees that a server crash can
* never cause complete data loss, while preserving performance. The worst that can
* happen is a crash between two syncs, causing a loss of all data between the two
* syncs. Usually syncs are 30 seconds appart so that's at most 30 seconds of
* data. [This post by Antirez on Redis persistence](http://oldblog.antirez.com/post/redis-persistence-demystified.html)
* explains this in more details, NeDB being very close to Redis AOF persistence
* with `appendfsync` option set to `no`.
*/
class Persistence { class Persistence {
/** /**
* Create a new Persistence object for database options.db * Create a new Persistence object for database options.db
* @param {Datastore} options.db * @param {Datastore} options.db
* @param {Number} [options.corruptAlertThreshold] Optional, threshold after which an alert is thrown if too much data is corrupt * @param {Number} [options.corruptAlertThreshold] Optional, threshold after which an alert is thrown if too much data is corrupt
* @param {string} [options.nodeWebkitAppName] Optional, specify the name of your NW app if you want options.filename to be relative to the directory where Node Webkit stores application data such as cookies and local storage (the best place to store data in my opinion) * @param {serializationHook} [options.beforeDeserialization] Hook you can use to transform data after it was serialized and before it is written to disk.
* @param {serializationHook} [options.afterSerialization] Inverse of `afterSerialization`.
*/ */
constructor (options) { constructor (options) {
this.db = options.db this.db = options.db
@ -52,30 +77,17 @@ class Persistence {
} }
} }
} }
// For NW apps, store data in the same directory where NW stores application data
if (this.filename && options.nodeWebkitAppName) {
console.log('==================================================================')
console.log('WARNING: The nodeWebkitAppName option is deprecated')
console.log('To get the path to the directory where Node Webkit stores the data')
console.log('for your app, use the internal nw.gui module like this')
console.log('require(\'nw.gui\').App.dataPath')
console.log('See https://github.com/rogerwang/node-webkit/issues/500')
console.log('==================================================================')
this.filename = Persistence.getNWAppFilename(options.nodeWebkitAppName, this.filename)
}
} }
/** /**
* Persist cached database * Internal version without using the {@link Datastore#executor} of {@link Datastore#compactDatafileAsync}, use it instead.
* This serves as a compaction function since the cache always contains only the number of documents in the collection * @return {Promise<void>}
* while the data file is append-only so it may grow larger * @private
* @param {Function} callback Optional callback, signature: err
*/ */
persistCachedDatabase (callback = () => {}) { async persistCachedDatabaseAsync () {
const lines = [] const lines = []
if (this.inMemoryOnly) return callback(null) if (this.inMemoryOnly) return
this.db.getAllData().forEach(doc => { this.db.getAllData().forEach(doc => {
lines.push(this.afterSerialization(model.serialize(doc))) lines.push(this.afterSerialization(model.serialize(doc)))
@ -92,76 +104,86 @@ class Persistence {
} }
}) })
storage.crashSafeWriteFileLines(this.filename, lines, err => { await storage.crashSafeWriteFileLinesAsync(this.filename, lines)
if (err) return callback(err)
this.db.emit('compaction.done') this.db.emit('compaction.done')
return callback(null)
})
} }
/** /**
* Queue a rewrite of the datafile * @see Datastore#compactDatafile
* @deprecated
* @param {NoParamCallback} [callback = () => {}]
* @see Persistence#compactDatafileAsync
*/ */
compactDatafile () { compactDatafile (callback) {
this.db.executor.push({ this: this, fn: this.persistCachedDatabase, arguments: [] }) deprecate(_callback => this.db.compactDatafile(_callback), '@seald-io/nedb: calling Datastore#persistence#compactDatafile is deprecated, please use Datastore#compactDatafile, it will be removed in the next major version.')(callback)
} }
/** /**
* Set automatic compaction every interval ms * @see Datastore#setAutocompactionInterval
* @param {Number} interval in milliseconds, with an enforced minimum of 5 seconds * @deprecated
*/ */
setAutocompactionInterval (interval) { setAutocompactionInterval (interval) {
const minInterval = 5000 deprecate(_interval => this.db.setAutocompactionInterval(_interval), '@seald-io/nedb: calling Datastore#persistence#setAutocompactionInterval is deprecated, please use Datastore#setAutocompactionInterval, it will be removed in the next major version.')(interval)
const realInterval = Math.max(interval || 0, minInterval)
this.stopAutocompaction()
this.autocompactionIntervalId = setInterval(() => {
this.compactDatafile()
}, realInterval)
} }
/** /**
* Stop autocompaction (do nothing if autocompaction was not running) * @see Datastore#stopAutocompaction
* @deprecated
*/ */
stopAutocompaction () { stopAutocompaction () {
if (this.autocompactionIntervalId) clearInterval(this.autocompactionIntervalId) deprecate(() => this.db.stopAutocompaction(), '@seald-io/nedb: calling Datastore#persistence#stopAutocompaction is deprecated, please use Datastore#stopAutocompaction, it will be removed in the next major version.')()
} }
/** /**
* Persist new state for the given newDocs (can be insertion, update or removal) * Persist new state for the given newDocs (can be insertion, update or removal)
* Use an append-only format * Use an append-only format
* @param {Array} newDocs Can be empty if no doc was updated/removed *
* @param {Function} callback Optional, signature: err * Do not use directly, it should only used by a {@link Datastore} instance.
* @param {document[]} newDocs Can be empty if no doc was updated/removed
* @return {Promise}
* @private
*/ */
persistNewState (newDocs, callback = () => {}) { async persistNewStateAsync (newDocs) {
let toPersist = '' let toPersist = ''
// In-memory only datastore // In-memory only datastore
if (this.inMemoryOnly) return callback(null) if (this.inMemoryOnly) return
newDocs.forEach(doc => { newDocs.forEach(doc => {
toPersist += this.afterSerialization(model.serialize(doc)) + '\n' toPersist += this.afterSerialization(model.serialize(doc)) + '\n'
}) })
if (toPersist.length === 0) return callback(null) if (toPersist.length === 0) return
storage.appendFile(this.filename, toPersist, 'utf8', err => callback(err)) await storage.appendFileAsync(this.filename, toPersist, 'utf8')
} }
/** /**
* From a database's raw data, return the corresponding * @typedef rawIndex
* machine understandable collection * @property {string} fieldName
* @property {boolean} [unique]
* @property {boolean} [sparse]
*/
/**
* From a database's raw data, return the corresponding machine understandable collection.
*
* Do not use directly, it should only used by a {@link Datastore} instance.
* @param {string} rawData database file
* @return {{data: document[], indexes: Object.<string, rawIndex>}}
* @private
*/ */
treatRawData (rawData) { treatRawData (rawData) {
const data = rawData.split('\n') const data = rawData.split('\n')
const dataById = {} const dataById = {}
const indexes = {} const indexes = {}
let dataLength = data.length
// Last line of every data file is usually blank so not really corrupt // Last line of every data file is usually blank so not really corrupt
let corruptItems = -1 let corruptItems = 0
for (const datum of data) { for (const datum of data) {
if (datum === '') { dataLength--; continue }
try { try {
const doc = model.deserialize(this.beforeDeserialization(datum)) const doc = model.deserialize(this.beforeDeserialization(datum))
if (doc._id) { if (doc._id) {
@ -175,10 +197,16 @@ class Persistence {
} }
// A bit lenient on corruption // A bit lenient on corruption
if ( if (dataLength > 0) {
data.length > 0 && const corruptionRate = corruptItems / dataLength
corruptItems / data.length > this.corruptAlertThreshold if (corruptionRate > this.corruptAlertThreshold) {
) throw new Error(`More than ${Math.floor(100 * this.corruptAlertThreshold)}% of the data file is corrupt, the wrong beforeDeserialization hook may be used. Cautiously refusing to start NeDB to prevent dataloss`) const error = new Error(`${Math.floor(100 * corruptionRate)}% of the data file is corrupt, more than given corruptAlertThreshold (${Math.floor(100 * this.corruptAlertThreshold)}%). Cautiously refusing to start NeDB to prevent dataloss.`)
error.corruptionRate = corruptionRate
error.corruptItems = corruptItems
error.dataLength = dataLength
throw error
}
}
const tdata = Object.values(dataById) const tdata = Object.values(dataById)
@ -186,20 +214,32 @@ class Persistence {
} }
/** /**
* From a database's raw stream, return the corresponding * From a database's raw data stream, return the corresponding machine understandable collection
* machine understandable collection * Is only used by a {@link Datastore} instance.
*
* Is only used in the Node.js version, since [React-Native]{@link module:storageReactNative} &
* [browser]{@link module:storageBrowser} storage modules don't provide an equivalent of
* {@link module:storage.readFileStream}.
*
* Do not use directly, it should only used by a {@link Datastore} instance.
* @param {Readable} rawStream
* @return {Promise<{data: document[], indexes: Object.<string, rawIndex>}>}
* @async
* @private
*/ */
treatRawStream (rawStream, cb) { treatRawStreamAsync (rawStream) {
return new Promise((resolve, reject) => {
const dataById = {} const dataById = {}
const indexes = {} const indexes = {}
// Last line of every data file is usually blank so not really corrupt let corruptItems = 0
let corruptItems = -1
const lineStream = byline(rawStream, { keepEmptyLines: true }) const lineStream = byline(rawStream)
let length = 0 let dataLength = 0
lineStream.on('data', (line) => { lineStream.on('data', (line) => {
if (line === '') return
try { try {
const doc = model.deserialize(this.beforeDeserialization(line)) const doc = model.deserialize(this.beforeDeserialization(line))
if (doc._id) { if (doc._id) {
@ -211,24 +251,30 @@ class Persistence {
corruptItems += 1 corruptItems += 1
} }
length++ dataLength++
}) })
lineStream.on('end', () => { lineStream.on('end', () => {
// A bit lenient on corruption // A bit lenient on corruption
if (length > 0 && corruptItems / length > this.corruptAlertThreshold) { if (dataLength > 0) {
const err = new Error(`More than ${Math.floor(100 * this.corruptAlertThreshold)}% of the data file is corrupt, the wrong beforeDeserialization hook may be used. Cautiously refusing to start NeDB to prevent dataloss`) const corruptionRate = corruptItems / dataLength
cb(err, null) if (corruptionRate > this.corruptAlertThreshold) {
const error = new Error(`${Math.floor(100 * corruptionRate)}% of the data file is corrupt, more than given corruptAlertThreshold (${Math.floor(100 * this.corruptAlertThreshold)}%). Cautiously refusing to start NeDB to prevent dataloss.`)
error.corruptionRate = corruptionRate
error.corruptItems = corruptItems
error.dataLength = dataLength
reject(error, null)
return return
} }
}
const data = Object.values(dataById) const data = Object.values(dataById)
cb(null, { data, indexes: indexes }) resolve({ data, indexes: indexes })
}) })
lineStream.on('error', function (err) { lineStream.on('error', function (err) {
cb(err) reject(err, null)
})
}) })
} }
@ -237,28 +283,33 @@ class Persistence {
* 1) Create all indexes * 1) Create all indexes
* 2) Insert all data * 2) Insert all data
* 3) Compact the database * 3) Compact the database
*
* This means pulling data out of the data file or creating it if it doesn't exist * This means pulling data out of the data file or creating it if it doesn't exist
* Also, all data is persisted right away, which has the effect of compacting the database file * Also, all data is persisted right away, which has the effect of compacting the database file
* This operation is very quick at startup for a big collection (60ms for ~10k docs) * This operation is very quick at startup for a big collection (60ms for ~10k docs)
* @param {Function} callback Optional callback, signature: err *
* Do not use directly as it does not use the [Executor]{@link Datastore.executor}, use {@link Datastore#loadDatabaseAsync} instead.
* @return {Promise<void>}
* @private
*/ */
loadDatabase (callback = () => {}) { async loadDatabaseAsync () {
this.db.resetIndexes() this.db._resetIndexes()
// In-memory only datastore // In-memory only datastore
if (this.inMemoryOnly) return callback(null) if (this.inMemoryOnly) return
await Persistence.ensureDirectoryExistsAsync(path.dirname(this.filename))
async.waterfall([ await storage.ensureDatafileIntegrityAsync(this.filename)
cb => {
// eslint-disable-next-line node/handle-callback-err
Persistence.ensureDirectoryExists(path.dirname(this.filename), err => {
// TODO: handle error
// eslint-disable-next-line node/handle-callback-err
storage.ensureDatafileIntegrity(this.filename, err => {
// TODO: handle error
const treatedDataCallback = (err, treatedData) => {
if (err) return cb(err)
let treatedData
if (storage.readFileStream) {
// Server side
const fileStream = storage.readFileStream(this.filename, { encoding: 'utf8' })
treatedData = await this.treatRawStreamAsync(fileStream)
} else {
// Browser
const rawData = await storage.readFileAsync(this.filename, 'utf8')
treatedData = this.treatRawData(rawData)
}
// Recreate all indexes in the datafile // Recreate all indexes in the datafile
Object.keys(treatedData.indexes).forEach(key => { Object.keys(treatedData.indexes).forEach(key => {
this.db.indexes[key] = new Index(treatedData.indexes[key]) this.db.indexes[key] = new Index(treatedData.indexes[key])
@ -266,74 +317,51 @@ class Persistence {
// Fill cached database (i.e. all indexes) with data // Fill cached database (i.e. all indexes) with data
try { try {
this.db.resetIndexes(treatedData.data) this.db._resetIndexes(treatedData.data)
} catch (e) {
this.db.resetIndexes() // Rollback any index which didn't fail
return cb(e)
}
this.db.persistence.persistCachedDatabase(cb)
}
if (storage.readFileStream) {
// Server side
const fileStream = storage.readFileStream(this.filename, { encoding: 'utf8' })
this.treatRawStream(fileStream, treatedDataCallback)
return
}
// Browser
storage.readFile(this.filename, 'utf8', (err, rawData) => {
if (err) return cb(err)
try {
const treatedData = this.treatRawData(rawData)
treatedDataCallback(null, treatedData)
} catch (e) { } catch (e) {
return cb(e) this.db._resetIndexes() // Rollback any index which didn't fail
throw e
} }
})
})
})
}
], err => {
if (err) return callback(err)
await this.db.persistence.persistCachedDatabaseAsync()
this.db.executor.processBuffer() this.db.executor.processBuffer()
return callback(null)
})
} }
/** /**
* Check if a directory stat and create it on the fly if it is not the case * See {@link Datastore#dropDatabaseAsync}. This function uses {@link Datastore#executor} internally. Decorating this
* cb is optional, signature: err * function with an {@link Executor#pushAsync} will result in a deadlock.
* @return {Promise<void>}
* @private
* @see Datastore#dropDatabaseAsync
*/ */
static ensureDirectoryExists (dir, callback = () => {}) { async dropDatabaseAsync () {
storage.mkdir(dir, { recursive: true }, err => { callback(err) }) this.db.stopAutocompaction() // stop autocompaction
this.db.executor.ready = false // prevent queuing new tasks
this.db.executor.resetBuffer() // remove pending buffered tasks
await this.db.executor.queue.guardian // wait for the ongoing tasks to end
// remove indexes (which means remove data from memory)
this.db.indexes = {}
// add back _id index, otherwise it will fail
this.db.indexes._id = new Index({ fieldName: '_id', unique: true })
// reset TTL on indexes
this.db.ttlIndexes = {}
// remove datastore file
if (!this.db.inMemoryOnly) {
await this.db.executor.pushAsync(async () => {
if (await storage.existsAsync(this.filename)) await storage.unlinkAsync(this.filename)
}, true)
}
} }
/** /**
* Return the path the datafile if the given filename is relative to the directory where Node Webkit stores * Check if a directory stat and create it on the fly if it is not the case.
* data for this application. Probably the best place to store data * @param {string} dir
* @return {Promise<void>}
* @private
*/ */
static getNWAppFilename (appName, relativeFilename) { static async ensureDirectoryExistsAsync (dir) {
let home await storage.mkdirAsync(dir, { recursive: true })
if (process.platform === 'win32' || process.platform === 'win64') {
home = process.env.LOCALAPPDATA || process.env.APPDATA
if (!home) throw new Error('Couldn\'t find the base application data folder')
home = path.join(home, appName)
} else if (process.platform === 'darwin') {
home = process.env.HOME
if (!home) throw new Error('Couldn\'t find the base application data directory')
home = path.join(home, 'Library', 'Application Support', appName)
} else if (process.platform === 'linux') {
home = process.env.HOME
if (!home) throw new Error('Couldn\'t find the base application data directory')
home = path.join(home, '.config', appName)
} else throw new Error(`Can't use the Node Webkit relative path for platform ${process.platform}`)
return path.join(home, 'nedb-data', relativeFilename)
} }
} }

@ -1,45 +1,136 @@
/** /**
* Way data is stored for this database * Way data is stored for this database.
* For a Node.js/Node Webkit database it's the file system * This version is the Node.js/Node Webkit version.
* For a browser-side database it's localforage which chooses the best option depending on user browser (IndexedDB then WebSQL then localStorage) * It's essentially fs, mkdirp and crash safe write and read functions.
* *
* This version is the Node.js/Node Webkit version * @see module:storageBrowser
* It's essentially fs, mkdirp and crash safe write and read functions * @see module:storageReactNative
* @module storage
* @private
*/ */
const fs = require('fs') const fs = require('fs')
const fsPromises = fs.promises
const path = require('path') const path = require('path')
const async = require('async')
const storage = {}
const { Readable } = require('stream') const { Readable } = require('stream')
// eslint-disable-next-line node/no-callback-literal /**
storage.exists = (path, cb) => fs.access(path, fs.constants.F_OK, (err) => { cb(!err) }) * Returns true if file exists.
storage.rename = fs.rename * @param {string} file
storage.writeFile = fs.writeFile * @return {Promise<boolean>}
storage.unlink = fs.unlink * @async
storage.appendFile = fs.appendFile * @alias module:storage.existsAsync
storage.readFile = fs.readFile * @see module:storage.exists
storage.readFileStream = fs.createReadStream */
storage.mkdir = fs.mkdir const existsAsync = file => fsPromises.access(file, fs.constants.F_OK).then(() => true, () => false)
/** /**
* Explicit name ... * Node.js' [fsPromises.rename]{@link https://nodejs.org/api/fs.html#fspromisesrenameoldpath-newpath}
* @function
* @param {string} oldPath
* @param {string} newPath
* @return {Promise<void>}
* @alias module:storage.renameAsync
* @async
*/ */
storage.ensureFileDoesntExist = (file, callback) => { const renameAsync = fsPromises.rename
storage.exists(file, exists => {
if (!exists) return callback(null)
storage.unlink(file, err => callback(err)) /**
}) * Node.js' [fsPromises.writeFile]{@link https://nodejs.org/api/fs.html#fspromiseswritefilefile-data-options}.
* @function
* @param {string} path
* @param {string} data
* @param {object} [options]
* @return {Promise<void>}
* @alias module:storage.writeFileAsync
* @async
*/
const writeFileAsync = fsPromises.writeFile
/**
* Node.js' [fs.createWriteStream]{@link https://nodejs.org/api/fs.html#fscreatewritestreampath-options}.
* @function
* @param {string} path
* @param {Object} [options]
* @return {fs.WriteStream}
* @alias module:storage.writeFileStream
*/
const writeFileStream = fs.createWriteStream
/**
* Node.js' [fsPromises.unlink]{@link https://nodejs.org/api/fs.html#fspromisesunlinkpath}.
* @function
* @param {string} path
* @return {Promise<void>}
* @async
* @alias module:storage.unlinkAsync
*/
const unlinkAsync = fsPromises.unlink
/**
* Node.js' [fsPromises.appendFile]{@link https://nodejs.org/api/fs.html#fspromisesappendfilepath-data-options}.
* @function
* @param {string} path
* @param {string} data
* @param {object} [options]
* @return {Promise<void>}
* @alias module:storage.appendFileAsync
* @async
*/
const appendFileAsync = fsPromises.appendFile
/**
* Node.js' [fsPromises.readFile]{@link https://nodejs.org/api/fs.html#fspromisesreadfilepath-options}.
* @function
* @param {string} path
* @param {object} [options]
* @return {Promise<Buffer>}
* @alias module:storage.readFileAsync
* @async
*/
const readFileAsync = fsPromises.readFile
/**
* Node.js' [fs.createReadStream]{@link https://nodejs.org/api/fs.html#fscreatereadstreampath-options}.
* @function
* @param {string} path
* @param {Object} [options]
* @return {fs.ReadStream}
* @alias module:storage.readFileStream
*/
const readFileStream = fs.createReadStream
/**
* Node.js' [fsPromises.mkdir]{@link https://nodejs.org/api/fs.html#fspromisesmkdirpath-options}.
* @function
* @param {string} path
* @param {object} options
* @return {Promise<void|string>}
* @alias module:storage.mkdirAsync
* @async
*/
const mkdirAsync = fsPromises.mkdir
/**
* Removes file if it exists.
* @param {string} file
* @return {Promise<void>}
* @alias module:storage.ensureFileDoesntExistAsync
* @async
*/
const ensureFileDoesntExistAsync = async file => {
if (await existsAsync(file)) await unlinkAsync(file)
} }
/** /**
* Flush data in OS buffer to storage if corresponding option is set * Flush data in OS buffer to storage if corresponding option is set.
* @param {String} options.filename * @param {object|string} options If options is a string, it is assumed that the flush of the file (not dir) called options was requested
* @param {Boolean} options.isDir Optional, defaults to false * @param {string} [options.filename]
* If options is a string, it is assumed that the flush of the file (not dir) called options was requested * @param {boolean} [options.isDir = false] Optional, defaults to false
* @return {Promise<void>}
* @alias module:storage.flushToStorageAsync
* @async
*/ */
storage.flushToStorage = (options, callback) => { const flushToStorageAsync = async (options) => {
let filename let filename
let flags let flags
if (typeof options === 'string') { if (typeof options === 'string') {
@ -62,102 +153,136 @@ storage.flushToStorage = (options, callback) => {
* database is loaded and a crash happens. * database is loaded and a crash happens.
*/ */
fs.open(filename, flags, (err, fd) => { let filehandle, errorOnFsync, errorOnClose
if (err) { try {
return callback((err.code === 'EISDIR' && options.isDir) ? null : err) filehandle = await fsPromises.open(filename, flags)
try {
await filehandle.sync()
} catch (errFS) {
errorOnFsync = errFS
}
} catch (error) {
if (error.code !== 'EISDIR' || !options.isDir) throw error
} finally {
try {
await filehandle.close()
} catch (errC) {
errorOnClose = errC
}
} }
fs.fsync(fd, errFS => { if ((errorOnFsync || errorOnClose) && !((errorOnFsync.code === 'EPERM' || errorOnClose.code === 'EISDIR') && options.isDir)) {
fs.close(fd, errC => {
if ((errFS || errC) && !((errFS.code === 'EPERM' || errFS.code === 'EISDIR') && options.isDir)) {
const e = new Error('Failed to flush to storage') const e = new Error('Failed to flush to storage')
e.errorOnFsync = errFS e.errorOnFsync = errorOnFsync
e.errorOnClose = errC e.errorOnClose = errorOnClose
return callback(e) throw e
} else {
return callback(null)
} }
})
})
})
} }
/** /**
* Fully write or rewrite the datafile * Fully write or rewrite the datafile.
* @param {String} filename * @param {string} filename
* @param {String[]} lines * @param {string[]} lines
* @param {Function} callback * @return {Promise<void>}
* @alias module:storage.writeFileLinesAsync
* @async
*/ */
storage.writeFileLines = (filename, lines, callback = () => {}) => { const writeFileLinesAsync = (filename, lines) => new Promise((resolve, reject) => {
try { try {
const stream = fs.createWriteStream(filename) const stream = writeFileStream(filename)
const readable = Readable.from(lines) const readable = Readable.from(lines)
readable.on('data', (line) => { readable.on('data', (line) => {
try { try {
stream.write(line) stream.write(line + '\n')
stream.write('\n')
} catch (err) { } catch (err) {
callback(err) reject(err)
} }
}) })
readable.on('end', () => { readable.on('end', () => {
stream.close(callback) stream.close(err => {
if (err) reject(err)
else resolve()
})
})
readable.on('error', err => {
reject(err)
})
stream.on('error', err => {
reject(err)
}) })
readable.on('error', callback)
stream.on('error', callback)
} catch (err) { } catch (err) {
callback(err) reject(err)
} }
} })
/** /**
* Fully write or rewrite the datafile, immune to crashes during the write operation (data will not be lost) * Fully write or rewrite the datafile, immune to crashes during the write operation (data will not be lost).
* @param {String} filename * @param {string} filename
* @param {String[]} lines * @param {string[]} lines
* @param {Function} callback Optional callback, signature: err * @return {Promise<void>}
* @alias module:storage.crashSafeWriteFileLinesAsync
*/ */
storage.crashSafeWriteFileLines = (filename, lines, callback = () => {}) => { const crashSafeWriteFileLinesAsync = async (filename, lines) => {
const tempFilename = filename + '~' const tempFilename = filename + '~'
async.waterfall([ await flushToStorageAsync({ filename: path.dirname(filename), isDir: true })
async.apply(storage.flushToStorage, { filename: path.dirname(filename), isDir: true }),
cb => { const exists = await existsAsync(filename)
storage.exists(filename, exists => { if (exists) await flushToStorageAsync({ filename })
if (exists) storage.flushToStorage(filename, err => cb(err))
else return cb() await writeFileLinesAsync(tempFilename, lines)
})
}, await flushToStorageAsync(tempFilename)
cb => {
storage.writeFileLines(tempFilename, lines, cb) await renameAsync(tempFilename, filename)
},
async.apply(storage.flushToStorage, tempFilename), await flushToStorageAsync({ filename: path.dirname(filename), isDir: true })
cb => {
storage.rename(tempFilename, filename, err => cb(err))
},
async.apply(storage.flushToStorage, { filename: path.dirname(filename), isDir: true })
], err => callback(err))
} }
/** /**
* Ensure the datafile contains all the data, even if there was a crash during a full file write * Ensure the datafile contains all the data, even if there was a crash during a full file write.
* @param {String} filename * @param {string} filename
* @param {Function} callback signature: err * @return {Promise<void>}
* @alias module:storage.ensureDatafileIntegrityAsync
*/ */
storage.ensureDatafileIntegrity = (filename, callback) => { const ensureDatafileIntegrityAsync = async filename => {
const tempFilename = filename + '~' const tempFilename = filename + '~'
storage.exists(filename, filenameExists => { const filenameExists = await existsAsync(filename)
// Write was successful // Write was successful
if (filenameExists) return callback(null) if (filenameExists) return
storage.exists(tempFilename, oldFilenameExists => { const oldFilenameExists = await existsAsync(tempFilename)
// New database // New database
if (!oldFilenameExists) return storage.writeFile(filename, '', 'utf8', err => { callback(err) }) if (!oldFilenameExists) await writeFileAsync(filename, '', 'utf8')
// Write failed, use old version // Write failed, use old version
storage.rename(tempFilename, filename, err => callback(err)) else await renameAsync(tempFilename, filename)
})
})
} }
// Interface // Interface
module.exports = storage module.exports.existsAsync = existsAsync
module.exports.renameAsync = renameAsync
module.exports.writeFileAsync = writeFileAsync
module.exports.writeFileLinesAsync = writeFileLinesAsync
module.exports.crashSafeWriteFileLinesAsync = crashSafeWriteFileLinesAsync
module.exports.appendFileAsync = appendFileAsync
module.exports.readFileAsync = readFileAsync
module.exports.unlinkAsync = unlinkAsync
module.exports.mkdirAsync = mkdirAsync
module.exports.readFileStream = readFileStream
module.exports.flushToStorageAsync = flushToStorageAsync
module.exports.ensureDatafileIntegrityAsync = ensureDatafileIntegrityAsync
module.exports.ensureFileDoesntExistAsync = ensureFileDoesntExistAsync

@ -1,15 +1,62 @@
const uniq = (array, iterator) => { /**
if (iterator) return [...(new Map(array.map(x => [iterator(x), x]))).values()] * Utility functions for all environments.
else return [...new Set(array)] * This replaces the underscore dependency.
} *
* @module utils
* @private
*/
const objectToString = o => Object.prototype.toString.call(o) /**
* @callback IterateeFunction
* @param {*} arg
* @return {*}
*/
/**
* Produces a duplicate-free version of the array, using === to test object equality. In particular only the first
* occurrence of each value is kept. If you want to compute unique items based on a transformation, pass an iteratee
* function.
*
* Heavily inspired by {@link https://underscorejs.org/#uniq}.
* @param {Array} array
* @param {IterateeFunction} [iteratee] transformation applied to every element before checking for duplicates. This will not
* transform the items in the result.
* @return {Array}
* @alias module:utils.uniq
*/
const uniq = (array, iteratee) => {
if (iteratee) return [...(new Map(array.map(x => [iteratee(x), x]))).values()]
else return [...new Set(array)]
}
/**
* Returns true if arg is an Object. Note that JavaScript arrays and functions are objects, while (normal) strings
* and numbers are not.
*
* Heavily inspired by {@link https://underscorejs.org/#isObject}.
* @param {*} arg
* @return {boolean}
*/
const isObject = arg => typeof arg === 'object' && arg !== null const isObject = arg => typeof arg === 'object' && arg !== null
const isDate = d => isObject(d) && objectToString(d) === '[object Date]' /**
* Returns true if d is a Date.
*
* Heavily inspired by {@link https://underscorejs.org/#isDate}.
* @param {*} d
* @return {boolean}
* @alias module:utils.isDate
*/
const isDate = d => isObject(d) && Object.prototype.toString.call(d) === '[object Date]'
const isRegExp = re => isObject(re) && objectToString(re) === '[object RegExp]' /**
* Returns true if re is a RegExp.
*
* Heavily inspired by {@link https://underscorejs.org/#isRegExp}.
* @param {*} re
* @return {boolean}
* @alias module:utils.isRegExp
*/
const isRegExp = re => isObject(re) && Object.prototype.toString.call(re) === '[object RegExp]'
module.exports.uniq = uniq module.exports.uniq = uniq
module.exports.isDate = isDate module.exports.isDate = isDate

@ -0,0 +1,48 @@
/**
* Responsible for sequentially executing actions on the database
* @private
*/
class Waterfall {
/**
* Instantiate a new Waterfall.
*/
constructor () {
/**
* This is the internal Promise object which resolves when all the tasks of the `Waterfall` are done.
*
* It will change any time `this.waterfall` is called.
*
* @type {Promise}
*/
this.guardian = Promise.resolve()
}
/**
*
* @param {AsyncFunction} func
* @return {AsyncFunction}
*/
waterfall (func) {
return (...args) => {
this.guardian = this.guardian.then(() => {
return func(...args)
.then(result => ({ error: false, result }), result => ({ error: true, result }))
})
return this.guardian.then(({ error, result }) => {
if (error) return Promise.reject(result)
else return Promise.resolve(result)
})
}
}
/**
* Shorthand for chaining a promise to the Waterfall
* @param {Promise} promise
* @return {Promise}
*/
chain (promise) {
return this.waterfall(() => promise)()
}
}
module.exports = Waterfall

15041
package-lock.json generated

File diff suppressed because it is too large Load Diff

@ -1,6 +1,6 @@
{ {
"name": "@seald-io/nedb", "name": "@seald-io/nedb",
"version": "2.2.1", "version": "3.0.0-6",
"files": [ "files": [
"lib/**/*.js", "lib/**/*.js",
"browser-version/**/*.js", "browser-version/**/*.js",
@ -42,8 +42,8 @@
}, },
"dependencies": { "dependencies": {
"@seald-io/binary-search-tree": "^1.0.2", "@seald-io/binary-search-tree": "^1.0.2",
"async": "0.2.10", "localforage": "^1.9.0",
"localforage": "^1.9.0" "util": "^0.12.4"
}, },
"devDependencies": { "devDependencies": {
"@react-native-async-storage/async-storage": "^1.15.9", "@react-native-async-storage/async-storage": "^1.15.9",
@ -53,17 +53,18 @@
"commander": "^7.2.0", "commander": "^7.2.0",
"events": "^3.3.0", "events": "^3.3.0",
"jest": "^27.3.1", "jest": "^27.3.1",
"jquery": "^3.6.0", "jsdoc-to-markdown": "^7.1.0",
"karma": "^6.3.2", "karma": "^6.3.2",
"karma-chai": "^0.1.0", "karma-chai": "^0.1.0",
"karma-chrome-launcher": "^3.1.0", "karma-chrome-launcher": "^3.1.0",
"karma-junit-reporter": "^2.0.1", "karma-junit-reporter": "^2.0.1",
"karma-mocha": "^2.0.1", "karma-mocha": "^2.0.1",
"karma-source-map-support": "^1.4.0", "karma-source-map-support": "^1.4.0",
"mocha": "^8.4.0", "mocha": "^9.1.3",
"mocha-junit-reporter": "^2.0.0", "mocha-junit-reporter": "^2.0.0",
"path-browserify": "^1.0.1", "path-browserify": "^1.0.1",
"process": "^0.11.10", "process": "^0.11.10",
"react-native": "^0.66.0",
"semver": "^7.3.5", "semver": "^7.3.5",
"source-map-loader": "^2.0.2", "source-map-loader": "^2.0.2",
"standard": "^16.0.3", "standard": "^16.0.3",
@ -84,7 +85,8 @@
"test:browser": "xvfb-maybe karma start karma.conf.local.js", "test:browser": "xvfb-maybe karma start karma.conf.local.js",
"test:react-native": "jest test/react-native", "test:react-native": "jest test/react-native",
"test:typings": "ts-node ./typings-tests.ts", "test:typings": "ts-node ./typings-tests.ts",
"prepublishOnly": "npm run build:browser" "prepublishOnly": "npm run build:browser",
"generateDocs:markdown": "jsdoc2md --no-cache -c jsdoc.conf.js --param-list-format list --files ./lib/*.js > API.md"
}, },
"main": "index.js", "main": "index.js",
"browser": { "browser": {

@ -1,5 +1,5 @@
/* eslint-env mocha, browser */ /* eslint-env mocha, browser */
/* global async, Nedb, localforage */ /* global Nedb, localforage, testUtils */
const N = 5000 const N = 5000
const db = new Nedb({ filename: 'loadTest', autoload: true }) const db = new Nedb({ filename: 'loadTest', autoload: true })
@ -9,7 +9,7 @@ const sample = JSON.stringify({ data: Math.random(), _id: Math.random() })
const someInserts = (sn, N, callback) => { const someInserts = (sn, N, callback) => {
const beg = Date.now() const beg = Date.now()
let i = 0 let i = 0
async.whilst(() => i < N, _cb => { testUtils.whilst(() => i < N, _cb => {
db.insert({ data: Math.random() }, err => { i += 1; return _cb(err) }) db.insert({ data: Math.random() }, err => { i += 1; return _cb(err) })
}, err => { }, err => {
console.log('Inserts, series ' + sn + ' ' + (Date.now() - beg)) console.log('Inserts, series ' + sn + ' ' + (Date.now() - beg))
@ -41,7 +41,7 @@ const someLSDiff = (sn, N, callback) => {
function someLF (sn, N, callback) { function someLF (sn, N, callback) {
const beg = Date.now() const beg = Date.now()
let i = 0 let i = 0
async.whilst(() => i < N, _cb => { testUtils.whilst(() => i < N, _cb => {
localforage.getItem('loadTestLF', (err, value) => { localforage.getItem('loadTestLF', (err, value) => {
if (err) return _cb(err) if (err) return _cb(err)
localforage.setItem('loadTestLF', value + sample, err => { i += 1; return _cb(err) }) localforage.setItem('loadTestLF', value + sample, err => { i += 1; return _cb(err) })
@ -56,7 +56,7 @@ function someLF (sn, N, callback) {
const someLFDiff = (sn, N, callback) => { const someLFDiff = (sn, N, callback) => {
const beg = Date.now() const beg = Date.now()
let i = 0 let i = 0
async.whilst(() => i < N, _cb => { testUtils.whilst(() => i < N, _cb => {
localforage.setItem('loadTestLF-' + i, sample, err => { i += 1; return _cb(err) }) localforage.setItem('loadTestLF-' + i, sample, err => { i += 1; return _cb(err) })
}, err => { }, err => {
console.log('localForage/IDB, series ' + sn + ' ' + (Date.now() - beg)) console.log('localForage/IDB, series ' + sn + ' ' + (Date.now() - beg))
@ -73,53 +73,53 @@ describe.skip('Load tests', function () {
}) })
it.skip('Inserts', function (done) { it.skip('Inserts', function (done) {
async.waterfall([ testUtils.waterfall([
// Slow and gets slower with database size // Slow and gets slower with database size
async.apply(someInserts, '#1', N), // N=5000, 141s testUtils.apply(someInserts, '#1', N), // N=5000, 141s
async.apply(someInserts, '#2', N), // N=5000, 208s testUtils.apply(someInserts, '#2', N), // N=5000, 208s
async.apply(someInserts, '#3', N), // N=5000, 281s testUtils.apply(someInserts, '#3', N), // N=5000, 281s
async.apply(someInserts, '#4', N) // N=5000, 350s testUtils.apply(someInserts, '#4', N) // N=5000, 350s
], done) ], done)
}) })
it.skip('Localstorage', function (done) { it.skip('Localstorage', function (done) {
async.waterfall([ testUtils.waterfall([
// Slow and gets slower really fast with database size, then outright crashes // Slow and gets slower really fast with database size, then outright crashes
async.apply(someLS, '#1', N), // N=4000, 2.5s testUtils.apply(someLS, '#1', N), // N=4000, 2.5s
async.apply(someLS, '#2', N), // N=4000, 8.0s testUtils.apply(someLS, '#2', N), // N=4000, 8.0s
async.apply(someLS, '#3', N), // N=4000, 26.5s testUtils.apply(someLS, '#3', N), // N=4000, 26.5s
async.apply(someLS, '#4', N) // N=4000, 47.8s then crash, can't get string (with N=5000 crash happens on second pass) testUtils.apply(someLS, '#4', N) // N=4000, 47.8s then crash, can't get string (with N=5000 crash happens on second pass)
], done) ], done)
}) })
it.skip('Localstorage Diff', function (done) { it.skip('Localstorage Diff', function (done) {
async.waterfall([ testUtils.waterfall([
// Much faster and more consistent // Much faster and more consistent
async.apply(someLSDiff, '#1', N), // N=50000, 0.7s testUtils.apply(someLSDiff, '#1', N), // N=50000, 0.7s
async.apply(someLSDiff, '#2', N), // N=50000, 0.5s testUtils.apply(someLSDiff, '#2', N), // N=50000, 0.5s
async.apply(someLSDiff, '#3', N), // N=50000, 0.5s testUtils.apply(someLSDiff, '#3', N), // N=50000, 0.5s
async.apply(someLSDiff, '#4', N) // N=50000, 0.5s testUtils.apply(someLSDiff, '#4', N) // N=50000, 0.5s
], done) ], done)
}) })
it.skip('LocalForage', function (done) { it.skip('LocalForage', function (done) {
async.waterfall([ testUtils.waterfall([
// Slow and gets slower with database size // Slow and gets slower with database size
cb => { localforage.setItem('loadTestLF', '', err => cb(err)) }, cb => { localforage.setItem('loadTestLF', '', err => cb(err)) },
async.apply(someLF, '#1', N), // N=5000, 69s testUtils.apply(someLF, '#1', N), // N=5000, 69s
async.apply(someLF, '#2', N), // N=5000, 108s testUtils.apply(someLF, '#2', N), // N=5000, 108s
async.apply(someLF, '#3', N), // N=5000, 137s testUtils.apply(someLF, '#3', N), // N=5000, 137s
async.apply(someLF, '#4', N) // N=5000, 169s testUtils.apply(someLF, '#4', N) // N=5000, 169s
], done) ], done)
}) })
it.skip('LocalForage diff', function (done) { it.skip('LocalForage diff', function (done) {
async.waterfall([ testUtils.waterfall([
// Quite fast and speed doesn't change with database size (tested with N=10000 and N=50000, still no slow-down) // Quite fast and speed doesn't change with database size (tested with N=10000 and N=50000, still no slow-down)
async.apply(someLFDiff, '#1', N), // N=5000, 18s testUtils.apply(someLFDiff, '#1', N), // N=5000, 18s
async.apply(someLFDiff, '#2', N), // N=5000, 18s testUtils.apply(someLFDiff, '#2', N), // N=5000, 18s
async.apply(someLFDiff, '#3', N), // N=5000, 18s testUtils.apply(someLFDiff, '#3', N), // N=5000, 18s
async.apply(someLFDiff, '#4', N) // N=5000, 18s testUtils.apply(someLFDiff, '#4', N) // N=5000, 18s
], done) ], done)
}) })
}) })

@ -1,5 +1,5 @@
/* eslint-env mocha */ /* eslint-env mocha */
/* global chai, Nedb */ /* global chai, Nedb, testUtils */
/** /**
* Testing the browser version of NeDB * Testing the browser version of NeDB
@ -265,7 +265,7 @@ describe('Indexing', function () {
db.insert({ a: 6 }, function () { db.insert({ a: 6 }, function () {
db.insert({ a: 7 }, function () { db.insert({ a: 7 }, function () {
// eslint-disable-next-line node/handle-callback-err // eslint-disable-next-line node/handle-callback-err
db.getCandidates({ a: 6 }, function (err, candidates) { testUtils.callbackify(query => db._getCandidatesAsync(query))({ a: 6 }, function (err, candidates) {
assert.strictEqual(candidates.length, 3) assert.strictEqual(candidates.length, 3)
assert.isDefined(candidates.find(function (doc) { return doc.a === 4 })) assert.isDefined(candidates.find(function (doc) { return doc.a === 4 }))
assert.isDefined(candidates.find(function (doc) { return doc.a === 6 })) assert.isDefined(candidates.find(function (doc) { return doc.a === 6 }))
@ -274,7 +274,7 @@ describe('Indexing', function () {
db.ensureIndex({ fieldName: 'a' }) db.ensureIndex({ fieldName: 'a' })
// eslint-disable-next-line node/handle-callback-err // eslint-disable-next-line node/handle-callback-err
db.getCandidates({ a: 6 }, function (err, candidates) { testUtils.callbackify(query => db._getCandidatesAsync(query))({ a: 6 }, function (err, candidates) {
assert.strictEqual(candidates.length, 1) assert.strictEqual(candidates.length, 1)
assert.isDefined(candidates.find(function (doc) { return doc.a === 6 })) assert.isDefined(candidates.find(function (doc) { return doc.a === 6 }))

@ -46,7 +46,7 @@ describe('byline', function () {
it('should work with streams2 API', function (done) { it('should work with streams2 API', function (done) {
let stream = fs.createReadStream(localPath('empty.txt')) let stream = fs.createReadStream(localPath('empty.txt'))
stream = byline.createStream(stream) stream = byline(stream)
stream.on('readable', function () { stream.on('readable', function () {
while (stream.read() !== null) { while (stream.read() !== null) {

@ -0,0 +1,519 @@
/* eslint-env mocha */
const testDb = 'workspace/test.db'
const { promises: fs } = require('fs')
const assert = require('assert').strict
const path = require('path')
const Datastore = require('../lib/datastore')
const Persistence = require('../lib/persistence')
const Cursor = require('../lib/cursor')
const { exists } = require('./utils.test.js')
describe('Cursor Async', function () {
let d
beforeEach(async () => {
d = new Datastore({ filename: testDb })
assert.equal(d.filename, testDb)
assert.equal(d.inMemoryOnly, false)
await Persistence.ensureDirectoryExistsAsync(path.dirname(testDb))
if (await exists(testDb)) await fs.unlink(testDb)
await d.loadDatabaseAsync()
assert.equal(d.getAllData().length, 0)
})
describe('Without sorting', function () {
beforeEach(async () => {
await d.insertAsync({ age: 5 })
await d.insertAsync({ age: 57 })
await d.insertAsync({ age: 52 })
await d.insertAsync({ age: 23 })
await d.insertAsync({ age: 89 })
})
it('Without query, an empty query or a simple query and no skip or limit', async () => {
const cursor = new Cursor(d)
const docs = await cursor
assert.equal(docs.length, 5)
assert.equal(docs.filter(function (doc) { return doc.age === 5 })[0].age, 5)
assert.equal(docs.filter(function (doc) { return doc.age === 57 })[0].age, 57)
assert.equal(docs.filter(function (doc) { return doc.age === 52 })[0].age, 52)
assert.equal(docs.filter(function (doc) { return doc.age === 23 })[0].age, 23)
assert.equal(docs.filter(function (doc) { return doc.age === 89 })[0].age, 89)
const cursor2 = new Cursor(d, {})
const docs2 = await cursor2
assert.equal(docs2.length, 5)
assert.equal(docs2.filter(function (doc) { return doc.age === 5 })[0].age, 5)
assert.equal(docs2.filter(function (doc) { return doc.age === 57 })[0].age, 57)
assert.equal(docs2.filter(function (doc) { return doc.age === 52 })[0].age, 52)
assert.equal(docs2.filter(function (doc) { return doc.age === 23 })[0].age, 23)
assert.equal(docs2.filter(function (doc) { return doc.age === 89 })[0].age, 89)
const cursor3 = new Cursor(d, { age: { $gt: 23 } })
const docs3 = await cursor3
assert.equal(docs3.length, 3)
assert.equal(docs3.filter(function (doc) { return doc.age === 57 })[0].age, 57)
assert.equal(docs3.filter(function (doc) { return doc.age === 52 })[0].age, 52)
assert.equal(docs3.filter(function (doc) { return doc.age === 89 })[0].age, 89)
})
it('With an empty collection', async () => {
await d.removeAsync({}, { multi: true })
const cursor = new Cursor(d)
const docs = await cursor
assert.equal(docs.length, 0)
})
it('With a limit', async () => {
const cursor = new Cursor(d)
cursor.limit(3)
const docs = await cursor
assert.equal(docs.length, 3)
// No way to predict which results are returned of course ...
})
it('With a skip', async () => {
const cursor = new Cursor(d)
const docs = await cursor.skip(2)
assert.equal(docs.length, 3)
// No way to predict which results are returned of course ...
})
it('With a limit and a skip and method chaining', async () => {
const cursor = new Cursor(d)
cursor.limit(4).skip(3) // Only way to know that the right number of results was skipped is if limit + skip > number of results
const docs = await cursor
assert.equal(docs.length, 2)
// No way to predict which results are returned of course ...
})
}) // ===== End of 'Without sorting' =====
describe('Sorting of the results', function () {
beforeEach(async () => {
// We don't know the order in which docs will be inserted but we ensure correctness by testing both sort orders
await d.insertAsync({ age: 5 })
await d.insertAsync({ age: 57 })
await d.insertAsync({ age: 52 })
await d.insertAsync({ age: 23 })
await d.insertAsync({ age: 89 })
})
it('Using one sort', async () => {
const cursor = new Cursor(d, {})
cursor.sort({ age: 1 })
const docs = await cursor
// Results are in ascending order
for (let i = 0; i < docs.length - 1; i += 1) {
assert(docs[i].age < docs[i + 1].age)
}
cursor.sort({ age: -1 })
const docs2 = await cursor
// Results are in descending order
for (let i = 0; i < docs2.length - 1; i += 1) {
assert(docs2[i].age > docs2[i + 1].age)
}
})
it('Sorting strings with custom string comparison function', async () => {
const db = new Datastore({
inMemoryOnly: true,
autoload: true,
compareStrings: function (a, b) { return a.length - b.length }
})
await db.insertAsync({ name: 'alpha' })
await db.insertAsync({ name: 'charlie' })
await db.insertAsync({ name: 'zulu' })
const docs = await db.findAsync({}).sort({ name: 1 })
assert.equal(docs.map(x => x.name)[0], 'zulu')
assert.equal(docs.map(x => x.name)[1], 'alpha')
assert.equal(docs.map(x => x.name)[2], 'charlie')
delete db.compareStrings
const docs2 = await db.findAsync({}).sort({ name: 1 })
assert.equal(docs2.map(x => x.name)[0], 'alpha')
assert.equal(docs2.map(x => x.name)[1], 'charlie')
assert.equal(docs2.map(x => x.name)[2], 'zulu')
})
it('With an empty collection', async () => {
await d.removeAsync({}, { multi: true })
const cursor = new Cursor(d)
cursor.sort({ age: 1 })
const docs = await cursor
assert.equal(docs.length, 0)
})
it('Ability to chain sorting and exec', async () => {
const cursor = new Cursor(d)
const docs = await cursor.sort({ age: 1 })
// Results are in ascending order
for (let i = 0; i < docs.length - 1; i += 1) {
assert.ok(docs[i].age < docs[i + 1].age)
}
const cursor2 = new Cursor(d)
const docs2 = await cursor2.sort({ age: -1 })
// Results are in descending order
for (let i = 0; i < docs2.length - 1; i += 1) {
assert(docs2[i].age > docs2[i + 1].age)
}
})
it('Using limit and sort', async () => {
const cursor = new Cursor(d)
const docs = await cursor.sort({ age: 1 }).limit(3)
assert.equal(docs.length, 3)
assert.equal(docs[0].age, 5)
assert.equal(docs[1].age, 23)
assert.equal(docs[2].age, 52)
const cursor2 = new Cursor(d)
const docs2 = await cursor2.sort({ age: -1 }).limit(2)
assert.equal(docs2.length, 2)
assert.equal(docs2[0].age, 89)
assert.equal(docs2[1].age, 57)
})
it('Using a limit higher than total number of docs shouldn\'t cause an error', async () => {
const cursor = new Cursor(d)
const docs = await cursor.sort({ age: 1 }).limit(7)
assert.equal(docs.length, 5)
assert.equal(docs[0].age, 5)
assert.equal(docs[1].age, 23)
assert.equal(docs[2].age, 52)
assert.equal(docs[3].age, 57)
assert.equal(docs[4].age, 89)
})
it('Using limit and skip with sort', async () => {
const cursor = new Cursor(d)
const docs = await cursor.sort({ age: 1 }).limit(1).skip(2)
assert.equal(docs.length, 1)
assert.equal(docs[0].age, 52)
const cursor2 = new Cursor(d)
const docs2 = await cursor2.sort({ age: 1 }).limit(3).skip(1)
assert.equal(docs2.length, 3)
assert.equal(docs2[0].age, 23)
assert.equal(docs2[1].age, 52)
assert.equal(docs2[2].age, 57)
const cursor3 = new Cursor(d)
const docs3 = await cursor3.sort({ age: -1 }).limit(2).skip(2)
assert.equal(docs3.length, 2)
assert.equal(docs3[0].age, 52)
assert.equal(docs3[1].age, 23)
})
it('Using too big a limit and a skip with sort', async () => {
const cursor = new Cursor(d)
const docs = await cursor.sort({ age: 1 }).limit(8).skip(2)
assert.equal(docs.length, 3)
assert.equal(docs[0].age, 52)
assert.equal(docs[1].age, 57)
assert.equal(docs[2].age, 89)
})
it('Using too big a skip with sort should return no result', async () => {
const cursor = new Cursor(d)
const docs = await cursor.sort({ age: 1 }).skip(5)
assert.equal(docs.length, 0)
const cursor2 = new Cursor(d)
const docs2 = await cursor2.sort({ age: 1 }).skip(7)
assert.equal(docs2.length, 0)
const cursor3 = new Cursor(d)
const docs3 = await cursor3.sort({ age: 1 }).limit(3).skip(7)
assert.equal(docs3.length, 0)
const cursor4 = new Cursor(d)
const docs4 = await cursor4.sort({ age: 1 }).limit(6).skip(7)
assert.equal(docs4.length, 0)
})
it('Sorting strings', async () => {
await d.removeAsync({}, { multi: true })
await d.insertAsync({ name: 'jako' })
await d.insertAsync({ name: 'jakeb' })
await d.insertAsync({ name: 'sue' })
const cursor = new Cursor(d, {})
const docs = await cursor.sort({ name: 1 })
assert.equal(docs.length, 3)
assert.equal(docs[0].name, 'jakeb')
assert.equal(docs[1].name, 'jako')
assert.equal(docs[2].name, 'sue')
const cursor2 = new Cursor(d, {})
const docs2 = await cursor2.sort({ name: -1 })
assert.equal(docs2.length, 3)
assert.equal(docs2[0].name, 'sue')
assert.equal(docs2[1].name, 'jako')
assert.equal(docs2[2].name, 'jakeb')
})
it('Sorting nested fields with dates', async () => {
await d.removeAsync({}, { multi: true })
const doc1 = await d.insertAsync({ event: { recorded: new Date(400) } })
const doc2 = await d.insertAsync({ event: { recorded: new Date(60000) } })
const doc3 = await d.insertAsync({ event: { recorded: new Date(32) } })
const cursor = new Cursor(d, {})
const docs = await cursor.sort({ 'event.recorded': 1 })
assert.equal(docs.length, 3)
assert.equal(docs[0]._id, doc3._id)
assert.equal(docs[1]._id, doc1._id)
assert.equal(docs[2]._id, doc2._id)
const cursor2 = new Cursor(d, {})
const docs2 = await cursor2.sort({ 'event.recorded': -1 })
assert.equal(docs2.length, 3)
assert.equal(docs2[0]._id, doc2._id)
assert.equal(docs2[1]._id, doc1._id)
assert.equal(docs2[2]._id, doc3._id)
})
it('Sorting when some fields are undefined', async () => {
await d.removeAsync({}, { multi: true })
await d.insertAsync({ name: 'jako', other: 2 })
await d.insertAsync({ name: 'jakeb', other: 3 })
await d.insertAsync({ name: 'sue' })
await d.insertAsync({ name: 'henry', other: 4 })
const cursor = new Cursor(d, {})
// eslint-disable-next-line node/handle-callback-err
const docs = await cursor.sort({ other: 1 })
assert.equal(docs.length, 4)
assert.equal(docs[0].name, 'sue')
assert.equal(docs[0].other, undefined)
assert.equal(docs[1].name, 'jako')
assert.equal(docs[1].other, 2)
assert.equal(docs[2].name, 'jakeb')
assert.equal(docs[2].other, 3)
assert.equal(docs[3].name, 'henry')
assert.equal(docs[3].other, 4)
const cursor2 = new Cursor(d, { name: { $in: ['suzy', 'jakeb', 'jako'] } })
const docs2 = await cursor2.sort({ other: -1 })
assert.equal(docs2.length, 2)
assert.equal(docs2[0].name, 'jakeb')
assert.equal(docs2[0].other, 3)
assert.equal(docs2[1].name, 'jako')
assert.equal(docs2[1].other, 2)
})
it('Sorting when all fields are undefined', async () => {
await d.removeAsync({}, { multi: true })
await d.insertAsync({ name: 'jako' })
await d.insertAsync({ name: 'jakeb' })
await d.insertAsync({ name: 'sue' })
const cursor = new Cursor(d, {})
const docs = await cursor.sort({ other: 1 })
assert.equal(docs.length, 3)
const cursor2 = new Cursor(d, { name: { $in: ['sue', 'jakeb', 'jakob'] } })
const docs2 = await cursor2.sort({ other: -1 })
assert.equal(docs2.length, 2)
})
it('Multiple consecutive sorts', async () => {
await d.removeAsync({}, { multi: true })
await d.insertAsync({ name: 'jako', age: 43, nid: 1 })
await d.insertAsync({ name: 'jakeb', age: 43, nid: 2 })
await d.insertAsync({ name: 'sue', age: 12, nid: 3 })
await d.insertAsync({ name: 'zoe', age: 23, nid: 4 })
await d.insertAsync({ name: 'jako', age: 35, nid: 5 })
const cursor = new Cursor(d, {})
// eslint-disable-next-line node/handle-callback-err
const docs = await cursor.sort({ name: 1, age: -1 })
assert.equal(docs.length, 5)
assert.equal(docs[0].nid, 2)
assert.equal(docs[1].nid, 1)
assert.equal(docs[2].nid, 5)
assert.equal(docs[3].nid, 3)
assert.equal(docs[4].nid, 4)
const cursor2 = new Cursor(d, {})
const docs2 = await cursor2.sort({ name: 1, age: 1 })
assert.equal(docs2.length, 5)
assert.equal(docs2[0].nid, 2)
assert.equal(docs2[1].nid, 5)
assert.equal(docs2[2].nid, 1)
assert.equal(docs2[3].nid, 3)
assert.equal(docs2[4].nid, 4)
const cursor3 = new Cursor(d, {})
const docs3 = await cursor3.sort({ age: 1, name: 1 })
assert.equal(docs3.length, 5)
assert.equal(docs3[0].nid, 3)
assert.equal(docs3[1].nid, 4)
assert.equal(docs3[2].nid, 5)
assert.equal(docs3[3].nid, 2)
assert.equal(docs3[4].nid, 1)
const cursor4 = new Cursor(d, {})
const docs4 = await cursor4.sort({ age: 1, name: -1 })
assert.equal(docs4.length, 5)
assert.equal(docs4[0].nid, 3)
assert.equal(docs4[1].nid, 4)
assert.equal(docs4[2].nid, 5)
assert.equal(docs4[3].nid, 1)
assert.equal(docs4[4].nid, 2)
})
it('Similar data, multiple consecutive sorts', async () => {
let id
const companies = ['acme', 'milkman', 'zoinks']
const entities = []
await d.removeAsync({}, { multi: true })
id = 1
for (let i = 0; i < companies.length; i++) {
for (let j = 5; j <= 100; j += 5) {
entities.push({
company: companies[i],
cost: j,
nid: id
})
id++
}
}
await Promise.all(entities.map(entity => d.insertAsync(entity)))
const cursor = new Cursor(d, {})
const docs = await cursor.sort({ company: 1, cost: 1 })
assert.equal(docs.length, 60)
for (let i = 0; i < docs.length; i++) {
assert.equal(docs[i].nid, i + 1)
}
})
}) // ===== End of 'Sorting' =====
describe('Projections', function () {
let doc1
let doc2
let doc3
let doc4
let doc0
beforeEach(async () => {
// We don't know the order in which docs will be inserted but we ensure correctness by testing both sort orders
doc0 = await d.insertAsync({ age: 5, name: 'Jo', planet: 'B', toys: { bebe: true, ballon: 'much' } })
doc1 = await d.insertAsync({ age: 57, name: 'Louis', planet: 'R', toys: { ballon: 'yeah', bebe: false } })
doc2 = await d.insertAsync({ age: 52, name: 'Grafitti', planet: 'C', toys: { bebe: 'kind of' } })
doc3 = await d.insertAsync({ age: 23, name: 'LM', planet: 'S' })
doc4 = await d.insertAsync({ age: 89, planet: 'Earth' })
})
it('Takes all results if no projection or empty object given', async () => {
const cursor = new Cursor(d, {})
cursor.sort({ age: 1 }) // For easier finding
const docs = await cursor
assert.equal(docs.length, 5)
assert.deepStrictEqual(docs[0], doc0)
assert.deepStrictEqual(docs[1], doc3)
assert.deepStrictEqual(docs[2], doc2)
assert.deepStrictEqual(docs[3], doc1)
assert.deepStrictEqual(docs[4], doc4)
cursor.projection({})
const docs2 = await cursor
assert.equal(docs2.length, 5)
assert.deepStrictEqual(docs2[0], doc0)
assert.deepStrictEqual(docs2[1], doc3)
assert.deepStrictEqual(docs2[2], doc2)
assert.deepStrictEqual(docs2[3], doc1)
assert.deepStrictEqual(docs2[4], doc4)
})
it('Can take only the expected fields', async () => {
const cursor = new Cursor(d, {})
cursor.sort({ age: 1 }) // For easier finding
cursor.projection({ age: 1, name: 1 })
const docs = await cursor
assert.equal(docs.length, 5)
// Takes the _id by default
assert.deepStrictEqual(docs[0], { age: 5, name: 'Jo', _id: doc0._id })
assert.deepStrictEqual(docs[1], { age: 23, name: 'LM', _id: doc3._id })
assert.deepStrictEqual(docs[2], { age: 52, name: 'Grafitti', _id: doc2._id })
assert.deepStrictEqual(docs[3], { age: 57, name: 'Louis', _id: doc1._id })
assert.deepStrictEqual(docs[4], { age: 89, _id: doc4._id }) // No problems if one field to take doesn't exist
cursor.projection({ age: 1, name: 1, _id: 0 })
const docs2 = await cursor
assert.equal(docs2.length, 5)
assert.deepStrictEqual(docs2[0], { age: 5, name: 'Jo' })
assert.deepStrictEqual(docs2[1], { age: 23, name: 'LM' })
assert.deepStrictEqual(docs2[2], { age: 52, name: 'Grafitti' })
assert.deepStrictEqual(docs2[3], { age: 57, name: 'Louis' })
assert.deepStrictEqual(docs2[4], { age: 89 }) // No problems if one field to take doesn't exist
})
it('Can omit only the expected fields', async () => {
const cursor = new Cursor(d, {})
cursor.sort({ age: 1 }) // For easier finding
cursor.projection({ age: 0, name: 0 })
const docs = await cursor
assert.equal(docs.length, 5)
// Takes the _id by default
assert.deepStrictEqual(docs[0], { planet: 'B', _id: doc0._id, toys: { bebe: true, ballon: 'much' } })
assert.deepStrictEqual(docs[1], { planet: 'S', _id: doc3._id })
assert.deepStrictEqual(docs[2], { planet: 'C', _id: doc2._id, toys: { bebe: 'kind of' } })
assert.deepStrictEqual(docs[3], { planet: 'R', _id: doc1._id, toys: { bebe: false, ballon: 'yeah' } })
assert.deepStrictEqual(docs[4], { planet: 'Earth', _id: doc4._id })
cursor.projection({ age: 0, name: 0, _id: 0 })
const docs2 = await cursor
assert.equal(docs2.length, 5)
assert.deepStrictEqual(docs2[0], { planet: 'B', toys: { bebe: true, ballon: 'much' } })
assert.deepStrictEqual(docs2[1], { planet: 'S' })
assert.deepStrictEqual(docs2[2], { planet: 'C', toys: { bebe: 'kind of' } })
assert.deepStrictEqual(docs2[3], { planet: 'R', toys: { bebe: false, ballon: 'yeah' } })
assert.deepStrictEqual(docs2[4], { planet: 'Earth' })
})
it('Cannot use both modes except for _id', async () => {
const cursor = new Cursor(d, {})
cursor.sort({ age: 1 }) // For easier finding
cursor.projection({ age: 1, name: 0 })
await assert.rejects(() => cursor)
cursor.projection({ age: 1, _id: 0 })
const docs = await cursor
assert.deepStrictEqual(docs[0], { age: 5 })
assert.deepStrictEqual(docs[1], { age: 23 })
assert.deepStrictEqual(docs[2], { age: 52 })
assert.deepStrictEqual(docs[3], { age: 57 })
assert.deepStrictEqual(docs[4], { age: 89 })
cursor.projection({ age: 0, toys: 0, planet: 0, _id: 1 })
const docs2 = await cursor
assert.deepStrictEqual(docs2[0], { name: 'Jo', _id: doc0._id })
assert.deepStrictEqual(docs2[1], { name: 'LM', _id: doc3._id })
assert.deepStrictEqual(docs2[2], { name: 'Grafitti', _id: doc2._id })
assert.deepStrictEqual(docs2[3], { name: 'Louis', _id: doc1._id })
assert.deepStrictEqual(docs2[4], { _id: doc4._id })
})
it('Projections on embedded documents - omit type', async () => {
const cursor = new Cursor(d, {})
cursor.sort({ age: 1 }) // For easier finding
cursor.projection({ name: 0, planet: 0, 'toys.bebe': 0, _id: 0 })
const docs = await cursor
assert.deepStrictEqual(docs[0], { age: 5, toys: { ballon: 'much' } })
assert.deepStrictEqual(docs[1], { age: 23 })
assert.deepStrictEqual(docs[2], { age: 52, toys: {} })
assert.deepStrictEqual(docs[3], { age: 57, toys: { ballon: 'yeah' } })
assert.deepStrictEqual(docs[4], { age: 89 })
})
it('Projections on embedded documents - pick type', async () => {
const cursor = new Cursor(d, {})
cursor.sort({ age: 1 }) // For easier finding
cursor.projection({ name: 1, 'toys.ballon': 1, _id: 0 })
const docs = await cursor
assert.deepStrictEqual(docs[0], { name: 'Jo', toys: { ballon: 'much' } })
assert.deepStrictEqual(docs[1], { name: 'LM' })
assert.deepStrictEqual(docs[2], { name: 'Grafitti' })
assert.deepStrictEqual(docs[3], { name: 'Louis', toys: { ballon: 'yeah' } })
assert.deepStrictEqual(docs[4], {})
})
}) // ==== End of 'Projections' ====
})

@ -3,10 +3,11 @@ const chai = require('chai')
const testDb = 'workspace/test.db' const testDb = 'workspace/test.db'
const fs = require('fs') const fs = require('fs')
const path = require('path') const path = require('path')
const async = require('async') const { each, waterfall } = require('./utils.test.js')
const Datastore = require('../lib/datastore') const Datastore = require('../lib/datastore')
const Persistence = require('../lib/persistence') const Persistence = require('../lib/persistence')
const Cursor = require('../lib/cursor') const Cursor = require('../lib/cursor')
const { callbackify } = require('util')
const { assert } = chai const { assert } = chai
chai.should() chai.should()
@ -19,9 +20,9 @@ describe('Cursor', function () {
d.filename.should.equal(testDb) d.filename.should.equal(testDb)
d.inMemoryOnly.should.equal(false) d.inMemoryOnly.should.equal(false)
async.waterfall([ waterfall([
function (cb) { function (cb) {
Persistence.ensureDirectoryExists(path.dirname(testDb), function () { callbackify((dirname) => Persistence.ensureDirectoryExistsAsync(dirname))(path.dirname(testDb), function () {
fs.access(testDb, fs.constants.F_OK, function (err) { fs.access(testDb, fs.constants.F_OK, function (err) {
if (!err) { if (!err) {
fs.unlink(testDb, cb) fs.unlink(testDb, cb)
@ -29,12 +30,10 @@ describe('Cursor', function () {
}) })
}) })
}, },
function (cb) { async function (cb) {
d.loadDatabase(function (err) { await d.loadDatabaseAsync()
assert.isNull(err)
d.getAllData().length.should.equal(0) d.getAllData().length.should.equal(0)
return cb() cb()
})
} }
], done) ], done)
}) })
@ -60,7 +59,7 @@ describe('Cursor', function () {
}) })
it('Without query, an empty query or a simple query and no skip or limit', function (done) { it('Without query, an empty query or a simple query and no skip or limit', function (done) {
async.waterfall([ waterfall([
function (cb) { function (cb) {
const cursor = new Cursor(d) const cursor = new Cursor(d)
cursor.exec(function (err, docs) { cursor.exec(function (err, docs) {
@ -102,7 +101,7 @@ describe('Cursor', function () {
}) })
it('With an empty collection', function (done) { it('With an empty collection', function (done) {
async.waterfall([ waterfall([
function (cb) { function (cb) {
d.remove({}, { multi: true }, function (err) { return cb(err) }) d.remove({}, { multi: true }, function (err) { return cb(err) })
}, },
@ -224,7 +223,7 @@ describe('Cursor', function () {
}) })
it('With an empty collection', function (done) { it('With an empty collection', function (done) {
async.waterfall([ waterfall([
function (cb) { function (cb) {
d.remove({}, { multi: true }, function (err) { return cb(err) }) d.remove({}, { multi: true }, function (err) { return cb(err) })
}, },
@ -242,7 +241,7 @@ describe('Cursor', function () {
it('Ability to chain sorting and exec', function (done) { it('Ability to chain sorting and exec', function (done) {
let i let i
async.waterfall([ waterfall([
function (cb) { function (cb) {
const cursor = new Cursor(d) const cursor = new Cursor(d)
cursor.sort({ age: 1 }).exec(function (err, docs) { cursor.sort({ age: 1 }).exec(function (err, docs) {
@ -269,7 +268,7 @@ describe('Cursor', function () {
}) })
it('Using limit and sort', function (done) { it('Using limit and sort', function (done) {
async.waterfall([ waterfall([
function (cb) { function (cb) {
const cursor = new Cursor(d) const cursor = new Cursor(d)
cursor.sort({ age: 1 }).limit(3).exec(function (err, docs) { cursor.sort({ age: 1 }).limit(3).exec(function (err, docs) {
@ -295,7 +294,7 @@ describe('Cursor', function () {
}) })
it('Using a limit higher than total number of docs shouldnt cause an error', function (done) { it('Using a limit higher than total number of docs shouldnt cause an error', function (done) {
async.waterfall([ waterfall([
function (cb) { function (cb) {
const cursor = new Cursor(d) const cursor = new Cursor(d)
cursor.sort({ age: 1 }).limit(7).exec(function (err, docs) { cursor.sort({ age: 1 }).limit(7).exec(function (err, docs) {
@ -313,7 +312,7 @@ describe('Cursor', function () {
}) })
it('Using limit and skip with sort', function (done) { it('Using limit and skip with sort', function (done) {
async.waterfall([ waterfall([
function (cb) { function (cb) {
const cursor = new Cursor(d) const cursor = new Cursor(d)
cursor.sort({ age: 1 }).limit(1).skip(2).exec(function (err, docs) { cursor.sort({ age: 1 }).limit(1).skip(2).exec(function (err, docs) {
@ -348,7 +347,7 @@ describe('Cursor', function () {
}) })
it('Using too big a limit and a skip with sort', function (done) { it('Using too big a limit and a skip with sort', function (done) {
async.waterfall([ waterfall([
function (cb) { function (cb) {
const cursor = new Cursor(d) const cursor = new Cursor(d)
cursor.sort({ age: 1 }).limit(8).skip(2).exec(function (err, docs) { cursor.sort({ age: 1 }).limit(8).skip(2).exec(function (err, docs) {
@ -364,7 +363,7 @@ describe('Cursor', function () {
}) })
it('Using too big a skip with sort should return no result', function (done) { it('Using too big a skip with sort should return no result', function (done) {
async.waterfall([ waterfall([
function (cb) { function (cb) {
const cursor = new Cursor(d) const cursor = new Cursor(d)
cursor.sort({ age: 1 }).skip(5).exec(function (err, docs) { cursor.sort({ age: 1 }).skip(5).exec(function (err, docs) {
@ -401,7 +400,7 @@ describe('Cursor', function () {
}) })
it('Sorting strings', function (done) { it('Sorting strings', function (done) {
async.waterfall([ waterfall([
function (cb) { function (cb) {
d.remove({}, { multi: true }, function (err) { d.remove({}, { multi: true }, function (err) {
if (err) { return cb(err) } if (err) { return cb(err) }
@ -445,7 +444,7 @@ describe('Cursor', function () {
let doc2 let doc2
let doc3 let doc3
async.waterfall([ waterfall([
function (cb) { function (cb) {
d.remove({}, { multi: true }, function (err) { d.remove({}, { multi: true }, function (err) {
if (err) { return cb(err) } if (err) { return cb(err) }
@ -491,7 +490,7 @@ describe('Cursor', function () {
}) })
it('Sorting when some fields are undefined', function (done) { it('Sorting when some fields are undefined', function (done) {
async.waterfall([ waterfall([
function (cb) { function (cb) {
d.remove({}, { multi: true }, function (err) { d.remove({}, { multi: true }, function (err) {
if (err) { return cb(err) } if (err) { return cb(err) }
@ -539,7 +538,7 @@ describe('Cursor', function () {
}) })
it('Sorting when all fields are undefined', function (done) { it('Sorting when all fields are undefined', function (done) {
async.waterfall([ waterfall([
function (cb) { function (cb) {
d.remove({}, { multi: true }, function (err) { d.remove({}, { multi: true }, function (err) {
if (err) { return cb(err) } if (err) { return cb(err) }
@ -573,7 +572,7 @@ describe('Cursor', function () {
}) })
it('Multiple consecutive sorts', function (done) { it('Multiple consecutive sorts', function (done) {
async.waterfall([ waterfall([
function (cb) { function (cb) {
d.remove({}, { multi: true }, function (err) { d.remove({}, { multi: true }, function (err) {
if (err) { return cb(err) } if (err) { return cb(err) }
@ -657,7 +656,7 @@ describe('Cursor', function () {
const companies = ['acme', 'milkman', 'zoinks'] const companies = ['acme', 'milkman', 'zoinks']
const entities = [] const entities = []
async.waterfall([ waterfall([
function (cb) { function (cb) {
d.remove({}, { multi: true }, function (err) { d.remove({}, { multi: true }, function (err) {
if (err) { return cb(err) } if (err) { return cb(err) }
@ -674,7 +673,7 @@ describe('Cursor', function () {
} }
} }
async.each(entities, function (entity, callback) { each(entities, function (entity, callback) {
d.insert(entity, function () { d.insert(entity, function () {
callback() callback()
}) })

File diff suppressed because it is too large Load Diff

@ -3,11 +3,12 @@ const chai = require('chai')
const testDb = 'workspace/test.db' const testDb = 'workspace/test.db'
const fs = require('fs') const fs = require('fs')
const path = require('path') const path = require('path')
const async = require('async') const { apply, each, waterfall } = require('./utils.test.js')
const model = require('../lib/model') const model = require('../lib/model')
const Datastore = require('../lib/datastore') const Datastore = require('../lib/datastore')
const Persistence = require('../lib/persistence') const Persistence = require('../lib/persistence')
const reloadTimeUpperBound = 60 // In ms, an upper bound for the reload time used to check createdAt and updatedAt const { callbackify } = require('util')
const reloadTimeUpperBound = 200 // In ms, an upper bound for the reload time used to check createdAt and updatedAt
const { assert } = chai const { assert } = chai
chai.should() chai.should()
@ -20,9 +21,9 @@ describe('Database', function () {
d.filename.should.equal(testDb) d.filename.should.equal(testDb)
d.inMemoryOnly.should.equal(false) d.inMemoryOnly.should.equal(false)
async.waterfall([ waterfall([
function (cb) { function (cb) {
Persistence.ensureDirectoryExists(path.dirname(testDb), function () { callbackify((dirname) => Persistence.ensureDirectoryExistsAsync(dirname))(path.dirname(testDb), function () {
fs.access(testDb, fs.constants.FS_OK, function (err) { fs.access(testDb, fs.constants.FS_OK, function (err) {
if (!err) { if (!err) {
fs.unlink(testDb, cb) fs.unlink(testDb, cb)
@ -431,6 +432,7 @@ describe('Database', function () {
it('If the callback throws an uncaught exception, do not catch it inside findOne, this is userspace concern', function (done) { it('If the callback throws an uncaught exception, do not catch it inside findOne, this is userspace concern', function (done) {
let tryCount = 0 let tryCount = 0
const currentUncaughtExceptionHandlers = process.listeners('uncaughtException') const currentUncaughtExceptionHandlers = process.listeners('uncaughtException')
let i let i
process.removeAllListeners('uncaughtException') process.removeAllListeners('uncaughtException')
@ -471,7 +473,7 @@ describe('Database', function () {
d.insert({ tf: 4, an: 'other' }, function (err, _doc2) { d.insert({ tf: 4, an: 'other' }, function (err, _doc2) {
d.insert({ tf: 9 }, function () { d.insert({ tf: 9 }, function () {
// eslint-disable-next-line node/handle-callback-err // eslint-disable-next-line node/handle-callback-err
d.getCandidates({ r: 6, tf: 4 }, function (err, data) { callbackify(query => d._getCandidatesAsync(query))({ r: 6, tf: 4 }, function (err, data) {
const doc1 = data.find(function (d) { return d._id === _doc1._id }) const doc1 = data.find(function (d) { return d._id === _doc1._id })
const doc2 = data.find(function (d) { return d._id === _doc2._id }) const doc2 = data.find(function (d) { return d._id === _doc2._id })
@ -500,7 +502,7 @@ describe('Database', function () {
// eslint-disable-next-line node/handle-callback-err // eslint-disable-next-line node/handle-callback-err
d.insert({ tf: 9 }, function (err, _doc2) { d.insert({ tf: 9 }, function (err, _doc2) {
// eslint-disable-next-line node/handle-callback-err // eslint-disable-next-line node/handle-callback-err
d.getCandidates({ r: 6, tf: { $in: [6, 9, 5] } }, function (err, data) { callbackify(query => d._getCandidatesAsync(query))({ r: 6, tf: { $in: [6, 9, 5] } }, function (err, data) {
const doc1 = data.find(function (d) { return d._id === _doc1._id }) const doc1 = data.find(function (d) { return d._id === _doc1._id })
const doc2 = data.find(function (d) { return d._id === _doc2._id }) const doc2 = data.find(function (d) { return d._id === _doc2._id })
@ -529,7 +531,7 @@ describe('Database', function () {
// eslint-disable-next-line node/handle-callback-err // eslint-disable-next-line node/handle-callback-err
d.insert({ tf: 9 }, function (err, _doc4) { d.insert({ tf: 9 }, function (err, _doc4) {
// eslint-disable-next-line node/handle-callback-err // eslint-disable-next-line node/handle-callback-err
d.getCandidates({ r: 6, notf: { $in: [6, 9, 5] } }, function (err, data) { callbackify(query => d._getCandidatesAsync(query))({ r: 6, notf: { $in: [6, 9, 5] } }, function (err, data) {
const doc1 = data.find(function (d) { return d._id === _doc1._id }) const doc1 = data.find(function (d) { return d._id === _doc1._id })
const doc2 = data.find(function (d) { return d._id === _doc2._id }) const doc2 = data.find(function (d) { return d._id === _doc2._id })
const doc3 = data.find(function (d) { return d._id === _doc3._id }) const doc3 = data.find(function (d) { return d._id === _doc3._id })
@ -562,7 +564,7 @@ describe('Database', function () {
// eslint-disable-next-line node/handle-callback-err // eslint-disable-next-line node/handle-callback-err
d.insert({ tf: 9 }, function (err, _doc4) { d.insert({ tf: 9 }, function (err, _doc4) {
// eslint-disable-next-line node/handle-callback-err // eslint-disable-next-line node/handle-callback-err
d.getCandidates({ r: 6, tf: { $lte: 9, $gte: 6 } }, function (err, data) { callbackify(query => d._getCandidatesAsync(query))({ r: 6, tf: { $lte: 9, $gte: 6 } }, function (err, data) {
const doc2 = data.find(function (d) { return d._id === _doc2._id }) const doc2 = data.find(function (d) { return d._id === _doc2._id })
const doc4 = data.find(function (d) { return d._id === _doc4._id }) const doc4 = data.find(function (d) { return d._id === _doc4._id })
@ -608,7 +610,7 @@ describe('Database', function () {
}) })
}) })
d.persistence.compactDatafile() d.compactDatafile()
}) })
}, 101) }, 101)
}) })
@ -683,7 +685,7 @@ describe('Database', function () {
describe('Find', function () { describe('Find', function () {
it('Can find all documents if an empty query is used', function (done) { it('Can find all documents if an empty query is used', function (done) {
async.waterfall([ waterfall([
function (cb) { function (cb) {
// eslint-disable-next-line node/handle-callback-err // eslint-disable-next-line node/handle-callback-err
d.insert({ somedata: 'ok' }, function (err) { d.insert({ somedata: 'ok' }, function (err) {
@ -708,7 +710,7 @@ describe('Database', function () {
}) })
it('Can find all documents matching a basic query', function (done) { it('Can find all documents matching a basic query', function (done) {
async.waterfall([ waterfall([
function (cb) { function (cb) {
// eslint-disable-next-line node/handle-callback-err // eslint-disable-next-line node/handle-callback-err
d.insert({ somedata: 'ok' }, function (err) { d.insert({ somedata: 'ok' }, function (err) {
@ -737,7 +739,7 @@ describe('Database', function () {
}) })
it('Can find one document matching a basic query and return null if none is found', function (done) { it('Can find one document matching a basic query and return null if none is found', function (done) {
async.waterfall([ waterfall([
function (cb) { function (cb) {
// eslint-disable-next-line node/handle-callback-err // eslint-disable-next-line node/handle-callback-err
d.insert({ somedata: 'ok' }, function (err) { d.insert({ somedata: 'ok' }, function (err) {
@ -1011,7 +1013,7 @@ describe('Database', function () {
describe('Count', function () { describe('Count', function () {
it('Count all documents if an empty query is used', function (done) { it('Count all documents if an empty query is used', function (done) {
async.waterfall([ waterfall([
function (cb) { function (cb) {
// eslint-disable-next-line node/handle-callback-err // eslint-disable-next-line node/handle-callback-err
d.insert({ somedata: 'ok' }, function (err) { d.insert({ somedata: 'ok' }, function (err) {
@ -1032,7 +1034,7 @@ describe('Database', function () {
}) })
it('Count all documents matching a basic query', function (done) { it('Count all documents matching a basic query', function (done) {
async.waterfall([ waterfall([
function (cb) { function (cb) {
// eslint-disable-next-line node/handle-callback-err // eslint-disable-next-line node/handle-callback-err
d.insert({ somedata: 'ok' }, function (err) { d.insert({ somedata: 'ok' }, function (err) {
@ -1101,7 +1103,7 @@ describe('Database', function () {
describe('Update', function () { describe('Update', function () {
it('If the query doesn\'t match anything, database is not modified', function (done) { it('If the query doesn\'t match anything, database is not modified', function (done) {
async.waterfall([ waterfall([
function (cb) { function (cb) {
// eslint-disable-next-line node/handle-callback-err // eslint-disable-next-line node/handle-callback-err
d.insert({ somedata: 'ok' }, function (err) { d.insert({ somedata: 'ok' }, function (err) {
@ -1198,7 +1200,7 @@ describe('Database', function () {
} }
// Actually launch the tests // Actually launch the tests
async.waterfall([ waterfall([
function (cb) { function (cb) {
// eslint-disable-next-line node/handle-callback-err // eslint-disable-next-line node/handle-callback-err
d.insert({ somedata: 'ok' }, function (err, doc1) { d.insert({ somedata: 'ok' }, function (err, doc1) {
@ -1220,11 +1222,11 @@ describe('Database', function () {
return cb() return cb()
}) })
}, },
async.apply(testPostUpdateState), apply(testPostUpdateState),
function (cb) { function (cb) {
d.loadDatabase(function (err) { cb(err) }) d.loadDatabase(function (err) { cb(err) })
}, },
async.apply(testPostUpdateState) apply(testPostUpdateState)
], done) ], done)
}) })
@ -1260,7 +1262,7 @@ describe('Database', function () {
} }
// Actually launch the test // Actually launch the test
async.waterfall([ waterfall([
function (cb) { function (cb) {
// eslint-disable-next-line node/handle-callback-err // eslint-disable-next-line node/handle-callback-err
d.insert({ somedata: 'ok' }, function (err, doc1) { d.insert({ somedata: 'ok' }, function (err, doc1) {
@ -1282,20 +1284,20 @@ describe('Database', function () {
return cb() return cb()
}) })
}, },
async.apply(testPostUpdateState), apply(testPostUpdateState),
function (cb) { function (cb) {
d.loadDatabase(function (err) { return cb(err) }) d.loadDatabase(function (err) { return cb(err) })
}, },
async.apply(testPostUpdateState) // The persisted state has been updated apply(testPostUpdateState) // The persisted state has been updated
], done) ], done)
}) })
describe('Upserts', function () { describe('Upserts', function () {
it('Can perform upserts if needed', function (done) { it('Can perform upserts if needed', function (done) {
d.update({ impossible: 'db is empty anyway' }, { newDoc: true }, {}, function (err, nr, upsert) { d.update({ impossible: 'db is empty anyway' }, { newDoc: true }, {}, function (err, nr, affectedDocuments) {
assert.isNull(err) assert.isNull(err)
nr.should.equal(0) nr.should.equal(0)
assert.isUndefined(upsert) assert.isNull(affectedDocuments)
// eslint-disable-next-line node/handle-callback-err // eslint-disable-next-line node/handle-callback-err
d.find({}, function (err, docs) { d.find({}, function (err, docs) {
@ -1790,8 +1792,8 @@ describe('Database', function () {
d.update({ a: 1 }, { $set: { b: 20 } }, {}, function (err, numAffected, affectedDocuments, upsert) { d.update({ a: 1 }, { $set: { b: 20 } }, {}, function (err, numAffected, affectedDocuments, upsert) {
assert.isNull(err) assert.isNull(err)
numAffected.should.equal(1) numAffected.should.equal(1)
assert.isUndefined(affectedDocuments) assert.isNull(affectedDocuments)
assert.isUndefined(upsert) assert.isFalse(upsert)
// returnUpdatedDocs set to true // returnUpdatedDocs set to true
d.update({ a: 1 }, { $set: { b: 21 } }, { returnUpdatedDocs: true }, function (err, numAffected, affectedDocuments, upsert) { d.update({ a: 1 }, { $set: { b: 21 } }, { returnUpdatedDocs: true }, function (err, numAffected, affectedDocuments, upsert) {
@ -1799,7 +1801,7 @@ describe('Database', function () {
numAffected.should.equal(1) numAffected.should.equal(1)
affectedDocuments.a.should.equal(1) affectedDocuments.a.should.equal(1)
affectedDocuments.b.should.equal(21) affectedDocuments.b.should.equal(21)
assert.isUndefined(upsert) assert.isFalse(upsert)
done() done()
}) })
@ -1814,8 +1816,8 @@ describe('Database', function () {
d.update({}, { $set: { b: 20 } }, { multi: true }, function (err, numAffected, affectedDocuments, upsert) { d.update({}, { $set: { b: 20 } }, { multi: true }, function (err, numAffected, affectedDocuments, upsert) {
assert.isNull(err) assert.isNull(err)
numAffected.should.equal(2) numAffected.should.equal(2)
assert.isUndefined(affectedDocuments) assert.isNull(affectedDocuments)
assert.isUndefined(upsert) assert.isFalse(upsert)
// returnUpdatedDocs set to true // returnUpdatedDocs set to true
d.update({}, { $set: { b: 21 } }, { d.update({}, { $set: { b: 21 } }, {
@ -1825,7 +1827,7 @@ describe('Database', function () {
assert.isNull(err) assert.isNull(err)
numAffected.should.equal(2) numAffected.should.equal(2)
affectedDocuments.length.should.equal(2) affectedDocuments.length.should.equal(2)
assert.isUndefined(upsert) assert.isFalse(upsert)
done() done()
}) })
@ -1840,8 +1842,8 @@ describe('Database', function () {
d.update({ a: 3 }, { $set: { b: 20 } }, {}, function (err, numAffected, affectedDocuments, upsert) { d.update({ a: 3 }, { $set: { b: 20 } }, {}, function (err, numAffected, affectedDocuments, upsert) {
assert.isNull(err) assert.isNull(err)
numAffected.should.equal(0) numAffected.should.equal(0)
assert.isUndefined(affectedDocuments) assert.isNull(affectedDocuments)
assert.isUndefined(upsert) assert.isFalse(upsert)
// Upsert flag set // Upsert flag set
d.update({ a: 3 }, { $set: { b: 21 } }, { upsert: true }, function (err, numAffected, affectedDocuments, upsert) { d.update({ a: 3 }, { $set: { b: 21 } }, { upsert: true }, function (err, numAffected, affectedDocuments, upsert) {
@ -1881,7 +1883,7 @@ describe('Database', function () {
} }
// Actually launch the test // Actually launch the test
async.waterfall([ waterfall([
function (cb) { function (cb) {
// eslint-disable-next-line node/handle-callback-err // eslint-disable-next-line node/handle-callback-err
d.insert({ somedata: 'ok' }, function (err, doc1) { d.insert({ somedata: 'ok' }, function (err, doc1) {
@ -1901,11 +1903,11 @@ describe('Database', function () {
return cb() return cb()
}) })
}, },
async.apply(testPostUpdateState), apply(testPostUpdateState),
function (cb) { function (cb) {
d.loadDatabase(function (err) { return cb(err) }) d.loadDatabase(function (err) { return cb(err) })
}, },
async.apply(testPostUpdateState) apply(testPostUpdateState)
], done) ], done)
}) })
@ -1920,7 +1922,7 @@ describe('Database', function () {
// Remove two docs simultaneously // Remove two docs simultaneously
const toRemove = ['Mars', 'Saturn'] const toRemove = ['Mars', 'Saturn']
async.each(toRemove, function (planet, cb) { each(toRemove, function (planet, cb) {
d.remove({ planet: planet }, function (err) { return cb(err) }) d.remove({ planet: planet }, function (err) { return cb(err) })
// eslint-disable-next-line node/handle-callback-err // eslint-disable-next-line node/handle-callback-err
}, function (err) { }, function (err) {
@ -2177,7 +2179,7 @@ describe('Database', function () {
d.getAllData().length.should.equal(0) d.getAllData().length.should.equal(0)
d.ensureIndex({ fieldName: 'z' }) d.ensureIndex({ fieldName: 'z' }, function () {
d.indexes.z.fieldName.should.equal('z') d.indexes.z.fieldName.should.equal('z')
d.indexes.z.unique.should.equal(false) d.indexes.z.unique.should.equal(false)
d.indexes.z.sparse.should.equal(false) d.indexes.z.sparse.should.equal(false)
@ -2200,6 +2202,7 @@ describe('Database', function () {
}) })
}) })
}) })
})
it('Can initialize multiple indexes on a database load', function (done) { it('Can initialize multiple indexes on a database load', function (done) {
const now = new Date() const now = new Date()
@ -2247,11 +2250,12 @@ describe('Database', function () {
d.getAllData().length.should.equal(0) d.getAllData().length.should.equal(0)
d.ensureIndex({ fieldName: 'z', unique: true }) d.ensureIndex({ fieldName: 'z', unique: true }, function () {
d.indexes.z.tree.getNumberOfKeys().should.equal(0) d.indexes.z.tree.getNumberOfKeys().should.equal(0)
fs.writeFile(testDb, rawData, 'utf8', function () { fs.writeFile(testDb, rawData, 'utf8', function () {
d.loadDatabase(function (err) { d.loadDatabase(function (err) {
assert.isNotNull(err)
err.errorType.should.equal('uniqueViolated') err.errorType.should.equal('uniqueViolated')
err.key.should.equal('1') err.key.should.equal('1')
d.getAllData().length.should.equal(0) d.getAllData().length.should.equal(0)
@ -2261,6 +2265,7 @@ describe('Database', function () {
}) })
}) })
}) })
})
it('If a unique constraint is not respected, ensureIndex will return an error and not create an index', function (done) { it('If a unique constraint is not respected, ensureIndex will return an error and not create an index', function (done) {
d.insert({ a: 1, b: 4 }, function () { d.insert({ a: 1, b: 4 }, function () {
@ -3020,7 +3025,7 @@ describe('Database', function () {
d.ensureIndex({ fieldName: 'bad' }) d.ensureIndex({ fieldName: 'bad' })
d.insert({ bad: ['a', 'b'] }, function () { d.insert({ bad: ['a', 'b'] }, function () {
// eslint-disable-next-line node/handle-callback-err // eslint-disable-next-line node/handle-callback-err
d.getCandidates({ bad: { $in: ['a', 'b'] } }, function (err, res) { callbackify(query => d._getCandidatesAsync(query))({ bad: { $in: ['a', 'b'] } }, function (err, res) {
res.length.should.equal(1) res.length.should.equal(1)
done() done()
}) })

@ -0,0 +1,83 @@
/* eslint-env mocha */
const testDb = 'workspace/test.db'
const { promises: fs } = require('fs')
const assert = require('assert').strict
const path = require('path')
const Datastore = require('../lib/datastore')
const Persistence = require('../lib/persistence')
const { exists } = require('./utils.test.js')
// Test that operations are executed in the right order
// We prevent Mocha from catching the exception we throw on purpose by remembering all current handlers, remove them and register them back after test ends
const testRightOrder = async d => {
const docs = await d.findAsync({})
assert.equal(docs.length, 0)
await d.insertAsync({ a: 1 })
await d.updateAsync({ a: 1 }, { a: 2 }, {})
const docs2 = await d.findAsync({})
assert.equal(docs2[0].a, 2)
d.updateAsync({ a: 2 }, { a: 3 }, {}) // not awaiting
d.executor.pushAsync(async () => { throw new Error('Some error') }) // not awaiting
const docs3 = await d.findAsync({})
assert.equal(docs3[0].a, 3)
}
// Note: The following test does not have any assertion because it
// is meant to address the deprecation warning:
// (node) warning: Recursive process.nextTick detected. This will break in the next version of node. Please use setImmediate for recursive deferral.
// see
const testEventLoopStarvation = async d => {
const times = 1001
let i = 0
while (i < times) {
i++
d.findAsync({ bogus: 'search' })
}
await d.findAsync({ bogus: 'search' })
}
// Test that operations are executed in the right order even with no callback
const testExecutorWorksWithoutCallback = async d => {
d.insertAsync({ a: 1 })
d.insertAsync({ a: 2 })
const docs = await d.findAsync({})
assert.equal(docs.length, 2)
}
describe('Executor async', function () {
describe('With persistent database', async () => {
let d
beforeEach(async () => {
d = new Datastore({ filename: testDb })
assert.equal(d.filename, testDb)
assert.equal(d.inMemoryOnly, false)
await Persistence.ensureDirectoryExistsAsync(path.dirname(testDb))
if (await exists(testDb)) await fs.unlink(testDb)
await d.loadDatabaseAsync()
assert.equal(d.getAllData().length, 0)
})
it('Operations are executed in the right order', () => testRightOrder(d))
it('Does not starve event loop and raise warning when more than 1000 callbacks are in queue', () => testEventLoopStarvation(d))
it('Works in the right order even with no supplied callback', () => testExecutorWorksWithoutCallback(d))
})
}) // ==== End of 'With persistent database' ====
describe('With non persistent database', function () {
let d
beforeEach(async () => {
d = new Datastore({ inMemoryOnly: true })
assert.equal(d.inMemoryOnly, true)
await d.loadDatabaseAsync()
assert.equal(d.getAllData().length, 0)
})
it('Operations are executed in the right order', () => testRightOrder(d))
it('Works in the right order even with no supplied callback', () => testExecutorWorksWithoutCallback(d))
}) // ==== End of 'With non persistent database' ====

@ -3,9 +3,10 @@ const chai = require('chai')
const testDb = 'workspace/test.db' const testDb = 'workspace/test.db'
const fs = require('fs') const fs = require('fs')
const path = require('path') const path = require('path')
const async = require('async') const { waterfall } = require('./utils.test.js')
const Datastore = require('../lib/datastore') const Datastore = require('../lib/datastore')
const Persistence = require('../lib/persistence') const Persistence = require('../lib/persistence')
const { callbackify } = require('util')
const { assert } = chai const { assert } = chai
chai.should() chai.should()
@ -14,23 +15,33 @@ chai.should()
// We prevent Mocha from catching the exception we throw on purpose by remembering all current handlers, remove them and register them back after test ends // We prevent Mocha from catching the exception we throw on purpose by remembering all current handlers, remove them and register them back after test ends
function testThrowInCallback (d, done) { function testThrowInCallback (d, done) {
const currentUncaughtExceptionHandlers = process.listeners('uncaughtException') const currentUncaughtExceptionHandlers = process.listeners('uncaughtException')
const currentUnhandledRejectionHandlers = process.listeners('unhandledRejection')
process.removeAllListeners('uncaughtException') process.removeAllListeners('uncaughtException')
process.removeAllListeners('unhandledRejection')
// eslint-disable-next-line node/handle-callback-err // eslint-disable-next-line node/handle-callback-err
process.on('uncaughtException', function (err) { process.on('uncaughtException', function (err) {
// Do nothing with the error which is only there to test we stay on track // Do nothing with the error which is only there to test we stay on track
}) })
process.on('unhandledRejection', function MINE (ex) {
// Do nothing with the error which is only there to test we stay on track
})
// eslint-disable-next-line node/handle-callback-err // eslint-disable-next-line node/handle-callback-err
d.find({}, function (err) { d.find({}, function (err) {
process.nextTick(function () { process.nextTick(function () {
// eslint-disable-next-line node/handle-callback-err // eslint-disable-next-line node/handle-callback-err
d.insert({ bar: 1 }, function (err) { d.insert({ bar: 1 }, function (err) {
process.removeAllListeners('uncaughtException') process.removeAllListeners('uncaughtException')
process.removeAllListeners('unhandledRejection')
for (let i = 0; i < currentUncaughtExceptionHandlers.length; i += 1) { for (let i = 0; i < currentUncaughtExceptionHandlers.length; i += 1) {
process.on('uncaughtException', currentUncaughtExceptionHandlers[i]) process.on('uncaughtException', currentUncaughtExceptionHandlers[i])
} }
for (let i = 0; i < currentUnhandledRejectionHandlers.length; i += 1) {
process.on('unhandledRejection', currentUnhandledRejectionHandlers[i])
}
done() done()
}) })
@ -141,9 +152,9 @@ describe('Executor', function () {
d.filename.should.equal(testDb) d.filename.should.equal(testDb)
d.inMemoryOnly.should.equal(false) d.inMemoryOnly.should.equal(false)
async.waterfall([ waterfall([
function (cb) { function (cb) {
Persistence.ensureDirectoryExists(path.dirname(testDb), function () { callbackify((dirname) => Persistence.ensureDirectoryExistsAsync(dirname))(path.dirname(testDb), function () {
fs.access(testDb, fs.constants.F_OK, function (err) { fs.access(testDb, fs.constants.F_OK, function (err) {
if (!err) { if (!err) {
fs.unlink(testDb, cb) fs.unlink(testDb, cb)

File diff suppressed because it is too large Load Diff

@ -3,12 +3,15 @@ const chai = require('chai')
const testDb = 'workspace/test.db' const testDb = 'workspace/test.db'
const fs = require('fs') const fs = require('fs')
const path = require('path') const path = require('path')
const async = require('async') const { apply, waterfall } = require('./utils.test.js')
const model = require('../lib/model') const model = require('../lib/model')
const Datastore = require('../lib/datastore') const Datastore = require('../lib/datastore')
const Persistence = require('../lib/persistence') const Persistence = require('../lib/persistence')
const storage = require('../lib/storage') const storage = require('../lib/storage')
const { execFile, fork } = require('child_process') const { execFile, fork } = require('child_process')
const { callbackify } = require('util')
const { existsCallback } = require('./utils.test')
const { ensureFileDoesntExistAsync } = require('../lib/storage')
const Readable = require('stream').Readable const Readable = require('stream').Readable
const { assert } = chai const { assert } = chai
@ -22,9 +25,9 @@ describe('Persistence', function () {
d.filename.should.equal(testDb) d.filename.should.equal(testDb)
d.inMemoryOnly.should.equal(false) d.inMemoryOnly.should.equal(false)
async.waterfall([ waterfall([
function (cb) { function (cb) {
Persistence.ensureDirectoryExists(path.dirname(testDb), function () { callbackify((dirname) => Persistence.ensureDirectoryExistsAsync(dirname))(path.dirname(testDb), function () {
fs.access(testDb, fs.constants.FS_OK, function (err) { fs.access(testDb, fs.constants.FS_OK, function (err) {
if (!err) { if (!err) {
fs.unlink(testDb, cb) fs.unlink(testDb, cb)
@ -66,7 +69,7 @@ describe('Persistence', function () {
stream.push(rawData) stream.push(rawData)
stream.push(null) stream.push(null)
d.persistence.treatRawStream(stream, function (err, result) { callbackify(rawStream => d.persistence.treatRawStreamAsync(rawStream))(stream, function (err, result) {
assert.isNull(err) assert.isNull(err)
const treatedData = result.data const treatedData = result.data
treatedData.sort(function (a, b) { return a._id - b._id }) treatedData.sort(function (a, b) { return a._id - b._id })
@ -79,6 +82,7 @@ describe('Persistence', function () {
}) })
it('Badly formatted lines have no impact on the treated data', function () { it('Badly formatted lines have no impact on the treated data', function () {
d.persistence.corruptAlertThreshold = 1 // to prevent a corruption alert
const now = new Date() const now = new Date()
const rawData = model.serialize({ _id: '1', a: 2, ages: [1, 5, 12] }) + '\n' + const rawData = model.serialize({ _id: '1', a: 2, ages: [1, 5, 12] }) + '\n' +
'garbage\n' + 'garbage\n' +
@ -92,6 +96,7 @@ describe('Persistence', function () {
}) })
it('Badly formatted lines have no impact on the treated data (with stream)', function (done) { it('Badly formatted lines have no impact on the treated data (with stream)', function (done) {
d.persistence.corruptAlertThreshold = 1 // to prevent a corruption alert
const now = new Date() const now = new Date()
const rawData = model.serialize({ _id: '1', a: 2, ages: [1, 5, 12] }) + '\n' + const rawData = model.serialize({ _id: '1', a: 2, ages: [1, 5, 12] }) + '\n' +
'garbage\n' + 'garbage\n' +
@ -101,7 +106,7 @@ describe('Persistence', function () {
stream.push(rawData) stream.push(rawData)
stream.push(null) stream.push(null)
d.persistence.treatRawStream(stream, function (err, result) { callbackify(rawStream => d.persistence.treatRawStreamAsync(rawStream))(stream, function (err, result) {
assert.isNull(err) assert.isNull(err)
const treatedData = result.data const treatedData = result.data
treatedData.sort(function (a, b) { return a._id - b._id }) treatedData.sort(function (a, b) { return a._id - b._id })
@ -135,7 +140,7 @@ describe('Persistence', function () {
stream.push(rawData) stream.push(rawData)
stream.push(null) stream.push(null)
d.persistence.treatRawStream(stream, function (err, result) { callbackify(rawStream => d.persistence.treatRawStreamAsync(rawStream))(stream, function (err, result) {
assert.isNull(err) assert.isNull(err)
const treatedData = result.data const treatedData = result.data
treatedData.sort(function (a, b) { return a._id - b._id }) treatedData.sort(function (a, b) { return a._id - b._id })
@ -169,7 +174,7 @@ describe('Persistence', function () {
stream.push(rawData) stream.push(rawData)
stream.push(null) stream.push(null)
d.persistence.treatRawStream(stream, function (err, result) { callbackify(rawStream => d.persistence.treatRawStreamAsync(rawStream))(stream, function (err, result) {
assert.isNull(err) assert.isNull(err)
const treatedData = result.data const treatedData = result.data
treatedData.sort(function (a, b) { return a._id - b._id }) treatedData.sort(function (a, b) { return a._id - b._id })
@ -205,7 +210,7 @@ describe('Persistence', function () {
stream.push(rawData) stream.push(rawData)
stream.push(null) stream.push(null)
d.persistence.treatRawStream(stream, function (err, result) { callbackify(rawStream => d.persistence.treatRawStreamAsync(rawStream))(stream, function (err, result) {
assert.isNull(err) assert.isNull(err)
const treatedData = result.data const treatedData = result.data
treatedData.sort(function (a, b) { return a._id - b._id }) treatedData.sort(function (a, b) { return a._id - b._id })
@ -239,7 +244,7 @@ describe('Persistence', function () {
stream.push(rawData) stream.push(rawData)
stream.push(null) stream.push(null)
d.persistence.treatRawStream(stream, function (err, result) { callbackify(rawStream => d.persistence.treatRawStreamAsync(rawStream))(stream, function (err, result) {
assert.isNull(err) assert.isNull(err)
const treatedData = result.data const treatedData = result.data
treatedData.sort(function (a, b) { return a._id - b._id }) treatedData.sort(function (a, b) { return a._id - b._id })
@ -277,7 +282,7 @@ describe('Persistence', function () {
stream.push(rawData) stream.push(rawData)
stream.push(null) stream.push(null)
d.persistence.treatRawStream(stream, function (err, result) { callbackify(rawStream => d.persistence.treatRawStreamAsync(rawStream))(stream, function (err, result) {
assert.isNull(err) assert.isNull(err)
const treatedData = result.data const treatedData = result.data
const indexes = result.indexes const indexes = result.indexes
@ -421,6 +426,10 @@ describe('Persistence', function () {
d.loadDatabase(function (err) { d.loadDatabase(function (err) {
assert.isDefined(err) assert.isDefined(err)
assert.isNotNull(err) assert.isNotNull(err)
assert.hasAllKeys(err, ['corruptionRate', 'corruptItems', 'dataLength'])
assert.strictEqual(err.corruptionRate, 0.25)
assert.strictEqual(err.corruptItems, 1)
assert.strictEqual(err.dataLength, 4)
fs.writeFileSync(corruptTestFilename, fakeData, 'utf8') fs.writeFileSync(corruptTestFilename, fakeData, 'utf8')
d = new Datastore({ filename: corruptTestFilename, corruptAlertThreshold: 1 }) d = new Datastore({ filename: corruptTestFilename, corruptAlertThreshold: 1 })
@ -433,6 +442,11 @@ describe('Persistence', function () {
assert.isDefined(err) assert.isDefined(err)
assert.isNotNull(err) assert.isNotNull(err)
assert.hasAllKeys(err, ['corruptionRate', 'corruptItems', 'dataLength'])
assert.strictEqual(err.corruptionRate, 0.25)
assert.strictEqual(err.corruptItems, 1)
assert.strictEqual(err.dataLength, 4)
done() done()
}) })
}) })
@ -445,7 +459,7 @@ describe('Persistence', function () {
done() done()
}) })
d.persistence.compactDatafile() d.compactDatafile()
}) })
describe('Serialization hooks', function () { describe('Serialization hooks', function () {
@ -454,7 +468,7 @@ describe('Persistence', function () {
it('Declaring only one hook will throw an exception to prevent data loss', function (done) { it('Declaring only one hook will throw an exception to prevent data loss', function (done) {
const hookTestFilename = 'workspace/hookTest.db' const hookTestFilename = 'workspace/hookTest.db'
storage.ensureFileDoesntExist(hookTestFilename, function () { callbackify(storage.ensureFileDoesntExistAsync)(hookTestFilename, function () {
fs.writeFileSync(hookTestFilename, 'Some content', 'utf8'); fs.writeFileSync(hookTestFilename, 'Some content', 'utf8');
(function () { (function () {
@ -487,7 +501,7 @@ describe('Persistence', function () {
it('Declaring two hooks that are not reverse of one another will cause an exception to prevent data loss', function (done) { it('Declaring two hooks that are not reverse of one another will cause an exception to prevent data loss', function (done) {
const hookTestFilename = 'workspace/hookTest.db' const hookTestFilename = 'workspace/hookTest.db'
storage.ensureFileDoesntExist(hookTestFilename, function () { callbackify(storage.ensureFileDoesntExistAsync)(hookTestFilename, function () {
fs.writeFileSync(hookTestFilename, 'Some content', 'utf8'); fs.writeFileSync(hookTestFilename, 'Some content', 'utf8');
(function () { (function () {
@ -509,7 +523,7 @@ describe('Persistence', function () {
it('A serialization hook can be used to transform data before writing new state to disk', function (done) { it('A serialization hook can be used to transform data before writing new state to disk', function (done) {
const hookTestFilename = 'workspace/hookTest.db' const hookTestFilename = 'workspace/hookTest.db'
storage.ensureFileDoesntExist(hookTestFilename, function () { callbackify(storage.ensureFileDoesntExistAsync)(hookTestFilename, function () {
const d = new Datastore({ const d = new Datastore({
filename: hookTestFilename, filename: hookTestFilename,
autoload: true, autoload: true,
@ -586,7 +600,7 @@ describe('Persistence', function () {
it('Use serialization hook when persisting cached database or compacting', function (done) { it('Use serialization hook when persisting cached database or compacting', function (done) {
const hookTestFilename = 'workspace/hookTest.db' const hookTestFilename = 'workspace/hookTest.db'
storage.ensureFileDoesntExist(hookTestFilename, function () { callbackify(storage.ensureFileDoesntExistAsync)(hookTestFilename, function () {
const d = new Datastore({ const d = new Datastore({
filename: hookTestFilename, filename: hookTestFilename,
autoload: true, autoload: true,
@ -619,7 +633,7 @@ describe('Persistence', function () {
idx = model.deserialize(idx) idx = model.deserialize(idx)
assert.deepStrictEqual(idx, { $$indexCreated: { fieldName: 'idefix' } }) assert.deepStrictEqual(idx, { $$indexCreated: { fieldName: 'idefix' } })
d.persistence.persistCachedDatabase(function () { callbackify(() => d.persistence.persistCachedDatabaseAsync())(function () {
const _data = fs.readFileSync(hookTestFilename, 'utf8') const _data = fs.readFileSync(hookTestFilename, 'utf8')
const data = _data.split('\n') const data = _data.split('\n')
let doc0 = bd(data[0]) let doc0 = bd(data[0])
@ -646,7 +660,7 @@ describe('Persistence', function () {
it('Deserialization hook is correctly used when loading data', function (done) { it('Deserialization hook is correctly used when loading data', function (done) {
const hookTestFilename = 'workspace/hookTest.db' const hookTestFilename = 'workspace/hookTest.db'
storage.ensureFileDoesntExist(hookTestFilename, function () { callbackify(storage.ensureFileDoesntExistAsync)(hookTestFilename, function () {
const d = new Datastore({ const d = new Datastore({
filename: hookTestFilename, filename: hookTestFilename,
autoload: true, autoload: true,
@ -714,7 +728,7 @@ describe('Persistence', function () {
fs.existsSync('workspace/it.db').should.equal(false) fs.existsSync('workspace/it.db').should.equal(false)
fs.existsSync('workspace/it.db~').should.equal(false) fs.existsSync('workspace/it.db~').should.equal(false)
storage.ensureDatafileIntegrity(p.filename, function (err) { callbackify(storage.ensureDatafileIntegrityAsync)(p.filename, function (err) {
assert.isNull(err) assert.isNull(err)
fs.existsSync('workspace/it.db').should.equal(true) fs.existsSync('workspace/it.db').should.equal(true)
@ -737,7 +751,7 @@ describe('Persistence', function () {
fs.existsSync('workspace/it.db').should.equal(true) fs.existsSync('workspace/it.db').should.equal(true)
fs.existsSync('workspace/it.db~').should.equal(false) fs.existsSync('workspace/it.db~').should.equal(false)
storage.ensureDatafileIntegrity(p.filename, function (err) { callbackify(storage.ensureDatafileIntegrityAsync)(p.filename, function (err) {
assert.isNull(err) assert.isNull(err)
fs.existsSync('workspace/it.db').should.equal(true) fs.existsSync('workspace/it.db').should.equal(true)
@ -760,7 +774,7 @@ describe('Persistence', function () {
fs.existsSync('workspace/it.db').should.equal(false) fs.existsSync('workspace/it.db').should.equal(false)
fs.existsSync('workspace/it.db~').should.equal(true) fs.existsSync('workspace/it.db~').should.equal(true)
storage.ensureDatafileIntegrity(p.filename, function (err) { callbackify(storage.ensureDatafileIntegrityAsync)(p.filename, function (err) {
assert.isNull(err) assert.isNull(err)
fs.existsSync('workspace/it.db').should.equal(true) fs.existsSync('workspace/it.db').should.equal(true)
@ -785,7 +799,7 @@ describe('Persistence', function () {
fs.existsSync('workspace/it.db').should.equal(true) fs.existsSync('workspace/it.db').should.equal(true)
fs.existsSync('workspace/it.db~').should.equal(true) fs.existsSync('workspace/it.db~').should.equal(true)
storage.ensureDatafileIntegrity(theDb.persistence.filename, function (err) { callbackify(storage.ensureDatafileIntegrityAsync)(theDb.persistence.filename, function (err) {
assert.isNull(err) assert.isNull(err)
fs.existsSync('workspace/it.db').should.equal(true) fs.existsSync('workspace/it.db').should.equal(true)
@ -820,7 +834,7 @@ describe('Persistence', function () {
fs.writeFileSync(testDb + '~', 'something', 'utf8') fs.writeFileSync(testDb + '~', 'something', 'utf8')
fs.existsSync(testDb + '~').should.equal(true) fs.existsSync(testDb + '~').should.equal(true)
d.persistence.persistCachedDatabase(function (err) { callbackify(() => d.persistence.persistCachedDatabaseAsync())(function (err) {
const contents = fs.readFileSync(testDb, 'utf8') const contents = fs.readFileSync(testDb, 'utf8')
assert.isNull(err) assert.isNull(err)
fs.existsSync(testDb).should.equal(true) fs.existsSync(testDb).should.equal(true)
@ -848,7 +862,7 @@ describe('Persistence', function () {
fs.writeFileSync(testDb + '~', 'bloup', 'utf8') fs.writeFileSync(testDb + '~', 'bloup', 'utf8')
fs.existsSync(testDb + '~').should.equal(true) fs.existsSync(testDb + '~').should.equal(true)
d.persistence.persistCachedDatabase(function (err) { callbackify(() => d.persistence.persistCachedDatabaseAsync())(function (err) {
const contents = fs.readFileSync(testDb, 'utf8') const contents = fs.readFileSync(testDb, 'utf8')
assert.isNull(err) assert.isNull(err)
fs.existsSync(testDb).should.equal(true) fs.existsSync(testDb).should.equal(true)
@ -873,7 +887,7 @@ describe('Persistence', function () {
fs.existsSync(testDb).should.equal(false) fs.existsSync(testDb).should.equal(false)
fs.existsSync(testDb + '~').should.equal(true) fs.existsSync(testDb + '~').should.equal(true)
d.persistence.persistCachedDatabase(function (err) { callbackify(() => d.persistence.persistCachedDatabaseAsync())(function (err) {
const contents = fs.readFileSync(testDb, 'utf8') const contents = fs.readFileSync(testDb, 'utf8')
assert.isNull(err) assert.isNull(err)
fs.existsSync(testDb).should.equal(true) fs.existsSync(testDb).should.equal(true)
@ -911,9 +925,9 @@ describe('Persistence', function () {
const dbFile = 'workspace/test2.db' const dbFile = 'workspace/test2.db'
let theDb, theDb2, doc1, doc2 let theDb, theDb2, doc1, doc2
async.waterfall([ waterfall([
async.apply(storage.ensureFileDoesntExist, dbFile), apply(callbackify(storage.ensureFileDoesntExistAsync), dbFile),
async.apply(storage.ensureFileDoesntExist, dbFile + '~'), apply(callbackify(storage.ensureFileDoesntExistAsync), dbFile + '~'),
function (cb) { function (cb) {
theDb = new Datastore({ filename: dbFile }) theDb = new Datastore({ filename: dbFile })
theDb.loadDatabase(cb) theDb.loadDatabase(cb)
@ -1003,6 +1017,8 @@ describe('Persistence', function () {
const datafileLength = fs.readFileSync('workspace/lac.db', 'utf8').length const datafileLength = fs.readFileSync('workspace/lac.db', 'utf8').length
assert(datafileLength > 5000)
// Loading it in a separate process that we will crash before finishing the loadDatabase // Loading it in a separate process that we will crash before finishing the loadDatabase
fork('test_lac/loadAndCrash.test').on('exit', function (code) { fork('test_lac/loadAndCrash.test').on('exit', function (code) {
code.should.equal(1) // See test_lac/loadAndCrash.test.js code.should.equal(1) // See test_lac/loadAndCrash.test.js
@ -1052,24 +1068,161 @@ describe('Persistence', function () {
}) })
}) // ==== End of 'Prevent dataloss when persisting data' ==== }) // ==== End of 'Prevent dataloss when persisting data' ====
describe('ensureFileDoesntExist', function () { describe('dropDatabase', function () {
it('Doesnt do anything if file already doesnt exist', function (done) { it('deletes data in memory', done => {
storage.ensureFileDoesntExist('workspace/nonexisting', function (err) { const inMemoryDB = new Datastore({ inMemoryOnly: true })
assert.isNull(err) inMemoryDB.insert({ hello: 'world' }, err => {
fs.existsSync('workspace/nonexisting').should.equal(false) assert.equal(err, null)
inMemoryDB.dropDatabase(err => {
assert.equal(err, null)
assert.equal(inMemoryDB.getAllData().length, 0)
return done()
})
})
})
it('deletes data in memory & on disk', done => {
d.insert({ hello: 'world' }, err => {
if (err) return done(err)
d.dropDatabase(err => {
if (err) return done(err)
assert.equal(d.getAllData().length, 0)
existsCallback(testDb, bool => {
assert.equal(bool, false)
done() done()
}) })
}) })
})
})
it('Deletes file if it stat', function (done) { it('check that executor is drained before drop', done => {
fs.writeFileSync('workspace/existing', 'hello world', 'utf8') for (let i = 0; i < 100; i++) {
fs.existsSync('workspace/existing').should.equal(true) d.insert({ hello: 'world' }) // no await
}
d.dropDatabase(err => { // it should await the end of the inserts
if (err) return done(err)
assert.equal(d.getAllData().length, 0)
existsCallback(testDb, bool => {
assert.equal(bool, false)
done()
})
})
})
storage.ensureFileDoesntExist('workspace/existing', function (err) { it('check that autocompaction is stopped', done => {
assert.isNull(err) d.setAutocompactionInterval(5000)
fs.existsSync('workspace/existing').should.equal(false) d.insert({ hello: 'world' }, err => {
if (err) return done(err)
d.dropDatabase(err => {
if (err) return done(err)
assert.equal(d.autocompactionIntervalId, null)
assert.equal(d.getAllData().length, 0)
existsCallback(testDb, bool => {
assert.equal(bool, false)
done() done()
}) })
}) })
}) // ==== End of 'ensureFileDoesntExist' ==== })
})
it('check that we can reload and insert afterwards', done => {
d.insert({ hello: 'world' }, err => {
if (err) return done(err)
d.dropDatabase(err => {
if (err) return done(err)
assert.equal(d.getAllData().length, 0)
existsCallback(testDb, bool => {
assert.equal(bool, false)
d.loadDatabase(err => {
if (err) return done(err)
d.insert({ hello: 'world' }, err => {
if (err) return done(err)
assert.equal(d.getAllData().length, 1)
d.compactDatafile(err => {
if (err) return done(err)
existsCallback(testDb, bool => {
assert.equal(bool, true)
done()
})
})
})
})
})
})
})
})
it('check that we can dropDatatabase if the file is already deleted', done => {
callbackify(ensureFileDoesntExistAsync)(testDb, err => {
if (err) return done(err)
existsCallback(testDb, bool => {
assert.equal(bool, false)
d.dropDatabase(err => {
if (err) return done(err)
existsCallback(testDb, bool => {
assert.equal(bool, false)
done()
})
})
})
})
})
it('Check that TTL indexes are reset', done => {
d.ensureIndex({ fieldName: 'expire', expireAfterSeconds: 10 })
const date = new Date()
d.insert({ hello: 'world', expire: new Date(date.getTime() - 1000 * 20) }, err => { // expired by 10 seconds
if (err) return done(err)
d.find({}, (err, docs) => {
if (err) return done(err)
assert.equal(docs.length, 0) // the TTL makes it so that the document is not returned
d.dropDatabase(err => {
if (err) return done(err)
assert.equal(d.getAllData().length, 0)
existsCallback(testDb, bool => {
assert.equal(bool, false)
d.loadDatabase(err => {
if (err) return done(err)
d.insert({ hello: 'world', expire: new Date(date.getTime() - 1000 * 20) }, err => {
if (err) return done(err)
d.find({}, (err, docs) => {
if (err) return done(err)
assert.equal(docs.length, 1) // the TTL makes it so that the document is not returned
d.compactDatafile(err => {
if (err) return done(err)
existsCallback(testDb, bool => {
assert.equal(bool, true)
done()
})
})
})
})
})
})
})
})
})
})
it('Check that the buffer is reset', done => {
d.dropDatabase(err => {
if (err) return done(err)
// these 3 will hang until load
d.insert({ hello: 'world' })
d.insert({ hello: 'world' })
d.insert({ hello: 'world' })
assert.equal(d.getAllData().length, 0)
d.dropDatabase(err => {
if (err) return done(err)
d.insert({ hi: 'world' })
d.loadDatabase(err => {
if (err) return done(err)
assert.equal(d.getAllData().length, 1)
assert.equal(d.getAllData()[0].hi, 'world')
done()
})
})
})
})
}) // ==== End of 'dropDatabase' ====
}) })

@ -0,0 +1,46 @@
const { callbackify, promisify } = require('util')
const { promises: fs, constants: fsConstants } = require('fs')
const waterfallAsync = async tasks => {
for (const task of tasks) {
await promisify(task)()
}
}
const waterfall = callbackify(waterfallAsync)
const eachAsync = async (arr, iterator) => Promise.all(arr.map(el => promisify(iterator)(el)))
const each = callbackify(eachAsync)
const apply = function (fn) {
const args = Array.prototype.slice.call(arguments, 1)
return function () {
return fn.apply(
null, args.concat(Array.prototype.slice.call(arguments))
)
}
}
const whilstAsync = async (test, fn) => {
while (test()) await promisify(fn)()
}
const whilst = callbackify(whilstAsync)
const wait = delay => new Promise(resolve => {
setTimeout(resolve, delay)
})
const exists = path => fs.access(path, fsConstants.FS_OK).then(() => true, () => false)
// eslint-disable-next-line node/no-callback-literal
const existsCallback = (path, callback) => fs.access(path, fsConstants.FS_OK).then(() => callback(true), () => callback(false))
module.exports.whilst = whilst
module.exports.apply = apply
module.exports.waterfall = waterfall
module.exports.each = each
module.exports.wait = wait
module.exports.exists = exists
module.exports.existsCallback = existsCallback
module.exports.callbackify = callbackify

@ -1,133 +1,59 @@
/* eslint-env mocha */ /* eslint-env mocha */
/* global DEBUG */
/** /**
* Load and modify part of fs to ensure writeFile will crash after writing 5000 bytes * Load and modify part of fs to ensure writeFile will crash after writing 5000 bytes
*/ */
const fs = require('fs') const fs = require('fs')
const { Writable } = require('stream')
const { callbackify } = require('util')
function rethrow () { fs.promises.writeFile = async function (path, data) {
// Only enable in debug mode. A backtrace uses ~1000 bytes of heap space and let onePassDone = false
// is fairly slow to generate. const options = { encoding: 'utf8', mode: 0o666, flag: 'w' } // we don't care about the actual options passed
if (DEBUG) {
const backtrace = new Error()
return function (err) {
if (err) {
backtrace.stack = err.name + ': ' + err.message +
backtrace.stack.substr(backtrace.name.length)
throw backtrace
}
}
}
return function (err) {
if (err) {
throw err // Forgot a callback but don't know where? Use NODE_DEBUG=fs
}
}
}
function maybeCallback (cb) {
return typeof cb === 'function' ? cb : rethrow()
}
function isFd (path) {
return (path >>> 0) === path
}
function assertEncoding (encoding) {
if (encoding && !Buffer.isEncoding(encoding)) {
throw new Error('Unknown encoding: ' + encoding)
}
}
let onePassDone = false
function writeAll (fd, isUserFd, buffer, offset, length, position, callback_) { const filehandle = await fs.promises.open(path, options.flag, options.mode)
const callback = maybeCallback(arguments[arguments.length - 1]) const buffer = (data instanceof Buffer) ? data : Buffer.from('' + data, options.encoding || 'utf8')
let length = buffer.length
let offset = 0
try {
while (length > 0) {
if (onePassDone) { process.exit(1) } // Crash on purpose before rewrite done if (onePassDone) { process.exit(1) } // Crash on purpose before rewrite done
const l = Math.min(5000, length) // Force write by chunks of 5000 bytes to ensure data will be incomplete on crash const { bytesWritten } = await filehandle.write(buffer, offset, Math.min(5000, length)) // Force write by chunks of 5000 bytes to ensure data will be incomplete on crash
// write(fd, buffer, offset, length, position, callback)
fs.write(fd, buffer, offset, l, position, function (writeErr, written) {
if (writeErr) {
if (isUserFd) {
if (callback) callback(writeErr)
} else {
fs.close(fd, function () {
if (callback) callback(writeErr)
})
}
} else {
onePassDone = true onePassDone = true
if (written === length) { offset += bytesWritten
if (isUserFd) { length -= bytesWritten
if (callback) callback(null)
} else {
fs.close(fd, callback)
}
} else {
offset += written
length -= written
if (position !== null) {
position += written
} }
writeAll(fd, isUserFd, buffer, offset, length, position, callback) } finally {
await filehandle.close()
} }
}
})
} }
fs.writeFile = function (path, data, options, callback_) { class FakeFsWriteStream extends Writable {
const callback = maybeCallback(arguments[arguments.length - 1]) constructor (filename) {
super()
if (!options || typeof options === 'function') { this.filename = filename
options = { encoding: 'utf8', mode: 438, flag: 'w' } // Mode 438 == 0o666 (compatibility with older Node releases) this._content = Buffer.alloc(0)
} else if (typeof options === 'string') {
options = { encoding: options, mode: 438, flag: 'w' } // Mode 438 == 0o666 (compatibility with older Node releases)
} else if (typeof options !== 'object') {
throw new Error(`throwOptionsError${options}`)
} }
assertEncoding(options.encoding) _write (chunk, encoding, callback) {
this._content = Buffer.concat([this._content, Buffer.from(chunk, encoding)])
const flag = options.flag || 'w' callback()
if (isFd(path)) {
writeFd(path, true)
return
} }
fs.open(path, flag, options.mode, function (openErr, fd) { _end (chunk, encoding, callback) {
if (openErr) { this._content = Buffer.concat([this._content, Buffer.from(chunk, encoding)])
if (callback) callback(openErr) callback()
} else {
writeFd(fd, false)
} }
})
function writeFd (fd, isUserFd) {
const buffer = (data instanceof Buffer) ? data : Buffer.from('' + data, options.encoding || 'utf8')
const position = /a/.test(flag) ? null : 0
writeAll(fd, isUserFd, buffer, 0, buffer.length, position, callback)
}
}
fs.createWriteStream = function (path) {
let content = ''
return {
write (data) {
content += data
},
close (callback) { close (callback) {
fs.writeFile(path, content, callback) callbackify(fs.promises.writeFile)(this.filename, this._content, 'utf8', callback)
}
} }
} }
// End of fs modification fs.createWriteStream = path => new FakeFsWriteStream(path)
// End of fs monkey patching
const Nedb = require('../lib/datastore.js') const Nedb = require('../lib/datastore.js')
const db = new Nedb({ filename: 'workspace/lac.db' }) const db = new Nedb({ filename: 'workspace/lac.db' })
db.loadDatabase() db.loadDatabaseAsync() // no need to await

@ -1,64 +1,61 @@
const fs = require('fs') const fs = require('fs')
const async = require('async') const fsPromises = fs.promises
const Nedb = require('../lib/datastore') const Nedb = require('../lib/datastore')
const db = new Nedb({ filename: './workspace/openfds.db', autoload: true })
const N = 64 const N = 64
let i
let fds
function multipleOpen (filename, N, callback) { // A console.error triggers an error of the parent test
async.whilst(function () { return i < N }
, function (cb) {
fs.open(filename, 'r', function (err, fd) {
i += 1
if (fd) { fds.push(fd) }
return cb(err)
})
}
, callback)
}
async.waterfall([ const test = async () => {
// Check that ulimit has been set to the correct value let filehandles = []
function (cb) { try {
i = 0 for (let i = 0; i < 2 * N + 1; i++) {
fds = [] const filehandle = await fsPromises.open('./test_lac/openFdsTestFile', 'r')
multipleOpen('./test_lac/openFdsTestFile', 2 * N + 1, function (err) { filehandles.push(filehandle)
if (!err) { console.log('No error occured while opening a file too many times') } }
fds.forEach(function (fd) { fs.closeSync(fd) }) console.error('No error occurred while opening a file too many times')
return cb() process.exit(1)
}) } catch (error) {
}, if (error.code !== 'EMFILE') {
function (cb) { console.error(error)
i = 0 process.exit(1)
fds = [] }
multipleOpen('./test_lac/openFdsTestFile2', N, function (err) { } finally {
if (err) { console.log('An unexpected error occured when opening file not too many times: ' + err) } for (const filehandle of filehandles) {
fds.forEach(function (fd) { fs.closeSync(fd) }) await filehandle.close()
return cb() }
}) filehandles = []
}, }
// Then actually test NeDB persistence
function () {
db.remove({}, { multi: true }, function (err) {
if (err) { console.log(err) }
db.insert({ hello: 'world' }, function (err) {
if (err) { console.log(err) }
i = 0 try {
async.whilst(function () { return i < 2 * N + 1 } for (let i = 0; i < N; i++) {
, function (cb) { const filehandle = await fsPromises.open('./test_lac/openFdsTestFile2', 'r')
db.persistence.persistCachedDatabase(function (err) { filehandles.push(filehandle)
if (err) { return cb(err) }
i += 1
return cb()
})
} }
, function (err) { } catch (error) {
if (err) { console.log('Got unexpected error during one peresistence operation: ' + err) } console.error(`An unexpected error occurred when opening file not too many times with error: ${error}`)
process.exit(1)
} finally {
for (const filehandle of filehandles) {
await filehandle.close()
} }
)
})
})
} }
])
try {
const db = new Nedb({ filename: './workspace/openfds.db' })
await db.loadDatabaseAsync()
await db.removeAsync({}, { multi: true })
await db.insertAsync({ hello: 'world' })
for (let i = 0; i < 2 * N + 1; i++) {
await db.persistence.persistCachedDatabaseAsync()
}
} catch (error) {
console.error(`Got unexpected error during one persistence operation with error: ${error}`)
}
}
try {
test()
} catch (error) {
console.error(error)
process.exit(1)
}

@ -20,10 +20,6 @@ db.loadDatabase()
db = new Datastore({ filename: 'path/to/datafile_2', autoload: true }) db = new Datastore({ filename: 'path/to/datafile_2', autoload: true })
// You can issue commands right away // You can issue commands right away
// Type 4: Persistent datastore for a Node Webkit app called 'nwtest'
// For example on Linux, the datafile will be ~/.config/nwtest/nedb-data/something.db
db = new Datastore({ filename: 'something.db' })
// Of course you can create multiple datastores if you need several // Of course you can create multiple datastores if you need several
// collections. In this case it's usually a good idea to use autoload for all collections. // collections. In this case it's usually a good idea to use autoload for all collections.
const dbContainer: any = {} const dbContainer: any = {}

@ -32,15 +32,17 @@ module.exports = (env, argv) => {
process: 'process/browser', process: 'process/browser',
Buffer: ['buffer', 'Buffer'], Buffer: ['buffer', 'Buffer'],
setImmediate: ['timers-browserify', 'setImmediate'], setImmediate: ['timers-browserify', 'setImmediate'],
clearImmediate: ['timers-browserify', 'clearImmediate'] clearImmediate: ['timers-browserify', 'clearImmediate'],
util: 'util'
}) })
], ],
entry: { entry: {
Nedb: path.join(__dirname, 'lib', 'datastore.js') Nedb: path.join(__dirname, 'lib', 'datastore.js'),
testUtils: path.join(__dirname, 'test', 'utils.test.js')
}, },
output: { output: {
path: path.join(__dirname, 'browser-version/out'), path: path.join(__dirname, 'browser-version/out'),
filename: minimize ? 'nedb.min.js' : 'nedb.js', filename: pathData => `${pathData.chunk.name.toLowerCase()}${minimize ? '.min' : ''}.js`,
libraryTarget: 'window', libraryTarget: 'window',
library: '[name]' library: '[name]'
} }

Loading…
Cancel
Save