From f32da096f2758c7754832dc61dae6a18072f5bee Mon Sep 17 00:00:00 2001 From: Louis Chatriot Date: Fri, 3 May 2013 19:29:29 +0300 Subject: [PATCH 1/7] Update README.md --- README.md | 52 +++++++++++++++++++++++++++++++++++++++++++++++++--- 1 file changed, 49 insertions(+), 3 deletions(-) diff --git a/README.md b/README.md index 443b07e..58ab9b9 100644 --- a/README.md +++ b/README.md @@ -1,4 +1,50 @@ -node-embedded-db -================ +# node-embedded-db -Couldn't find a good embedded datastore for node.js. I'm making one. +Embedded persistent database for Node.js, with no dependency (except npm modules of course). The API is the same as MongoDB. + +## Why? +I needed to store data from another project ([https://github.com/louischatriot/braindead-ci](Braindead CI)). I needed the datastore to be standalone (i.e. no dependency except other Node modules) so that people can install the software using a simple `npm install`. I couldn't find one without bugs and a clean API so I made this one. + +## Installation, tests +It will be published as an npm module once it is finished. To launch tests: `npm test`. You + +## Performance +### Speed +Performance is pretty good on the kind of datasets it is designed for (10,000 documents or less). On my machine (3 years old, no SSD), with a collection with 10,000 documents: +* An insert takes 0.1ms +* A read takes 5.7ms +* An update takes 62ms +* A deletion takes 61ms +Read, update and deletion times are pretty much non impacted by the number of concerned documents. Inserts, updates and deletions are non-blocking. Read will be soon, too (but they are so fast it is not so important anyway). + +Memory + + +## API + + + +## License + +(The MIT License) + +Copyright (c) 2013 Louis Chatriot <louis.chatriot@gmail.com> + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +'Software'), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY +CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, +TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. From 4ad97719d3e04bfede57a5c11b9fabf538e7891b Mon Sep 17 00:00:00 2001 From: Louis Chatriot Date: Fri, 3 May 2013 19:52:20 +0300 Subject: [PATCH 2/7] Update README.md --- README.md | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/README.md b/README.md index 58ab9b9..9c87a0b 100644 --- a/README.md +++ b/README.md @@ -10,18 +10,18 @@ It will be published as an npm module once it is finished. To launch tests: `npm ## Performance ### Speed -Performance is pretty good on the kind of datasets it is designed for (10,000 documents or less). On my machine (3 years old, no SSD), with a collection with 10,000 documents: +It is pretty fast on the kind of datasets it was designed for (10,000 documents or less). On my machine (3 years old, no SSD), with a collection with 10,000 documents: * An insert takes 0.1ms * A read takes 5.7ms * An update takes 62ms * A deletion takes 61ms Read, update and deletion times are pretty much non impacted by the number of concerned documents. Inserts, updates and deletions are non-blocking. Read will be soon, too (but they are so fast it is not so important anyway). -Memory - +### Memory footprint +For now, a copy of the whole database is kept in memory. For the kind of datasets expected this should be too much (max 20MB) but I am planning on stopping using that method to free RAM and make it completely asynchronous. ## API - +It's a subset of MongoDB's API. ## License From 46e6f0f819946f87c2a0f51a29ed8dd1cc75ac02 Mon Sep 17 00:00:00 2001 From: Louis Chatriot Date: Fri, 3 May 2013 19:55:29 +0300 Subject: [PATCH 3/7] Update README.md --- README.md | 6 ++++-- 1 file changed, 4 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 9c87a0b..55e4aaf 100644 --- a/README.md +++ b/README.md @@ -6,7 +6,7 @@ Embedded persistent database for Node.js, with no dependency (except npm modules I needed to store data from another project ([https://github.com/louischatriot/braindead-ci](Braindead CI)). I needed the datastore to be standalone (i.e. no dependency except other Node modules) so that people can install the software using a simple `npm install`. I couldn't find one without bugs and a clean API so I made this one. ## Installation, tests -It will be published as an npm module once it is finished. To launch tests: `npm test`. You +It will be published as an npm module once it is finished. To launch tests: `npm test`. ## Performance ### Speed @@ -17,8 +17,10 @@ It is pretty fast on the kind of datasets it was designed for (10,000 documents * A deletion takes 61ms Read, update and deletion times are pretty much non impacted by the number of concerned documents. Inserts, updates and deletions are non-blocking. Read will be soon, too (but they are so fast it is not so important anyway). +You can run the simple benchmarks I use by executing the scripts in the `benchmarks` folder. They all take an optional parameter which is the size of the dataset to use (default is 10,000). + ### Memory footprint -For now, a copy of the whole database is kept in memory. For the kind of datasets expected this should be too much (max 20MB) but I am planning on stopping using that method to free RAM and make it completely asynchronous. +For now, a copy of the whole database is kept in memory. For the kind of datasets expected this should not be too much (max 20MB) but I am planning on stopping using that method to free RAM and make it completely asynchronous. ## API It's a subset of MongoDB's API. From b2af33e7aa20824990734fdc92e59212b77fc489 Mon Sep 17 00:00:00 2001 From: Louis Chatriot Date: Fri, 3 May 2013 19:55:49 +0300 Subject: [PATCH 4/7] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 55e4aaf..f580714 100644 --- a/README.md +++ b/README.md @@ -14,7 +14,7 @@ It is pretty fast on the kind of datasets it was designed for (10,000 documents * An insert takes 0.1ms * A read takes 5.7ms * An update takes 62ms -* A deletion takes 61ms +* A deletion takes 61ms Read, update and deletion times are pretty much non impacted by the number of concerned documents. Inserts, updates and deletions are non-blocking. Read will be soon, too (but they are so fast it is not so important anyway). You can run the simple benchmarks I use by executing the scripts in the `benchmarks` folder. They all take an optional parameter which is the size of the dataset to use (default is 10,000). From 042e0429f2a69389f50d9d40fb92ac148fcfd204 Mon Sep 17 00:00:00 2001 From: Louis Chatriot Date: Fri, 3 May 2013 19:56:04 +0300 Subject: [PATCH 5/7] Update README.md --- README.md | 1 + 1 file changed, 1 insertion(+) diff --git a/README.md b/README.md index f580714..9768e82 100644 --- a/README.md +++ b/README.md @@ -15,6 +15,7 @@ It is pretty fast on the kind of datasets it was designed for (10,000 documents * A read takes 5.7ms * An update takes 62ms * A deletion takes 61ms + Read, update and deletion times are pretty much non impacted by the number of concerned documents. Inserts, updates and deletions are non-blocking. Read will be soon, too (but they are so fast it is not so important anyway). You can run the simple benchmarks I use by executing the scripts in the `benchmarks` folder. They all take an optional parameter which is the size of the dataset to use (default is 10,000). From 376045386ac9e50b8bc592da4e4a59cf5aa78c5c Mon Sep 17 00:00:00 2001 From: Louis Chatriot Date: Fri, 3 May 2013 19:56:44 +0300 Subject: [PATCH 6/7] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 9768e82..b2901f2 100644 --- a/README.md +++ b/README.md @@ -3,7 +3,7 @@ Embedded persistent database for Node.js, with no dependency (except npm modules of course). The API is the same as MongoDB. ## Why? -I needed to store data from another project ([https://github.com/louischatriot/braindead-ci](Braindead CI)). I needed the datastore to be standalone (i.e. no dependency except other Node modules) so that people can install the software using a simple `npm install`. I couldn't find one without bugs and a clean API so I made this one. +I needed to store data from another project ([Braindead CI](https://github.com/louischatriot/braindead-ci)). I needed the datastore to be standalone (i.e. no dependency except other Node modules) so that people can install the software using a simple `npm install`. I couldn't find one without bugs and a clean API so I made this one. ## Installation, tests It will be published as an npm module once it is finished. To launch tests: `npm test`. From a9799a3dad70676c2af035f424bec66ec5c01a0d Mon Sep 17 00:00:00 2001 From: Louis Chatriot Date: Fri, 3 May 2013 19:57:16 +0300 Subject: [PATCH 7/7] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index b2901f2..f3ed84a 100644 --- a/README.md +++ b/README.md @@ -3,7 +3,7 @@ Embedded persistent database for Node.js, with no dependency (except npm modules of course). The API is the same as MongoDB. ## Why? -I needed to store data from another project ([Braindead CI](https://github.com/louischatriot/braindead-ci)). I needed the datastore to be standalone (i.e. no dependency except other Node modules) so that people can install the software using a simple `npm install`. I couldn't find one without bugs and a clean API so I made this one. +I needed to store data from another project (Braindead CI). I needed the datastore to be standalone (i.e. no dependency except other Node modules) so that people can install the software using a simple `npm install`. I couldn't find one without bugs and a clean API so I made this one. ## Installation, tests It will be published as an npm module once it is finished. To launch tests: `npm test`.