The ultimate solution for populating your MongoDB database.

Overview

Mongo Seeding

Mongo Seeding

GitHub release Build Status MIT license

The ultimate solution for populating your MongoDB database 🚀

Define MongoDB documents in JSON, JavaScript or even TypeScript files. Use JS library, install CLI or run Docker image to import them!

Introduction

Mongo Seeding is a flexible set of tools for importing data into MongoDB database.

It's great for:

  • testing database queries, automatically or manually
  • preparing ready-to-go development environment for your application
  • setting initial state for your application

How does it work?

  1. Define documents for MongoDB import in JSON, JavaScript or TypeScript file(s). To learn, how to do that, read the import data definition guide. To see some examples, navigate to the examples directory.

  2. Use one of the Mongo Seeding tools, depending on your needs:

  3. ???

  4. Profit!

Motivation

There are many tools for MongoDB data import out there, including the official one - mongoimport. Why should you choose Mongo Seeding?

Problem #1: JSON used for import data definition

Every tool I found before creating Mongo Seeding support only JSON files. In my opinion, that is not the most convenient way of data definition. The biggest problems are data redundancy and lack of ability to write logic.

Imagine that you want to import 10 very similar documents into authors collection. Every document is identical - except the name:

{
    "name": "{NAME_HERE}",
    "email": "[email protected]",
    "avatar": "https://placekitten.com/300/300"
}

With every tool I've ever found, you would need to create 5 separate JSON files, or one file with array of objects. Of course, the latter option is better, but anyway you end up with a file looking like this:

[
    {
        "name": "John",
        "email": "[email protected]",
        "avatar": "https://placekitten.com/300/300"
    },
    {
        "name": "Joanne",
        "email": "[email protected]",
        "avatar": "https://placekitten.com/300/300"
    },
    {
        "name": "Bob",
        "email": "[email protected]",
        "avatar": "https://placekitten.com/300/300"
    },
    {
        "name": "Will",
        "email": "[email protected]",
        "avatar": "https://placekitten.com/300/300"
    },
    {
        "name": "Chris",
        "email": "[email protected]",
        "avatar": "https://placekitten.com/300/300"
    }
]

It doesn't look good - you did probably hear about DRY principle.

Imagine that now you have to change authors' email. You would probably use search and replace. But what if you would need change the data shape completely? This time you can also use IDE features like multiple cursors etc., but hey - it's a waste of time. What if you had a much more complicated data shape?

If you could use JavaScript to define the authors documents, it would be much easier and faster to write something like this:

const names = ["John", "Joanne", "Bob", "Will", "Chris"];

module.exports = names.map(name => ({
    name,
    email: "[email protected]",
    avatar: "https://placekitten.com/300/300",
}))

Obviously, in JavaScript files you can also import other files - external libraries, helper methods etc. It's easy to write some data randomization rules - which are mostly essential for creating development sample data. Consider the following example of people collection import:

const { getObjectId } = require("../../helpers/index");

const names = ["John", "Joanne", "Bob", "Will", "Chris"];

const min = 18;
const max = 100;

module.exports = names.map(name => ({
    firstName: name,
    age: Math.floor(Math.random() * (max - min + 1)) + min,
    _id: getObjectId(name),
}))

The difference should be noticeable. This way of defining import data feels just right. And yes, you can do that in Mongo Seeding. But, JSON files are supported as well.

Problem #2: No data model validation

In multiple JSON files which contains MongoDB documents definition, it's easy to make a mistake, especially in complex data structure. Sometimes a typo results in invalid data. See the example below for people collection definition:

[
    {
        "name": "John",
        "email": "[email protected]",
        "age": 18,
    },
    {
        "name": "Bob",
        "emial": "[email protected]",
        "age": "none",
    },
]

Because of a typo, Bob has email field empty. Also, there is a non-number value for age key. The same problem would exist in JavaScript data definition. But, if you was able to use TypeScript, the situation slightly changes:

export interface Person {
  name: string;
  email: string;
  age: number;
}
// import interface defined above
import { Person } from '../../models/index';

const people: Person[] = [
    {
        name: "John",
        email: "[email protected]",
        age: 18,
    },
    {
        name: "Bob",
        emial: "[email protected]", // <-- error underlined in IDE
        age: "none", //  <-- error underlined in IDE
    },
];

export = people;

If you used types, you would instantly see that you made mistakes - not only during import, but much earlier, in your IDE.

At this point some can say: “We had this for years — this is the purpose of mongoose!”. The problem is that importing a bigger amount of data with mongoose is painfully slow — because of the model validation. You can decide to use a faster approach, Model.collection.insert() method, but in this case you disable model validation completely!

Also, starting from version 3.6, MongoDB supports JSON Schema validation. Even if you are OK with writing validation rules in JSON, you still have to try inserting a document into collection to see if the object is valid. It is too slow and cumbersome, isn’t it? How to solve this problem?

It’s simple. Use TypeScript. Compile time model validation will be much faster. And IDE plugins (or built-in support like in Visual Studio Code) will ensure that you won’t make any mistake during sample data file modification. Oh, and the last thing: If you have an existing TypeScript application which uses MongoDB, then you can just reuse all models for data import.

The Mongo Seeding CLI and Mongo Seeding Docker Image have TypeScript runtime built-in. It means that you can take advantage of static type checking in TypeScript data definition files (.ts extension).

Problem #3: No ultimate solution

Tools like this should be as flexible as possible. Some developers need just CLI tool, and some want to import data programmatically. Before writing Mongo Seeding, I needed a ready-to-use Docker image and found none. Dockerizing an application is easy, but it takes time.

That's why Mongo Seeding consists of:

All tools you'll ever need for seeding your MongoDB database.

Contribution

Before you contribute to this project, read CONTRIBUTING.md file.

Comments
  • Support Extended JSON format

    Support Extended JSON format

    Hi @pkosiec,

    Awesome job at the library, it's very useful.

    I'm running into an issue. I'm inserting data from a JSON file. In the file I have an "_id" field which should be MongoDB ObjectId.

    The issue is that it gets inserted as plain text.

    This syntax should lead to the creation of ObjectIds:

    { 
     "_id" : {"$oid":"5a68fdc3615eda645bc6bdec"}
    }
    

    But it throws an error locally: { MongoSeedingError: Error: key $oid must not start with '$'

    I found that syntax here: https://stackoverflow.com/questions/51439955/mongoimport-id-is-not-an-objectid

    Is there a way to end up with ObjectIds instead of plain text ids?

    Thank you

    :rocket: enhancement area/core 
    opened by rickitan 15
  • Description of running examples is confusing

    Description of running examples is confusing

    Looks like a great library, however I couldn't get either of the methods to import the example data to work. Am I doing something incorrectly?

    /tmp $ git clone https://github.com/pkosiec/mongo-seeding
    Cloning into 'mongo-seeding'...
    remote: Counting objects: 838, done.
    remote: Compressing objects: 100% (92/92), done.
    remote: Total 838 (delta 63), reused 73 (delta 34), pack-reused 708
    Receiving objects: 100% (838/838), 556.96 KiB | 87.00 KiB/s, done.
    Resolving deltas: 100% (485/485), done.
    /tmp $ cd mongo-seeding-test
    /tmp/mongo-seeding-test $ cat package.json index.js
    {
      "name": "mongo-seeding-test",
      "version": "1.0.0",
      "description": "",
      "main": "index.js",
      "scripts": {
        "test": "echo \"Error: no test specified\" && exit 1"
      },
      "author": "",
      "license": "ISC",
      "dependencies": {
        "mongo-seeding": "^2.2.0"
      }
    }
    const { seedDatabase } = require('mongo-seeding');
    
    const path = require('path');
    
    const config = {
      database: {
        host: '127.0.0.1',
        port: 27017,
        name: 'mydatabase',
      },
      inputPath: path.resolve(__dirname, '../mongo-seeding/samples/example/data'),
      dropDatabase: true,
    };
    
    (async () => {
      try {
        await seedDatabase(config);
      } catch (err) {
        // Handle errors
        console.error('there was an error:');
        console.error(err);
      }
      // Do whatever you want after successful import
    })()
    /tmp/mongo-seeding-test $ node index.js
    there was an error:
    { MongoSeedingError: Error: Cannot find module 'mongodb'
        at wrapError (/private/tmp/mongo-seeding-test/node_modules/mongo-seeding/dist/index.js:52:19)
        at Object.<anonymous> (/private/tmp/mongo-seeding-test/node_modules/mongo-seeding/dist/index.js:44:15)
        at Generator.next (<anonymous>)
        at /private/tmp/mongo-seeding-test/node_modules/mongo-seeding/dist/index.js:7:71
        at new Promise (<anonymous>)
        at __awaiter (/private/tmp/mongo-seeding-test/node_modules/mongo-seeding/dist/index.js:3:12)
        at exports.seedDatabase (/private/tmp/mongo-seeding-test/node_modules/mongo-seeding/dist/index.js:14:43)
        at __dirname (/private/tmp/mongo-seeding-test/index.js:17:11)
        at Object.<anonymous> (/private/tmp/mongo-seeding-test/index.js:24:3)
        at Module._compile (module.js:660:30) name: 'MongoSeedingError' }
    /tmp/mongo-seeding-test $ cd ..
    /tmp $ cd mongo-seeding
    /tmp/mongo-seeding $ cd samples/example/data/
    /tmp/mongo-seeding/samples/example/data $ seed -u 'mongodb://127.0.0.1:27017/mydb' -d .
      mongo-seeding Starting... +0ms
      mongo-seeding Closing connection... +5ms
    Error MongoSeedingError: Error: Cannot find module 'mongodb'
    
    :rocket: enhancement :book: documentation area/examples 
    opened by mjgs 13
  • TS Compilation errors when using Mongo Seeding along MongoDB driver v4

    TS Compilation errors when using Mongo Seeding along MongoDB driver v4

    I am using mongo-seeding and mongoose in the same (typescript) project. I recently upgraded mongoose to version 6.0.4. This version of mongoose depends on mongodb 4.1.1 which does not play nicely with mongo-seeding right now.

    node_modules/mongo-seeding/dist/config.d.ts:3:10 - error TS2305: Module '"mongodb"' has no exported member 'CollectionInsertManyOptions'.
    
    3 import { CollectionInsertManyOptions, MongoClientOptions } from 'mongodb';
               ~~~~~~~~~~~~~~~~~~~~~~~~~~~
    
    node_modules/mongo-seeding/dist/database/database.d.ts:1:14 - error TS2305: Module '"mongodb"' has no exported member 'CollectionInsertManyOptions'.
    
    1 import { Db, CollectionInsertManyOptions, MongoClient } from 'mongodb';
                   ~~~~~~~~~~~~~~~~~~~~~~~~~~~
    
    node_modules/mongo-seeding/dist/database/database.d.ts:32:167 - error TS2694: Namespace 'node_modules/mongodb/mongodb"' has no exported member 'InsertWriteOpResult'.
    
    32     insertDocumentsIntoCollection(documentsToInsert: any[], collectionName: string, collectionInsertOptions?: CollectionInsertManyOptions): Promise<import("mongodb").InsertWriteOpResult<any>>;
    
                             ~~~~~~~~~~~~~~~~~~~
    
    node_modules/mongo-seeding/dist/database/database.d.ts:55:93 - error TS2694: Namespace '"node_modules/mongodb/mongodb"' has no exported member 'DeleteWriteOpResultObject'.
    
    55     removeAllDocumentsIfCollectionExists(collectionName: string): Promise<import("mongodb").DeleteWriteOpResultObject | undefined>;
                                                                                                   ~~~~~~~~~~~~~~~~~~~~~~~~~
    
    

    Do you plan to upgrade mongodb (type) dependencies in the foreseeable future?

    :bug: bug area/core 
    opened by aleneum 11
  • import(..) returns undefinded instead of results of resolution

    import(..) returns undefinded instead of results of resolution

    When the import(..) function is called the results are not returned, but a undefined This leads to problems when using trying to use this package with cypress.

    The for loop does not return the results of the promises https://github.com/pkosiec/mongo-seeding/blob/master/core/src/importer/index.ts#L25-L28

    If we would use a map and return instead of the for loop the issue could be solved. What do you think?

    area/core 
    opened by gerwinbrunner 8
  • TS example - ENOENT ERROR

    TS example - ENOENT ERROR

    I am not completely sure if this is an actual issue or is just that I'm missing something, but the npm script is not working, I keep getting

    MongoSeedingError: Error: ENOENT: no such file or directory, scandir '/Users/irvingarmenta/Documents/dashb-mongo-ts/data-seed/data-seed/data.ts'
        at wrapError (/Users/irvingarmenta/Documents/dashb-mongo-ts/data-seed/node_modules/mongo-seeding/src/index.ts:125:17)
        at Seeder.readCollectionsFromPath (/Users/irvingarmenta/Documents/dashb-mongo-ts/data-seed/node_modules/mongo-seeding/src/index.ts:58:13)
        at Object.<anonymous> (/Users/irvingarmenta/Documents/dashb-mongo-ts/data-seed/index.js:13:28)
        at Module._compile (internal/modules/cjs/loader.js:777:30)
        at Object.Module._extensions..js (internal/modules/cjs/loader.js:788:10)
        at Module.load (internal/modules/cjs/loader.js:643:32)
        at Function.Module._load (internal/modules/cjs/loader.js:556:12)
        at Function.Module.runMain (internal/modules/cjs/loader.js:840:10)
        at internal/main/run_main_module.js:17:11
    

    I have tried multiple file names on the index.js file but it keeps saying that it does not found the data file

    This is the index.js file:

    require('ts-node').register();
    
    const path = require('path');
    const { Seeder } = require('mongo-seeding');
    
    const config = {
      database: 'mongodb://localhost/dashb',
      dropDatabase: true,
    };
    
    const seeder = new Seeder(config);
    
    const collections = seeder.readCollectionsFromPath(
      path.resolve('./data-seed/data.ts'),
      {
        extensions: ['js', 'json', 'ts'],
        transformers: [Seeder.Transformers.replaceDocumentIdWithUnderscoreId],
      },
    );
    
    seeder
      .import(collections)
      .then(() => {
        console.log('Success');
      })
      .catch(err => {
        console.log('Error', err);
      });
    

    and this is the data.ts file that is in the same directory as index.js

    import { getObjectId } from './helpers';
    import { seedUser } from '../src/api/methods';
    
    const names = ["Hanamichi", "Rukawa", "Haruko", "Akagi", "Mitsuki"];
    
    const users: seedUser[] = names.map((name, i) => {
      return {
        id: getObjectId(name),
        name,
        email: `${name}@email.com`,
        password: `password${i}`,
        role: 'user'
      }
    });
    
    export = users;
    

    I tried following the ts example, any ideas why I keep getting ENOENT ?

    my directory tree is:

    project folder > 
             src 
             data-seed 
             // other stuff
    

    data-seed has it's own package.json and also the main project folder too, has it's own package.json

    area/core needs more info 
    opened by IrvingArmenta 7
  • Mongo Connection Timeouts because of default DEFAULT_CLIENT_OPTIONS

    Mongo Connection Timeouts because of default DEFAULT_CLIENT_OPTIONS

    Version: 3.4.0

    Hi, Great Library! Im getting a connection timeout error. Inspecting a little more I found out that it is because the current DEFAULT_CLIENT_OPTIONS for the DatabaseConnector are as follow:

    DatabaseConnector.DEFAULT_CLIENT_OPTIONS = {
        ignoreUndefined: true,
        useNewUrlParser: true,
        useUnifiedTopology: true,
        connectTimeoutMS: 1,
    };
    

    where the connectTimeoutMS is 1 milli second, therefor getting a timeout every time.

    I've tried setting the mongoClientOptions in the Seeder Config but it is still using the default options when constructing the DatabaseConnector.

    Is there any workaround for this in the current version? Thanks!

    :bug: bug area/core 
    opened by lukasburns 5
  • Add example using Mongo Seeding with Mongoose and Express

    Add example using Mongo Seeding with Mongoose and Express

    Hi to all, I needed to use a seeder to my project, so I found this helpfull package, it is super, I have created an example using json, one in Spanish and other in English, I hope to help the package.

    :book: documentation area/examples 
    opened by D4ITON 5
  • Update DB URI Parsing from CLI Params to Allow mongoDB+srv protocol

    Update DB URI Parsing from CLI Params to Allow mongoDB+srv protocol

    I am deploying a Mongo instance using Mongo Atlas. In one of the newer driver versions, a new protocol was added to allow connecting to a Mongo cluster without having to specify all hosts in the cluster. Details may be found in the MongoDB 3.6 docs. The mongo-seeding-cli works fine with the new protocol if the user specifies the connection address using a DB URI string. However, the user is unable to specify individual parameters using the new service because the core mongo seeding library automatically adds a port to the connection string, which is not allowed when connecting to mongo using the mongo+srv protocol.

    mongo-seeding Connecting to mongodb+srv://<myAdmin>:<myPassword>@runewordfinder-0ltsc.mongodb.net:27017/runewordfinder... +9ms
    (node:73572) DeprecationWarning: current URL string parser is deprecated, and will be removed in a future version. To use the new parser, pass option { useNewUrlParser: true } to MongoClient.connect.
      mongo-seeding Ports not accepted with `mongodb+srv` URIs
    

    I would like to be able to specify the individual parameters so I may hide my password as an environment variable. I propose that core/src/database/database-connector.ts#getDbConnectionUri(...) is updated to check the provided protocol and create an appropriate string that:

    • Includes the port if protocol === 'mongodb'
    • Does not include the port if protocol === 'mongodb+srv'
    :bug: bug area/core 
    opened by BenedictRasmussen 5
  • Silence logger

    Silence logger

    Hello!

    We are using this library in company, mostly for seeding database before each test. It works great, but the logging is terrible feature for us. It may be useful for one-time seeding, but since every our test generates such logs, our log from testing is bloated.

    Could you please either implement logging silence, or point out how it should be properly done, so I can prepare PR? I looked into the code and I have at least few ideas how to configure logger (since it's just a sample function now), so we could discuss that perhaps

    :rocket: enhancement area/core area/cli 
    opened by soanvig 4
  • Leaking credentials in logs

    Leaking credentials in logs

    Just noticed that in the docker logs it will show this:

    mongo_seed_1  | 2019-06-18T03:48:24.527Z mongo-seeding Connecting to mongodb://user:pass@mongodb:27017/db...
    

    I'm using the dockerfile, I recommend to add a env as follows HIDE_CREDS=true

    If set, I think it would make sense to have an environment variable to mask the user:pass like this: mongodb://HIDDEN_USER:HIDDEN_PASS@mongodb:27017/db...

    That way, anybody viewing the logs doesn't automatically have access to the user/pass.

    :bug: bug area/core 
    opened by hongkongkiwi 4
  • Connection string interpreted literally

    Connection string interpreted literally

    First of all, great library! This is such a common use case that I can't believe there aren't go-to solutions out there. Hopefully this library becomes one.

    I noticed that when using a connection string, e.g. /test?retryWrites=true, it should reference an existing database named test in mongo, but instead results in a new database being created with the name /test?retryWrites=true. Seems like the params from the ? on are being interpreted literally. Don't know if it's an issue with this library or a dependency.

    :bug: bug area/core 
    opened by murtyjones 4
  • Ability to provide `partialConfig` object when using DB URI in constructor

    Ability to provide `partialConfig` object when using DB URI in constructor

    There is an old issue from 2019 #73 about being able to switch between databases. I'm currently trying to do something similar in a multi tenant approach where I want to seed a bunch of different DB's.

    The issue I have is being able to change the database when using a URI. My project reuses a single config file which has my URI string in it as every library seems to support URI's. This however makes it difficult to change the DB name.

    Other libraries I use will connect with the URI to my default DB. Then there typically is a function do call .db(name) on the mongo client to switch the database.

    I think this could be added as an additional parameter to .import(collections, database) or as a separate function to switch DB.

    There was some talk in #73 about how to handle databases with different credentials and I think that roadblocked the feature. In that case, if you wanted to use a DB with different user then you could still create a new seeder with that. For development, I think most people are going to have it all under one user and that wouldn't be an issue.

    :rocket: enhancement area/core 
    opened by jrj2211 4
  • Update MongoDB driver dependency

    Update MongoDB driver dependency

    The closeConnection function needs to be reworked to check if the db is connected, without using the deprecated method isConnected

    (node:169094) DeprecationWarning: isConnected is deprecated and will be removed in the next major version
        at Database.<anonymous> (/home/user/Programming/some_project/node_services/node_modules/mongo-seeding/src/database/database.ts:105:38)
        at Generator.next (<anonymous>)
        at /home/user/Programming/some_project/node_services/node_modules/mongo-seeding/dist/database/database.js:8:71
        at new Promise (<anonymous>)
        at Object.<anonymous>.__awaiter (/home/user/Programming/some_project/node_services/node_modules/mongo-seeding/dist/database/database.js:4:12)
        at Database.closeConnection (/home/user/Programming/some_project/node_services/node_modules/mongo-seeding/dist/database/database.js:94:16)
        at Seeder.<anonymous> (/home/user/Programming/some_project/node_services/node_modules/mongo-seeding/src/index.ts:144:24)
        at Generator.next (<anonymous>)
        at fulfilled (/home/user/Programming/some_project/node_services/node_modules/mongo-seeding/dist/index.js:15:58)
        at processTicksAndRejections (node:internal/process/task_queues:96:5)
    
    :rocket: enhancement area/core 
    opened by ctrlsam 1
  • Handle `export default` in import data files

    Handle `export default` in import data files

    I just ran into this issue where all the added data would show up under a single document (with a single ID) in MongoDB. After about 30 minutes of tinkering around, trying stuff, and digging through the source code—I noticed how I exported my data using export default instead of export = [].

    Why would export default result in this module thinking that I want to add an array as single document? export = [] doesn't really seem like a pattern many libs use/require?

    ‼️ Wrong way

    export default [
      { name: "My First Item" },
      { name: "My Second Item" },
      { name: "My Third Item" },
      { name: "My Last Item" }
    ];
    

    ✅ Right way

    export = [
      { name: "My First Item" },
      { name: "My Second Item" },
      { name: "My Third Item" },
      { name: "My Last Item" }
    ];
    

    Suggestion

    Maybe it's safe to assume that when the user exports an array, they want it to be imported as separate documents in their database?

    Overal very useful tool, saves me a bunch of boilerplate code. 😄

    :rocket: enhancement area/core 
    opened by jessevdp 3
  • Prevent duplicates while seeding

    Prevent duplicates while seeding

    Like other ORM-based seeders, it would be awesome if the seeding runs would be stored in collection to prevent duplicates within the seeded collection. For example sequelize (mysql orm) is storing timestamp and file names in a separate table to prevent that.

    Keep up the good work!

    :rocket: enhancement area/core 
    opened by dominicrico 3
  • Rework core API

    Rework core API

    In near future, I would like to rewrite Mongo Seeding core. It may introduce some breaking changes, as I want to support additional features, without making the API too complex.

    While rethinking API, I want to consider the following abilities:

    • seed multiple databases with different users (#73)
    • use own MongoDB client
    • minimize custom code as possible (for example, by migrating to built-in reconnect feature in Mongo client - #97)
    • handle .mjs files (#105)
    :rocket: enhancement area/core 
    opened by pkosiec 4
  • Add a way to add data into multiple databases (specifically admin & app db)

    Add a way to add data into multiple databases (specifically admin & app db)

    Let me explain the use for this case.

    Right now, I have a docker spinning up a mongodb database. Assuming that the database is totally empty I want to seed using "pkosiec/mongo-seeding" docker.

    There's just one catch, I want to also seed mongodb users. These reside in the "admin" database, while my data is in an "app" database.

    I worked around it by using a shell script to seed the db users when the mongodb docker spins up, then running the seeding docker.

    It's really not ideal though, I'd like to use this docker to seed both db users in the admin database as well as my application data.

    Any ideas on this one?

    :rocket: enhancement area/core 
    opened by hongkongkiwi 7
Releases(v3.7.2)
Owner
Paweł Kosiec
Full-stack Cloud Developer. Cloud-native & open source enthusiast.
Paweł Kosiec
Ultimate Script to complete PostgreSQL-to-PostgreSQL Migration right after AWS DMS task done

Ultimate Script to complete PostgreSQL-to-PostgreSQL Migration right after AWS DMS task done

방신우 22 Dec 23, 2022
A MongoDB-like database built on top of Hyperbee with support for indexing

hyperbeedeebee A MongoDB-like database built on top of Hyperbee with support for indexing WIP: There may be breaking changes in the indexing before th

null 35 Dec 12, 2022
Database manager for MySQL, PostgreSQL, SQL Server, MongoDB, SQLite and others. Runs under Windows, Linux, Mac or as web application

Database manager for MySQL, PostgreSQL, SQL Server, MongoDB, SQLite and others. Runs under Windows, Linux, Mac or as web application

DbGate 2k Dec 30, 2022
🔥 Dreamy-db - A Powerful database for storing, accessing, and managing multiple database.

Dreamy-db About Dreamy-db - A Powerful database for storing, accessing, and managing multiple databases. A powerful node.js module that allows you to

Dreamy Developer 24 Dec 22, 2022
DolphinDB JavaScript API is a JavaScript library that encapsulates the ability to operate the DolphinDB database, such as: connecting to the database, executing scripts, calling functions, uploading variables, etc.

DolphinDB JavaScript API English | 中文 Overview DolphinDB JavaScript API is a JavaScript library that encapsulates the ability to operate the DolphinDB

DolphinDB 6 Dec 12, 2022
Solution that builds, publishes and verifies Telegram X APKs.

Telegram X Publisher — solution that automatically deploys Telegram X and verifies APK hashes. This is the complete source code for the solution that

Telegram X 19 Jan 8, 2023
MongoDB object modeling designed to work in an asynchronous environment.

Mongoose Mongoose is a MongoDB object modeling tool designed to work in an asynchronous environment. Mongoose supports both promises and callbacks. Do

Automattic 25.2k Dec 30, 2022
The Official MongoDB Node.js Driver

MongoDB NodeJS Driver The official MongoDB driver for Node.js. NOTE: v3.x released with breaking API changes. You can find a list of changes here. Ver

mongodb 9.6k Dec 28, 2022
TypeScript ORM for Node.js based on Data Mapper, Unit of Work and Identity Map patterns. Supports MongoDB, MySQL, MariaDB, PostgreSQL and SQLite databases.

TypeScript ORM for Node.js based on Data Mapper, Unit of Work and Identity Map patterns. Supports MongoDB, MySQL, MariaDB, PostgreSQL and SQLite datab

MikroORM 5.4k Dec 31, 2022
🍹 MongoDB ODM for Node.js apps based on Redux

Lightweight and flexible MongoDB ODM for Node.js apps based on Redux. Features Flexible Mongorito is based on Redux, which opens the doors for customi

Vadim Demedes 1.4k Nov 30, 2022
A high performance MongoDB ORM for Node.js

Iridium A High Performance, IDE Friendly ODM for MongoDB Iridium is designed to offer a high performance, easy to use and above all, editor friendly O

Sierra Softworks 570 Dec 14, 2022
📇 Generates and parses MongoDB BSON UUIDs

uuid-mongodb Generates and parses BSON UUIDs for use with MongoDB. BSON UUIDs provide better performance than their string counterparts. Inspired by @

Carmine DiMascio 96 Nov 21, 2022
E-Commerce Application developed with nodejs and stored to mongodb.

E-Commerce Application This Application has been developed with nodejs and mongodb. Environment Variables Create a file named config.env in config dir

Abdullah Öztürk 13 Dec 23, 2021
A back-end server aplication created using node.js, express and mongodb.

Course Material and FAQ for my Complete Node.js, Express and MongoDB Bootcamp This repo contains starter files and the finished project files for all

Pablo César Jiménez villeda 1 Jan 4, 2022
5-BackControlFinanzas - Node & MongoDB

API para APP 'Control Finanzas' Pequeña API/Backend creada para la aplicacion personal de Control Finanzas. Enlace de github a la aplicacion que requi

Ignacio Tirado 1 Jan 3, 2022
A joi extension to validate MongoDB's ObjectIDs

@marsup/joi-objectid This is a simple joi extension to validate MongoDB's ObjectIDs. Installation npm install --save @marsup/joi-objectid Usage const

Nicolas Morel 2 Dec 15, 2022
A starter template for Nest.js with MongoDB, GraphQL, JWT Auth, and more!

A progressive Node.js framework for building efficient and scalable server-side applications. Description Nest framework TypeScript starter repository

Michael Guay 19 Dec 25, 2022
A Node.js ORM for MySQL, SQLite, PostgreSQL, MongoDB, GitHub and serverless service like Deta, InspireCloud, CloudBase, LeanCloud.

Dittorm A Node.js ORM for MySQL, SQLite, PostgreSQL, MongoDB, GitHub and serverless service like Deta, InspireCloud, CloudBase, LeanCloud. Installatio

Waline 21 Dec 25, 2022