Welcome!

Blog Feed Post

Getting started with MongoDB and Mongoose

What is Mongoose?

Mongoose is an “elegant mongodb object modeling for node.js“. If you have used MongoDB before and tried basic database operations, you might have noticed that MongoDB is  “schema less”. When you are looking to implement a more structured database and want to leverage the power of MongoDB, Mongoose is one of the ODM (Object Data Mapping) solutions.

To quickly demonstrate, you run an insert command into a collection named users like


db.users.insert({ name : 'Arvind', gender : 'male'});

And right after that you can run


db.users.insert({ name : 'Arvind', gender : 'male', password : '!@#$'});

and MongoDB will never complain about the variation in the number of columns (key value pairs). This is very flexible. But when you want to keep your data more organized and structured, you would need to maintain that in your server code, writing validation, making sure nothing irrelevant is stored in a collection. And this is where Mongoose makes life easy.

“Mongoose provides a straight-forward, schema-based solution to modeling your application data and includes built-in type casting, validation, query building, business logic hooks and more, out of the box.”

Install Node js & MongoDB

To use Mongoose, we need to have Node js installed, you can find info here.

Start Developing

Let us first create a small playground, where we can have fun. Create a new folder named myMongooseApp. And open terminal/prompt here and run

npm init

This will help us in initializing a new node project. Fill it up as required. Next, we will install Mongoose as a dependency to our project. Run

npm install mongoose --save-dev

then start the MongoDB service by running

mongod

Next, create a new file named index.js at the root of the and then open it up in your favorite editor. And add the below code.

var mongoose = require('mongoose');
mongoose.connect('mongodb://localhost/myTestDB');

var db = mongoose.connection;

db.on('error', function (err) {
console.log('connection error', err);
});
db.once('open', function () {
console.log('connected.');
});

Here, we require the mongoose package to connect to the DB, and initialize the connection. The name of our Database is myTestDB.

Then run

node index.js

and you should see the connected message. You can also use a node package named nodemon for automatically restarting the node server on changes.
Now, our sandbox is ready to play!

Mongoose Schemas

Schemas are like skeletons. The bare bones of how your data collection will look like. If you are dealing with a collection of users, your schema would look something like this.

Name - String
Age - Number
Gender - String
Date of Birth - Date

And if you are dealing with a collection of products, your schema will look something like this

SKU - String
Name - String
Price - Number
InStock - Boolean
Quantity - Number

You can see the drift. When our data is guarded with a schema like this, the possibility of storing garbage data reduces drastically.

Now we got an understanding of schemas, lets try and build a user schema using Mongoose. Back to index.js and add the below code

var Schema = mongoose.Schema;
var userSchema = new Schema({
name : String,
age : Number,
DOB : Date,
isAlive : Boolean
});

Basic User related fields and their schema types. You can find a list of schema types here.

Next, we will create a model from the schema. Add

var User = mongoose.model('User', userSchema);

Thats it, our User model is ready. We will use this as our base schema to insert users into the database. This way we know that every document in a User collection will have the fields listed on the schema. Lets create a new user instance and save it to DB add

var arvind = new User({
name : 'Arvind',
age : 99,
DOB : '01/01/1915',
isAlive : true
});

arvind.save(function (err, data) {
if (err) console.log(err);
else console.log('Saved : ', data );
});

And you should see something like

Saved : { __v: 0,
name: 'Arvind',
age: 99,
DOB: Fri Jan 01 1915 00:00:00 GMT+0530 (IST),
isAlive: true,
_id: 536a4866dba434390d728216 }

No hassles, no issues. Simple and easy API to interact with Models.

Lets say that we want each model to have a method named isYounger. This method will return true if age is less than 50, false if greater. We can do this by querying the DB for the current user, then checking the conditioning and the returning true or false.

But what if we want to implement this method to all the models of User schema? This is how we do it in Mongoose

var mongoose = require('mongoose');
mongoose.connect('mongodb://localhost/myTestDB');

var db = mongoose.connection;

db.on('error', function (err) {
console.log('connection error', err);
});
db.once('open', function () {
console.log('connected.');
});

var Schema = mongoose.Schema;
var userSchema = new Schema({
name : String,
age : Number,
DOB : Date,
isAlive : Boolean
});

userSchema.methods.isYounger = function () {
return this.model('User').age < 50 ? true : false;
}

var User = mongoose.model('User', userSchema);

var arvind = new User({
name : 'Arvind',
age : 99,
DOB : '01/01/1915',
isAlive : true
});

arvind.save(function (err, data) {
if (err) console.log(err);
else console.log('Saved ', data );
});

console.log('isYounger : ',arvind.isYounger());

On line 21 we add the method definition. And the result on line 39 will be false. This is a simple handy way of adding methods to your Schema, making it more Object Oriented-ish.

In case you have a password field, you can add a method like encryptPassword(), to encrypt password and comparePassword(), to compare the passwords at login to the userSchema itself. You can read more about password Authentication here.

Out of box Mongoose also provides a few options when the schema is created. For example if you take a look at the below schema declaration, we are passing an option

var userSchema = new Schema({
name : String,
age : Number,
DOB : Date,
isAlive : Boolean
}, {strict : false});

strict:false. Strict option is true by default and it does not allow the ‘non-schema’ key value pairs to be saved. Example:

var arvind = new User({
name : 'Arvind',
age : 99,
DOB : '01/01/1915',
isAlive : true
});

will be saved, where as

var arvind = new User({
name : 'Arvind',
age : 99,
DOB : '01/01/1915',
isAlive : true,
bucketList : [{...}, {...}, {...} ]
});

All of the above will be saved minus the bucketList array, because it was not declared as part of schema. So, no client who consumes your services will be able to dump invalid data into your collections.

Another cool option is collection. If you don’t want your model name to be the same as collection name, you can pass the name as an option like

var userSchema = new Schema({
name : String,
age : Number,
DOB : Date,
isAlive : Boolean
}, {collection : 'appusers'});

You can find a list of other options here.

With Mongoose, You can also add events to your schemas like pre save or post save, where you can perform validations or process data or run other queries in the respective events. These methods are called as Middlewares.

A simple example can be a parallel middleware like

var userSchema = new Schema({
name : String,
age : Number,
DOB : Date,
isAlive : Boolean
})
schema.pre('save', true, function (next, done) {
// calling next kicks off the next middleware in parallel
next();
doAsync(done);
});

You can read more about middlewares here.

 

Thanks for reading. Do comment.
@arvindr21

Read the original blog entry...

More Stories By Dharshan Rangegowda

Dharshan is the founder of MongoDirector.com. Previous to MongoDirector Dharshan worked in the Virtualization and Data management groups in Microsoft.

Latest Stories
Regardless of what business you’re in, it’s increasingly a software-driven business. Consumers’ rising expectations for connected digital and physical experiences are driving what some are calling the "Customer Experience Challenge.” In his session at @DevOpsSummit at 20th Cloud Expo, Marco Morales, Director of Global Solutions at CollabNet, will discuss how organizations are increasingly adopting a discipline of Value Stream Mapping to ensure that the software they are producing is poised to ...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
IBM helps FinTechs and financial services companies build and monetize cognitive-enabled financial services apps quickly and at scale. Hosted on IBM Bluemix, IBM’s platform builds in customer insights, regulatory compliance analytics and security to help reduce development time and testing. In his session at 20th Cloud Expo, Tom Eck, Industry Platforms CTO at IBM Cloud, will discuss how these tools simplify the time-consuming tasks of selection, mapping and data integration, allowing developer...
For financial firms, the cloud is going to increasingly become a crucial part of dealing with customers over the next five years and beyond, particularly with the growing use and acceptance of virtual currencies. There are new data storage paradigms on the horizon that will deliver secure solutions for storing and moving sensitive financial data around the world without touching terrestrial networks. In his session at 20th Cloud Expo, Cliff Beek, President of Cloud Constellation Corporation, ...
In order to meet the rapidly changing demands of today’s customers, companies are continually forced to redefine their business strategies in order to meet these needs, stay relevant and continue to see profitable growth. IoT deployment and development is integral in this transformation, and today businesses are increasingly seeing the value of investing their resources into IoT deployments. These technologies are able increase ROI through projects such as connecting supply chains or enabling sm...
Interested in leveling up on your Cloud Foundry skills? Join IBM for Cloud Foundry Days on June 7 at Cloud Expo New York at the Javits Center in New York City. Cloud Foundry Days is a free half day educational conference and networking event. Come find out why Cloud Foundry is the industry's fastest-growing and most adopted cloud application platform.
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend @CloudExpo | @ThingsExpo, June 6-8, 2017, at the Javits Center in New York City, NY and October 31 - November 2, 2017, Santa Clara Convention Center, CA. Learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
Five years ago development was seen as a dead-end career, now it’s anything but – with an explosion in mobile and IoT initiatives increasing the demand for skilled engineers. But apart from having a ready supply of great coders, what constitutes true ‘DevOps Royalty’? It’ll be the ability to craft resilient architectures, supportability, security everywhere across the software lifecycle. In his keynote at @DevOpsSummit at 20th Cloud Expo, Jeffrey Scheaffer, GM and SVP, Continuous Delivery Busine...
SYS-CON Events announced today that delaPlex will exhibit at SYS-CON's @CloudExpo, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. delaPlex pioneered Software Development as a Service (SDaaS), which provides scalable resources to build, test, and deploy software. It’s a fast and more reliable way to develop a new product or expand your in-house team.
SYS-CON Events announced today that Carbonite will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Carbonite protects your entire IT footprint with the right level of protection for each workload, ensuring lower costs and dependable solutions with DoubleTake and Evault.
SYS-CON Events announced today that EARP Integration will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. EARP Integration is a passionate software house. Since its inception in 2009 the company successfully delivers smart solutions for cities and factories that start their digital transformation. EARP provides bespoke solutions like, for example, advanced enterprise portals, business intelligence systems an...
Amazon started as an online bookseller 20 years ago. Since then, it has evolved into a technology juggernaut that has disrupted multiple markets and industries and touches many aspects of our lives. It is a relentless technology and business model innovator driving disruption throughout numerous ecosystems. Amazon’s AWS revenues alone are approaching $16B a year making it one of the largest IT companies in the world. With dominant offerings in Cloud, IoT, eCommerce, Big Data, AI, Digital Assis...
SYS-CON Events announced today that delaPlex will exhibit at SYS-CON's @ThingsExpo, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. delaPlex pioneered Software Development as a Service (SDaaS), which provides scalable resources to build, test, and deploy software. It’s a fast and more reliable way to develop a new product or expand your in-house team.
In his session at 20th Cloud Expo, Brad Winett, Senior Technologist for DDN Storage, will present several current, end-user environments that are using object storage at scale for cloud deployments including private cloud and cloud providers. Details on the top considerations of features and functions for selecting object storage will be included. Brad will also touch on recent developments in tiering technologies that deliver single solution and an end-user view of data across files and objects...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...