Introduction to MongoDb with .NET part 4: querying basics with find and findOne

Introduction

In the previous post discussed some details around insertions in MongoDb. We saw that it was quite a painless and straightforward operation. We use the “db” handle, provide the name of the collection where we’d like to insert a document, call the “insert” function on it and pass in a JSON document. All MongoDb documents must have a unique and immutable ID represented by the “_id” field. By default it is of type ObjectId which comes from the BSON specification. We can also provide an ID of integer or string or some other type ourselves but then it is our responsibility that the ID is unique within the collection otherwise we’ll get an exception. Auto-incremented integer IDs are difficult to implement in MongoDb but storing GUID as a string for the ID is a viable option if you’d like to stick to a familiar ID setting strategy from relational databases.

In this post we’ll start looking into querying in MongoDb which will span multiple posts. We’ll mostly see a lot of short examples.

Read more of this post

Introduction to MongoDb with .NET part 3: document insertions

Introduction

In the previous post we successfully installed the latest version of MongoDb on Windows. I think you’ll agree that it was a very simple and painless process. We started exploring the two most important tools of the MongoDb installation folder. Mongo.exe starts the client with which you can interface with the database using commands and queries. Mongod.exe in turn starts the database. We saw a couple of command and query examples such as inserting a new record, searching for one and also deleting one. The default query language of MongoDb is JavaScript and most parameters to the query functions will be in JSON. A JSON query parameter is in fact also a document.

In this post we’ll take a closer look at insertions. Don’t forget to start both mongo.exe and mongod.exe in two separate command prompts if you want to try the examples yourself.

Read more of this post

Introduction to MongoDb with .NET part 2: installation on Windows

Introduction

In the previous post we discussed the basic characteristics of MongoDb. We said that it was a NoSql database that stores its data in binary JSON, i.e. BSON documents. These documents contain collections that are somewhat like tables in relational databases. In theory any data structure that can be represented by JSON can be stored in a BSON documents. There’s no database schema, there are no constraints, you can store any (un)structured object graph in a collection as long as it can be represented in JSON. MongoDb is also highly scalable and queries have a good performance, especially if the database is tuned well. MongoDb also comes with a couple of drawbacks, such as the lack of stored procedures and transactions. You’ll need to weigh all the proc and cons before you decide which data store technology to use in your project.

In this post we’ll first install the MongoDb executables on the local machine. We’ll then see a couple of basic examples of interacting with the database server from the MongoDb client.

Read more of this post

Introduction to MongoDb with .NET part 1: background

Introduction

MongoDb is the most popular NoSql database out there at the time of writing this post. It’s used by a wide range companies as a data store. Large and small, well-established and freshly started organisations have all embraced this relatively new database technology. The default choice for storing data in a .NET project has most often been SQL Server. While SQL Server is probably still the most popular choice for .NET developersm they can choose from other well-tested alternatives depending on their project needs. MongoDb is very easy to set up and start working with.

In this series we’ll explore a number of features of MongoDb and how it can used in a .NET project. There is already a series dedicated to MongoDb in .NET on this blog starting here. However, it was a long time ago and MongoDb, like many other technologies within IT evolve very quickly so it’s time to revisit it. Also, in this updated series I’d like to devote more time on the raw queries we can send to the MongoDb server than in the previous one. So first we’ll do some querying and data manipulation through the MongoDb shell and then we’ll go over to MongoDb in .NET.

Read more of this post

Domain Driven Design with Web API extensions part 16: testing the MongoDb repository in the DDD application

Introduction

In the previous post we implemented the ITimetableViewModelRepository interface in the WebSuiteDemo.Loadtesting.Repository.MongoDb project. We saw that the implementation was practically the same as in the EF repository layer. The only difference was due to details in the MongoDb query syntax.

In this post we’ll finish off this extension series to the load testing DDD demo project. We’ll wire up the MongoDb repository layer with the rest of the application.

Read more of this post

Domain Driven Design with Web API extensions part 15: implementing the ITimetableViewModelRepository interface in MongoDb

Introduction

In the previous post we implemented the ITimetableRepository interface in the MongoDb repository layer. We saw that the code carried out the same type of logic as its EF counterpart. Most of the new things were related to MongoDb query syntax.

In this post we’ll implement the ITimetableViewModelRepository interface. Make sure you have the DDD demo project open.

Read more of this post

Domain Driven Design with Web API extensions part 14: implementing the ITimetableRepository interface in MongoDb

Introduction

In the previous post we looked at a couple of basic operations with the MongoDb .NET driver. I decided to include that “intermediate” post so that you won’t get overwhelmed with a lot of new code if you’re new to MongoDb.

In this post we’ll to be implement the ITimetableRepository interface in the MongoDb repository layer.

Read more of this post

Domain Driven Design with Web API extensions part 13: query examples with the MongoDb driver

Introduction

In the previous post we successfully seeded our MongoDb load testing data store. We saw that the Seed() method wasn’t all that different from its EntityFramework equivalent.

In this post we’ll look at a range of examples of using the MongoDb driver. We’ll primarily consider CRUD operations. Originally I wanted to simply present the implementation of the domain repository interfaces. However, I thought it may be too overwhelming for MongoDb novices to be presented all the new object types and query functions. The purpose of this “intermediate” post is therefore to provide a soft start in MongoDb queries.

We’ll be working in the MongoDbDatabaseTester project of the DDD demo solution in this post. Make sure you start up the MongoDb server with the “mongod” command in a command prompt.

Read more of this post

Domain Driven Design with Web API extensions part 12: seeding the MongoDb database

Introduction

In the previous post we constructed the database objects that represent the database version of the load testing domain objects. Recall that there’s no automated code generation and mapping tool for MongoDb .NET – at least not yet. Therefore decoupling database and domain objects with MongoDb as the backing store usually means some extra coding but in turn you’ll have completely independent database and domain objects.

We also inserted an abstraction and a concrete implementation for the connection string repository.

In this post we’ll seed the database with the same initial values as we had for the EF data store.

Read more of this post

Building a web service with Node.js in Visual Studio Part 8: connecting to MongoDb

Introduction

In the previous post we gave some structure to our Node.js project by way of a service and a repository. We also discussed the role of callbacks in asynchronous code execution, i.e. the “next” parameter. However, we still return some hard-coded JSON to all queries. It’s time to connect to the MongDb database we set up in part 2. The goal of this post is to replace the following code…

module.exports.getAll = function () {
    return { name: "Great Customer", orders: "none yet" };
};

module.exports.getById = function (customerId) {
    return { name: "Great Customer with id " + customerId, orders: "none yet" }
};

…with real DB access code.

In addition, we’ll operate with callbacks all the way from the controller to the repository. It might be overkill for this small demo project but I wanted to demonstrate something that’s similar to the await-async paradigm in .NET. If you’ve worked with the await-async keywords in .NET then you’ll know that once you decorate a method with “async” then the caller of that method will be “async” as well, and so on all the way up on the call stack.

MongoDb driver

There’s no built-in library in Node to access data in a database. There’s however a number of drivers available for download through the Node Package Manager. Keep in mind that we’re still dealing with JSON objects so forget the mapping convenience you’ve got used to while working with Entity Framework in a .NET project. Some would on the other hand say that this is actually a benefit because we can work with data in a raw format without the level of abstraction imposed by an object relational mapper. So whether or not this is a (dis)advantage depends on your preferences.

We’ll go for a simplistic driver for MongoDb which allows us to interact with the database at a low level, much like we did through the command line interface in parts 2 and 3 of this series. Open the project we’ve been working on and right-click npm. Select the driver called “mongodb”:

MongoDb driver for NodeJs

The central access

Add a new file called “access.js” to the repositories folder. Insert the following code in the file:

var mongoDb = require('mongodb');
var connectionString = "mongodb://localhost:27017/customers";
var database = null;

module.exports.getDbHandle = function (next) {
    if (!database) {
        mongoDb.MongoClient.connect(connectionString, function (err, db) {
            if (err) {
                next(err, null);
            }
            else {
                database = db;
                next(null, database);
            }
        });
    }
    else {
        next(null, database);
    }
};

The purpose of this file is to provide universal access to our MongoDb database to all our repositories. We first declare the following:

  • We import the mongodb library
  • We declare the connection string which includes the name of our database, i.e. “customers”
  • We set up a field that will be a reference to the database

The getDbHandle function accepts a callback function called “next” which we’re familiar with by now. We then check if the “database” field is null – we don’t want to re-open the database every time we need something, MongoDb handles connection pooling automatically. If “database” has been set then we simply return it using the “next” callback and pass in null for the exception.

Otherwise we use the “connect” function of the mongodb library to connect to the database using the connection string and a callback. The connect function will populate the “err” parameter with any error during the operation and the “db” parameter with a handle to the database. As we saw before we call the “next” callback with the error if there’s one otherwise we set our “database” field and pass it back to the “next” callback.

The new customer repository

The updated customer repository – customerRepository.js – looks as follows:

var databaseAccess = require('./access');

module.exports.getAll = function (next) {
    databaseAccess.getDbHandle(function (err, db) {
        if (err) {
            next(err, null);
        }
        else {
            db.collection("customers").find().toArray(function (err, res) {
                if (err) {
                    next(err, null);
                }
                else {
                    next(null, res);
                }
            });
        }
    });
};

module.exports.getById = function (customerId, next) {
    databaseAccess.getDbHandle(function (err, db) {
        if (err) {
            next(err, null);
        }
        else {
            var mongoDb = require('mongodb');
            var BSON = mongoDb.BSONPure;
            var objectId = new BSON.ObjectID(customerId);
            db.collection("customers").find({ '_id': objectId }).toArray(function (err, res) {
                if (err) {
                    next(err, null);
                }
                else {
                    next(null, res);
                }
            });
        }
    });
};

We import access.js to get access to the DB handle. The getAll function accepts a “next” callback and calls upon getDbHandle we’ve seen above. If there’s an error while opening the database we populate the error field of “next” and pass “null” as the result. Otherwise we can go on and query the database. We need to reference the “customers” collection within the database. Our goal is to find all customers and from part 2 of this series we know that the “find()” function with no parameters will do just that. So we call find(). We’re not done as we need to turn it into an array which also accepts a callback with the usual signature: error and result. As usual, if there’s an error, we call next with error and null otherwise we set null as the error and pass the result. If all went well then “res” will include all customers as JSON.

The getById function follows the same setup. Part 2, referred to above, showed how to pass a query to the find() method so this should be familiar. The only somewhat complex thing is that we need to turn the incoming “customerId” string parameter into an ObjectId object which MongoDb understands. We then pass the converted object id as the search parameter of the “_id” field.

Calling the repository from the service

The updated customerService code follows the same callback passing paradigm as we saw above:

var customerRepository = require('../repositories/customerRepository');

module.exports.getAllCustomers = function (next) {
    customerRepository.getAll(function (err, res) {
        if (err) {
            next(err, null);
        }
        else {
            next(null, res);
        }
    });
};

module.exports.getCustomerById = function (customerId, next) {
    customerRepository.getById(customerId, function (err, res) {
        if (err) {
            next(err, null);
        }
        else {
            next(null, res);
        }
    });
};

This doesn’t add much functionality to the service apart from calling the repository. Later on when we have the POST/PUT/DELETE functions in place we’ll be able to add validation rules.

index.js in the services folder will be updated accordingly:

var customerService = require('./customerService');

module.exports.getAllCustomers = function (next) {
    customerService.getAllCustomers(function (err, res) {
        if (err) {
            next(err, null);
        }
        else {
            next(null, res);
        }
    });
};

module.exports.getCustomerById = function (id, next) {
    customerService.getCustomerById(id, function (err, res) {
        if (err) {
            next(err, null);
        }
        else {
            next(null, res);
        }
    });
};

Updating the controller

Finally, we’ll extend the controller function to respond with a 400 in case of an error:

var customerService = require('../services');

module.exports.start = function (app) {
    app.get("/customers", function (req, res) {
        
        customerService.getAllCustomers(function (err, customers) {
            if (err) {
                res.status(400).send(err);
            }
            else {
                res.set('Content-Type', 'application/json');
                res.status(200).send(customers);
            }
        });
    });
    
    app.get("/customers/:id", function (req, res) {
        
        var customerId = req.params.id;        
        customerService.getCustomerById(customerId, function (err, customer) {
            if (err) {
                res.status(400).send(err);
            }
            else {
                res.set('Content-Type', 'application/json');
                res.status(200).send(customer);
            }
        });
        
    });
};

Note that we set the status code using the “status” function and the response body using the “send” function. In a real project you’d probably refine the response codes further but this will be fine for demo purposes.

Test

Run the application and navigate to /customers. Depending on how closely you followed part 2 and 3 of this series you can have different responses from the database. In my case I got the following:

[  
   {  
      "_id":"544cbaf1da8014d9145c85e7",
      "name":"Donald Duck",
      "orders":[  

      ]
   },
   {  
      "_id":"544cb61fda8014d9145c85e6",
      "name":"Great customer",
      "orders":[  
         {  
            "item":"Book",
            "quantity":2,
            "itemPrice":10
         },
         {  
            "item":"Car",
            "quantity":1,
            "itemPrice":2000
         }
      ]
   }
]

Copy the _id field and enter it as /customers/[id] in the browser, e.g. /customers/544cb61fda8014d9145c85e6 in the above case. The browser shows the following output:

[  
   {  
      "_id":"544cb61fda8014d9145c85e6",
      "name":"Great customer",
      "orders":[  
         {  
            "item":"Book",
            "quantity":2,
            "itemPrice":10
         },
         {  
            "item":"Car",
            "quantity":1,
            "itemPrice":2000
         }
      ]
   }
]

Great, we have the findAll and findById functionality in place.

We’ll continue with insertions in the next post.

View all posts related to Node here.

Elliot Balynn's Blog

A directory of wonderful thoughts

Software Engineering

Web development

Disparate Opinions

Various tidbits

chsakell's Blog

WEB APPLICATION DEVELOPMENT TUTORIALS WITH OPEN-SOURCE PROJECTS

Once Upon a Camayoc

ARCHIVED: Bite-size insight on Cyber Security for the not too technical.