Building a web service with Node.js in Visual Studio Part 11: PUT and DELETE operations

Introduction

In the previous post we tested the GET operations in our demo web service through the C# console tester application. In this post we’ll look at two other HTTP verbs in action. We’ll insert and test a PUT endpoint to update a customer. In particular we’ll add new orders to an existing customer. In addition we’ll remove a customer through a DELETE endpoint. This post will also finish up the series on Node.js.

We’ll extend the CustomerOrdersApi demo application so have it ready in Visual Studio.

Updating a customer

Let’s start with the repository and work our way up to the controller. Add the following method to customerRepository.js:

module.exports.addOrders = function (customerId, newOrders, next) {
    databaseAccess.getDbHandle(function (err, db) {
        if (err) {
            next(err, null);
        }
        else {
            var collection = db.collection("customers");
            var mongoDb = require('mongodb');
            var BSON = mongoDb.BSONPure;
            var objectId = new BSON.ObjectID(customerId);
            collection.find({ '_id': objectId }).count(function (err, count) {
                if (count == 0) {
                    err = "No matching customer";
                    next(err, null);
                }
                else {
                    collection.update({ '_id': objectId }, { $addToSet: { orders: { $each: newOrders } } }, function (err, result) {
                        if (err) {
                            next(err, null);
                        }
                        else {
                            next(null, result);
                        }
                    });
                }
            });
        }
    });
};

Most of this code should look familiar from the previous posts on this topic. We check whether a customer with the incoming customer ID exists. If not then we return an exception in the “next” callback. Otherwise we update the customer. The newOrders parameter will hold the orders to be added in an array. The update statement may look strange at first but the MongoDb addToSet operator coupled with the “each” operator enables us to push all elements in an array into an existing one. If we simply use the push operator then it will add the newOrders array into the orders array of the customer, i.e. we’ll end up with an array within an array which is not what we want. The each operator will go through the elements in newOrders and add them into the orders array. If the update function goes well then MongoDb will return the number of elements updated which will be assigned to the “result” parameter. We’re expecting it to be 1 as there’s only one customer with a given ID.

Let’s extend customerService.js:

module.exports.addOrders = function (customerId, orderItems, next) {
    if (!customerId) {
        var err = "Missing customer id property";
        next(err, null);
    }
    else {
        customerRepository.addOrders(customerId, orderItems, function (err, res) {
            if (err) {
                next(err, null);
            }
            else {
                next(null, res);
            }
        });
    }
};

Here comes the new function in index.js within the services folder:

module.exports.addOrders = function (customerId, orderItems, next) {
    customerService.addOrders(customerId, orderItems, function (err, res) {
        if (err) {
            next(err, null);
        }
        else {
            next(null, res);
        }
    });
};

…and finally we can build the PUT endpoint in customersController.js:

app.put("/customers", function (req, res) {
        var orders = req.body.orders;
        var customerId = req.body.customerId;
        customerService.addOrders(customerId, orders, function (err, itemCount) {
            if (err) {
                res.status(400).send(err);
            }
            else {
                res.set('Content-Type', 'text/plain');
                res.status(200).send(itemCount.toString());
            }
        });
    });

We read the “orders” and “customerId” parameters from the request body like when we inserted a new customer. We return HTTP 200, i.e. “OK” in case the operation was successful. We also respond with the number of updated items in a plain text format.

Let’s test this from our little tester console application. Add the following method to ApiTesterService.cs:

public int TestUpdateFunction(String customerId, List<Order> newOrders)
{
	HttpRequestMessage putRequest = new HttpRequestMessage(HttpMethod.Put, new Uri("http://localhost:1337/customers/"));
	putRequest.Headers.ExpectContinue = false;
	AddOrdersToCustomerRequest req = new AddOrdersToCustomerRequest() { CustomerId = customerId, NewOrders = newOrders };
	string jsonBody = JsonConvert.SerializeObject(req);
	putRequest.Content = new StringContent(jsonBody, Encoding.UTF8, "application/json");
	HttpClient httpClient = new HttpClient();
	httpClient.Timeout = new TimeSpan(0, 10, 0);
	Task<HttpResponseMessage> httpRequest = httpClient.SendAsync(putRequest,
			HttpCompletionOption.ResponseContentRead, CancellationToken.None);
	HttpResponseMessage httpResponse = httpRequest.Result;
	HttpStatusCode statusCode = httpResponse.StatusCode;

	HttpContent responseContent = httpResponse.Content;
	if (responseContent != null)
	{
		Task<String> stringContentsTask = responseContent.ReadAsStringAsync();
		String stringContents = stringContentsTask.Result;
		if (statusCode == HttpStatusCode.OK)
		{
			return Convert.ToInt32(stringContents);
		}
		else
		{
			throw new Exception(string.Format("No customer updated: {0}", stringContents));
		}
	}
	throw new Exception("No customer updated");
}

…where AddOrdersToCustomerRequest looks as follows:

public class AddOrdersToCustomerRequest
{
	[JsonProperty(PropertyName="customerId")]
	public String CustomerId { get; set; }
	[JsonProperty(PropertyName="orders")]
	public List<Order> NewOrders { get; set; }
}

We set the JSON property names according to the expected values we set in the controller. We send the JSON payload to the PUT endpoint and convert the response into an integer. We can call this method from Program.cs as follows:

private static void TestCustomerUpdate()
{
	Console.WriteLine("Testing item update.");
	Console.WriteLine("=================================");
	try
	{
		ApiTesterService service = new ApiTesterService();
		List<Customer> allCustomers = service.GetAllCustomers();
		Customer customer = SelectRandom(allCustomers);
		List<Order> newOrders = new List<Order>()
		{
			new Order(){Item = "Food", Price = 2, Quantity = 3}
			, new Order(){Item = "Drink", Price = 3, Quantity = 4}
			, new Order(){Item = "Taxi", Price = 10, Quantity = 1}
		};
		int updatedItemsCount = service.TestUpdateFunction(customer.Id, newOrders);
		Console.WriteLine("Updated customer {0} ", customer.Name);
		Console.WriteLine("Updated items count: {0}", updatedItemsCount);
	}
	catch (Exception ex)
	{
		Console.WriteLine("Exception caught while testing PUT: {0}", ex.Message);
	}
	Console.WriteLine("=================================");
	Console.WriteLine("End of PUT operation test.");
}

We first extract all customers, then select one at random using the SelectRandom method we saw in the previous post. We then build an arbitrary orders list and call the TestUpdateFunction of the service. If all goes well then we print the name of the updated customer and the number of updated items which we expect to be 1. Otherwise we print the exception message. Call this method from Main:

static void Main(string[] args)
{
	TestCustomerUpdate();

	Console.WriteLine("Main done...");
	Console.ReadKey();
}

Start the application with F5. As the Node.js project is set as the startup project you’ll see it start in a browser as before. Do the following to start the tester console app:

  • Right-click it in Solution Explorer
  • Select Debug
  • Select Start new instance

You should see output similar to the following:

Testing PUT operation through tester application

If you then navigate to /customers in the appropriate browser window then you should see the new order items. In my case the JSON output looks as follows:

[{"_id":"544cb61fda8014d9145c85e6","name":"Great customer","orders":[{"item":"Food","quantity":3,"itemPrice":2},{"item":"Drink","quantity":4,"itemPrice":3},{"item":"Taxi","quantity":1,"itemPrice":10}]},{"_id":"546b56f1b8fd6abc122cc8ff","name":"hello","orders":[]}]

This was one application of PUT. You can use the same endpoint to update other parts of your domain, e.g. the customer name.

Deleting a customer

We’ll create a DELETE endpoint to remove a customer. We cannot attach a request body to a DELETE request so we’ll instead send the ID of the customer to be deleted in the URL. We saw an example of that when we retrieved a single customer based on the ID.

Here’s the remove function in customerRepository.js:

module.exports.remove = function (customerId, next) {
    databaseAccess.getDbHandle(function (err, db) {
        if (err) {
            next(err, null);
        }
        else {
            var collection = db.collection("customers");
            var mongoDb = require('mongodb');
            var BSON = mongoDb.BSONPure;
            var objectId = new BSON.ObjectID(customerId);
            collection.remove({ '_id': objectId }, function (err, result) {
                if (err) {
                    next(err, null);
                }
                else {
                    next(null, result);
                }
            });
        }
    });
};

Like in the case of UPDATE, MongoDb will return the number of deleted elements in the “result” parameter. Let’s extend customerService.js:

module.exports.deleteCustomer = function (customerId, next) {
    if (!customerId) {
        var err = "Missing customer id property";
        next(err, null);
    }
    else {
        customerRepository.remove(customerId, function (err, res) {
            if (err) {
                next(err, null);
            }
            else {
                next(null, res);
            }
        });
    }
};

…and index.js in the services folder:

module.exports.deleteCustomer = function (customerId, next) {
    customerService.deleteCustomer(customerId, function (err, res) {
        if (err) {
            next(err, null);
        }
        else {
            next(null, res);
        }
    });
};

Finally let’s add the DELETE endpoint to customersController:

app.delete("/customers/:id", function (req, res) {
        var customerId = req.params.id;
        customerService.deleteCustomer(customerId, function (err, itemCount) {
            if (err) {
                res.status(400).send(err);
            }
            else {
                res.set('Content-Type', 'text/plain');
                res.status(200).send(itemCount.toString());
            }
        });
    });

Like above, we return HTTP 200 and the number of deleted items if the operation has gone well.

Back in the tester app let’s add the following test method to ApiTesterService:

public int TestDeleteFunction(string customerId)
{
	HttpRequestMessage getRequest = new HttpRequestMessage(HttpMethod.Delete, new Uri("http://localhost:1337/customers/" + customerId));
	getRequest.Headers.ExpectContinue = false;
	HttpClient httpClient = new HttpClient();
	httpClient.Timeout = new TimeSpan(0, 10, 0);
	Task<HttpResponseMessage> httpRequest = httpClient.SendAsync(getRequest,
			HttpCompletionOption.ResponseContentRead, CancellationToken.None);
	HttpResponseMessage httpResponse = httpRequest.Result;
	HttpStatusCode statusCode = httpResponse.StatusCode;
	HttpContent responseContent = httpResponse.Content;
	if (responseContent != null)
	{
		Task<String> stringContentsTask = responseContent.ReadAsStringAsync();
		String stringContents = stringContentsTask.Result;
		if (statusCode == HttpStatusCode.OK)
		{
			return Convert.ToInt32(stringContents);
		}
		else
		{
			throw new Exception(string.Format("No customer deleted: {0}", stringContents));
		}
	}
	throw new Exception("No customer deleted");
}

This method is very similar to its update counterpart. We send our request to the DELETE endpoint and wait for the response. If all goes well then we return the number of deleted items to the caller. The caller can look like this in Program.cs:

private static void TestCustomerDeletion()
{
	Console.WriteLine("Testing item deletion.");
	Console.WriteLine("=================================");
	try
	{
		ApiTesterService service = new ApiTesterService();
		List<Customer> allCustomers = service.GetAllCustomers();
		Customer customer = SelectRandom(allCustomers);

		int deletedItemsCount = service.TestDeleteFunction(customer.Id);
		Console.WriteLine("Deleted customer {0} ", customer.Name);
		Console.WriteLine("Deleted items count: {0}", deletedItemsCount);
	}
	catch (Exception ex)
	{
		Console.WriteLine("Exception caught while testing DELETE: {0}", ex.Message);
	}

	Console.WriteLine("=================================");
	Console.WriteLine("End of DELETE operation test.");
}

Like above, we retrieve all existing customers and select one at random for deletion.

Call the above method from Main:

static void Main(string[] args)
{			
	TestCustomerDeletion();

	Console.WriteLine("Main done...");
	Console.ReadKey();
}

Start both the web and the console application like we did above. You should see output similar to the following:

Testing DELETE through tester application

There you have it. We’ve built a starter Node.js application with the 4 basic web operations: GET, POST, PUT and DELETE. Hopefully this will be enough for you to start building your own Node.js project.

View all posts related to Node here.

Advertisement

Building a web service with Node.js in Visual Studio Part 10: testing GET actions

Introduction

In the previous post of this series we tested the insertion of new customers through a simple C# console application.

In this post we’ll extend our demo C# tester application to test GET actions.

We’ll be working on our demo application CustomerOrdersApi so have it ready in Visual Studio and let’s get to it.

Testing GET

We created our domain objects in the previous post with JSON-related attributes: Customer and Order. We’ll now use these objects to test the GET operations of the web service. We created a class called ApiTesterService to run the tests for us. Open that file and add the following two methods:

public List<Customer> GetAllCustomers()
{
	HttpRequestMessage getRequest = new HttpRequestMessage(HttpMethod.Get, new Uri("http://localhost:1337/customers"));
	getRequest.Headers.ExpectContinue = false;
	HttpClient httpClient = new HttpClient();
	httpClient.Timeout = new TimeSpan(0, 10, 0);
	Task<HttpResponseMessage> httpRequest = httpClient.SendAsync(getRequest,
			HttpCompletionOption.ResponseContentRead, CancellationToken.None);
	HttpResponseMessage httpResponse = httpRequest.Result;
	HttpStatusCode statusCode = httpResponse.StatusCode;
	HttpContent responseContent = httpResponse.Content;
	if (responseContent != null)
	{
		Task<String> stringContentsTask = responseContent.ReadAsStringAsync();
		String stringContents = stringContentsTask.Result;
		List<Customer> allCustomers = JsonConvert.DeserializeObject<List<Customer>>(stringContents);
		return allCustomers;
	}

	throw new IOException("Exception when retrieving all customers");
}

public Customer GetSpecificCustomer(String id)
{
	HttpRequestMessage getRequest = new HttpRequestMessage(HttpMethod.Get, new Uri("http://localhost:1337/customers/" + id));
	getRequest.Headers.ExpectContinue = false;
	HttpClient httpClient = new HttpClient();
	httpClient.Timeout = new TimeSpan(0, 10, 0);
	Task<HttpResponseMessage> httpRequest = httpClient.SendAsync(getRequest,
			HttpCompletionOption.ResponseContentRead, CancellationToken.None);
	HttpResponseMessage httpResponse = httpRequest.Result;
	HttpStatusCode statusCode = httpResponse.StatusCode;
	HttpContent responseContent = httpResponse.Content;
	if (responseContent != null)
	{
		Task<String> stringContentsTask = responseContent.ReadAsStringAsync();
		String stringContents = stringContentsTask.Result;
		List<Customer> customers = JsonConvert.DeserializeObject<List<Customer>>(stringContents);
		return customers[0];
	}
	throw new IOException("Exception when retrieving single customer.");
}

I know this is a lot of duplication but this will suffice for demo purposes. GetAllCustomers(), like the name implies will retrieve all customers from the Node.js web service. GetSpecificCustomer(string id) will retrieve a single customer by its ID. You may be wondering why we get a list of customers in GetSpecificCustomer. MongoDb will return all matching documents in an array. So if there’s only one matching document then it will be put into an array as well. Therefore we extract the first and only element from that list and return it from the function. We saw something similar in the previous post where we tested the POST operation: MongoDb responded with an array which included a single element, i.e. the one that was inserted.

So we’re testing the two GET endpoints of our Node.js service:

app.get("/customers" ...
app.get("/customers/:id" ...

We can call these methods from Program.cs through the following helper method:

private static void TestCustomerRetrieval()
{
	Console.WriteLine("Testing item retrieval.");
	Console.WriteLine("Retrieving all customers:");
	Console.WriteLine("=================================");
	ApiTesterService service = new ApiTesterService();
	try
	{
		List<Customer> allCustomers = service.GetAllCustomers();
		Console.WriteLine("Found {0} customers: ", allCustomers.Count);
		foreach (Customer c in allCustomers)
		{
			Console.WriteLine("Id: {0}, name: {1}, has {2} order(s).", c.Id, c.Name, c.Orders.Count);
			foreach (Order o in c.Orders)
			{
				Console.WriteLine("Item: {0}, price: {1}, quantity: {2}", o.Item, o.Price, o.Quantity);
			}
		}

		Console.WriteLine();
		Customer customer = SelectRandom(allCustomers);
		Console.WriteLine("Retrieving single customer with ID {0}.", customer.Id);
		Customer getById = service.GetSpecificCustomer(customer.Id);

		Console.WriteLine("Id: {0}, name: {1}, has {2} order(s).", getById.Id, getById.Name, getById.Orders.Count);
		foreach (Order o in getById.Orders)
		{
			Console.WriteLine("Item: {0}, prigetByIde: {1}, quantity: {2}", o.Item, o.Price, o.Quantity);
		}
	}
	catch (Exception ex)
	{
		Console.WriteLine("Exception caught while testing GET: {0}", ex.Message);
	}

	Console.WriteLine("=================================");
	Console.WriteLine("End of item retrieval tests.");
}

The above method first retrieves all customers from the service and prints some information about them: their IDs and orders. Then a customer is selected at random and we test the “get by id” functionality. Here’s the SelectRandom method:

private static Customer SelectRandom(List<Customer> allCustomers)
{
	Random random = new Random();
	int i = random.Next(0, allCustomers.Count);
	return allCustomers[i];
}

Call TestCustomerRetrieval from Main:

static void Main(string[] args)
{			
	TestCustomerUp();

	Console.WriteLine("Main done...");
	Console.ReadKey();
}

Start the application with F5. As the Node.js project is set as the startup project you’ll see it start in a browser as before. Do the following to start the tester console app:

  • Right-click it in Solution Explorer
  • Select Debug
  • Select Start new instance

If all goes well then you’ll see some customer information in the command window depending on what you’ve entered into the customers collection before:

testing GET through tester application

The web service responds with JSON similar to the following in the case of “get all customers”:

[{"_id":"544cbaf1da8014d9145c85e7","name":"Donald Duck","orders":[]},{"_id":"544cb61fda8014d9145c85e6","name":"Great customer","orders":[{"item":"Book","quantity":2,"itemPrice":10},{"item":"Car","quantity":1,"itemPrice":2000}]},{"_id":"546b56f1b8fd6abc122cc8ff","name":"hello","orders":[]}]

…and here’s the raw response body of “get by id”:

[{"_id":"544cbaf1da8014d9145c85e7","name":"Donald Duck","orders":[]}]

Note that this JSON is also an array even so we’ll need to read the first element from the customers list in the GetSpecificCustomer function as noted above.

In the next post, which will finish this series, we’ll take a look at updates and deletions, i.e. the PUT and DELETE operations.

View all posts related to Node here.

Building a web service with Node.js in Visual Studio Part 9: testing POST actions

Introduction

In the previous post we extended the service and repository to handle GET requests. We also managed to connect to the local MongoDb database. We can read all customers from the customers collection and we can also search by ID.

In this post we’ll set up a little test application that will call the Node.js service. We’ll also see how to insert a new customer in the database.

POST operations

Inserting a new resource is generally performed either via PUT or POST operations. Here we’ll adapt the following convention:

  • POST: insert a new resource
  • PUT: update an existing resource

We’ll build up the insertion logic from the bottom up, i.e. we’ll start with the repository. Open the CustomerOrdersApi demo application and locate customerRepository.js. Add the following module.exports statement to expose the insertion function:

module.exports.insertBrandNew = function (customerName, next) {
    databaseAccess.getDbHandle(function (err, db) {
        if (err) {
            next(err, null);
        }
        else {
            //check for existence of customer name
            var collection = db.collection("customers");
            collection.find({ 'name': customerName }).count(function (err, count) {
                if (err) {
                    next(err, null);
                }
                else {
                    if (count > 0) {
                        err = "Customer with this name already exists";
                        next(err, null);
                    }
                    else {
                        //insert new customer with empty orders array
                        var newCustomer = {
                            name : customerName
                            , orders : []
                        };
                        collection.insert(newCustomer, function (err, result) {
                            if (err) {
                                next(err, null);
                            }
                            else {
                                next(null, result);
                            }
                        });                        
                    }
                }
            });
        }
    });

Let’s go through this function step by step. The function accepts the “next” callback which we’re familiar with by now. It also accepts a parameter to hold the name of the new customer. The idea is that we’ll enter a new customer with an empty orders array so there’s no parameter for the orders.

You’ll recognise the top section of the function body, i.e. where we get hold of the database. If that process generates an error then we return it. Otherwise we continue with checking if there’s a customer with that name. We don’t want to enter duplicates so we check for the existence of the customer name first. The “count” function which also accepts a callback will populate the “count” parameter with the number of customers found by customerName. If “count” is larger than 0 then we return an error. Otherwise we construct a new customer object and insert it into the customers collection. The “insert” function accepts the customer object in JSON format and of course a callback with any error and result parameters. The “result” parameter in this case will hold the new customer we inserted into the database in case there were no exceptions. We return that object to the caller using the “next” callback.

customerService.js will also be extended accordingly. Add the following function to that file:

module.exports.insertNewCustomer = function (customerName, next) {
    if (!customerName) {
        var err = "Missing customer name property";
        next(err, null);
    }
    else {
        customerRepository.insertBrandNew(customerName, function (err, res) {
            if (err) {
                next(err, null);
            }
            else {
                next(null, res);
            }
        });
    }
};

We check if customerName is null. If not then we call upon the repository. index.js in the services folder will be extended as well with a new function:

module.exports.insertNewCustomer = function (customerName, next) {
    customerService.insertNewCustomer(customerName, function (err, res) {
        if (err) {
            next(err, null);
        }
        else {
            next(null, res);
        }
    });
};

Finally, we need to add a new route to the customersController.js:

app.post("/customers", function(req, res) {
        var customerName = req.body.customerName;
        customerService.insertNewCustomer(customerName, function (err, newCustomer) {
            if (err) {
                res.status(400).send(err);
            }
            else {
                res.set('Content-Type', 'application/json');
                res.status(201).send(newCustomer);
            }
        });

    });

POST actions are handled through the “post” method, just like GET actions are handled by “get”. We’ll need to POST to the “/customers” endpoint and send in the customer name in the request body. The request body can be retrieved using the “body” property of the request object. If the request body is JSON formatted then the individual JSON properties can be extracted like shown in the example. We then call upon the appropriate function in customerService. In case the customer was inserted we respond with HTTP 201, i.e. “Created” and return the new object in the response.

There’s one more thing we need to do. If we tested this code like it is now then req.body would be void or “undefined”. We need to add another middleware from npm to make the request body readable. Right-click “npm” and insert the following node.js middleware called “body-parser”:

body-parser middleware in NPM

We’ll need to reference this package in server.js as follows:

var http = require('http');
var express = require('express');
var controllers = require('./controllers');
var bodyParser = require('body-parser')

var app = express();
app.use(bodyParser.urlencoded({ extended: false }))
app.use(bodyParser.json())

controllers.start(app);

var port = process.env.port || 1337;
http.createServer(app).listen(port);

Testing with code

Let’s test what we have so far from a simple .NET application. Add a new C# Console application to the solution and call it ApiTester. Add references to the following libraries:

  • System.Net
  • System.Net.Http

These are necessary to make HTTP calls to the Node.js web service. We’ll be communicating a lot using JSON strings so add the following JSON package through NuGet:

json.net nuget

Next we’ll insert two C# classes that represent our thin domain layer, Customer and Order:

public class Customer
{
	[JsonProperty(PropertyName = "_id")]
	public String Id { get; set; }
	[JsonProperty(PropertyName="name")]
	public String Name { get; set; }
	[JsonProperty(PropertyName="orders")]
	public List<Order> Orders { get; set; }
}
public class Order
{
	[JsonProperty(PropertyName = "item")]
	public string Item { get; set; }
	[JsonProperty(PropertyName = "quantity")]
	public int Quantity { get; set; }
	[JsonProperty(PropertyName = "itemPrice")]
	public decimal Price { get; set; }
}

The JsonProperty attributes indicate the name of the JSON property that will be mapped against the C# object property. This mapping is necessary otherwise when we read the Customer objects from the service then properties in the JSON response must be translated into the properties of our domain objects.

Next add a new class called ApiTestService to the console app. Insert the following method to it that will call the Node.js web service to insert a new customer:

public Customer TestCustomerCreation(String customerName)
{
	HttpRequestMessage postRequest = new HttpRequestMessage(HttpMethod.Post, new Uri("http://localhost:1337/customers/"));
	postRequest.Headers.ExpectContinue = false;
	InsertCustomerRequest req = new InsertCustomerRequest() { CustomerName = customerName };
        string jsonBody = JsonConvert.SerializeObject(req);
	postRequest.Content = new StringContent(jsonBody, Encoding.UTF8, "application/json");
	HttpClient httpClient = new HttpClient();
	httpClient.Timeout = new TimeSpan(0, 10, 0);
	Task<HttpResponseMessage> httpRequest = httpClient.SendAsync(postRequest,
			HttpCompletionOption.ResponseContentRead, CancellationToken.None);
	HttpResponseMessage httpResponse = httpRequest.Result;
	HttpStatusCode statusCode = httpResponse.StatusCode;

	HttpContent responseContent = httpResponse.Content;
	if (responseContent != null)
	{
		Task<String> stringContentsTask = responseContent.ReadAsStringAsync();
		String stringContents = stringContentsTask.Result;
		if (statusCode == HttpStatusCode.Created)
		{
			List<Customer> customers = JsonConvert.DeserializeObject<List<Customer>>(stringContents);
			return customers[0];
		}
		else
		{
			throw new Exception(string.Format("No customer created: {0}", stringContents));
		}
	}
	throw new Exception("No customer created");
}

Note that you may need to change the port number in the URI if you have something different. We call the web service and send in our customer creation request as JSON in the request body. We then check the response message. If the response code is 201, i.e. Created then we translate the JSON string into a list of customers – MongoDb will respond with an array and it will include a single element. We extract the first and only element from the list and return it from the function. Otherwise we throw an exception. InsertCustomerRequest is just a data transfer object to convey our message:

public class InsertCustomerRequest
{
	[JsonProperty(PropertyName="customerName")]
	public String CustomerName { get; set; }
}

We set the JSON property name to “customerName” so that the web service will find it through req.body.customerService as we saw above.

Insert the following method to Program.cs:

private static void TestCustomerInsertion()
{
	Console.Write("Customer name: ");
	string customerName = Console.ReadLine();
	ApiTesterService service = new ApiTesterService();
	try
	{
		Customer customer = service.TestCustomerCreation(customerName);
		if (customer != null)
		{
			Console.WriteLine("New customer id: {0}", customer.Id);
		}
	}
	catch (Exception ex)
	{
		Console.WriteLine(ex.Message);
	}
}

This is very basic: we enter a customer name and print out the ID of the new customer or the exception that was thrown. Call this method from Main:

static void Main(string[] args)
{			
	TestCustomerInsertion();
	Console.WriteLine("Main done...");
	Console.ReadKey();
}

Start the application with F5. As the Node.js project is set as the startup project you’ll see it start in a browser as before. Do the following to start the tester console app:

  • Right-click it in Solution Explorer
  • Select Debug
  • Select Start new instance

Enter a customer name when prompted. If all goes well then you’ll get the ID of the new customer in MongoDb:

New customer added Id output from MongoDb

Test again with the same name, it should fail:

No customer created error message from nodejs

In the next post we’ll extend our test application to call the GET endpoints of the web service.

View all posts related to Node here.

Building a web service with Node.js in Visual Studio Part 8: connecting to MongoDb

Introduction

In the previous post we gave some structure to our Node.js project by way of a service and a repository. We also discussed the role of callbacks in asynchronous code execution, i.e. the “next” parameter. However, we still return some hard-coded JSON to all queries. It’s time to connect to the MongDb database we set up in part 2. The goal of this post is to replace the following code…

module.exports.getAll = function () {
    return { name: "Great Customer", orders: "none yet" };
};

module.exports.getById = function (customerId) {
    return { name: "Great Customer with id " + customerId, orders: "none yet" }
};

…with real DB access code.

In addition, we’ll operate with callbacks all the way from the controller to the repository. It might be overkill for this small demo project but I wanted to demonstrate something that’s similar to the await-async paradigm in .NET. If you’ve worked with the await-async keywords in .NET then you’ll know that once you decorate a method with “async” then the caller of that method will be “async” as well, and so on all the way up on the call stack.

MongoDb driver

There’s no built-in library in Node to access data in a database. There’s however a number of drivers available for download through the Node Package Manager. Keep in mind that we’re still dealing with JSON objects so forget the mapping convenience you’ve got used to while working with Entity Framework in a .NET project. Some would on the other hand say that this is actually a benefit because we can work with data in a raw format without the level of abstraction imposed by an object relational mapper. So whether or not this is a (dis)advantage depends on your preferences.

We’ll go for a simplistic driver for MongoDb which allows us to interact with the database at a low level, much like we did through the command line interface in parts 2 and 3 of this series. Open the project we’ve been working on and right-click npm. Select the driver called “mongodb”:

MongoDb driver for NodeJs

The central access

Add a new file called “access.js” to the repositories folder. Insert the following code in the file:

var mongoDb = require('mongodb');
var connectionString = "mongodb://localhost:27017/customers";
var database = null;

module.exports.getDbHandle = function (next) {
    if (!database) {
        mongoDb.MongoClient.connect(connectionString, function (err, db) {
            if (err) {
                next(err, null);
            }
            else {
                database = db;
                next(null, database);
            }
        });
    }
    else {
        next(null, database);
    }
};

The purpose of this file is to provide universal access to our MongoDb database to all our repositories. We first declare the following:

  • We import the mongodb library
  • We declare the connection string which includes the name of our database, i.e. “customers”
  • We set up a field that will be a reference to the database

The getDbHandle function accepts a callback function called “next” which we’re familiar with by now. We then check if the “database” field is null – we don’t want to re-open the database every time we need something, MongoDb handles connection pooling automatically. If “database” has been set then we simply return it using the “next” callback and pass in null for the exception.

Otherwise we use the “connect” function of the mongodb library to connect to the database using the connection string and a callback. The connect function will populate the “err” parameter with any error during the operation and the “db” parameter with a handle to the database. As we saw before we call the “next” callback with the error if there’s one otherwise we set our “database” field and pass it back to the “next” callback.

The new customer repository

The updated customer repository – customerRepository.js – looks as follows:

var databaseAccess = require('./access');

module.exports.getAll = function (next) {
    databaseAccess.getDbHandle(function (err, db) {
        if (err) {
            next(err, null);
        }
        else {
            db.collection("customers").find().toArray(function (err, res) {
                if (err) {
                    next(err, null);
                }
                else {
                    next(null, res);
                }
            });
        }
    });
};

module.exports.getById = function (customerId, next) {
    databaseAccess.getDbHandle(function (err, db) {
        if (err) {
            next(err, null);
        }
        else {
            var mongoDb = require('mongodb');
            var BSON = mongoDb.BSONPure;
            var objectId = new BSON.ObjectID(customerId);
            db.collection("customers").find({ '_id': objectId }).toArray(function (err, res) {
                if (err) {
                    next(err, null);
                }
                else {
                    next(null, res);
                }
            });
        }
    });
};

We import access.js to get access to the DB handle. The getAll function accepts a “next” callback and calls upon getDbHandle we’ve seen above. If there’s an error while opening the database we populate the error field of “next” and pass “null” as the result. Otherwise we can go on and query the database. We need to reference the “customers” collection within the database. Our goal is to find all customers and from part 2 of this series we know that the “find()” function with no parameters will do just that. So we call find(). We’re not done as we need to turn it into an array which also accepts a callback with the usual signature: error and result. As usual, if there’s an error, we call next with error and null otherwise we set null as the error and pass the result. If all went well then “res” will include all customers as JSON.

The getById function follows the same setup. Part 2, referred to above, showed how to pass a query to the find() method so this should be familiar. The only somewhat complex thing is that we need to turn the incoming “customerId” string parameter into an ObjectId object which MongoDb understands. We then pass the converted object id as the search parameter of the “_id” field.

Calling the repository from the service

The updated customerService code follows the same callback passing paradigm as we saw above:

var customerRepository = require('../repositories/customerRepository');

module.exports.getAllCustomers = function (next) {
    customerRepository.getAll(function (err, res) {
        if (err) {
            next(err, null);
        }
        else {
            next(null, res);
        }
    });
};

module.exports.getCustomerById = function (customerId, next) {
    customerRepository.getById(customerId, function (err, res) {
        if (err) {
            next(err, null);
        }
        else {
            next(null, res);
        }
    });
};

This doesn’t add much functionality to the service apart from calling the repository. Later on when we have the POST/PUT/DELETE functions in place we’ll be able to add validation rules.

index.js in the services folder will be updated accordingly:

var customerService = require('./customerService');

module.exports.getAllCustomers = function (next) {
    customerService.getAllCustomers(function (err, res) {
        if (err) {
            next(err, null);
        }
        else {
            next(null, res);
        }
    });
};

module.exports.getCustomerById = function (id, next) {
    customerService.getCustomerById(id, function (err, res) {
        if (err) {
            next(err, null);
        }
        else {
            next(null, res);
        }
    });
};

Updating the controller

Finally, we’ll extend the controller function to respond with a 400 in case of an error:

var customerService = require('../services');

module.exports.start = function (app) {
    app.get("/customers", function (req, res) {
        
        customerService.getAllCustomers(function (err, customers) {
            if (err) {
                res.status(400).send(err);
            }
            else {
                res.set('Content-Type', 'application/json');
                res.status(200).send(customers);
            }
        });
    });
    
    app.get("/customers/:id", function (req, res) {
        
        var customerId = req.params.id;        
        customerService.getCustomerById(customerId, function (err, customer) {
            if (err) {
                res.status(400).send(err);
            }
            else {
                res.set('Content-Type', 'application/json');
                res.status(200).send(customer);
            }
        });
        
    });
};

Note that we set the status code using the “status” function and the response body using the “send” function. In a real project you’d probably refine the response codes further but this will be fine for demo purposes.

Test

Run the application and navigate to /customers. Depending on how closely you followed part 2 and 3 of this series you can have different responses from the database. In my case I got the following:

[  
   {  
      "_id":"544cbaf1da8014d9145c85e7",
      "name":"Donald Duck",
      "orders":[  

      ]
   },
   {  
      "_id":"544cb61fda8014d9145c85e6",
      "name":"Great customer",
      "orders":[  
         {  
            "item":"Book",
            "quantity":2,
            "itemPrice":10
         },
         {  
            "item":"Car",
            "quantity":1,
            "itemPrice":2000
         }
      ]
   }
]

Copy the _id field and enter it as /customers/[id] in the browser, e.g. /customers/544cb61fda8014d9145c85e6 in the above case. The browser shows the following output:

[  
   {  
      "_id":"544cb61fda8014d9145c85e6",
      "name":"Great customer",
      "orders":[  
         {  
            "item":"Book",
            "quantity":2,
            "itemPrice":10
         },
         {  
            "item":"Car",
            "quantity":1,
            "itemPrice":2000
         }
      ]
   }
]

Great, we have the findAll and findById functionality in place.

We’ll continue with insertions in the next post.

View all posts related to Node here.

Building a web service with Node.js in Visual Studio Part 7: service, repository and “next”

Introduction

In the previous post we looked at how to create controllers and routes in our Node.js project. We saw the controllers are normal JS classes that don’t need to follow any naming convention. We still give them names like “customersController” to indicate their purpose. We also saw what “module.exports” does and how it can be used to add functionality to a JS object. Lastly we looked at the role of “index.js” to reference a whole folder using the “require” function.

We ended up with a customers controller that directly sends back a JSON object. In a layered project, however, we should separate the roles as much as possible into controllers, services and repositories. The repository is responsible for data store operations such as insertions and queries. Services are the glue between controllers and repositories as controller normally do not contact the data store directly.

In a .NET project we’d hide the services and repositories behind abstractions like interfaces. In JS we don’t have interfaces so we’ll go for a simplified version. Still, the goal is to separate the roles and not let the controllers drive data access.

However, before that we need to look at asynchronous code execution in Node.js.

The “next” parameter

Have you checked out OWIN and Katana in .NET 4.5? If not, then you as a .NET developer should. It will help you understand the role of the “next” parameter in Node.js. There’s a course on OWIN on this blog starting here. Skim through it to get the idea. Take special note of the app builder object and the role of the “next” parameter in the following function:

appBuilder.Use(async (env, next)

In short “next” refers to the next action in the execution chain. The “next” parameter usually represents a function which can accept a number of parameters. Often it accepts an error parameter and some other objects that are returned from the callback function.

Note that “next” is only a parameter name. It could be called donaldduck or mickeymouse but apparently it’s quite often called “next” by default in Node.js projects to indicate its role to the caller.

The service

In this section we’ll first want to move the following dummy data extraction code from the controller to a service:

res.send({ name: "Great Customer with id " + customerId, orders: "none yet" });

Once that’s working we’ll move it further down to a repository. The controller we’ll interact with the service only.

Insert a new folder called “services” into the project and into that folder two JS files: one called index.js and another called customerService.js. Remember index.js from the previous post? It is a default file in a folder so that callers can refer to the folder through the require function without having exact knowledge of the folder’s contents beforehand.

We’ll start with customerService.js which at first won’t call any repository, we’ll do that in the next section. We must first understand how to chain the bits and pieces together. customerService.js is very simple:

module.exports.getAllCustomers = function () {
    return { name: "Great Customer", orders: "none yet" };  
};

module.exports.getCustomerById = function (customerId) {
    return { name: "Great Customer with id " + customerId, orders: "none yet" }
};

This should be fairly straightforward by now: we expose two methods, one for getting all customers and another for getting a single customer by id.

index.js is somewhat more exciting:

var customerService = require('./customerService');

module.exports.getAllCustomers = function (next) {
    next(null, customerService.getAllCustomers());
};

module.exports.getCustomerById = function (id, next) {
    next(null, customerService.getCustomerById(id));
};

We first make a reference to customerService. We then build the two methods that in turn will call the service functions in at first sight a funny way. We declare that the getAllCustomers function accepts a parameter called next. By the way we call “next” we imply the signature of “next” that must be passed into the function: a first parameter which we simply set to “null” here and the result of the customerService.getAllCustomers() operation. The first parameter will be an error parameter which is simply set to null for right now, we’ll keep it as a placeholder.

So in fact we’ll be soon passing in a function callback into the getAllCustomers function. By “next” we indicate to the caller that this might need to be executed asynchronously: data retrieval usually means consulting a database or a web service which involves some overhead. While that operation is ongoing the idle threads can perform something else.

Note, however, that in the above case we simply call “next” in a synchronous manner first to keep the example simple. This test method doesn’t involve any database operation yet but this implementation will change in future posts.

The getCustomerById is similar but we also pass in a customer ID, not only the next function to be called.

So how are these used from the controller? Consider the following code in customersController.js:

var customerService = require('../services');

module.exports.start = function (app) {
    app.get("/customers", function (req, res) {
        res.set('Content-Type', 'application/json');
        customerService.getAllCustomers(function (err, customers) {
            if (err) {

            }
            else {
                res.send(customers);
            }
        });
    });
    
    app.get("/customers/:id", function (req, res) {
        var customerId = req.params.id;
        res.set('Content-Type', 'application/json');
        customerService.getCustomerById(customerId, function (err, customer) {
            if (err) {
            }
            else {
                res.send(customer);
            }
        });
        
    });
};

We first reference the services folder. Notice the two dots as we need to leave the controllers folder first. We still set the routes as before but respond in a different manner which at first can seem confusing. We call getAllCustomers in the /customers endpoint and pass in a function for the “next” parameter. We know that the function needs to follow a certain signature: an error an a result placeholder. In the body of the implementation of “next” we check if there was any error – to be implemented later – otherwise we send back the result from the operation, i.e. “customers”. getCustomerById is the same but we also pass in the customer ID besides the “next” function implementation.

Run the application, navigate to /customers and /customers/123 and both should work as before.

The repositories

This step might be overkill at this stage but let’s see how a repository can be implemented. Add a new folder called “repositories” and a file called customerRepository into it:

module.exports.getAll = function () {
    return { name: "Great Customer", orders: "none yet" };
};

module.exports.getById = function (customerId) {
    return { name: "Great Customer with id " + customerId, orders: "none yet" }
};

This is the same as customerService above, only the exposed method names are different. customerService.js can be updated to:

var customerRepository = require('../repositories/customerRepository');

module.exports.getAllCustomers = function () {
    return customerRepository.getAll();  
};

module.exports.getCustomerById = function (customerId) {
    return customerRepository.getById(customerId);
};

There’s no need to change any other file.

Run the application and check if the above routes still work as before, they should.

In the next post we’ll connect to our customers database in MongoDb.

View all posts related to Node here.

Building a web service with Node.js in Visual Studio Part 6: module.exports and controller basics

Introduction

In the previous post in this series we looked at the Node Package Manager and the basics in Express.js. We added a couple of endpoints to our web app and sent back some responses: HTML and JSON. We could in theory build our entire application on that but it’s of course not very wise to list all the endpoints and responses in server.js.

Instead we’d like to add routes and controllers just like in a “normal” MVC app. We can certainly do that in a Node.js project. We’ll add the routes and controllers in their separate JS files to keep the project modular.

There are no conventions available here. If you call /customers then the call won’t magically be routed to the customers controller. There’s no object called “controller” that you can inherit from to mark your JS object as controller. We’ll need to declare all of that ourselves. Therefore you can name your controllers as you wish such as “customersController” or “mickeyMouse”, it won’t make any difference from a programmatic point of view. However, we’ll follow the commonly accepted MVC naming rules so that we know which JS file does what.

module.exports

Before we continue though there’s something else you need to be familiar with in Node: the “module” and the “exports”.

We saw the “require” method before but we haven’t discussed what it returns. We just assigned its return value to a variable, like…

var http = require('http');
var express = require('express');

The require method will return an object that represents the publicly available elements of the module behind the library referenced by the string input variable. Within the module there can be one or more public fields and methods that are exported so that callers of the module will have access to them. You expose the public functions in a module using the “module.exports” statement. So “module.exports” is sort of a bucket where you can add the public methods and properties of a module. Let’s see an example.

Add a new folder called “modules” to the project and add a new JS file called “module_example.js”. Let’s add a public function to it:

module.exports.greatFunction = function()
{
    return "Greetings from great function!";
};

We’re stating that the module exposes – or exports – a function called greatFunction which returns a string.

You can call this function from server.js as follows:

var exampleModule = require('./modules/module_example.js');
var ret = exampleModule.greatFunction();

Note that we referenced “module_example” by a relative file path as opposed to modules installed via the npm which only required a simple name. Once we have a reference to the module in “module_example.js” we can call its exposed “greatFunction” function.

We can assign a default function to our module. Add the following code to module_example.js:

module.exports = function()
{
    return "Greetings from a single method module!";
};

Note that we didn’t give any name to the function. We can call this as follows from server.js:

var ret = exampleModule();

So we simply call exampleModule as if it is a method.

We can also have properties in a module:

module.exports.defaultGreeting = "Hello from property";

…and here’s how you can expose a constructor by modifying the default function above:

module.exports = function () {
    this.greeting = "Hello from constructor!";
};

…and call it from server.js as follows:

var ex = new exampleModule();
var greeting = ex.greeting;

Your own modules can also have their own “require” statements to reference their own dependencies and those dependencies can reference other dependencies as well. Keep in mind that server.js is the entry point of the application, like Global.asax in a .NET web app. It is run once upon application start and executes the dependency code it finds in form of those require statements. This execution only occurs once upon application start-up and then all results from the require statements are cached.

Controllers

Add a new folder called “controllers” to the project. Right-click it, select Add… New Item… and add a new JavaScript file called customersController.js. As mentioned in the introduction there’s no special programmatic reason to call a controller a “controller” in Node.js but it’s good to follow the MVC conventions anyway to make navigating the project modules easier.

We’ll expose – or think of this as “export” to make the naming of module.exports easier – our get, post etc. methods for the customersController using module.exports as we’ve seen above. We’ve seen an example of declaring the GET method function for our application in server.js:

var app = express();

app.get("/customers", function (req, res) {
    res.set('Content-Type', 'application/json');
    res.send({name: "Great Customer", orders: "none yet"});
});

The goal is to move that into the controller. The initial code in customersController will be as follows:

module.exports.start = function (app) {
    app.get("/customers", function (req, res) {
        res.set('Content-Type', 'application/json');
        res.send({ name: "Great Customer", orders: "none yet" });
    });
};

We expose a “start” method which accepts the app object and assign the GET method within the method body. You’ll recognise the GET method implementation itself.

Back in server.js we’ll reference the entire “controllers” folder using the require method. Add the following code to server.js just below the require(‘express’) statement:

var controllers = require('./controllers');

This statement will automatically reference and execute all elements within the controllers folder so we don’t need to reference them one by one. For this to work though there will need to be a file called index.js in that folder. We’ll use index.js to initialise all controllers so it works like a routing engine. Add a new file called index.js to the “controllers” folder and add the following content to the file:

var customerController = require('./customersController');

module.exports.start = function (app) {
    customerController.start(app);
};

Index.js will need to be extended as you add new controllers and call their start methods in turn. This way we can have one central file to take care of the initialisation of all controllers. Server.js will only need to know about “controllers” in general through an indirect reference to index.js in the require(‘./controllers’) statement. Server.js will be able to call “start” on the controllers object which will call the start method in the index.js file. Which in turn will initialise the controllers.

Back in server.js we can insert the following code after app = server():

controllers.start(app);

Just to recap we have the following code in server.js where I’ve omitted the example code of the “module.exports” section:

var http = require('http');
var express = require('express');
var controllers = require('./controllers');

var app = express();

controllers.start(app);

var port = process.env.port || 1337;
http.createServer(app).listen(port);

Run the application, navigate to /customers and you should see the same JSON response as before. However, now we have the skeleton of a modularised Node.js project with controllers and a central file for initialising the controllers.

How do we add query parameters to the route? Open customersController again and add a new route within the body of the start function:

app.get("/customers/:id", function (req, res) {
        var customerId = req.params.id;

        res.set('Content-Type', 'application/json');
        res.send({ name: "Great Customer with id " + customerId, orders: "none yet" });
    });

We attach the place of the query parameter with a colon and read it using req.params, easy as that.

Run the application, navigate to /customers/123 and you should see the following JSON on the screen:

{"name":"Great Customer with id 123","orders":"none yet"}

In the next post we’ll discuss asynchronous code execution a bit more and we’ll see how to organise our code into services and repositories.

View all posts related to Node here.

Building a web service with Node.js in Visual Studio Part 5: npm and Express.js basics

Introduction

In the previous post we discussed the absolute basics of a Node web application. It really didn’t have a lot of functions: it listened to every single incoming request and responded with some properties of the request. We also come to the conclusion that those functions were quite limited. We have no routes, no controllers, no nothing, so working with code on that level is really cumbersome.

In this post we’ll discuss the package manager of Node i.e. the Node Package Manager npm. We’ll also go through the basics of Express.js which is a web framework built on top of Node.

We’ll build upon the CustomerOrdersApi so have it ready in Visual Studio – or in the IDE of your preference. You can write Node.js in Notepad if you wish, but Visual Studio provides some extras, like debugging, break points, intellisense etc.

Node Package Manager

One element we haven’t discussed so far is the little node called “npm”:

npm node in visual studio

npm stands for Node Package Manager. It is the Node.js equivalent of NuGet in .NET and Maven in Java. It is a tool for managing dependencies that are not part of standard Node.js. E.g. the “http” package we saw previously is included by default so we didn’t have to import it. There are a number of packages that are not installed by default and they need to be imported if you need to use them.

The imported packages will be referenced in the file called package.json, visible in the above screenshot. This function of package.json is similar to pom.xml in a Java Maven project and packages.config in .NET. The dependencies in package.json will be read by the IDE and install the necessary dependencies after opening the project for the first time.

This is valid for Node.js, .NET and Maven as well: it’s not necessary to send the dependencies around to other developers. You can upload the project and package.json to a central source, like SVN or Git. Another developer who downloads it from source control can look into package.json and install the dependencies or let Visual Studio take care of it automatically.

So what external libraries are available? This site lists all of them. Packages are referenced by names such as “grunt”, “express” or “underscore”.

Manual installation

You can call npm in a command prompt. If you open a command prompt and type “npm” then, if you followed the Node.js installation in part 1 then you should get the usage help message of npm:

npm in command window

Close the command prompt, this was just a test to see of “npm” works. Right-click the project name in Visual Studio and click “Open Command Prompt Here”. Enter the following command to install the package called “grunt”:

npm install grunt

If everything goes well then you’ll see that grunt is installed along with all dependencies of grunt:

npm manual install grunt output

The dependencies will be listed under the npm node and stored in a folder called “node_modules”:

grunt visible as dependency in project

However, what’s that “not listed in package json” about? Open package.json and you’ll see nothing about dependencies there, right? This means that you can use grunt in your project but other developers won’t be aware of it just by looking at the package.json file. They will see some reference to it in the code, like request(‘grunt’) and then they’ll need to fetch the dependency via npm themselves.

Let’s uninstall grunt by the following command:

npm uninstall grunt

Grunt will silently disappear from the project.

The –save flag will ensure that package.json is filled in:

npm install grunt –save

Check package.json and you’ll see a field called “dependencies”:

"dependencies": {
"grunt": "^0.4.5"
}

Uninstall grunt before we continue:

npm uninstall grunt –save

Using the npm tool

There’s a more visual approach to all this. Right-click “npm” in Solution Explorer and select “Install New npm Packages”. This will first load all available packages and then open a window similar to the following:

npm visual tool

Search for “grunt”, click on the package name to select it – the present version of grunt is 0.4.5, you might see something higher – and then click “Install package” leaving all other default values untouched. Grunt will be installed again.

If you right-click npm again then you’ll see options like “Update npm Packages” and “Install Missing npm Packages”. These do exactly what’s expected based on their descriptions: update installed npm packages and install any missing ones based on the package.json file. So another developer who downloads your project can simply run these functions to get the necessary dependencies.

If you right-click the grunt pckage under npm you’ll be able to uninstall it by selecting “Uninstall npm package(s)”.

Installing Express.js

Express.js is a web framework comparable to ASP.NET on top of IIS. It is a dependency that can be downloaded via npm like we did with grunt. Right-click “npm” and select Install New npm Packages. Search for “express” and select the first result that comes up:

ExpressJs package in NPM

By the time you read this post there will almost certainly be an updated package but hopefully there won’t be large code changes. Clicke “Install package” and then Express and its dependencies should be installed:

ExpressJs installed with dependencies

Check package.json as well, “express” should appear in the dependencies array. The node_modules folder was also populated with Express with its own node_modules folder for its dependencies:

ExpressJs in node_modules folder with its dependencies

Using Express.js

Now that we have installed Express.js let’s use it. We’ll go through the statements first and re-write server.js at the end.

The first step is to import Express into our project like we did with “http” using the require method:

var express = require('express');

“express” can be executed like a function and it will return a reference to our application, like in a singleton:

var app = express();

We can take this reference and pass it into the createServer method in place of the current callback. Here’s the current state of server.js:

var http = require('http');
var express = require('express');
var app = express();
var port = process.env.port || 1337;
http.createServer(app).listen(port);

The application doesn’t do anything yet but this step was necessary to have Express drive our web app.

Express takes care of requests based on the web methods, like GET, POST etc. and the URL. Express exposes methods to handle the HTTP verbs and the methods are named exactly as those verbs: get, post etc. They accept a URL and a callback function which will be similar to what we had before. Add the following code after app = express():

app.get("/", function (req, res) {
    res.writeHead(200, { 'Content-Type': 'text/html' });
    res.end("<html><body><p>Hello from the application root</p></body></html>");
});

The above code ensures that we respond to GET requests to the root of the web app which is “/”. Run the application and you should see the message in the browser. You can test other URLs, such as /index, they won’t work, you’ll get an exception message on the screen saying like “Cannot GET [url]”.

Notice that we’re sending HTML back whereas RESTful APIs normally respond with JSON. Let’s see how we can respond with Json. Insert the following code after the get method shown above:

app.get("/customers", function (req, res) {
    res.writeHead(200, { 'Content-Type': 'application/json' });
    res.end('{name: "Great customer", orders: "none yet"}');
});

Run the application and navigate to /customers. You should see the JSON as shown in the code. There’s in fact a more concise way of sending a response using the “send” method:

app.get("/customers", function (req, res) {
    res.set('Content-Type', 'application/json');
    res.send({name: "Great Customer", orders: "none yet"});
});

The writeHead method was replaced by the set method. Also, note the lack of apostrophes around the JSON object, i.e. we’re not specifying the JSON as a string. Re-run the app to make sure that everything works. You’ll see the JSON object as string in the browser window. In fact we can even leave out the set method in this case and Express will infer from the object type that we’re sending back a JSON object. However, the set method can be used to put other headers types to the response.

In the next post we’ll look at the module.exports function and controller basics.

View all posts related to Node here.

Building a web service with Node.js in Visual Studio Part 4: Node.js web service basics

Introduction

In the previous post we practiced some basic MongoDb queries that will be useful in our project later on. In this post we’ll start looking at the basic structure of a Node.js web service.

The goal of this post is to go through some code basics in Node.js so we’ll take an easy start.

Keep in mind that Node.js supports asynchronous code execution by default. You don’t need to add any special code for that. In .NET we can add support using the await-async pattern but currently the starting point for an MVC.NET project template is synchronous code execution.

A consequence of asynchronous code execution in Node.js is the ubiquity of callbacks. We’ll see callbacks passed into a large number of methods. The presence of callbacks will make sure that the available threads can be put to work and serve requests and methods without delegating all work to a single thread. The end result will be a better utilisation of threads and CPU. We saw similar behaviour in .NET in the post on async-await referenced above.

Selecting the template

We installed the ingredients necessary for building a Node.js project in Visual Studio. Open Visual Studio 2013 Pro and let’s see which template can be useful for us:

Available NodeJs templates in Visual Studio

We can immediately rule out all Azure applications. We don’t have any existing Node.js apps so option #2 can be ruled out too. There are two templates that install the “Express” framework. Recall that Express is a web application framework upon Node, so it’s something like ASP.NET running on IIS. However, those templates implement views and a lot more code that we’d need to discuss. Since we’re building a web service we’d need to start by removing code. I’d like to avoid that. Also, adding new code and providing an explanation is easier at this beginning level than saying “we don’t need that, you can erase it”.

So we have two remaining “blank” applications: a console and a web app. We saw the console app in the first post of this series. We could actually go with that template and build upon it from scratch. However, we’ll take the other template instead so that we have something to start with at least. It won’t install Express.js but we can install it ourselves – it will be a good occasion to take a look at the Node Package Manager tool.

Select the “Blank Node.js Web Application” template, call the project CustomerOrdersApi and press OK.

Starting point

Most elements look familiar from the first part. We have a server.js which is the entry point to the application, similar to Program.cs in a .NET console app. It currently has the following contents:

var http = require('http');
var port = process.env.port || 1337;
http.createServer(function (req, res) {
    res.writeHead(200, { 'Content-Type': 'text/plain' });
    res.end('Hello World\n');
}).listen(port);

Even before going through this code let’s run it. Press F5 as you normally do in Visual Studio. This should open a browser, navigate to http://localhost:1337/ and that will print “Hello World” as plain text on the screen.

Let’s consider the bits of code in server.js which is the entry point of the application, much like Global.asax.cs in an ASP.NET web app:

var http = require('http');

“require” is the node.js equivalent of a using statement in C#. We import the package called “http” which includes the tools for handling HTTP calls. This package is part of the standard node library so we didn’t have to do anything special to import it.

http.createServer(function (req, res)

We use the http library to create a server. The createServer accepts a callback which in turn accepts parameters for the HTTP request – req – and HTTP response – res. The “req” parameter will allow us to access the different parts of an incoming HTTP request: the headers, the query, the URL etc.

.listen(port);

The listen(port) method of the http object will make sure that we’re listening to requests on the given port. The port is assigned from a default value. If that’s null then we go with 1337. Unsurprisingly we’d use port 80 for a public HTTP web server and 443 for HTTPS.

res.writeHead(200, { 'Content-Type': 'text/plain' });
res.end('Hello World\n');

The body of the callback function is very simple. It sets the response code to 200 OK and the content-type header to text/plain. Then we send a string back saying Hello World.

Let’s output some simple HTML instead. Change the callback function as follows:

res.writeHead(200, { 'Content-Type': 'text/html' });
var url = req.url;
res.end("<html><body><p>Requested URL: " + url + "</p></body></html>");

We read the URL of the request and put it in a paragraph. Run the project and you’ll see the following output:

Requested URL: /

Now extend the URL to e.g. http://localhost:1337/hello/bye and the output will be the following:

Requested URL: /hello/bye

Let’s read some other properties of “req”:

var url = req.url;
res.end("<html><body><p>Request properties: URL: " + url + ", method: " + req.method +
        ", http version: " + req.httpVersion + "</p></body></html>");

You’ll probably understand what those properties mean.

We can indicate in the header section that some resource was not found:

res.writeHead(404, { 'Content-Type': 'text/html' });

The 404 will be visible in the developer tools of your browser. Here’s the output in Chrome:

404 returned by Node server

So you can see that we can read a lot of properties of the request and set the response accordingly. We could in theory use this simple template for a web service and handle all the GET, POST etc. requests based on the incoming URL in a gigantic if-else statement. However, that would be a bad idea. We’d like to build request handlers comparable to those in MVC.NET routes and controllers.

That’s where the Express.js library enters the scene. We’ll import it and at the same time discuss the Node Package Manager in the next post.

View all posts related to Node here.

Building a web service with Node.js in Visual Studio Part 3: MongoDb basics cont’d

Introduction

In the previous post we set up MongoDb and looked at the basics of querying against a MongoDb database. We also inserted a couple of customer objects with empty order arrays. Therefore we are familiar with the basics of insertions and querying in MongoDb.

In this post we’ll look at how to perform updates and deletions. Connect to the MongoDb through a command line like we saw in the previous post and get ready for some JavaScript.

Updates

Reference: modifying documents.

In case we’d like to update the name of a customer we can do it as follows using the $set operator:

db.customers.update({name: "Mickey Mouse"}, {$set: {name: "Pluto"}})

This will change the customer name “Mickey Mouse” to “Pluto”. If everything went fine then you’ll get a WriteResult statement in the command prompt with fields like nMatched: 1, nUpserted: 0, nModified: 1. You can probably guess that nMatched means the update operation found one matching documents. nModified means the number of documents modified. nUpserted is a mix of “updated” and “inserted”. An upsert is an update operation where a new document is inserted if there are no matching ones.

Read through the reference material above – it’s not long – and note the following:

  • An update operation will by default only update the first matching document – you can override this with the multi flag
  • By default no upsert will be performed in case the search doesn’t result in any document – you can override this with the upsert flag

Updating the orders array of a customer is very similar:

db.customers.update({name: "Great customer"}, {$set: {orders: [{"item": "Book",	"quantity": 2,	"itemPrice": 10  }, {"item": "Car",	"quantity": 1,	"itemPrice": 2000  }]}})

This will update the “orders” property of the customer whose name is “Great customer”. Note that this statement will update the customer’s orders and overwrite any existing orders array, much like the the UPDATE statement in SQL. How can we then insert a new item to an existing orders array? The $push operator comes to the rescue:

db.customers.update({name: "Great customer"}, {$push: {orders: {"item": "Pen",	"quantity": 5,	"itemPrice": 2  }}})

Deletions

Reference: removing documents.

To remove all customers with the name Pluto execute the following command:

db.customers.remove({name: "Pluto"})

The console output will show in a property called nRemoved how many matching documents were removed.

If you only want to remove the first matching document then indicate it in an index parameter:

db.customers.remove({name: "Mickey Mouse"}, 1)

What if you’d like to remove an item from the orders array? You cannot do that with the remove statement. After all it’s not really a deletion of a customer element but an update of a nested array. The $pull operator will perform what we’re after:

db.customers.update({name: "Great customer"}, {$pull: {orders: {item: "Pen"} } })

This will remove the first element in the orders array of “Great customer” whose item name is “Pen”. If you’d like to remove all array elements with item name “Pen” then use the multi flag:

db.customers.update({name: "Great customer"}, {$pull: {orders: {item: "Pen"} } }, {multi: true})

This should suffice for now. It’s good practice to test some queries based on the MongoDb reference manual. Most of it can be directly used in Node.js as we’ll see later.

In the next post we’ll discuss the basics of a Node.js application.

View all posts related to Node here.

Building a web service with Node.js in Visual Studio Part 2: MongoDb basics

Introduction

In the previous post we outlined the goals of this series and discussed the basics of Node.js. In this post we’ll start from the very back of our service and set up the storage, namely the document-based MongoDb. There’s another series on this blog devoted to MongoDb in .NET. If you’ve never worked with MongoDb then I encourage you to read at least the first 2 parts in that series to get the overall idea. Make sure you understand the basic terminology of MongoDb, such as documents, BSON, collections and how objects are stored in documents.

For the sake of this demo we’ll look at JS and Json in MongoDb in some more detail. The series referred to above doesn’t look at interacting with the MongoDb database directly, only through the MongoDb C# driver which hides those details from us. E.g. it is possible to interact with an SQL Server database from a .NET application without writing a single line of SQL if you go through e.g. LINQ to Entities or LINQ to SQL. The details of opening and querying the database are abstracted away behind the database object context and LINQ statements.

This is, however, not the case with Node.js. There’s no LINQ to Node, or a Node.js driver for MongoDb or anything similar – not that I know of at least. We’ll later see a Node.js package which enables us to open the connection to MongoDb and send CRUD instructions to it but much of the syntax, especially in the case of filtering queries, will be the same as the bare-bones queries you send directly to MongoDb from the MongoDb console. So it is in fact some kind of Node.js driver for MongoDb but it doesn’t provide the same services as the C# driver in a .NET project. Therefore if you’re a staunch believer of relational databases and love using tools such as MS SQL management studio then you’ll need to take a deep breath and dive into the unknown 🙂 You’ll need to be familiar with a new query language designed for MongoDb. Don’t worry, the basics are not complex at all and there’s a lot of help on the MongoDb homepage and the language reference pages:

…and a whole lot more. These pages, especially the Reference manual will be good sources as you try to construct your queries.

This and the next post will be dedicated to MongoDb JavaScript and JSON syntax. This lays the foundations for the “real” stuff, i.e. when we interact with MongoDb through the web service interface. We’ll revisit many of these statements there.

Why MongoDb?

It is certainly possible to establish a connection to relational databases, like MS SQL or Oracle from Node.js but that’s not the norm. If a project already uses one of those and you need to build a Node.js module on top of it then you’ll need to work with it anyway. It’s, however, seldom the case that a brand new Node.js project will pick a relational database as its backup store. You’ll see that Node.js and document databases go hand in hand in practice. That is a more straightforward choice due to the extensive usage of JavaScript and JSON in Node.js and MongoDb. Also, it’s trivial to set up MongoDb on a database machine, you’re literally done in a couple of minutes. It is not the same hassle as setting up the more complex relational database applications like MS SQL. Also, Node.js is free and open-source, hence it’s more natural to pick a free and open-source database to accompany it.

Setup

Go through the first two pages in the MongoDb series referred to above. Make sure that you have MongoDb running as a service at the end of the process:

MongoDb running as a service

MongoDb CRUD operations

Connect to the MongoDb database by running mongo.exe in a command prompt. Make sure you navigate to the bin folder of the MongoDb installation folder. In my case this is c:\mongodb\bin:

MongoDb connecting to database

You’ll connect to the default “test” database upon the first connect. Note that connecting to a database doesn’t necessarily mean that the database exists – it won’t exist until you insert the first collection into it.

Type “show dbs” to list the databases. You’ll probably not see “test” among them. You can switch to another database with “use [database name]”, but again, the database won’t be created at first.

In the demo we’ll be working with customers and orders. In a relational database you’d probably create 2 or more tables to store customers and orders and order items and link them with secondary keys. Although you could solve it in a similar manner in a document database it’s better to think of collections as hierarchical representations of your objects. You store the customer and the orders in the same document, adhering to OOP principles.

Let’s look at an example. A customer and its orders can be represented in a JSON string as follows:

{
  "name": "Great customer",
  "orders": [{
	"item": "Book",
	"quantity": 2,
	"itemPrice": 10
  }, {
	"item": "Car",
	"quantity": 1,
	"itemPrice": 2000
  }]
}

Let’s first add “Great customer” to MongoDb. Switch to the customers context by running “use customers” in the MongoDb command prompt and then enter the following command:

db.customers.insert({name: "Great customer", orders: []})

If everything went OK then you’ll see something like “WriteResult…” and nInserted: 1 as a response.

Note that we inserted a new customer with an empty orders array. We will see this pattern later in the demo where we insert new customers through the web service. Another interesting detail is that you don’t need to put quotation marks around the property names: name vs. “name”, orders vs “orders” in the MongoDb JSON.

Let’s see if the object has really been inserted. Enter the following “find” command without parameters:

db.customers.find()

The “find” command with no parameters will select all elements from a collection. The output will be similar to the following:

{ "_id" : ObjectId("544cb61fda8014d9145c85e6"), "name" : "Great customer", "orders" : [ ] }

You’ll recognise “name” and “orders” but “_id” is new. If you haven’t specified an ID field then MongoDb will assign its own ID of type ObjectId to each new object with the property name “_id”. ObjectId is an internal type within MongoDb. You can specify an ID yourself but it’s your responsibility to make it unique, e.g.:

db.customers.insert({_id: 10, name: "Great customer", orders: []})

If there’s already an entry with id 10 then you’ll get an exception.

Let’s insert 2 more new customers:

db.customers.insert({name: "Donald Duck", orders: []})
db.customers.insert({name: "Mickey Mouse", orders: []})

Run the “find” command to make sure we have 3 customers in the customers collection.

You can search by customer name by adding a JSON-like query to the find method:

db.customers.find({name: "Donald Duck"})

Here’s the equivalent of the SQL “IN” clause to provide a range of values to the SELECT WHERE clause:

db.customers.find({name: { $in: ["Donald Duck", "Mickey Mouse"]}})

There’s a whole range of operators in MongoDb prefixed with the “$” character.

You can negate the statements using the $not operator:

db.customers.find({name: {$not: { $in: ["Donald Duck", "Mickey Mouse"]}}})

The above statement will return Great Customer, i.e. all customers whose name is not listed in the provided string array.

You can limit the returned fields by switching them on and off in a second JSON parameter. E.g. if you only wish to look at the ID of a customer then you can switch off “name” and “orders”:

db.customers.find({name: "Donald Duck"}, {name: 0, orders: 0})

We switch off “name” and “orders” by assigning 0 to them in the second JSON parameter. The _id field will be returned by default. If you’d like to view the orders only then enter the following command:

db.customers.find({name: "Donald Duck"}, {name: 0, _id: 0})

All fields that were NOT switched off by “0” in the selection parameter will be returned, in this case an empty array: “orders” : [].

The below query returns all customers who have not ordered anything yet, i.e. whose “orders” array is of size 0:

db.customers.find({orders: {$size: 0}})

If, however, you’d like to return all customers who have ordered at least 1 product, i.e. whose order array exceeds size 0 then you may test with the $gt, i.e. greater-than operator:

db.customers.find({orders: {$size: {$gt : 0}}})

…except that you’ll get an exception that $size is expecting a number. I’m not sure why this is the case and why it was implemented like this but the following statement with the $where operator will do the job:

db.customers.find({$where:"this.orders.length > 0"})

$where accepts a JavaScript command and we want to return items whose “orders.length” property is greater than 0. In fact the $size: 0 can be rewritten as follows:

db.customers.find({$where:"this.orders.length == 0"})

Note that this solution assumes that every element in the collection has an “orders” field. However, as you can store unstructured objects in a collection it’s not guaranteed that every customer will have an “orders” field. Say that in the beginning of the project you initialised each new customer like this:

db.customers.insert({name: "Donald Duck"})

In this case the above solution will throw an exception as “Donald Duck” has no “orders” field. To make sure this is not the case you can combine $where with $exists:

db.customers.find( {orders : {$exists:true}, $where:'this.orders.length>3'} )

We cannot go through all possible query examples here but this should be enough for starters. You can always consult the reference material mentioned in the introduction as you’re refining your queries. You can test your queries in the MongoDb command prompt like we did above to make sure they work as expected and to see how the result set is structured.

In the next post we’ll look at updates and deletions.

View all posts related to Node here.

Elliot Balynn's Blog

A directory of wonderful thoughts

Software Engineering

Web development

Disparate Opinions

Various tidbits

chsakell's Blog

WEB APPLICATION DEVELOPMENT TUTORIALS WITH OPEN-SOURCE PROJECTS

Once Upon a Camayoc

Bite-size insight on Cyber Security for the not too technical.

%d bloggers like this: