What a Nightmare!

NightmareJS

Ever heard of NightmareJS? Me either until today. Just in time too, I was beginning to pull my hair out using PhantomJS to run a suite of JavaScript unit tests.

The problem with PhantomJS

During my Visual Studio Team Services build I was experiencing quite a few sporadic failures where PhantonJS failed to connect. The result of this was none of my unit tests were ran and my build failed (a good thing since the tests didn’t run). After many hours of trying to figure out what the right combination of Node, Karma, Jasmine, and PhantomJS could be causing this I decided to look for another solution.

The PhantomJS Error of Death (PEoD)

Here is the exact error that we were receiving:

[32m09 03 2017 10:28:54.632:INFO [karma]: Karma v0.13.22 server started at http://localhost:22200/
2017-03-09T16:28:54.6322872Z 09 03 2017 10:28:54.632:INFO [launcher]: Starting browser PhantomJS
2017-03-09T16:28:56.3259069Z 09 03 2017 10:28:56.325:INFO [PhantomJS 2.1.1 (Windows 8 0.0.0)]: Connected on socket -I7m1ca0zjoJo8roAAAA with id 12617843
2017-03-09T16:29:26.3329205Z 09 03 2017 10:29:26.332:WARN [PhantomJS 2.1.1 (Windows 8 0.0.0)]: Disconnected (1 times), because no message in 30000 ms.
2017-03-09T16:29:26.3347996Z 
2017-03-09T16:29:26.3509918Z [10:29:26] 'test-minified' errored after 32 s
2017-03-09T16:29:26.3509918Z [10:29:26] Error: 1
2017-03-09T16:29:26.3509918Z     at formatError (D:\A1\_work\22\s\Source\WebUI\node_modules\gulp\bin\gulp.js:169:10)
2017-03-09T16:29:26.3509918Z     at Gulp.<anonymous> (D:\A1\_work\22\s\Source\WebUI\node_modules\gulp\bin\gulp.js:195:15)
2017-03-09T16:29:26.3509918Z     at emitOne (events.js:82:20)
2017-03-09T16:29:26.3509918Z     at Gulp.emit (events.js:169:7)
2017-03-09T16:29:26.3509918Z     at Gulp.Orchestrator._emitTaskDone (D:\A1\_work\22\s\Source\WebUI\node_modules\orchestrator\index.js:264:8)
2017-03-09T16:29:26.3509918Z     at D:\A1\_work\22\s\Source\WebUI\node_modules\orchestrator\index.js:275:23
2017-03-09T16:29:26.3509918Z     at finish (D:\A1\_work\22\s\Source\WebUI\node_modules\orchestrator\lib\runTask.js:21:8)
2017-03-09T16:29:26.3509918Z     at cb (D:\A1\_work\22\s\Source\WebUI\node_modules\orchestrator\lib\runTask.js:29:3)
2017-03-09T16:29:26.3509918Z     at removeAllListeners (D:\A1\_work\22\s\Source\WebUI\node_modules\karma\lib\server.js:336:7)
2017-03-09T16:29:26.3509918Z     at Server.<anonymous> (D:\A1\_work\22\s\Source\WebUI\node_modules\karma\lib\server.js:347:9)
2017-03-09T16:29:26.3509918Z     at Server.g (events.js:260:16)
2017-03-09T16:29:26.3509918Z     at emitNone (events.js:72:20)
2017-03-09T16:29:26.3509918Z     at Server.emit (events.js:166:7)
2017-03-09T16:29:26.3509918Z     at emitCloseNT (net.js:1537:8)
2017-03-09T16:29:26.3509918Z     at nextTickCallbackWith1Arg (node.js:431:9)
2017-03-09T16:29:26.3509918Z     at process._tickCallback (node.js:353:17)

Nightmare to the rescue

Nightmare is a very nicely documented and human readable browser API. Under the hood it uses Electon which is suppose to be 2x faster than PhantomJS! I’m sold!

Why did I use it?

I was previously using PhantomJS through karma to run all of our JavaScript unit test but we kept getting sporadic failures where Phantom couldn’t connect.

How to use Nightmare with karma

  1. Install the nightmare launcher

yarn add -D karma-nightmare

  1. Add Nighmare as your karma browser either within your karma config or within your karma API call. Here’s an example using the Karma API directly.
function runKarma(files, reporters, browser, singleRun, done) {
    log('starting karma');

    return new Server({
        port: 9999,
        browsers: ['Nightmare'],        
        files: files,
        singleRun: true,
        action: 'run',
        logLevel: 'info',
        captureTimeout: 30000,
        browserNoActivityTimeout: 30000,
        frameworks: ['jasmine']        
    }, function (err) {
        {
            handleKarmaError(err, done);
        }
    }).start();

Microsoft //Build 2016

Why //Build

I am fortunate enough to work for an employer that cares deeply about keeping up with technology and training their developers to do so. In addition to this my employer is also a Microsoft shop so we already have an interest in the Microsoft stack and what the future has in store for it.

Personally, I’ve never been to //Build or any other MS only conference in the past, but I’ve definitely kept up with the past conferences via the Channel 9 Live Stream. Knowing that this is the one big conference where Microsoft typically unveils their latest and greatest developer news I was all in when approached with the opportunity to attend.

Specific Interest

The team that I currently lead uses the following technologies from Microsoft:
Asp.Net Web API 2
Asp.Net Web Pages
EntityFramework v6
– SQL Server 2012
TypeScript
C# 6

With those tools I was definitely interested in hearing/seeing/learning what Microsoft has in mind for the Asp.Net Core 1.0 and EntityFramework Core 1.0

My Schedule

Day 1

Day 2

Day 3

Conclusion

All in all this was an excellent conference that I would love to attend again if given the opportunity. In particular I see this as an excellent opportunity to be exposed to up coming tech and to begin to build your bank of possible tools for your next projects!

Behaviorial Driven Development – Crash Course

Introduction

Behavioral Driven Development (BDD) all started from a simple blog post by Dan North in 2006. I”ll let you read the post for the detailed description, but the tl;dr is that he was looking for a better way to teach and communicate the practices of Test Driven Development (TDD). So, BDD was formed to bridge the gap between what should be tested and how to better communicate requirement between the business and developers.

Steps

The premise of BDD is to simplify the communication between business owners and developers by using examples to explain acceptance criteria. The examples provide clarification about specific criteria that the business owner expects to see from a feature, and in BDD they also provide the “Acceptance Criteria” for the story. In it’s simplest form I usually break the BDD process into the following 3 steps.

  1. 3 Amigo Story Huddle discuss example of what should be built
    • Business Analyst
    • Quality Assurance Engineer
    • Developer
  2. Define each example (aka scenario) using the Gherkin Syntax
    GIVEN I navigate to the insurance policy 1234
    WHEN the policy owner information is shown
    THEN I am able to choose a my coverage level
    
  3. Automation of the the scenarios

Tools

There are many BDD tools available to parse Gherkin scenarios and create test cases out of them. For example my development team is using SpecFlow because our Acceptance Tests are written in .Net. SpecFlow simply parses the Gherkin to generates NUnit tests which can then be ran via the typical NUnit console runner. The beauty of using these tools is that it allows non technical folks to create the acceptance criteria using business (not technical) language.

Tool Links

Structuring an Express MVC API – Part 2

Summary

In the first post we learned a bit about the MVC pattern itself. Now that we have the pattern down lets see how we can structure our application to follow this pattern. Keep in mind there are many ways to structure an MVC application and this is just one way. Also, since our application is relatively simple we will be organizing it by type instead of by feature. In larger projects organizing your files by feature is nice alternative to use, but for simple project it can be overkill.

However, the most important thing to remember is that whatever structure you choose to follow you stick to it consistently.

Application Structure

The application structure is best introduced from the below image:

todo-list-app-structure
todo-list-app-structure

As you can see the application is nicely laid out into two main folders under the src directory (client and server). Below I will discuss the purpose of each one of the server folders so that you will have a good understanding of how this all fits together.

Server

The server directory is pretty self explanatory. It’s the home to all server side code and it contains the main server side entry point, server.js. This file is very small and it’s purpose is to simply serve as the entry point for our application.

server.js:

process.env.NODE_ENV = process.env.NODE_ENV || 'development';

var express = require('./config/express');
var mongoose = require('./config/mongoose');

var db = mongoose();
var app = express();
app.listen('3000');

module.exports = app;
console.log('Server started on port http://localhost:3000');

List of folders under the server directory:
Config
Controllers
Models
Routes
Views

Config

This directory is home to all of the application configuration stuff. Two main configuration items that I like to have in this directory are:

  • express.js
  • mongoose.js

The express.js file contains all of the express configuration code.

Here is a snippet from express.js.

var express = require('express');
var morgan = require('morgan');
var compress = require('compression');
var bodyParser = require('body-parser');
var methodOverride = require('method-override');

module.exports = function () {
var app = express();

if (process.env.NODE_ENV === 'development') {
    // log all requests
    app.use(morgan('dev'));
} else {
    app.use(compress());
}

app.use(bodyParser.urlencoded(
{
    extended: true
}));
app.use(bodyParser.json());

// support for PUT and DELETE verbs
app.use(methodOverride());

app.set('views', './src/server/views');
app.set('view engine', 'ejs');

require('../routes/index.route.js')(app);
require('../routes/todo.route.js')(app);

As you can see we register all the express middleware and initialize each of our routes from this file.

Mongoose.js

There are two responsibilities of this file:

  • Connect to MongoDB
  • Initialize our Model objects

Here is the file in it’s entirety.

var config = require('./config');
var mongoose = require('mongoose');

module.exports = function () {
var db = mongoose.connect(config.db);

// load models here
require('../models/todo.model');

return db;
};
Env

The env directory is where the server configuration will live. At this time we currently have a development and production file which are loaded based upon the process.evn.NODE_ENV variable. As you can see from the config.js we are concatenating the node environment with the require path so that when we require(config) we will have the correct config settings for our currently running environment.

module.exports = require('./env/' + process.env.NODE_ENV + '.js');

Having an environment specific config is a best practice and will help out when deploying our application to a production server.

Controllers

As I mentioned in the summary section our application structure is organized by type so as you have probably figured out the controllers directory contains all of the controllers. Currently our ToDo application has two controllers: index.controller.js and todo.controller.js. The naming convention that I like use is to add the word controller to the file so that I can re-use both index and todo in other file names but still be able to use IDE navigation shortcuts to clearly see which files I’m navigating.

Models

You can probably guess where this is going by now, but models does just what is says. All models are defined and stored in this directory.

ToDo Model Example:

var mongoose = require('mongoose');
var Schema = mongoose.Schema;

var ToDoSchema = new Schema({
    description: {type: String},
    dueDate: {
        type: Date,
        default: Date.now
    },
    isComplete: {type: Boolean, default: false}
});

mongoose.model('Todo', ToDoSchema);

In another post I will dive a lot deeper into mongoose which is what we are using to create our Model objects.

Routes

Routes are the core of any API and one of the conventions that I typically use is to begin each route with /api. The reason that I do this to easily distinguish the difference between api and non api routes.

In our MVC API we don’t have any logic in the routing files. Instead the routes are in charge of simply delegating to the correct controller functions.

todo.route.js

var controller = require('../controllers/todo.controller.js');

module.exports = function (app) {
    app.route('/api/todos')
        .post(controller.create)
        .get(controller.list);

    app.route('/api/todos/:id')
        .get(controller.read)
        .put(controller.update)
        .delete(controller.delete);

    app.param('id',controller.todoById);
};

As you can see on the final line we are calling app.param which will run before any of the other middleware so that by the time our routing code gets ran the :id has already been parsed from the request path and added to the request object as req.todo Once that is complete, all the /api/todos/:id routes will have the req.todo object available.

Here is the magic behind the controller.todoById function:

exports.todoById = function (req, res, next, id) {
    Todo.findOne({_id: id}, function (err, todo) {
        if (err) {
            return next(err);
        } else {
            req.todo = todo;
            next();
        }
    })
};

In the above snippet we are simply using the Mongoose findOne function to find our Todo by _id from mongo. Once successfully found we add the object to the req object and call the next middleware (which happens to be all the routes that require the :id.

Views

The final directory that we have is the views directory and as you can guess this is where the view code will live. What view code you might ask? Well, even in our simple API app we still have one static view for the main index page and this is where that file is held.

index.ejs

<!doctype html>
<html lang="en">
<head>
    <meta charset="UTF-8">
    <title>Todo It!</title>
</head>
<body>
    <h1>It's already <%= now %> better get something done! </h1>
    <div class="todo-container">
        <form>
            <input type="text" placeholder="What should I get done?"/>
            <input type="submit"/>
        </form>
    </div>
</body>
</html>

In the above code may have noticed the <%= now %>. This is simply the syntax required from our server side view engine EJS.

Final thoughts

This two part series had quite a bit of content that wasn’t covered in too much depth because topics like MongoDB, Mongoose, Express Routing, EJS, etc… could all be several posts in themselves. Hopefully this still gave you an idea for how to structure a simple Express API using MVC.

Last but not least, we didn’t even give AngularJS love with this post, but rest assured we will be getting into a lot of Angular with future posts.