in javascript

a bit about Javascript scope

I am coming from C++ & Java (a little bit C# and Perl), everything about scope is very easy to understand. All of them has block-scope, it means that everything in block will have access to all variables within it and its outter scope’s. Like this:

In Java:

int a = 1;
{
  int b = 2;
  System.out.println(a); // a is accessible to this block
  System.out.println(b); // b is accessible to this block
}
System.out.println(b); // compilation-error

——

Now let’s try to do something similar in Javascript:

var callback = [];
for(var i = 0; i < 2; i++) {
  callback[i] = function() {
    return i*2;
  }
}
console.log(callback[0]()); // 0?
console.log(callback[1]()); // 2? 

If you are also from Java, you probably expect the result to be 0 & 2. Surprisingly, it’s not, it’s will be 4 & 4.

Why?
Javascript has a concept called ‘hoisting’. Variable and function declarations are pulled up so the script will be interpreted to something like:

var i;
var callback;
callback = [];
callback[i] = function() {
  return i*2;
}

for(i = 0; i < 2; i++) { 

} // at this stage in runtime, i has a value of 2 - stop condition for the loop

console.log(callback[0]()); //  will render 4
console.log(callback[1]()); //  will render 4

Strange huh? :-)

So how to ‘fix’ it?
In order to get the same behavior that you might want, you need to understand Javascript scope behavior. There had been no block scope (not until ES6 with ‘let’) for Javascript, there had been only function scope. In our example, if you want i to be ‘known’, you need introduce a temporary scope within the loop.

Thankfully, Javascript’s function is quite flexible, and you can do it like this:
var callback = [];

for(var i = 0; i < 2; i++) { 
  (function(i) {
    callback[i] = function() { 
      return i*2; 
    } 
  })(i); 
} 
console.log(callback[0]()); //  0
console.log(callback[1]()); //  2

The anonymous function we have here, will not be ‘hoisted’, it will be called in run-time and so the ‘running i’ will be captured and assigned to our callback function. By doing like this, we actually introduce a new function scope with the anonymous function where ‘i’ in function(i) is different than ‘var i’ in the loop.

Happy Coding!

How to get Mocha working with asynchronous calls

I wrote this blog some time ago about testing Node.js with Mocha and Chai. (http://minhhoang.de/test-driven-development-of-node-js-with-mocha-and-chai/)
However, it turns out that Mocha will bypass asynchronous calls, so how can you test when you have to do stuffs like database access or async write & read files.

The trick is to put the the parameter ‘done’ into the callback function of ‘it’ and call it when everything is ‘done’, inclusive the return from your asynchronous call:

it("should return something from the db call", function(<strong>done</strong>) {

dbUtils.getSomeRecord(function(result) {
  console.log(result);
  expect(newItem.getName()).to.equal("test item");
  done();

})

It should work this just like that. However, I did have issue with the timeout. Mocha has set by default 2s for done() being called.

Screen Shot 2014-09-24 at 15.10.30

There are two ways to fix this:

  1. Execute mocha with timeout parameter like this: mocha –timeout 20000 Item.test.js
  2. Inject this piece of code into each test case: it(‘some message’, function(done) { … setTimeout(done,20000)}

Hello World with Mongodb on a Mac OS X

Let’s try installing and running mongodb on a Mac computer.

  1. Download a Mac version of mongodb: http://www.mongodb.org/downloads
    I really like the fact that there is only 64-bit for Mac instead of both 64-bit & 32-bit version for Windows and Linux. Fewer choice = fewer confusion.
  2. Unzip the .tgz file and put it in your development folder, or whatever folder you like. The unzipped folder will look like this: unzipped mongodb folder
  3. Within the 'bin' folder are all binary files which are needed to run mongodb:Screen Shot 2014-09-20 at 13.51.04
  4. Since mongodb is a database, you should create a folder so that it can use to store data. Let’s create a folder under your home folder with: 'mkdir ~/data/mongo'
  5. In my Mac, I’ve put the mongodb folder under '/Users/<username>/Development/mongodb'. So when I come to the 'bin' folder within it, I can call './mongod --dbpath ~/data/mongo' to start mongo. If everything goes well, you’d get logs like this:Screen Shot 2014-09-20 at 14.07.23
  6. In the binary folder of the download mongodb, you already have a mongodb shell that can be used to interact with the local mongodb server (which is right now running in ‘localhost:27017’. Starting the client is very simple: './mongo', and you will get: Screen Shot 2014-09-20 at 14.10.23
  7. Mongodb uses the term ‘collections’ for your data. You can think of a collection as a table in relational database.
    Let’s create a new collection called ‘minhdb’ by typing: use minhdb.
    Then write a ‘hello world’ to the db with:  db.users.insert({1:"helloworld"}), and also:  db.users.insert({2:"hello minh"})
  8. That’s it! You have successfully installed and run your own mongodb in your computer. Let’s check your data by typing: db.users.find(). In my Mac, I got:Screen Shot 2014-09-20 at 14.18.21

Have fun and stay tuned! More on NoSQL (mongdb & cloudant) is coming!

Test-driven development of node.js with mocha and chai

Test-driven development (TDD) is very important to agile software development. Today we’ll look a bit into TDD of node.js application with mocha and chai.

  • mocha: a Javascript test framework
  • chai: an assertion library for node.js and browser

Story:
We’ll enhance a class Item with new attribute named “photos”. “photos” is an key-value Javascript object. User will be able to upload photos (single or multiple). The function we test will have the signature uploadPhotos = function(photos);. Each uploaded photo should have a unique key.

The idea of TDD is to write test cases first and let them fail until having a fully functional implementation. So we’re gonna write test cases for our Item class then implement the class later.

The first thing to do is to install mocha:

npm install -g mocha

and don’t forget to put mocha & chai to the package.json of your app:

"dependencies": {
  "express": "3.2.6",
  "ejs": "*",
  "node-sass": "0.9.0",
  "winston": "*",
  "passport-bluemix": "*",
  "mocha": "*",
  "chai": "*"
 }

Our Item class will be in the file Item.js and so the test file will be named: Item.test.js.
In Item.test.js, import needed libraries and modules:

var  Item = require('./Item')
, expect = require('chai').expect;

Write our test cases:

describe('Item', function() {

  //test Item constructor
  describe('new Item()', function() {
    it("should create a new Item with the name: test item", function() {
      var newItem = new Item("test item");
      expect(newItem.getName()).to.equal("test item");
    });
  });

  describe('.uploadPhotos()', function() {
    it("should upload an array of photos", function() {
      var photos = ['photo1', 'photo2'];
      var newItem = new Item("test item");
      newItem.uploadPhotos(photos);
      expect(newItem.getAllPhotos().length).to.equal(2);
    });
  });

});

What we have here are only two test cases, one is for the class constructor, and second is our test case. Since we didn’t implement the function uploadPhotos([]), its test case will obviously fail. This is the sense of TDD, you write your test cases first, and they will fail until the correct functionalities are implemented.

Now execute our test case with:
mocha Item.test.js
and the result:
Screen Shot 2014-06-29 at 14.05.05

Yep, it failed, now let’s make it right by implementing our function:

function Item(name) {
  this.name= name;
  this.descripiton = '';
  this.photos = {};
}

//return only keys of all photos
Item.prototype.getAllPhotos = function() {
  return Object.keys(this.photos);
}

Item.prototype.getPhoto = function(key) {
  return this.photos[key];
}

Item.prototype.uploadPhotos = function(photos) {
  for(var i = 0; i < photos.length; i++) {
    var now = new Date();
    var key

    do {
      key = 'photo_' + now.getTime() + '_' + Math.floor((Math.random() * now.getTime()) + 1); //generate an unique key
    } while (this.photos[key]) //if the key's already there, generate a new one
    this.photos[key] = photos[i];
  }
}

module.exports = Item; 

And run the test again.
Screen Shot 2014-06-29 at 14.09.46
Ta da! It passed!

so…happy coding!

using SASS as an Express middleware in your Node.js application

SASS is a powerful CSS preprocessor. It enhances CSS and ease your effort to write styslesheets by providing nested rules, variable and mixins. In order to use SASS as a middleware for Node.js Express application, I use node-sass. It seems like the most popular SASS module for Node.js in Github.

The current npm version of node-sass is 0.9.1 which excludes middleware.js and will cause error in your application, sth. like: sass.middleware is not a function. It is because middleware.js has just been pulled out as a separated module node-sass-middleware (README.md of node-sass 0.9.1 wasn’t updated, and node-sass-middleware is not yet available in npm).

So, let’s work around. Use 0.9.0 version, it should work just fine. Specify the version in your package.json:

"dependencies": {
"express": "3.2.6",
"ejs": "*",
"node-sass": "0.9.0"
}

and use it as middleware in your code:

app.use(
  sass.middleware({
  src: __dirname + '/public/sass',
  dest: __dirname + '/public/css',
  debug: true,
  outputStyle: 'compressed'
  })
);
app.use(express.static(path.join(__dirname, 'public')));

When you specify stylesheets in your website, the middleware will look into /public/sass for the .scss files, compile them into .css and put them back to /public/css.

Chained MapReduce with Cloudant

Cloudant (with CouchDB in the background) provides a very simple way to aggregate your data by using MapReduce.
For example, you have your data coming out of Map look like: {[month, week, day], spending.value}. Buy using Reduce _stats, you will get: min, max, average of your spending, and if you set group_level = 1, group_level=2, you’ve got your spending statistics by year and by month respectively.

However, if you ask a question like: in which day I’ve spent at most? How could you solve it? I saw people do in-memory sorting, or pre-denormalization to answer such kind of query, but there is a better way to do that with Cloudant Chained MapReduce.
To remember: Cloudant comes with sort-by-key by default, if you need sort-by-value, think of Chained MapReduce.

So how to do that?
At the time of writting this blog, Cloudant MapReduce cannot be done in the UI, you have to go to your design documentation and edit your view there. All you need to do, is to put "dbcopy":"<name_of_your_chained_db" in the view you want to chain. What it does is that it will create a new db, and continuously pull you data from your View to that DB. One constraint: dbcopy works only with reduced View. MapReduce with only Map wouldn’t work.

Ok, what you have now is a new db with “key” and “value”, now create a new simple view out of it:

function(doc) {
  emit(doc.value, doc.key);
 }
<span style="line-height: 1.5;">reduce: null</span>

Now you have a brand-new sorted view, with key = <your spending> and value = <date of your spending>. Just do a query and get your top 10 most spending days out of the view.

So…happy coding 😉

OAuth2 Bluemix strategy for Passport

You can use this module to authenticate users with IBM ID in your Node.js applications. The module can also be used as middleware in Express. Manage your client configurations in Bluemix IDaaS.

Install

$ npm install passport-bluemix

Usage

Authentication Strategy

Use BlueMix as OAuth2 authentication strategy for Passport. After authenticate using IBM ID, this strategy requires a verify callback which can be used to create/verify an user in your application. Calling done(null, profile) will save user profile from IBM to the current passport session. You can also write anything to the passport session, for example user.

var passport = require('passport')
, BlueMixOAuth2Strategy = require('passport-bluemix').BlueMixOAuth2Strategy;

passport.use('bluemix', new BlueMixOAuth2Strategy({
    authorizationURL : 'https://idaas.ng.bluemix.net/sps/oauth20sp/oauth20/authorize',
    tokenURL : 'https://idaas.ng.bluemix.net/sps/oauth20sp/oauth20/token',
    clientID : 'your_app_client_id',
    scope: 'profile',
    grant_type: 'authorization_code',
    clientSecret : 'your_app_client_secret',
    callbackURL : 'http://localhost:3000/auth/ibm/callback',
    profileURL: 'https://idaas.ng.bluemix.net/idaas/resources/profile.jsp'
}, function(accessToken, refreshToken, profile, done) {
    ... //find or create new user
    return done(null, ...);
}));

Authenticate Requests

Use passport.authenticate(), specifying the 'bluemix' strategy, to authenticate requests.

For example, as route middleware in an Express application:

app.get('/auth/ibm', passport.authenticate('bluemix', {requestedAuthnPolicy: 'http://www.ibm.com/idaas/authnpolicy/basic'}));
app.get('/auth/ibm/callback',
        passport.authenticate('bluemix'),
        function(req, res) {
        // Successful authentication, redirect home.
        res.redirect('/');
});

Resources:

tunnel to your mongodb in BlueMix

using cf cli version 5 (written in Ruby), you can tunnel into your services running in BlueMix. Unfortunately cf tunnel has been removed in cf v6 and it seems like it won’t never come back.

I don’t really understand the argumentation of Pivotal saying that all 3rd-party services are now provisioned outside of Pivotal Web Services, so you can  get the connection details from the vendors and using it for your local development. It may make sense in Pivotal’s own Cloud Foundry platform, but it’s not in IBM’s BlueMix and others.

Anyway, since I really like this functionality, I keep installing it with “gem install cf” (it didn’t work with Ruby 2+, use 1.9.3)

Ok, let’s do it!

1. As usual, targeting and login to bluemix:
cf target api.ng.bluemix.net
then login: cf login

cf target

 

2. Usage: tunnel [INSTANCE] [CLIENT]
Options:
--client CLIENT        Client to automatically launch
--instance INSTANCE    Service instance to tunnel to
--port PORT            Port to bind the tunnel to

What you need to do now, is to login BlueMix, and get your mongodb instance name. In my case, my instance name is: mongodb-4113s

mongodb

Now, execute the cf tunnel, bind it to port 10000 or whatever port you like:
cf tunnel --instance mongodb-4ll3s --port 10000

tunnel mongodb

It would take 1-2 min, and then you will have a tunnel to your MongoDB instance running in BlueMix. It means, you can start developing locally with your production db.

Have fun!