Welcome!

Apache Authors: Sematext Blog , Trevor Parsons, Lori MacVittie, Sandi Mappic, Carmen Gonzalez

Blog Feed Post

The Absolute Beginner’s Guide to Node.js

An absolute Beginner's Guide to node.js – by Brandon Cannaday

This is a republished blog post by Brandon Cannaday. Brandon is the CTO of Modulus, a Node.js application hosting platform. Brandon organizes the Indianapolis Node.js meetup and enjoys speaking at conferences about Node’s horizontal scalability. Prior to Modulus, Brandon worked in the chemical detection and telecommunications industries.

Modulus is the first company in the industry to offer a dedicated enterprise solution called Curvature. Curvature allows you to take advantage of rapid deployments, easy scaling, and real-time analytics in the environment of your choosing, on-premises, in the cloud, or a hybrid of the two.


There’s no shortage of Node.js tutorials out there, but most of them cover specific use cases or topics that only apply when you’ve already got Node up and running. I see comments every once and awhile that sound something like, “I’ve downloaded Node, now what?” This tutorial answers that question and explains how to get started from the very beginning.

What is Node.js?

A lot of the confusion for newcomers to Node is misunderstanding exactly what it is. The description on nodejs.org definitely doesn’t help.

An important thing to realize is that Node is not a webserver. By itself it doesn’t do anything. It doesn’t work like Apache. There is no config file where you point it to you HTML files. If you want it to be a HTTP server, you have to write an HTTP server (with the help of its built-in libraries). Node.js is just another way to execute code on your computer. It is simply a JavaScript runtime.

Installing Node

Node.js is very easy to install. If you’re using Windows or Mac, installers are available on the download page.

I’ve Installed Node, now what?

Once installed you’ll have access to a new command called “node”. You can use the node command in two different ways. The first is with no arguments. This will open an interactive shell (REPL: read-eval-print-loop) where you can execute raw JavaScript code.

$ node
> console.log('Hello World');
Hello World
undefined

An absolute Beginner's Guide to node.js – Hello World

In the above example I typed “console.log(‘Hello World’)” into the shell and hit enter. Node will then execute that code and we can see our logged message. It also prints “undefined” because it displays the return value of each command and console.log doesn’t return anything.

The other way to run Node is by providing it a JavaScript file to execute. This is almost always how you’ll be using it.

hello.js

console.log('Hello World');
$ node hello.js
Hello World

An absolute Beginner's Guide to node.js – console

In this example, I moved the console.log message into a file then passed that file to the node command as an argument. Node then runs the JavaScript in that file and prints “Hello World”.

File I/O with node.js

Running plain JavaScript is fun and all, but not very useful. This is why Node.js also includes a powerful set of libraries (modules) for doing real things. In this first example I’m going to open a log file and parse it.

example_log.txt

2013-08-09T13:50:33.166Z A 2
2013-08-09T13:51:33.166Z B 1
2013-08-09T13:52:33.166Z C 6
2013-08-09T13:53:33.166Z B 8
2013-08-09T13:54:33.166Z B 5

What this log data means is not important, but basically each message contains a date, a letter, and a value. I want to add up the values for each letter.

The first thing we need to do it read the contents of the file.

my_parser.js

// Load the fs (filesystem) module
var fs = require('fs');

// Read the contents of the file into memory.
fs.readFile('example_log.txt', function (err, logData) {

// If an error occurred, throwing it will
  // display the exception and end our app.
  if (err) throw err;

// logData is a Buffer, convert to string.
  var text = logData.toString();
});

Fortunately Node.js makes file I/O really easy with the built-in filesystem (fs) module. The fs module has a function named readFile that takes a file path and a callback. The callback will be invoked when the file is done being read. The file data comes in the form of a Buffer, which is basically a byte array. We can convert it to a string using the toString() function.

Now let’s add in the parsing. This is pretty much normal JavaScript so I won’t go into any details.

my_parser.js

// Load the fs (filesystem) module.
var fs = require('fs');// 

Read the contents of the file into memory.
fs.readFile('example_log.txt', function (err, logData) {

// If an error occurred, throwing it will
  // display the exception and kill our app.
  if (err) throw err;

// logData is a Buffer, convert to string.
  var text = logData.toString();

var results = {};

// Break up the file into lines.
  var lines = text.split('\n');

lines.forEach(function(line) {
    var parts = line.split(' ');
    var letter = parts[1];
    var count = parseInt(parts[2]);

if(!results[letter]) {
      results[letter] = 0;
    }

results[letter] += parseInt(count);
  });

console.log(results);
  // { A: 2, B: 14, C: 6 }
});

Now when you pass this file as the argument to the node command it will print the result and exit.

$ node my_parser.js
{ A: 2, B: 14, C: 6 }

An absolute Beginner's Guide to node.js – my_parser.js

I use Node.js a lot for scripting like this. It’s much easier and a more powerful alternative to bash scripts.

Asynchronous Callbacks in node.js

As you saw in the previous example, the typical pattern in Node.js is to use asynchronous callbacks. Basically you’re telling it to do something and when it’s done it will call your function (callback). This is because Node is single-threaded. While you’re waiting on the callback to fire, Node can go off and do other things instead of blocking until the request is finished.

This is especially important for web servers. It’s pretty common in modern web applications to access databases. While you’re waiting for the database to return results Node can process more requests. This allows you to handle thousands of concurrent connections with very little overhead, compared to creating a separate thread for each connection.

Create a HTTP Server with node.js

Like I said before Node doesn’t do anything out of the box. One of the built-in modules makes it pretty easy to create a basic HTTP server, which is the example on the Node.js homepage.

my_web_server.js

var http = require('http');

http.createServer(function (req, res) {
  res.writeHead(200, {'Content-Type': 'text/plain'});
  res.end('Hello World\n');
}).listen(8080);

console.log('Server running on port 8080.');

When I say basic, I mean basic. This is not a full-featured HTTP server. It can’t serve any HTML file or images. In fact, no matter what you request, it will return ‘Hello World’. However, you can run this and hit http://localhost:8080 in your browser and you’ll see the text.

$ node my_web_server.js

You might notice something a little different now. Your Node.js application no longer exits. This is because you created a server and your Node.js application will continue to run and respond to requests until you kill it yourself.

If you want this to be a full-featured web server, then you have to do that work. You have to check what was requested, read the appropriate files, and send the content back. There’s good news, though. People have already done this hard work for you.

Express for node.js

Express is a framework that makes creating most normal websites very simple. The first thing you have to do it install it. Along with the node command you also have access to a command called “npm”. This tool gives you access to an enormous collection of modules created by the community, and one of them is Express.

$ cd /my/app/location
$ npm install express

When you install a module, it will put it in a node_modules folder inside your application directory. You can now require it like any built-in module. Let’s create a basic static file server using Express.

my_static_file_server.js

var express = require('express'),
    app = express();

app.use(express.static(__dirname + '/public'));

app.listen(8080);
$ node my_static_file_server.js

You now have a pretty capable static file server. Anything you put in the /public folder can now be requested by your browser and displayed. HTML, images, almost anything. So for example, if you put an image called “my_image.png” inside the public folder, you can access it using your browser by going to http://localhost:8080/my_image.png. Of course Express has many many more features, but you can look those up as you continue developing.

NPM

We touched on npm a little in the previous section, but I want to emphasize how important this tool will be to normal Node.js development. There are thousands of modules available that solve almost all typical problems that you’re likely to encounter. Remember to check npm before re-inventing the wheel. It’s not unheard of for a typical Node.js application to have dozens of dependencies.

In the previous example we manually installed Express. If you have a lot of dependencies, that’s not going to be a very good way to install them. That’s why npm makes use of a package.json file.

package.json

{
  "name" : "MyStaticServer",
  "version" : "0.0.1",
  "dependencies" : {
    "express" : "3.3.x"
  }
}

A package.json file contains an overview of your application. There are a lot of available fields, but this is pretty much the minimum. The dependencies section describes the name and version of the modules you’d like to install. In this case I’ll accept any version of Express 3.3. You can list as many dependencies as you want in this section.

Now instead of installing each dependency separately, we can run a single command and install all of them.

$ npm install

When you run this command npm will look in the current folder for a package.json file. If it finds one, it will install every dependency listed.

Code Organization in node.js

So far we’ve only been using a single file, which isn’t very maintainable. In most applications your code will be split into several files. There’s no standard or enforced organization to what files go where. This isn’t Rails. There’s no concept of views go here and controllers go there. You can do whatever you want.

Let’s re-factor the log parsing script. It’s much more testable and more maintainable if we separate out the parsing logic into its own file.

parser.js

// Parser constructor.
var Parser = function() {

};

// Parses the specified text.
Parser.prototype.parse = function(text) {

var results = {};

// Break up the file into lines.
  var lines = text.split('\n');

lines.forEach(function(line) {
    var parts = line.split(' ');
    var letter = parts[1];
    var count = parseInt(parts[2]);

if(!results[letter]) {
      results[letter] = 0;
    }

results[letter] += parseInt(count);
  });

return results;
};

// Export the Parser constructor from this module.
module.exports = Parser;

What I did was create a new file to hold the logic for parsing logs. This is just standard JavaScript and there are many ways to encapsulate this code. I chose to define a new JavaScript object because it’s easy to unit test.

The important piece to this is the “module.exports” line. This tells Node what you’re exporting from this file. In this case I’m exporting the constructor, so users can create instances of my Parser object. You can export whatever you want.

Now let’s look at how to import this file and make use of my new Parser object.

my_parser.js

// Require my new parser.js file.
var Parser = require('./parser');

// Load the fs (filesystem) module.
var fs = require('fs');

// Read the contents of the file into memory.
fs.readFile('example_log.txt', function (err, logData) {

// If an error occurred, throwing it will
  // display the exception and kill our app.
  if (err) throw err;

// logData is a Buffer, convert to string.
  var text = logData.toString();

// Create an instance of the Parser object.
  var parser = new Parser();

// Call the parse function.
  console.log(parser.parse(text));
  // { A: 2, B: 14, C: 6 }
});

Files are included exactly like modules, except you provide a path instead of a name. The .js extension is implied so you can leave it off if you want.

Since I exported the constructor that is what will be returned from the require statement. I can now create instances of my Parser object and use it.

Summary

Hopefully this tutorial can bridge the gap between downloading Node.js and building your first widget. Node.js is an extremely powerful and flexible technology that can solve a wide variety of problems.

I want everyone to remember that Node.js is only bound by your imagination. The core libraries are very carefully designed to provide the puzzle pieces needed to build any picture. Combine those with the modules available in npm and it’s amazing how quickly you can begin building very complex and compelling applications.

If you have any questions or comments, feel free to drop them below.

Codeship – A hosted Continuous Deployment platform for web applications

Now you know how to get started with node.js why don’t you head over to Codeship and set up Continuous Deployment your first node.js project?

Read the original blog entry...

More Stories By Manuel Weiss

I am the cofounder of Codeship – a hosted Continuous Integration and Deployment platform for web applications. On the Codeship blog we love to write about Software Testing, Continuos Integration and Deployment. Also check out our weekly screencast series 'Testing Tuesday'!

@ThingsExpo Stories
The BPM world is going through some evolution or changes where traditional business process management solutions really have nowhere to go in terms of development of the road map. In this demo at 15th Cloud Expo, Kyle Hansen, Director of Professional Services at AgilePoint, shows AgilePoint’s unique approach to dealing with this market circumstance by developing a rapid application composition or development framework.

ARMONK, N.Y., Nov. 20, 2014 /PRNewswire/ --  IBM (NYSE: IBM) today announced that it is bringing a greater level of control, security and flexibility to cloud-based application development and delivery with a single-tenant version of Bluemix, IBM's platform-as-a-service. The new platform enables developers to build ap...

"BSQUARE is in the business of selling software solutions for smart connected devices. It's obvious that IoT has moved from being a technology to being a fundamental part of business, and in the last 18 months people have said let's figure out how to do it and let's put some focus on it, " explained Dave Wagstaff, VP & Chief Architect, at BSQUARE Corporation, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
The major cloud platforms defy a simple, side-by-side analysis. Each of the major IaaS public-cloud platforms offers their own unique strengths and functionality. Options for on-site private cloud are diverse as well, and must be designed and deployed while taking existing legacy architecture and infrastructure into account. Then the reality is that most enterprises are embarking on a hybrid cloud strategy and programs. In this Power Panel at 15th Cloud Expo (http://www.CloudComputingExpo.com), moderated by Ashar Baig, Research Director, Cloud, at Gigaom Research, Nate Gordon, Director of T...
The Internet of Things is not new. Historically, smart businesses have used its basic concept of leveraging data to drive better decision making and have capitalized on those insights to realize additional revenue opportunities. So, what has changed to make the Internet of Things one of the hottest topics in tech? In his session at @ThingsExpo, Chris Gray, Director, Embedded and Internet of Things, discussed the underlying factors that are driving the economics of intelligent systems. Discover how hardware commoditization, the ubiquitous nature of connectivity, and the emergence of Big Data a...
SYS-CON Events announced today that Windstream, a leading provider of advanced network and cloud communications, has been named “Silver Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. Windstream (Nasdaq: WIN), a FORTUNE 500 and S&P 500 company, is a leading provider of advanced network communications, including cloud computing and managed services, to businesses nationwide. The company also offers broadband, phone and digital TV services to consumers primarily in rural areas.
“In the past year we've seen a lot of stabilization of WebRTC. You can now use it in production with a far greater degree of certainty. A lot of the real developments in the past year have been in things like the data channel, which will enable a whole new type of application," explained Peter Dunkley, Technical Director at Acision, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that IDenticard will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. IDenticard™ is the security division of Brady Corp (NYSE: BRC), a $1.5 billion manufacturer of identification products. We have small-company values with the strength and stability of a major corporation. IDenticard offers local sales, support and service to our customers across the United States and Canada. Our partner network encompasses some 300 of the world's leading systems integrators and security s...
DevOps Summit 2015 New York, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that it is now accepting Keynote Proposals. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential.
"People are a lot more knowledgeable about APIs now. There are two types of people who work with APIs - IT people who want to use APIs for something internal and the product managers who want to do something outside APIs for people to connect to them," explained Roberto Medrano, Executive Vice President at SOA Software, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Nigeria has the largest economy in Africa, at more than US$500 billion, and ranks 23rd in the world. A recent re-evaluation of Nigeria's true economic size doubled the previous estimate, and brought it well ahead of South Africa, which is a member (unlike Nigeria) of the G20 club for political as well as economic reasons. Nigeria's economy can be said to be quite diverse from one point of view, but heavily dependent on oil and gas at the same time. Oil and natural gas account for about 15% of Nigera's overall economy, but traditionally represent more than 90% of the country's exports and as...
The Internet of Things is a misnomer. That implies that everything is on the Internet, and that simply should not be - especially for things that are blurring the line between medical devices that stimulate like a pacemaker and quantified self-sensors like a pedometer or pulse tracker. The mesh of things that we manage must be segmented into zones of trust for sensing data, transmitting data, receiving command and control administrative changes, and peer-to-peer mesh messaging. In his session at @ThingsExpo, Ryan Bagnulo, Solution Architect / Software Engineer at SOA Software, focused on desi...
"At our booth we are showing how to provide trust in the Internet of Things. Trust is where everything starts to become secure and trustworthy. Now with the scaling of the Internet of Things it becomes an interesting question – I've heard numbers from 200 billion devices next year up to a trillion in the next 10 to 15 years," explained Johannes Lintzen, Vice President of Sales at Utimaco, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
"For over 25 years we have been working with a lot of enterprise customers and we have seen how companies create applications. And now that we have moved to cloud computing, mobile, social and the Internet of Things, we see that the market needs a new way of creating applications," stated Jesse Shiah, CEO, President and Co-Founder of AgilePoint Inc., in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that Gridstore™, the leader in hyper-converged infrastructure purpose-built to optimize Microsoft workloads, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Gridstore™ is the leader in hyper-converged infrastructure purpose-built for Microsoft workloads and designed to accelerate applications in virtualized environments. Gridstore’s hyper-converged infrastructure is the industry’s first all flash version of HyperConverged Appliances that include both compute and storag...
Today’s enterprise is being driven by disruptive competitive and human capital requirements to provide enterprise application access through not only desktops, but also mobile devices. To retrofit existing programs across all these devices using traditional programming methods is very costly and time consuming – often prohibitively so. In his session at @ThingsExpo, Jesse Shiah, CEO, President, and Co-Founder of AgilePoint Inc., discussed how you can create applications that run on all mobile devices as well as laptops and desktops using a visual drag-and-drop application – and eForms-buildi...
We certainly live in interesting technological times. And no more interesting than the current competing IoT standards for connectivity. Various standards bodies, approaches, and ecosystems are vying for mindshare and positioning for a competitive edge. It is clear that when the dust settles, we will have new protocols, evolved protocols, that will change the way we interact with devices and infrastructure. We will also have evolved web protocols, like HTTP/2, that will be changing the very core of our infrastructures. At the same time, we have old approaches made new again like micro-services...
Code Halos - aka "digital fingerprints" - are the key organizing principle to understand a) how dumb things become smart and b) how to monetize this dynamic. In his session at @ThingsExpo, Robert Brown, AVP, Center for the Future of Work at Cognizant Technology Solutions, outlined research, analysis and recommendations from his recently published book on this phenomena on the way leading edge organizations like GE and Disney are unlocking the Internet of Things opportunity and what steps your organization should be taking to position itself for the next platform of digital competition.
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
As the Internet of Things unfolds, mobile and wearable devices are blurring the line between physical and digital, integrating ever more closely with our interests, our routines, our daily lives. Contextual computing and smart, sensor-equipped spaces bring the potential to walk through a world that recognizes us and responds accordingly. We become continuous transmitters and receivers of data. In his session at @ThingsExpo, Andrew Bolwell, Director of Innovation for HP's Printing and Personal Systems Group, discussed how key attributes of mobile technology – touch input, sensors, social, and ...