Welcome!

@DevOpsSummit Authors: Zakia Bouachraoui, Stackify Blog, Jnan Dash, Liz McMillan, Janakiram MSV

Related Topics: @DevOpsSummit, @CloudExpo

@DevOpsSummit: Blog Feed Post

Node.js ABC’s - N is for npm | @DevOpsSummit #Microservices

The Node Package Manager, also known as npm, is a software system that automates the process

Whether you are ready to start coding your first Node.js project, or if you are a seasoned veteran and need finish up a new exciting project, odds are you will need some functionality that has been written many times before by others.  One of the beauties of the internet is the ability to create and share things.  Node.js is a great language set in itself, but one key selling point is the vast amount of code that is available for free download.

The Node Package Manager, also known as npm, is a software system that automates the process of installing, upgrading, configuring, and removing Node.js packages.  As of Node.js version 0.6.3, npm is bundled and installed with the environment so once you have access to the node command line tools, you also have access to the vast number of shared packages available to you.

As of this writing, there are over 180,000 packages available in the npm.  You can find anything from credit card validators, to full language parsers.  If you have a task to perform, odds are there is a 3rd party package to help you along.  And the best feature I see is that packages are distributed as source script so you have full access to modify or extend the packages you pull from the repository.

npm

In my article on Loading, I discuss how you use the "require" command to load a module

Installing NPM

As I mentioned above, npm is included with Node.js versions 0.6.3 and above.  If you are using an older version of node, I'll leave it as an exercise to the reader to find the distribution package.  Better yet, I'd suggest upgrading to a more recent version of Node.js!

npm is itself distributed as a package through npm so you can have npm upgrade itself.  npm is updated quite frequently, so it's a good idea to upgrade npm once in a while

$ sudo npm install npm -g

Installing a package

Node.js has the concept of "local" and "global" packages.  Local packages are installed as part of your current project and are not available to other projects in your system.  Global packages are installed at the system level, so distributions that include command line tools (like npm) that you will want to use "globally", then you will want to include it in global scope.

Local Package Installation

To install a package locally, you'll use the "install" npm sub command.  Most often you will just need to specify the package name as an argument.

$ npm install package-name

Global Package Installation

To install a package globally, you'll use the "-g" argument.  Most often you will need to run this under "sudo" to allow for permissions in system folders.

$ sudo npm install -g package-name

Updating Packages

Since packages can be installed locally or globally, there is an update procedure for each as well.  Local updates are the easiest and can be completed with the "update" npm command.  This command will update all the packages in the local installation.

$npm update

To update packages globally, you add the "-g" option like you did above.

$npm update -g

But, in some cases you won't want to update all global packages at once.  You can use the "outdated" command to get a listing of global packages that need updating:

$ npm outdated -g --depth=0

and then you can update the global package with the same install command you used above

$ npm install -g package-name

Removing Packages

If the package you recently downloaded isn't living up to your expectations, you can use the "uninstall" npm command to remove the package from a local or global installation.  For a global installation, use the "-g" option as above

$ npm uninstall package-name
$ npm uninstall -g package-name

Creating Packages

In my article on Node.js Modules, I go over in detail the package creation process.  I'd suggest you read that article for details on creating your own package.

Publishing Packages

Once you have a great package put together, you may want to share that with the world.  The process for doing so is very easy.  You must first have a npm user account.  You can create a new user with the "npm adduser" command, or alternately login if you have an existing account with the "npm login" command.

Once you are authenticated, you can use the "publish" npm command to push your package to the npm repository.  use the folder name of your package in the "my-cool-package" parameter below:

$npm publish my-cool-package

Conclusion

Check out the Getting Started documentation on the npm website for more information on getting the most out of the Node Package Manager.

More Stories By Joe Pruitt

Joe Pruitt is a Principal Strategic Architect at F5 Networks working with Network and Software Architects to allow them to build network intelligence into their applications.

@DevOpsSummit Stories
Docker is sweeping across startups and enterprises alike, changing the way we build and ship applications. It's the most prominent and widely known software container platform, and it's particularly useful for eliminating common challenges when collaborating on code (like the "it works on my machine" phenomenon that most devs know all too well). With Docker, you can run and manage apps side-by-side - in isolated containers - resulting in better compute density. It's something that many developers don't think about, but you can even use Docker with ASP.NET.
If you are part of the cloud development community, you certainly know about “serverless computing,” almost a misnomer. Because it implies there are no servers which is untrue. However the servers are hidden from the developers. This model eliminates operational complexity and increases developer productivity. We came from monolithic computing to client-server to services to microservices to the serverless model. In other words, our systems have slowly “dissolved” from monolithic to function-by-function. Software is developed and deployed as individual functions – a first-class object and cloud runs it for you. These functions are triggered by events that follow certain rules. Functions are written in a fixed set of languages, with a fixed set of programming models and cloud-specific syntax and semantics. Cloud-specific services can be invoked to perform complex tasks. So for cloud-na...
Kubernetes is an open source system for automating deployment, scaling, and management of containerized applications. Kubernetes was originally built by Google, leveraging years of experience with managing container workloads, and is now a Cloud Native Compute Foundation (CNCF) project. Kubernetes has been widely adopted by the community, supported on all major public and private cloud providers, and is gaining rapid adoption in enterprises. However, Kubernetes may seem intimidating and complex to learn. This is because Kubernetes is more of a toolset than a ready solution. Hence it’s essential to know when and how to apply the appropriate Kubernetes constructs.
In a recent survey, Sumo Logic surveyed 1,500 customers who employ cloud services such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). According to the survey, a quarter of the respondents have already deployed Docker containers and nearly as many (23 percent) are employing the AWS Lambda serverless computing framework. It's clear: serverless is here to stay. The adoption does come with some needed changes, within both application development and operations. That means serverless is also changing the way we leverage public clouds. Truth-be-told, many enterprise IT shops were so happy to get out of the management of physical servers within a data center that many limitations of the existing public IaaS clouds were forgiven. However, now that we've lived a few years with public IaaS clouds, developers and CloudOps pros are giving a huge thumbs down to the...
To enable their developers, ensure SLAs and increase IT efficiency, Enterprise IT is moving towards a unified, centralized approach for managing their hybrid infrastructure. As if the journey to the cloud - private and public - was not difficult enough, the need to support modern technologies such as Containers and Serverless applications further complicates matters. This talk covers key patterns and lessons learned from large organizations for architecting your hybrid cloud in a way that: Supports self-service, "public cloud" experience for your developers that's consistent across any infrastructure. Gives Ops peace of mind with automated management of DR, scaling, provisioning, deployments, etc.