Async Waterfall Cannot Read Property 'apply' of Undefined
Since the moment Node.js was unveiled to the world, it has seen a fair share of both praise and criticism. The debate however continues, and may not terminate anytime soon. What nosotros oft overlook in these debates is that every programming linguistic communication and platform is criticized based on sure issues, which are created by how we utilise the platform. Regardless of how difficult Node.js makes writing safe lawmaking, and how piece of cake information technology makes writing highly concurrent code, the platform has been around for quite a while and has been used to build a huge number of robust and sophisticated web services. These web services scale well, and take proven their stability through their endurance of time on the Cyberspace.
However, like whatsoever other platform, Node.js is vulnerable to programmer problems and issues. Some of these mistakes degrade performance, while others make Node.js appear directly out unusable for whatever you are trying to attain. In this article, we will accept a wait at x common mistakes that developers new to Node.js often make, and how they can exist avoided to become a Node.js pro.
Error #1: Blocking the event loop
JavaScript in Node.js (simply like in the browser) provides a single threaded environment. This means that no two parts of your application run in parallel; instead, concurrency is achieved through the handling of I/O bound operations asynchronously. For example, a asking from Node.js to the database engine to fetch some document is what allows Node.js to focus on some other function of the application:
// Trying to fetch an user object from the database. Node.js is free to run other parts of the code from the moment this office is invoked.. db.User.get(userId, function(err, user) { // .. until the moment the user object has been retrieved hither })
Nevertheless, a piece of CPU-bound code in a Node.js example with thousands of clients connected is all it takes to block the event loop, making all the clients await. CPU-bound codes include attempting to sort a large array, running an extremely long loop, and then on. For example:
function sortUsersByAge(users) { users.sort(function(a, b) { return a.age < b.historic period ? -ane : 1 }) }
Invoking this "sortUsersByAge" function may be fine if run on a pocket-size "users" assortment, only with a large array, it will accept a horrible bear on on the overall performance. If this is something that absolutely must be done, and you are certain that there will be cipher else waiting on the result loop (for example, if this was role of a command-line tool that you lot are building with Node.js, and information technology wouldn't thing if the entire matter ran synchronously), then this may not be an result. However, in a Node.js server instance trying to serve thousands of users at a time, such a pattern can prove fatal.
If this array of users was existence retrieved from the database, the platonic solution would be to fetch information technology already sorted directly from the database. If the effect loop was existence blocked by a loop written to compute the sum of a long history of fiscal transaction data, it could be deferred to some external worker/queue setup to avoid hogging the consequence loop.
As you lot tin come across, there is no silvery-bullet solution to this kind of Node.js trouble, rather each case needs to exist addressed individually. The fundamental idea is to not practise CPU intensive work within the front end facing Node.js instances - the ones clients connect to concurrently.
Error #two: Invoking a Callback More Than Once
JavaScript has relied on callbacks since forever. In spider web browsers, events are handled by passing references to (often anonymous) functions that act like callbacks. In Node.js, callbacks used to be the only way asynchronous elements of your code communicated with each other - up until promises were introduced. Callbacks are still in use, and package developers however design their APIs around callbacks. I common Node.js issue related to using callbacks is calling them more than once. Typically, a function provided by a packet to do something asynchronously is designed to expect a part equally its last argument, which is chosen when the asynchronous task has been completed:
module.exports.verifyPassword = function(user, password, washed) { if(typeof password !== 'cord') { done(new Error('countersign should be a string')) render } computeHash(password, user.passwordHashOpts, function(err, hash) { if(err) { done(err) render } done(zilch, hash === user.passwordHash) }) }
Discover how in that location is a return statement every time "washed" is called, upwardly until the very final time. This is because calling the callback doesn't automatically end the execution of the electric current function. If the first "return" was commented out, passing a non-string password to this function will still outcome in "computeHash" beingness called. Depending on how "computeHash" deals with such a scenario, "done" may be called multiple times. Anyone using this function from elsewhere may be caught completely off baby-sit when the callback they pass is invoked multiple times.
Beingness careful is all it takes to avoid this Node.js fault. Some Node.js developers prefer a habit of adding a return keyword before every callback invocation:
if(err) { return done(err) }
In many asynchronous functions, the return value has nearly no significance, so this approach oft makes it easy to avoid such a problem.
Error #3: Deeply Nesting Callbacks
Securely-nesting callbacks, often referred to equally "callback hell", is not a Node.js issue in itself. All the same, this can cause problems making code quickly spin out of control:
part handleLogin(..., done) { db.User.get(..., function(..., user) { if(!user) { return washed(null, 'failed to log in') } utils.verifyPassword(..., function(..., okay) { if(okay) { return done(zippo, 'failed to log in') } session.login(..., function() { washed(null, 'logged in') }) }) }) }
The more than complex the task, the worse this can get. By nesting callbacks in such a manner, we easily terminate upwardly with error-prone, hard to read, and hard to maintain code. Ane workaround is to declare these tasks as modest functions, and and then link them up. Although, 1 of the (arguably) cleanest solutions to this is to apply a utility Node.js package that deals with asynchronous JavaScript patterns, such as Async.js:
office handleLogin(washed) { async.waterfall([ function(done) { db.User.get(..., washed) }, function(user, done) { if(!user) { return washed(null, 'failed to log in') } utils.verifyPassword(..., function(..., okay) { washed(null, user, okay) }) }, function(user, okay, done) { if(okay) { return washed(zilch, 'failed to log in') } session.login(..., role() { done(goose egg, 'logged in') }) } ], office() { // ... }) }
Like to "async.waterfall", there are a number of other functions that Async.js provides to deal with different asynchronous patterns. For brevity, we used simpler examples here, but reality is frequently worse.
Fault #4: Expecting Callbacks to Run Synchronously
Asynchronous programming with callbacks may non exist something unique to JavaScript and Node.js, but they are responsible for its popularity. With other programming languages, we are accustomed to the predictable guild of execution where two statements will execute one after another, unless there is a specific instruction to jump between statements. Fifty-fifty then, these are frequently limited to conditional statements, loop statements, and function invocations.
Nevertheless, in JavaScript, with callbacks a item office may not run well until the task it is waiting on is finished. The execution of the current office will run until the end without any stop:
function testTimeout() { console.log("Begin") setTimeout(function() { console.log("Done!") }, duration * 1000) console.log("Waiting..") }
As you will find, calling the "testTimeout" function will showtime print "Brainstorm", then print "Waiting.." followed by the the message "Done!" after about a second.
Anything that needs to happen after a callback has fired needs to be invoked from within it.
Error #5: Assigning to "exports", Instead of "module.exports"
Node.js treats each file as a small isolated module. If your bundle has ii files, perhaps "a.js" and "b.js", and so for "b.js" to admission "a.js"'southward functionality, "a.js" must consign it past adding backdrop to the exports object:
// a.js exports.verifyPassword = function(user, countersign, done) { ... }
When this is done, anyone requiring "a.js" will be given an object with the property role "verifyPassword":
// b.js crave('a.js') // { verifyPassword: part(user, password, done) { ... } }
However, what if nosotros desire to export this function direct, and not as the property of some object? We tin can overwrite exports to do this, simply we must not treat information technology as a global variable so:
// a.js module.exports = office(user, password, done) { ... }
Notice how we are treating "exports" every bit a property of the module object. The distinction here between "module.exports" and "exports" is very important, and is often a cause of frustration among new Node.js developers.
Mistake #6: Throwing Errors from Inside Callbacks
JavaScript has the notion of exceptions. Mimicking the syntax of almost all traditional languages with exception treatment support, such as Java and C++, JavaScript tin "throw" and grab exceptions in try-catch blocks:
function slugifyUsername(username) { if(typeof username === 'string') { throw new TypeError('expected a string username, got '+(typeof username)) } // ... } try { var usernameSlug = slugifyUsername(username) } catch(eastward) { console.log('Oh no!') }
Notwithstanding, endeavor-take hold of will not comport as you lot might await information technology to in asynchronous situations. For example, if you wanted to protect a large chunk of code with lots of asynchronous activity with one big try-catch block, it wouldn't necessarily work:
try { db.User.get(userId, part(err, user) { if(err) { throw err } // ... usernameSlug = slugifyUsername(user.username) // ... }) } take hold of(e) { panel.log('Oh no!') }
If the callback to "db.User.go" fired asynchronously, the scope containing the endeavor-grab cake would have long gone out of context for information technology to all the same be able to take hold of those errors thrown from inside the callback.
This is how errors are handled in a different way in Node.js, and that makes it essential to follow the (err, …) pattern on all callback office arguments - the first statement of all callbacks is expected to be an fault if one happens.
Mistake #7: Bold Number to Be an Integer Datatype
Numbers in JavaScript are floating points - at that place is no integer data type. You lot wouldn't expect this to be a problem, as numbers big enough to stress the limits of float are not encountered often. That is exactly when mistakes related to this happen. Since floating point numbers tin only concord integer representations upward to a certain value, exceeding that value in whatsoever calculation will immediately start messing it up. Equally strange as it may seem, the following evaluates to true in Node.js:
Math.pow(2, 53)+1 === Math.pow(ii, 53)
Unfortunately, the quirks with numbers in JavaScript doesn't stop here. Even though Numbers are floating points, operators that piece of work on integer information types work here as well:
5 % ii === 1 // truthful 5 >> 1 === two // truthful
Still, unlike arithmetic operators, bitwise operators and shift operators work just on the trailing 32 bits of such large "integer" numbers. For example, trying to shift "Math.pw(2, 53)" by one will always evaluate to 0. Trying to do a bitwise-or of one with that same big number will evaluate to 1.
Math.pow(two, 53) / 2 === Math.pw(two, 52) // truthful Math.pow(2, 53) >> one === 0 // true Math.pow(2, 53) | one === 1 // true
You may rarely demand to deal with large numbers, but if you practise, there are plenty of big integer libraries that implement the important mathematical operations on large precision numbers, such as node-bigint.
Fault #8: Ignoring the Advantages of Streaming APIs
Let's say we want to build a small proxy-similar web server that serves responses to requests past fetching the content from some other web server. As an instance, we shall build a small web server that serves Gravatar images:
var http = require('http') var crypto = require('crypto') http.createServer() .on('asking', function(req, res) { var email = req.url.substr(req.url.lastIndexOf('/')+1) if(!email) { res.writeHead(404) return res.end() } var buf = new Buffer(1024*1024) http.get('http://world wide web.gravatar.com/avatar/'+crypto.createHash('md5').update(electronic mail).digest('hex'), function(resp) { var size = 0 resp.on('information', function(chunk) { chunk.copy(buf, size) size += chunk.length }) .on('end', function() { res.write(buf.slice(0, size)) res.end() }) }) }) .mind(8080)
In this particular example of a Node.js trouble, nosotros are fetching the prototype from Gravatar, reading it into a Buffer, so responding to the request. This isn't such a bad matter to practise, given that Gravatar images are not too big. Even so, imagine if the size of the contents we are proxying were thousands of megabytes in size. A much ameliorate approach would accept been this:
http.createServer() .on('request', function(req, res) { var electronic mail = req.url.substr(req.url.lastIndexOf('/')+one) if(!email) { res.writeHead(404) return res.end() } http.get('http://www.gravatar.com/avatar/'+crypto.createHash('md5').update(electronic mail).digest('hex'), function(resp) { resp.pipage(res) }) }) .listen(8080)
Hither, we fetch the prototype and simply pipe the response to the customer. At no indicate practise we need to read the entire content into a buffer earlier serving it.
Mistake #9: Using Console.log for Debugging Purposes
In Node.js, "console.log" allows you to print almost anything to the console. Pass an object to information technology and it will print it as a JavaScript object literal. It accepts any arbitrary number of arguments and prints them all neatly space-separated. There are a number of reasons why a programmer may feel tempted to use this to debug his code; still, it is strongly recommended that you lot avert "console.log" in real lawmaking. You should avoid writing "console.log" all over the code to debug it and then commenting them out when they are no longer needed. Instead, utilize ane of the amazing libraries that are built but for this, such as debug.
Packages like these provide convenient means of enabling and disabling sure debug lines when you lot start the application. For example, with debug it is possible to prevent any debug lines from existence printed to the concluding by not setting the DEBUG surround variable. Using it is unproblematic:
// app.js var debug = require('debug')('app') debug('Hello, %s!', 'world')
To enable debug lines, simply run this code with the environment variable DEBUG ready to "app" or "*":
DEBUG=app node app.js
Mistake #10: Non Using Supervisor Programs
Regardless of whether your Node.js code is running in production or in your local development surroundings, a supervisor program monitor that tin orchestrate your program is an extremely useful matter to have. 1 practice frequently recommended by developers designing and implementing mod applications recommends that your code should fail fast. If an unexpected error occurs, do not try to handle it, rather let your program crash and have a supervisor restart it in a few seconds. The benefits of supervisor programs are not just express to restarting crashed programs. These tools allow you to restart the program on crash, besides as restart them when some files change. This makes developing Node.js programs a much more than pleasant experience.
There is a plethora of supervisor programs bachelor for Node.js. For example:
-
pm2
-
forever
-
nodemon
-
supervisor
All these tools come with their pros and cons. Some of them are expert for handling multiple applications on the aforementioned machine, while others are better at log direction. However, if you desire to get started with such a program, all of these are off-white choices.
Determination
Equally you lot can tell, some of these Node.js problems can have devastating effects on your program. Some may exist the cause of frustration while you're trying to implement the simplest of things in Node.js. Although Node.js has fabricated it extremely easy for newcomers to become started, it still has areas where it is just as piece of cake to mess up. Developers from other programming languages may be able to relate to some of these issues, but these mistakes are quite common among new Node.js developers. Fortunately, they are piece of cake to avoid. I hope this curt guide volition help beginners to write ameliorate code in Node.js, and to develop stable and efficient software for us all.
Source: https://www.toptal.com/nodejs/top-10-common-nodejs-developer-mistakes
0 Response to "Async Waterfall Cannot Read Property 'apply' of Undefined"
Post a Comment