lundi 20 avril 2015

Duplicate key error index Mongoose

Im trying to simply create multiple objects with empty arrays into them. Unfortunately there is an error message when i want to create more than one element like that.

Here is my object schema:

var groupSchema = mongoose.Schema({
id: mongoose.Schema.ObjectId,
name: { type: String, required: true },
section: { type: mongoose.Schema.ObjectId, ref:"Section", childPath:"groups" },
users: [ {type : mongoose.Schema.ObjectId, ref : 'User', childPath: "groups"} ],
invitations: [{
    _id:false,
    email: { type: String, required: true },
    isRegistered: { type: Boolean, required: true }
}],

});

Simple create function:

//Create new group
exports.createGroup = function(req, res){
Group.create(req.body, function(err, group){
    if(err){
        console.log(err);
        res.json(err);
        return false;
    }
    res.json({status: true, group: group});
    return true;
});
};

And error message:

{ [MongoError: insertDocument :: caused by :: 11000 E11000 duplicate key error index: skaud_up.groups.$invitations.email_1  dup key: { : null }]
name: 'MongoError',
code: 11000,
err: 'insertDocument :: caused by :: 11000 E11000 duplicate key error index: skaud_up.groups.$invitations.email_1  dup key: { : null }' }

Honestly I don't know why i cant have multiple elements with empty arrays i mnogoDB database.

Can someone explain me what is the cause of issue and what is the proper way of using this sort of objects?

how to workaround npm "Error: Invalid version: "0.1" BUG?

I am trying to build a nodejs package. When I run npm install I get Error: Invalid version: "0.1 message and npm installation fails.

I tried to fix the error manually by replacing "version": "0.1", with "version": "0.0.1", in package.json files in modules directories but there are many modules that contain invalid 0.1 version. It's very hard to fix it manually.

Is there a simpler way to fix it? Or maybe an awk, sed or other bash script that search for package.json files recursively and replace "version": "0.1", with "version": "0.0.1", help?

EDIT: I already checked out this thread npm: Why is a version "0.1" invalid? and lots of others prior to asking question

npm with nonProxyHost



I've searched over the internet what is the correct way to setup npm when you need to have some sites accessed over the proxy and some without, but haven't found an answer.

To give you example, we are having npm registry mirror within our corporate infrastructure and have to therefore access it without proxy setting. On the other hand, some npm modules are accessing public urls on their own, especially in case they need to download few things, and therefore need proxy set. Very good example is the phantomjs binary.
So having npm setup with proxy option and registry pointing to our internal infrastructure is not possible now, because when proxy is set, then all urls (even registry one) are resolved thru proxy server.

So I would need something similar like Java has:
-Dhttp.proxyHost="gate.company.com" -Dhttp.proxyPort=1234 -Dhttp.nonProxyHosts=10.\|.company.com"

Is it possible with npm?

I've checked the npm config, but there are just 'proxy' and 'https-proxy' settings, but nothing in direction I require.

Thnx for answers.

Cant use req.user in another controller

I'm using an angular-fullstack-generator with sequelize.

I have a problem, i need to use the logged User object (req.user) in another controller (with another Model), not in User controller.

But req.user in the other controller is undefined. any idea?

Thanks

Javascript Grunt passing arguments for Tasks

I'm at the beginning with Grunt and I'm facing with this issue: I've to pass as argument two paths, but, also if the execution seems to be ok, the task doesn't really work... Any hints?

grunt.initConfig({
    pkg: grunt.file.readJSON('package.json'),

    copy: {
        backup: {
            files: [
                {expand: true, src: [process.argv[2]], dest: [process.argv[3]]},
            ],
        },
    },

If I try to run

grunt copy:backup:sourcefolder:destinationfolder

but the code executes without giving any result.

Thanks in advance for the help!

nodejs' ldapauth module install failure

I have a nodejs application running on windows7 64bit. Now I want to install the ldapauth (http://ift.tt/1O5mjzZ) but when I do I get the following error during install. Please help!

C:\Programs\nodejsCloudant++>npm install ldapauth
npm WARN package.json make@0.0.0 No repository field.

> bcrypt@0.7.5 install C:\Programs\nodejsCloudant++\node_modules\ldapauth\node_modules\bcrypt
> node-gyp rebuild
C:\Programs\nodejsCloudant++\node_modules\ldapauth\node_modules\bcrypt>if not defined npm_config_node_gyp (node "C:\Programs\nodejs\node_modules\npm\bin\node-gyp-bin\\..\..\node_modules\node-gyp\bin\node-gyp.js" rebuild )  else (rebuild)
gyp ERR! configure error
gyp ERR! stack Error: Command failed: CreateProcessW: Access is denied.
gyp ERR! stack

Singleton MongoDB connection in Node

What is the best way to set up a singleton in Node for Mongodb? I tried the following code, but it does not work when making a lot of calls rapidly.

The singleton does not get set up before subsequent calls, and thus it tries opening too many connections and eventually fails. The below call works well for making infrequent calls.

Anyone have suggestions on the best practice here?

var db_singleon;

var getConnection= function getConnection(callback)
{
    if (db_singleton)
    { 
      callback(null,db_singleton);
    }
    else
    {
        var connURL = mongoURI; //set in env variables
        mongodb.connect(connURL,function(err,db){
            if(err)
                console.error("Error creating new connection "+err);
            else
            {
                db_singleton=db;    
                console.error("created new connection");
            }
            callback(err,db_singleton);
            return;
        });
    }
}

How to make multiple http calls from the nodejs server with a common callback?

I want to make multiple http calls in nodejs sever with common callback. Is there any modules available for doing it?

understanding require() with module.exports with javascript and browserify

I am a c++ programmer at heart and I'm currently being thrown into the deep end with javascript and asked to swim very quick. I am using browserify so I am able to use the require function that node.js uses to get get access to code in other files. Now everything I have done seems to be working fine, I am just unsure that I am doing it correctly.

//a.js
module.exports = function(){
    alert("hello world");
}

//b.js
var myClass = new MyClass();
module.exports = myClass;

//c.js
var a = require("./a.js");
a();
var b = require(./b.js");
b.prototype.test = 555;

//d.js
function () {
    var a = require("./a.js");
    a();
    var b = require(./b.js");
    assert(b.test === 555);
}
function () { // duplicated require called inside same file but different function
    var a = require("./a.js");
    a();
}

So in every function and every file I want to use a.js do I have to put the require call? Seems like it will get a bit convoluted. Is there a better way to do this? Also assuming that c.js is ran before d.js will the assert pass or does it result in a copy being created of myClass so the objects are different between C and D?

Thanks for any help.

apr_socket_recv: connection reset by peer (104) nodejs Ubuntu 14.10 x64 8GB RAM 4 Core (VPS)

I am working on a project in node.js. The project is about location(GPS) tracking. The ultimate target for my project is to server 1M concurrent requests. What i have done are,

Created a server in node.js listing on port 8000
A html document with google map to locate user positions/GPS locations
A single socket connection between server and html document to pass location information
An API to get user location from client devices ( it can be a mobile app )
Once a server receives user location via the API mentioned it will emit that information to client (HTML document) via socket.

Its working well and good.

Problem
I am using apachebench for load test my server. When i increase the concurrency benchmarking breaks frequently with the error

apr_socket_recv: Connection reset by peer (104)  

how can i solve this, Whats the actual cause of this problem.

Note: if i run the same server in my local windows machine it serving 20K requests successfully.

I have changed the

ulimit -n 9999999

and the soft and hard file openings limit to 1000000

neither solves my problem.
please help me to understands the problem clearly. How i increase the concurrency to 1M. is it possible with some more hardware/network tunings?

Edit:

I am using socketCluster on the server with the no of worker equivalent to the no of Core/CPU (i.e) 4 workers in my case
The CPU usage with the htop command in server terminal is 45%
Memory usage was around 4GB / 8GB & swap space not used
The ab command i used to load the server was

ab -n 20000 -c 20000 "http://IP_ADDRESS:8000/API_INFO"

How do i render a data on android from node.js/mongodb server

The post method works successfully, it is just that I cannot see the string that I am sending from android to the server showing in the server

This is my code

protected String doInBackground(String... params) {
   postData(params[0]);
   return null;
   }

    @Override
    protected void onPostExecute(String s) {
        pb.setVisibility(View.GONE);
        Toast.makeText(getApplicationContext(), "Your Post has been successfully posted",
                Toast.LENGTH_SHORT).show();
    }

    public void postData(String titleToSend){
    HttpClient client = new DefaultHttpClient();
    HttpPost post = new HttpPost("http://ift.tt/1HloeL3");

    try{
        List<NameValuePair> nameValuePairs = new ArrayList<NameValuePair>();
        nameValuePairs.add(new BasicNameValuePair("title", titleToSend));
        post.setEntity(new UrlEncodedFormEntity(nameValuePairs));

        HttpResponse response = client.execute(post);
    }catch (ClientProtocolException e){
        e.printStackTrace();
    }catch (IOException e){
        e.printStackTrace();
    }
}

The onClick listener code

public void onClick(View v) {
            String tit = null;

            if(title.getText().toString().length()<1){
                Toast.makeText(getApplicationContext(), "Please Fill in all fields", Toast.LENGTH_SHORT).show();
            }else{
                pb.setVisibility(View.VISIBLE);
                myTask task = new myTask();
                task.execute(title.getText().toString());
            }
}

The data-type name in the server is title, so what the user input in the EditText should go to the title field.

The backend api.js

api.route('/')
        // GET method to get list of ads
        .get(function(req, res) {
            Ad.find({}, function(err, ads) {
                if(err) {
                    res.send(err);
                    return;
                }
                res.json(ads);
            });
        }) 

        // POST method to post an ad to the database
        .post(function(req, res) {
            var ad = new Ad({
                title: req.body.title,
                photos: req.body.photos,
            });

Can anyone point to me what I am doing wrong here or guide me on how should I do it please

Express js: How to download a file using POST request

When I use GET, everything works fine. However, I struggle to use POST to achieve the same effect. Here are the code I have tried:

1.

app.post("/download", function (req, res) {
    res.download("./path");
});

2.

app.post("/download", function (req, res) {
    res.attachment("./path");
    res.send("ok");
});

3.

app.post("/download", function (req, res) {
    res.sendFile("./path");
});

None of them work. What is the correct way to do this?

EDIT: I submit a POST request through a HTML form to /download. ./path is a static file. When I use code in method 1, I can see the correct response header and response body in the developer tool. But the browser does not prompt a download.

Sequelize hooks not fired

I'm trying to use Sequelize beforeCreate hook to create Card instance for my User. For some reason hook is not fired:

"use strict";

module.exports = function(sequelize, DataTypes) {
  var User = sequelize.define("User", {
    email: DataTypes.STRING,
    hash: DataTypes.STRING(1024),
    salt: DataTypes.STRING(512),
    activationKey: DataTypes.STRING,
    resetPasswordKey: DataTypes.STRING
  }, {
    hooks: {
      beforeCreate: function(user, options, cb) {
        console.log('[ORM]: hook `beforeCreate` for model User...')
        models.Card
          .create({
            name: user.name,
            url: null
          })
          .then(function(card) {
            user.addCard(card).then(function() {
              cb(null, user)
            })
          })
          .catch(function(err) {
            console.log(err)
            cb(null, user)
          })
      }
    },
    classMethods: {
      associate: function(models) {
        // associations can be defined here
        User.hasMany(models.Card)
      }
    }
  });

  return User;
};

Testing:

db.sequelize.sync().then(function() {
  console.log('[INFO]: database synced')
  User.create({
    email: "helloworld@example.com",
    name: "John Dow"
  })
});

User is created, but Card — not, hook is not fired:

[INFO]: database synced
Executing (default): INSERT INTO "Users" ("id","email","updatedAt","createdAt") VALUES (DEFAULT,'helloworld@example.com','2015-04-20 09:22:50.388 +00:00','2015-04-20 09:22:50.388 +00:00') RETURNING *;

Any ideas, what I'm doing wrong? Thank you!

async.waterfall bind context

I am currently working on a web application with node.js and I can't figure a context problem with the library async.

Here is a example of code of my application :

notification.prototype.save = function (callback) {

    async.parallel([
        // Save the notification and associate it with the doodle
        function _saveNotification (done) {

            var query = 'INSERT INTO notification (notification_id, user_id, doodle_id, schedule_id) values (?, ?, ?, ?)';
            notification.db.execute(query, [ this.notification_id, this.user_id, this.doodle_id, this.schedule_id ], { prepare : true }, function (err) {
                return done(err);
            });

            console.log("SAVE NOTIFICATION");
            console.log("doodle_id", this.doodle_id);

        }.bind(this),

        // Save notification for the users with the good profile configuration
        function _saveNotificationForUsers (done) {
            this.saveNotificationForUsers(done);
        }.bind(this)

    ], function (err) {
        return callback(err);
    });
};

So in this code, I have to use the bind method to bind the context of my object ( this ) because otherwise async change it. I got it. But what I don't understand is why the code of this.saveNotificationForUsers does not work the same way :

notification.prototype.saveNotificationForUsers = function (callback) {

    console.log("SAVE NOTIFICATION FOR USERS");
    console.log("doodle id : ", this.doodle_id);

    async.waterfall([
        // Get the users of the doodle
        function _getDoodleUsers (finish) {
            var query = 'SELECT user_id FROM users_by_doodle WHERE doodle_id = ?';
            notification.db.execute(query, [ this.doodle_id ], { prepare : true }, function (err, result){
                if (err || result.rows.length === 0) {
                    return finish(err);
                }

                console.log("GET DOODLE USERS");
                console.log("doodle id : ", this.doodle_id);

                return finish(err, result.rows);
            });
        }.bind(this)
    ], function (err) {
        return callback(err);
    });
};

When I call the previous code, the first console.log is able to show me the "this.doodle_id" variable, which means the function knows the "this" context. But the functions inside the waterfall call does not, even if I bind 'this' to them.

I figured a way to make it works by creating a 'me' variable which is equal to 'this' juste before I call waterfall, and by binding the functions with the 'me' variable' and not this, but I would like to understand why I am forced to do this when I use async.waterfall and not when I use async.parallel.

I hope I was clear with the description of my problem, if someone can help me understand it will be a great pleasure !

Meteor JS: include dynamic name template in layout

I have this basic layout. I want to include a dynamic header to be included in the template. +header should be like this +{{get_header_name}}. get_header_name is a helper function. I tried this idea but jade will throw an error. Any ideas how to make it dynamic?

basic.jade

template(name="basicLayout")
    #main
        header
            +header // <--- make this a dynamic using helper (get_header_name)
            +search
        else
            +yield
        footer
            +footer

What is a fast & scalable way of identifying nearby defined set of places?

Given a set of places with lat/long predefined (locations that are not defined in google maps etc should be included), I need to get the top N number of places which are within a given distance radius from a given location (also with lat/long defined). An example for the places list would be restaurants in a country. Considering processing time and scalability (assuming the number of prefined places will grow over time) what is the best way to do this?

To narrow things down assume the places list and the location can be filtered by an area code. The location will be sent from a mobile to a NodeJS Express service where I'm hoping to do the processing on the server end for returning a list of nearby places back.

I found the link below which calculates direct distance between two lat/long points but need to know if there is a better way since I may need to consider driving distance too.

http://ift.tt/1DCVkUf

Mocha-casperjs is not creating the xunit file

I am using mocha-casperjs for running my tests. I'm running my tests with the command make test. Thereby my Makefile looks like:

Makefile:

test:
    # Clear console log before we start.
    @clear

    # Make sure we are not having too much modules.
    @npm prune

    # Make sure we have the required modules.
    @npm install

    # Clear the console, so we only see the test results.
    @clear

    # Run the test.
    ./node_modules/.bin/mocha-casperjs sample.js --xunit=xmllog.xml

.PHONY: test

But the xmllog.xml is never created. I tried to touch the xmllog.xml before running the test. I tried to fore errors in the test, to make sure failing tests are skipped. I have already commented the failing tests. But no xmllog.xml is created. Does someone have a clue?

I am running mocha-casperjs version 1.1.0-beta3.

Thanks!

Answer:

Thanks to @vanadium23 I was able to solve this way too easy issue. I was confusing the documentation of CasperJS itself and mocha-casperjs. His answer was:

In documentation there is no such option as --xunit. Instead of this, you need to use option --file=xmllog.xml

Thanks @vanadium23

Javascript to render charts not working with .getJSON

Correction: /devicedata to /products - that was a typo..

I have built a node-express app which I am using to to plot a graph from the data retrieved from mongo database. I have tried two external libraries Chart.js and Canvas.js. Both work perfectly fine when the data is hard-coded in the javascript. The moment I use $.getJSON to retrieve the data from database it stops working. On the server side code is as below:

app.get('/products', function(req, res) {
    var db = req.db;
    db.collection('products').find().toArray(function (err, items) {
       res.json(items);
    });
});

On the client side code is as below:

<script type="text/javascript">
$(document).ready(function () {

    $.getJSON("/products", function (result) {

        var chart = new CanvasJS.Chart("chartContainer", {
            title:{
                text: "Temperatures recorded this year"
                },
            data: [
                {
                    type: "column",
                    dataPoints: result,
                }
            ]
        });

        chart.render();
    });
});
</script>
<div id="chartContainer" style="height: 300px; width: 100%;">
</div>

Is there an alternative to .getJSON to retrieve data from database (mongo in this case) ? The chart is rendering as a blank canvas

How to trigger error on fs.rename command if file already exists?

I have the following code that moves files from one directory to another:

var fs = require('fs'),
    oldPath = 'firstfile.txt',
    newPath = 'temp/firstfile.txt';

fs.rename(oldPath, newPath, function (err) {
    console.log('rename callback ', err); 
});

Is it possible to trigger error if newPath file already exists?

Potly on Nodejs - "socket hang up and read ECONNRESET"

I have some error when use Plotly on Nodejs, can you help me?

I write a demo to generate image from splotly api, every 20 seconds, server will create a chart image. Some images is good, but then throw err as image:

enter image description here

enter image description here

My code:

setInterval(function() {
  plotly.getImage(figure, imgOpts, function(error, imageStream) {
    if (error) return console.log(error);
    var time = new Date();
    console.log("1--" + new Date());
    var fileStream = fs.createWriteStream(time + '.png');
    console.log("2--" + new Date());
    imageStream.pipe(fileStream);

    //            setTimeout(function(){
    //                console.log("3--"+new Date());
    //                //createPdfkit(time);
    //            },10000);


    //            imageStream.on('end',function(){
    //
    //            });
  });

}, 20000);

Check package version at runtime in nodejs?

I have some of my entries in package.json defined as "*"

"dependencies": {
    "express": "4.*",
    "passport": "*",
    "body-parser": "*",
    "express-error-handler": "*"
},

I wan't to freeze those values to the current version. How can I know what version my packages are at run time? I don't mind checking one by one since I don't have many of them :)

BTW: I cannot do npm list --depth=0 because I cannot access the vm directly (PaaS restriction), just the logs.

How to generate shell script in eclipse when we build the program?

I like to generate shell script in eclipse when we build the program? In shell script, it contains all the dependencies of JAR.

So that, I will invoke the shell script file in Node.js to process the action.

In google, everyone said about Manifest/Plug in, I am little bit confused about the steps.

dimanche 19 avril 2015

Using logstash and elasticseach

I'm actually using node-bunyan to manage log information through elasticsearch and logstash and I m facing a problem.


In fact, my log file has some informations, and fills great when I need it.


The problem is that elastic search doesn't find anything on



http://localhost:9200/logstash-*/



I have an empty object and so, I cant deliver my log to kibana.


Here's my logstash conf file :



input {
file {
type => "nextgen-app"
path => [ "F:\NextGen-dev\RestApi\app\logs\*.log" ]
codec => "json"
}
}

output {

elasticsearch {
host => "localhost"
protocol => "http"
}

}


And my js code :



log = bunyan.createLogger({
name: 'myapp',
streams: [
{
level: 'info',
path: './app/logs/nextgen-info-log.log'
},
{
level: 'error',
path: './app/logs/nextgen-error-log.log'
}
]
})

router.all('*', (req, res, next)=>
log.info(req.url)
log.info(req.method)
next()
)


NB : the logs are well written in the log files. The problem is between logstash and elasticsearch :-/


EDIT : querying http://localhost:9200/logstash-*/ gives me "{}" an empty JSON object Thanks for advance


Incompatible node.js version

I used to install node-png: npm install node-png -g


I got a message says:



npm WARN engine node-png@0.4.3: wanted: {"node":"0.8.x"} (current: {"node":"0.12.2","npm":"2.7.4"})



How to fix it, note: I use earlier version ,,


Azure SQL Read timeout

The following code



var query = {
identifier: identifier,
state: commons.OTP_TOKEN_LIFECYCLE_STATES.UNUSED
};

this.store.where(query).take(1).read({
success: function(results){
console.log('Got some kickass randomness');
resolve(results);
},

error: function(err){
console.log("Error while reading the data");
console.error(err);
reject({});
}
});


Fails giving a timeout on azure mobile backend services. This.table is a reference to the table "otp tokens".


I have tried many things from changing the node.js version to updating the mssql driver installed on the server. But IT NEVER WORKS.


More intrestingly writes are extremely fast.


I also added a logger to log "read" from the "otp tokens" table, to my surprise it never fires.


Anybody got an insight in what is going on ?


How to limit the number of nodes?(Spritekit)

Hey guys i have a little problem check it out:



override func touchesBegan(touches: Set, withEvent event: UIEvent) {

for touch in (touches as! Set<UITouch>) {

var Location:CGPoint = touch.locationInNode(self)
var Node:SKNode = self.nodeAtPoint(Location)

if(Location.x < self.size.width/2) {

node2.addchild(sprite2)
node1.removeFromParent()
}
if (Location.x > self.size.width/2) {

node1.addchild(sprite1)

node2.removeFromParent()
}


The problem is that when i tap for exmple 10 times on the same side of the screen (lets say the right side) it apppears 10 spritenodes and i can t use "hidden" because of the physicsBody can you guys help me?


Running and debugging Karma tests in a Docker container

I want to dockerize my entire node.js app and run everything inside a docker container, including tests.


It sounds easy if you're using PhantomJS and I actually tried that and it worked.


One thing I like though about running tests in Chrome - easy debugging. You could start Karma server, open devtools, set a breakpoint in a test file (using debugger statement) and run Karma - it will connect to the server run tests, and stop at the breakpoint, allowing you from there to do all sorts of things.


Now how do I do that in a docker container?




  • Should I start Karma server (with Chrome) on a hosting machine and tell somehow Karma-runner inside the container to connect to it, to run the tests? (How do I do that anyway?)




  • Is it possible to run Chrome in a docker container (it does sound like a silly question, but when I tried docker search desktop bunch of things come up, so I assume it is possible (?)




  • Maybe it's possible to debug tests in PhantomJS (although I doubt it would be as convenient as with Chrome devtools)




Would you please share your experience of running and debugging Karma tests in a docker container?


mongoose deep populate based on query

I'm populating the mongoose model using deepPopulate plugin like so:



User.findOne({_id: req.params.user_id})
.deepPopulate('classes.assignments.submissions')
.exec(function (err, user) {
res.send(user);
})


And I want the submissions to be populated based on the query - only submissions for a current user - something like .where(submissions.user == req.params.user_id). I can't find the solution for this and end up in a situation where current user can see submissions from other users.


I was trying to set submissions for other users to null manually inside the exec:



for (var i = 0; i < user.classes.length; i++) {
var cl = user.classes[i];
for (var i = 0; i < cl.assignments.length; i++) {
var assign = cl.assignments[i];
for (var i = 0; i < assign.submissions.length; i++) {
var subm = assign.submissions[i];
if(subm.user != req.params.user_id){
subm = null
};
};
};
};


But it still returns not filtered submissions.


I need help in combining 2 elements in an array together

here i am using css selectors and getting values from a web site,the values which mapped into an array.I need to merge the first element and second element together similarly I need to map the third and fourth elements


this is how the result come:



[ 'Mon',
' 7:30 AM to 11 PM',
' Tue',
' 7:30 AM to 11 PM',
' Wed',
' 7:30 AM to 11 PM',
' Thu',
' 7:30 AM to 11 PM',
' Fri',
' 7:30 AM to 11 PM',
' Sat',
' 7:30 AM to 11 PM',
' Sun',
' 7:30 AM to 11 PM']


here i need to merge the day and timing into a single array element


nodeJS too many child processes?

I am using node to recursively traverse a file system and make a system call for each file, by using child.exec. It works well when tested on a small structure, with a couple of folders and files, but when run on the whole home directory, it crashes after a while



child_process.js:945
throw errnoException(process._errno, 'spawn');
^
Error: spawn Unknown system errno 23
at errnoException (child_process.js:998:11)
at ChildProcess.spawn (child_process.js:945:11)
at exports.spawn (child_process.js:733:9)
at Object.exports.execFile (child_process.js:617:15)
at exports.exec (child_process.js:588:18)


Does this happen because it uses up all resources? How can I avoid this?


EDIT: Code improvement and best practices suggestions always welcome :)



function processDir(dir, callback) {
fs.readdir(dir, function (err, files) {
if (err) {...}
if (files) {
async.each(files, function (file, cb) {
var filePath = dir + "/" + file;
var stats = fs.statSync(filePath);
if (stats) {
if (stats.isFile()) {
processFile(dir, file, function (err) {
if (err) {...}
cb();
});
} else if (stats.isDirectory()) {
processDir(filePath, function (err) {
if (err) {...}
cb();
});
}
}
}, function (err) {
if (err) {...}
callback();
}
);
}
});
}

Mongoose many-to-many complement set

I'm attempting to use Mongoose to build a many-to-many relationship between users.


There are two types of users: teachers and students. A teacher can be in charge of an arbitrary number of students. On the other hand, a student can corresponds to multiple teachers.


I have created a schema Relation, which has two entries, teacher and student, referring to the ObjectId in the User Schema.



var RelationSchema = new Schema({
teacher: {
type: Schema.Types.ObjectId,
ref: 'User'
},
student: {
type: Schema.Types.ObjectId,
ref: 'User'
}
});


It is trivial to query all the students that one teacher is in charge of. ---- (1)


When it comes to querying its complement set, it becomes less obvious. The first thought is to query all the users minus the results from (1).



User.find({}, function(err, users) {
if (err) { /*handle and return */ }
Relation.find({teacher: teacherId})
.populate({path: 'student'})
.exec(function(err, relations) {
if (err) { /* handle and return */ }
// map relations to students
var students = _.map(relations, function(relation) {
return relation.student;
});
});
// Return the difference of users and students
// i.e., complement = users - students
});


However, the above method might cause huge overhead.


What is the most efficient way of querying the complement set of the students that one teacher is in charge of?


why the command script generated for my npm package is different than other package?

I published a npm package myself: makeTimeTable


I installed it happily in cygwin:



npm install -g maketimetable


But when I try to run the command line, an error is thrown:



/node_modules/maketimetable/cli.js: No such file or directory


Finally I figure out that the command script file(C:\Users\xxx\AppData\Roaming\npm\maketimetable) generated when installing my package is different than other global command line tools I installed, below is mine:



"$basedir/node_modules/maketimetable/cli.js" "$@"
exit $?


other global command line tools' script is like this:



#!/bin/sh
basedir=`dirname "$0"`

case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac

if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/eslint/bin/eslint.js" "$@"
ret=$?
else
node "$basedir/node_modules/eslint/bin/eslint.js" "$@"
ret=$?
fi
exit $ret


So in other tools' script, it will distinguish whether I'm using cygwin, it will treat the path and directory a little differently. But why my package's script is different than those? How can I make npm to generate the same kind of script for my package?


How can I run pm2 on a certain node version?

There are several different versions of node running on our linux server, And my service is based on node v0.11.14. However, other people's code have to run on lower version of node(lower than v0.11) otherwise their services will be out of service. So I can't define the global node version as v0.11. I just want to run pm2 to monitor my service based on node v0.11.


Is there anyway to run my pm2 on node v0.11 without changing the global node version? Thanks


Attaching object to Node.js process

I am using the environment variable and arguments parsing module called nconf for my node.js Express web server.


http://ift.tt/1G8l3EY


I decided that the best way to make the nconf data global was to simply attach it to the process variable (as in process.env), is this a good idea or bad idea? Will it slow down execution in weighing down "process"?


Here is my code:



var nconf = require('nconf');

nconf.argv()
.env()
.file({ file: './config/config.json' });

nconf.defaults({
'http': {
'port': 3000
}
});

process.nconf = nconf;

//now I can retrieve config settings anywhere like so process.nconf.get('key');


frankly, I kind of like this solution. Now I can retrieve the config data anywhere, without having to require a module. But there may be downsides to this...and it could quite possibly be a very bad idea. IDK.


Is JWT authentication is secure?how it protect CORS?

i have implemented the tokken based authentication in my project instead of the cookie-session based authentication. So, in jwt(jason-web-tokkens), every time is send req to server,I in headers attach the tokken and send it to server which validate it against the secret is used in generation the tokkkne in the first time and send me the response. Now, i have concern about it, first the tokken is saved in the local storage in browser.although the tokken is hashed but what if the hacker just take that tokken from storage and use it? can anyone tell me how it stop the CORS attack? I'm confused and cannot find any reliable answer online.


NodeJs on DigitalOcean cannot connect to MySQL database on Amazon RDS

We deployed the NodeJS web service on DigitalOcean and connect to MySQL database on Amazon RDS. i can use Navicat Premium to remote to MySQL RDS and also works fine with NodeJS localhost (it means my application on localhost can connect to RDS database)


But when i deployed my app to DigitalOcean server. It cannot connect to MySQL RDS. The connect was timed out.


I use command to view opening port:



iptables -L


And the result is:



Chain INPUT (policy ACCEPT)
target prot opt source destination

ACCEPT tcp -- anywhere anywhere tcp dpt:3000
ACCEPT tcp -- anywhere anywhere tcp dpt:mysql
ACCEPT tcp -- anywhere anywhere tcp dpt:3000
ACCEPT tcp -- anywhere anywhere tcp dpt:mysql
ACCEPT tcp -- anywhere anywhere tcp dpt:mysql

Chain FORWARD (policy ACCEPT)
target prot opt source destination

Chain OUTPUT (policy ACCEPT)
target prot opt source destination

ACCEPT tcp -- anywhere anywhere tcp dpt:mysql


I see output and input also opened for port 3306. But NodeJs still cannot connect. The problem is DigitalOcean droplet does not open port 3306 and NodeJS application cannot connect to outside via this port.


(P/S: Amazon RDS also opened all ports and anywhere for IP)


Please help me, thanks


How can I use Stanford's NER in my Meteor project?

I'd like to use Stanford's NLP (specifically its NER) in my Meteor project, so I tried using this node.js package: enter link description here



var node_ner = require('node-ner');

var ner = new node_ner({
install_path: '/path/to/stanford-ner'
});


However, I don't know what the install path should be for executing the Java .jar. If I put the NLP folder in my Meteor project under /private, what's the path that I should use to access that?


Any help would be greatly appreciated!


What is the correct way to extend a grunt task based on a few others?

I currently am creating a custom Grunt task for my application. It does several things, and in order to achieve these, I use a few existing Grunt tasks.


Below is how I have chosen to create my task, by aggregating these others.What I would like to know, is this a common way of extending a task or is there a better way?


Functionally it works, but I some how feel this is a hack?



module.exports = function (grunt) {

grunt.loadNpmTasks('grunt-shell');
grunt.loadNpmTasks('grunt-contrib-clean');
grunt.loadNpmTasks('grunt-contrib-copy');
grunt.loadNpmTasks('grunt-mkdir');

var _ = require('lodash');

var options = this.options();

grunt.registerTask('custom-task-name', 'Custom task description', function () {

// Config ommited for brevity.
grunt.initConfig({
clean: {
},
mkdir: {
},
copy: {
},
shell: {
},
});

var tasks = ['clean', 'mkdir:target', 'copy:target', 'shell:target'];

grunt.task.run(tasks);
});
};

Mongoose ODM findOneAndUpdate sub-document

I currently have a Message schema which has a sub-document of replies as such:



message: String,
replies: [{
message: String,
userFrom: {
type: mongoose.Schema.Types.ObjectId,
ref: 'User'
}
}]


I am attempting to make a POST request that will allow me to findOneAndUpdate the message to attach a reply. I've attempted the following:



Message.findOneAndUpdate(req.params.id,{
replies: {
message: req.body.reply,
userFrom: req.user._id
}
}, function(err, data){
...
});


What is happening is I am overwriting any reply that is currently in the replies array. I believe I need to use the $push operator but I'm not quite sure how.


Is it possible to use findOneAndUpdate as well as $push in a single request?


in php, get value in node js

I have a js file running in node. This js file reads data coming from a bluetooth device. I also have a php file that is running in apache server. This displays a website interface.


Now, in the php file, I want to use the data from the js file. What are the possible methods to achieve this?


CSS and Javascript files dont work into html file in the root folder - OpenShift

I'm using OpenShift free to make my term paper. My issue is that, in my root folder, I have the files: rankAplus.html, todo.js and dashboard.css. When I call todo.js and dashboard.css in my file rankAplus, I get this error (not found 404 - browser error). When I look inside of my element dashboard at browser, the file type is changed, from text/css to text/html. The same thing occurs with my JavaScript file. The files are the same folder. What do I need to do? Does someone know how to solve this issue?


I'm using Node JS to create server at openshift.


How to use externally generated textures with three.js

I am using three.js hosted on node.js. I am using it with a camera that can generate opengl textures as output frames. I would like to use this live stream of textures within three.js. I have been able to create shared contexts between the camera and three.js, and i can confirm that the texture exists in both the camera context and the three.js context. I am not quite sure tho how to get three.js to respect an externally maintained texture. The existing videotexture object is not the right way, it copies over framedata from a browser dom object, and i would like to use the camera-generated texture directly.


Since this is akin to the threejs render-to-texture support, only with the rendering happening outside threejs, I tried to modify that code to support external textures, but it seems like a bit of a slog.


I'm currently thinking I should implement a variant of the shadermaterial object which understands how to work with an externally generated texture, but was wondering if there is a better path.


How to control the number of nodes when i tap? Spritekit

Hi guys im new making iOS games (spriteKit) and im making a game where im using two spritenodes (left, right) but is the same hero. The game is about evading obstacles that are comming from two sides so when i tap one of the sides the opposite node should disappear and appears on the right place. The problem is when i tap more than once on the same side of the screen it appears more nodes and i dont want that. Here is the code:



override func touchesBegan(touches: Set, withEvent event: UIEvent) {

for touch in (touches as! Set<UITouch>) {

var Location:CGPoint = touch.locationInNode(self)
var Node:SKNode = self.nodeAtPoint(Location)

if(Location.x < self.size.width/2) {

node2.addchild(sprite2)
node1.removeFromParent()
}
if (Location.x > self.size.width/2) {

node1.addchild(sprite1)

node2.removeFromParent()
}

Running into error "Failed at the proj@0.0.1 watch script './lib/watchscript.js'." when trying to run a new Pex project

I followed every step in the 'How to use' session here: http://ift.tt/1GaVwOd, but I still cannot get the project to run when I run the command npm run watch. Bellow shows the output:


enter image description here


Does modelObject.save() only update an existing database document when the modelObject was obtained from the database itself?

To use an example that demonstrates the question, assume I have a User model defined by the following schema:



var UserSchema = new Schema({
username: String,
email: String
}
mongoose.model('User', UserSchema);


I know that to update a user using the save method, I could query the user and then save changes like so:



User.findOne({username: usernameTofind}, function(err, user) {
//ignore errors for brevity
user.email = newEmail;
user.save(function(err) { console.log('User email updated') });
});


But if I try to create a new User object with the exact same field values (including the _id) is there any possibility of overwriting the database document? I would assume not, because in theory this would mean that a malicious user could exploit an insecure api and overwrite existing documents (for instance using a 'Create a New Account' request, which wouldn't/couldn't rely on the user already being authenticated) , but more importantly, when I try to do this using a request tool (I'm using Postman, but I'm sure a similar curl command would suffice), I get a duplicate _id error



MongoError: insertDocument :: caused by :: 11000 E11000 duplicate key error index


So I just want to clarify that the only way to update an existing document is to query for the document, modify the returned instance, then call the save method on that instance, OR use the static update(). Both of these could be secured by requiring authentication.


If it helps, my motivation for this question is mentioned above, in that I want to make sure a user is not able to overwrite an existing document if a method such as the following is exposed publicly:



userCtrl.create = function(req, res, next) {
var user = new User(req.body);

user.save(function(err) {
if (err) {
return next(err);
} else {
res.json(user);
}
});
};


Quick Edit: I just realized, if this is the case, then how does the database know the difference between the queried instance and a new User object with the exact same keys and properties?


Is it possible to share text on facebook from server side using node.js

I have done some work of sharing url. but Is it possible to share text, currently I do not find any APIs that make sense. Does anyone knows?


part code of sharing URL using node.js on server side:



parameters.access_token = this.params.access_token;
Fb.api('me/feed', 'post', parameters, function(res){
if (!res || res.error){
console.log(!res? 'error occurred': res.error);
}
console.log('Post Id: ' + res.id);
})

How to publish a module written in ES6 to NPM?

I was about to publish a module to NPM, when I thought about rewriting it in ES6, to both future-proof it, and learn ES6. I've used Babel to transpile to ES5, and run tests. But I';m not sure how to proceed:



  1. Do I transpile, and publish the resulting /out folder to NPM?

  2. Do I include the result folder in my Github repo?

  3. ...or do I maintain 2 repos, one with the ES6 code + gulp script for Github, and one with the transpiled results + tests for NPM?


In short: what steps do I need to take to publish a module written in ES6 to NPM, while still allowing people to browse/fork the original code?


Globally set node_modules location for project

If I write



var moment = require('moment');


in my project, Node wastes a lot of time looking in places that do not actually contain the file, as this dtruss output shows.



PID/THRD RELATIVE SYSCALL(args) = return
7079/0x7cf313: 1244530 stat64("/Users/burke/code/api/api/models/node_modules/moment\0", 0x7FFF5FBFE5D8, 0x9) = -1 Err#2
7079/0x7cf313: 1244575 stat64("/Users/burke/code/api/api/models/node_modules/moment.js\0", 0x7FFF5FBFE578, 0x9) = -1 Err#2
7079/0x7cf313: 1244595 stat64("/Users/burke/code/api/api/models/node_modules/moment.json\0", 0x7FFF5FBFE578, 0x9) = -1 Err#2
7079/0x7cf313: 1244612 stat64("/Users/burke/code/api/api/models/node_modules/moment.node\0", 0x7FFF5FBFE578, 0x9) = -1 Err#2
7079/0x7cf313: 1244628 stat64("/Users/burke/code/api/api/models/node_modules/moment.coffee\0", 0x7FFF5FBFE578, 0x9) = -1 Err#2
7079/0x7cf313: 1244663 open("/Users/burke/code/api/api/models/node_modules/moment/package.json\0", 0x0, 0x1B6) = -1 Err#2
7079/0x7cf313: 1244694 stat64("/Users/burke/code/api/api/models/node_modules/moment/index.js\0", 0x7FFF5FBFE578, 0x1B6) = -1 Err#2
7079/0x7cf313: 1244713 stat64("/Users/burke/code/api/api/models/node_modules/moment/index.json\0", 0x7FFF5FBFE578, 0x1B6) = -1 Err#2
7079/0x7cf313: 1244729 stat64("/Users/burke/code/api/api/models/node_modules/moment/index.node\0", 0x7FFF5FBFE578, 0x1B6) = -1 Err#2
7079/0x7cf313: 1244745 stat64("/Users/burke/code/api/api/models/node_modules/moment/index.coffee\0", 0x7FFF5FBFE578, 0x1B6) = -1 Err#2
7079/0x7cf313: 1244767 stat64("/Users/burke/code/api/api/node_modules/moment\0", 0x7FFF5FBFE5D8, 0x1B6) = -1 Err#2
7079/0x7cf313: 1244788 stat64("/Users/burke/code/api/api/node_modules/moment.js\0", 0x7FFF5FBFE578, 0x1B6) = -1 Err#2
7079/0x7cf313: 1244805 stat64("/Users/burke/code/api/api/node_modules/moment.json\0", 0x7FFF5FBFE578, 0x1B6) = -1 Err#2
7079/0x7cf313: 1244821 stat64("/Users/burke/code/api/api/node_modules/moment.node\0", 0x7FFF5FBFE578, 0x1B6) = -1 Err#2
7079/0x7cf313: 1244837 stat64("/Users/burke/code/api/api/node_modules/moment.coffee\0", 0x7FFF5FBFE578, 0x1B6) = -1 Err#2
7079/0x7cf313: 1244862 open("/Users/burke/code/api/api/node_modules/moment/package.json\0", 0x0, 0x1B6) = -1 Err#2
7079/0x7cf313: 1244887 stat64("/Users/burke/code/api/api/node_modules/moment/index.js\0", 0x7FFF5FBFE578, 0x1B6) = -1 Err#2
7079/0x7cf313: 1244904 stat64("/Users/burke/code/api/api/node_modules/moment/index.json\0", 0x7FFF5FBFE578, 0x1B6) = -1 Err#2
7079/0x7cf313: 1244920 stat64("/Users/burke/code/api/api/node_modules/moment/index.node\0", 0x7FFF5FBFE578, 0x1B6) = -1 Err#2
7079/0x7cf313: 1244936 stat64("/Users/burke/code/api/api/node_modules/moment/index.coffee\0", 0x7FFF5FBFE578, 0x1B6) = -1 Err#2
7079/0x7cf313: 1244964 stat64("/Users/burke/code/api/node_modules/moment\0", 0x7FFF5FBFE5D8, 0x1B6) = 0 0
7079/0x7cf313: 1244990 stat64("/Users/burke/code/api/node_modules/moment.js\0", 0x7FFF5FBFE578, 0x1B6) = -1 Err#2
7079/0x7cf313: 1245015 stat64("/Users/burke/code/api/node_modules/moment.json\0", 0x7FFF5FBFE578, 0x1B6) = -1 Err#2
7079/0x7cf313: 1245038 stat64("/Users/burke/code/api/node_modules/moment.node\0", 0x7FFF5FBFE578, 0x1B6) = -1 Err#2
7079/0x7cf313: 1245488 madvise(0x1008AE000, 0x21000, 0x9) = 0 0
7079/0x7cf313: 1245503 stat64("/Users/burke/code/api/node_modules/moment.coffee\0", 0x7FFF5FBFE578, 0x9) = -1 Err#2
7079/0x7cf313: 1245612 open("/Users/burke/code/api/node_modules/moment/package.json\0", 0x0, 0x1B6) = 11 0


Is there a way to make Node not waste so much time looking in places that don't contain a node_modules directory? Like, I could set some kind of CHECK_HERE_FIRST=$HOME/code/api/node_modules environment variable and if the require is not for a relative path, that would be the first place that Node checked.


I could change all of my require lines to load the relative import but this seems cumbersome for a large project.


Step-by-step Install Iframely

I wan to using this oembed http://ift.tt/1evmxJr but I dont know how to setup .


I already try follow this documentation to setup http://ift.tt/1aJ9717 .


What should i do after run command "node server" ?


Can some one tell step-by-step to use this oembed ?


Install multiple Node.js Modules based on dependencies (list)

I have a dependencies list of nearly 40 node modules and while I was able to install all of them using one (or two) commands-



npm install module-1 module-2 ... module-N --save-dev



-still looking for a way to create a file that installs all dependencies when executed. This way anyone could download a single file for his/her system (Mac / Linux in my case) and install all modules by simply running that file. Any great suggestion, method? Thanks a lot!


Redirect UI page from nodejs for OAuth

I am working with Pocket API to get the access token. I have obtained the request token using a POST request.


Now i have to redirect the UI page to show:



http://ift.tt/1yGSsGI



I tried:



res.writeHead(301, {Location: redirectUrl} );
res.end();


And



res.redirect(redirectUrl)


But they make a POST request without redirecting the UI page to authentication page.


How can i get the access token in this case.


Why doesn't this test with promises pass?

I've stepped into wonderful world of Promises a few days ago and I was just thinking I was enlightened. Promises look simple but they can be confusing.


Could you please tell me why the following test doesn't pass?



var Promise = require('bluebird');
var expect = require('chai').expect;
var request = Promise.promisifyAll(require('request'));

describe('Promise', function() {
it('should work again', function() {

var final_result;

function first_promise() {
return new Promise(function(resolve, reject) {
resolve("http://www.google.com");
})
}

function second_promise() {
return new Promise(function(resolve, reject) {
resolve("This is second promise!");
})
}

function inner_async_request(url_from_first_promise) {
return new Promise(function(resolve, reject) {
return request.getAsync(url_from_first_promise).spread(function(response, content) {
final_result = content;
resolve(content);
})
})
}

return request.getAsync('http://127.0.0.1:3000/').spread(function(result, content) {
//do something with content and then return first_promise
console.log(content);
return first_promise;
})
.then(function(url) {
inner_async_request(url).then(function(result) {
console.log(result);
final_result = result;
})
return second_promise;
})
.then(function(result) {
// result should be "This is second promise!"
console.log(result);
// final_result should be google's html
expect(final_result).not.to.be.undefined;
})
});
});


Currently the error is: Unhandled rejection Error: options.uri is a required argument which should be received from first_promise, I guess?


By this test, actually, I want to understand how to use promises which depend on each other AND how to use promises as asynchronous functions inside promises -which should just work separately-.


Thank you


Socket.io not recognizing event inside Leap.loop

I am working on making an interface for a Leap motion controlled robot. From the server side node.js, I am emitting an event 'action' which contains the data to be displayed on a HTML page. I have implemented a modified version of the leap rigged hand (original code: http://ift.tt/1HGVe0V).



<html>
<head>
<title>Locomotion</title>
<script src="http://ift.tt/1JVcsqq"></script>
<script src="http://ift.tt/1HGVfSq"></script>
<script src="http://ift.tt/1JVcq1X"></script>
<script src="http://ift.tt/1HGVe0X"></script>
<script src="jquery.js"></script>
<script src="/http://ift.tt/1aeIZU4"></script>

<style>
body {
font-family: 'Myriad Pro', Helvetica, Arial, 'Lucida Grande', sans-serif;
font-size: 24pt;
color: white;
background-color: #348cb2;
}

#display {
position: fixed;
height: 50px;
width: 300px;
background: #fdd;
bottom: 10;
left: 10;
padding: 5px;
}
#direction {
color: #f77;
display: block;
height: 100%;
width: 200px;
padding-top: 5px;
margin: 0 auto;
text-align: center;
}
</style>
</head>

<body>
<div id="display"><span id="direction">undefined</span></div>
</body>

<script type="text/javascript">
var riggedHandPlugin,
socket = io();
socket.on('action', function(data) {
$('#display').html(data);
});
/*
Leap.loop({
hand: function(hand){
var handMesh = hand.data('riggedHand.mesh');
handMesh.material.opacity = 1;
handMesh.material.color.set('#7b4b2a');
handMesh.material.ambient = handMesh.material.color;
var screenPosition = handMesh.screenPosition(
hand.palmPosition,
riggedHandPlugin.camera
);
}
})
.use('riggedHand', {
scale: 1.5,
opacity : 1
});

riggedHandPlugin = Leap.loopController.plugins.riggedHand; */
</script>


With the above commented version, socket displays the 'action' data. However, the leap rigged hand won't be displayed.


Whereas, uncommenting would show the leap rigged hand, but wouldn't change the 'action' data in the


I'm stuck with this for a couple of days and I can't think of what to do.


Kindly help


How to get response data from the server in angularjs

This is my scenario. A server in nodejs handles the authentication procedure while in the frontend we have angularjs. when the user clicks on the button, he signs in with Facebook then the server handles all the aspects of the authentication at the end redirect to the uri of the angularjs application. We have in the server something like that



module.exports = function(request, reply) {

if (request.auth.isAuthenticated) {

var profile = request.auth.credentials.profile.raw;

// set to cookie
request.auth.session.set(profile);

// Perform any account lookup or registration, setup local session,
// and redirect to the application. The third-party credentials are
// stored in request.auth.credentials. Any query parameters from
// the initial request are passed back via request.auth.credentials.query

// here we should redirect the app flow somewhere
return reply({ profile: profile.id }).redirect('http://localhost:8080/app');
}

return reply('Unauthorized').code(401);
};


My issue is that I don't know how to retrieve profile object in angularjs. I mean I know that exist $http provider but in the following case the request doesn't start from angularjs. summering the flow the server reply with SPA if the user sign is successfully


Node.js cannot publish module. getting error Invalid name 'math_example'

I have been trying to publish my module using the command prompt command npm publish. and am getting the below error. I am having an account on the npm website and am also logging before I before I publish.


enter image description here


What is standard practice for handling user forms and file uploads?

What is the standard practice for handling data in text fields and file upload fields?


The question is similar to one I asked previously, but this one is slightly more general.


If we borrow the example of a user registering an account, which includes a name, email, and several file upload fields, the actions taken after form submission amount to:


(1) Validate all text fields name, email


(2) If validation is success, create and save User instance into DB.


(3) Save images to disk


(4) Update User instance to include filepaths of saved images.


The files uploaded aren't very big, roughly 5mb or less, so problems associated with uploading large 1GB+ files aren't really an issue for this question.


From what I've read, there are two ways of handling this.




  • Submit everything all together.


    There are several unanswered threads about this: http://ift.tt/1asfAxd


    Node.js Busboy parse fields and files seperatly


    I know that the text fields should come before the file fields when submitting the form thanks to mscdex's comment in my other question.


    But there are other problems I can see:


    (a) IF validations fail for text fields, that means everything will have to be resent in another form submission. This could potentially lead to a DOS attack/bandwidth issue by having a malicious user continually submit a form with bad text fields, but with lots of files.




  • Submit files when first selected, then when form submits, upload only file hash.


    (a) A potential DOS attack may happen by having a malicious user upload a ton of images that just sits on the server. Even with an independent bash script that cleans up the /tmp folder after X minutes, a user could still clog the disk space in the X minutes before cleanup by continually sending files.


    (b) Having an independent script for cleanup creates timing issues. What if a legitimate user keeps sending a form that fails validations, but then after X minutes, the user finally sends the correct form. By that time, the images would have been wiped since X minutes has passed even though the validations passed.




  • Some other way that I don't know




I feel the first way may be easier since I could potentially rate-limit the connections using nginx. Since the files are never hitting disk until validations are complete, I won't have any cleanup issues with files in /tmp. But I've searched the net and can't find anybody really doing this, which leads me to believe that file uploading is not really done this way.


What's the best way to handle file uploads with form data?


Port C++ to NodeJS - Compare XML files

I have some C++ code to compare XML files. I'm looking at moving my application to nodeJS because it's mostly based on JSON. However I'm concerned that nodeJS will struggle with this bit so I want to try it. This is how far I have got.


I'm new to c++ and nodejs and struggling to find the equivalent way to load the data into the map to do the comparison in nodejs like I did in c++ so any help would be really appreciated.



// Access FileSystem
ar fs = require('fs');
// Add XML2JS module
var xml2js = require('xml2js');

// Assume this returns a fully qualified XML file path
var filePath = GetFilePath();
try {
var fileData = fs.readFileSync(filePath, 'ascii');

var parser = new xml2js.Parser();
parser.parseString(fileData.substring(0, fileData.length), function (err, result) {
var json = JSON.stringify(result);
});

console.log("File '" + filePath + "/ was successfully read.\n");
} catch (ex) {
console.log("Unable to read file '" + filePath + "'.");
console.log(ex);
}


This is the original code I had in C++:



#include "pugi/pugixml.hpp"

#include <iostream>
#include <string>
#include <map>

int main()
{

pugi::xml_document doca, docb;
std::map<std::string, pugi::xml_node> mapa, mapb;

if (!doca.load_file("a.xml") || !docb.load_file("b.xml")) {
std::cout << "Can't find input files";
return 1;
}

for (auto& node: doca.child("jobsite_vacancies").children("job")) {
const char* id = node.child_value("id");
mapa[id] = node;
}

for (auto& node: docb.child("data").children("entry")) {
const char* idcs = node.child_value("id");
if (!mapa.erase(idcs)) {
mapb[idcs] = node;
}
}

for (auto& ea: mapa) {
std::cout << "Removed:" << std::endl;
ea.second.print(std::cout);
}

for (auto& eb: mapb) {
std::cout << "Added:" << std::endl;
eb.second.print(std::cout);
}

}

Can I chain npm configuration entries?

Use of the npm config section is simple, and cool, but I've run into one restriction with it: a config entry does not get expanded, so one cannot chain them, nor even access non-config values s.a. package version within the config.


Sample:



{
"name": "myproj",
"version": "0.1.2",

"//": "Here, '$npm_package_version' is not expanded",
"config": {
"target": "dist/myproj-$npm_package_version.js"
},

"scripts": {
"echo": "echo $npm_package_config_target",
}
}


This gives:


dist/myproj-$npm_package_version.js


instead of:


dist/myproj-0.1.2.js


Is there anything I can do about it? Chaining values like this is a useful feature - I'm surprised nom doesn't do it. Is there a reason not to?


References:



Mongoose query for subdocuments

I have the following scheme; its a conversation scheme with as subdocument structure a username and enabled or disabled state;



conversation = {title: "title",
users: [{"name: "a", "enabled": true}, {"name: "b", "enabled": false}]}


I want to select all the users that have an enabled state set to true. I found that $elemMatch in projection does just that, but only returns the first element... Putting $elemMatch in the query parts does not do anything...


Any help is appreciated


How to compile node.js module canvas on RedHat OpenShift servers?

On RedHat OpenShift servers it is not possible to compile node.js module canvas, because there are missing cairo libraries for the linux, and related required libraries as well.


Getting a value from JSON, Node.js

I am having trouble with getting a value from inside of a JSON file, which has been retrieved from the Steam API (http://ift.tt/16KNZkA).


The problem occurs when I attempt to print the data, the Node.js file code is below:



var data = JSON.parse(body); // Stores the JSON data which has been retrieved
console.log(data.result.toString(350462890).market_hash_name); // Attempts to grab the value of the market_hash_name from the JSON data and display it to screen


I get the following response: "undefined".


JSON data used below:



{
"result": {
"350462890": {
"icon_url": "fWFc82js0fmoRAP-qOIPu5THSWqfSmTELLqcUywGkijVjZYMUrsm1j-9xgEObwgfEh_nvjlWhNzZCveCDfIBj98xqodQ2CZknz5-OOqhNQh0fTvSAK5KVPAoywXpDS4n5YliBtazruNQfgrssNfPN-IqYtkdSpTZU_OCYAir70luiaAPfZOIqHznw223bZvDH3kW",
"icon_url_large": "fWFc82js0fmoRAP-qOIPu5THSWqfSmTELLqcUywGkijVjZYMUrsm1j-9xgEObwgfEh_nvjlWhNzZCveCDfIBj98xqodQ2CZknz5-OOqhNQh0fTvSAK5KVPAoywXpDS4n5fhvVcWx8vUHe126vYrANLYvNI1FG5LWCPfXM1304048hqALKpffqSu9jyvoMjgCRVO1rexMsCC1",
"icon_drag_url": "",
"name": "Dual Berettas | Panther",
"market_hash_name": "Dual Berettas | Panther (Field-Tested)",
"market_name": "Dual Berettas | Panther (Field-Tested)",
"name_color": "D2D2D2",
"background_color": "",
"type": "Mil-Spec Grade Pistol",
"tradable": "1",
"marketable": "1",
"commodity": "0",
"fraudwarnings": "",
"descriptions": {
"0": {
"type": "html",
"value": "Exterior: Field-Tested",
"app_data": ""
},
"1": {
"type": "html",
"value": " ",
"app_data": ""
},
"2": {
"type": "html",
"value": "Firing two large-mag Berettas at once will lower accuracy and increase load times. On the bright side, you'll get to fire two large-mag Berettas at once. It has been painted in a black, grey and red color scheme.",
"app_data": ""
},
"3": {
"type": "html",
"value": " ",
"app_data": ""
},
"4": {
"type": "html",
"value": "The Arms Deal 3 Collection",
"color": "9da1a9",
"app_data": {
"def_index": "65535",
"is_itemset_name": "1"
}
},
"5": {
"type": "html",
"value": " ",
"app_data": ""
}
},
"owner_descriptions": "",
"actions": {
"0": {
"name": "Inspect in Game...",
"link": "http://steamrungame/730/76561202255233023/+csgo_econ_action_preview%20S%owner_steamid%A%assetid%D14429613083935122456"
}
},
"market_actions": {
"0": {
"name": "Inspect in Game...",
"link": "http://steamrungame/730/76561202255233023/+csgo_econ_action_preview%20M%listingid%A%assetid%D14429613083935122456"
}
},
"tags": {
"0": {
"internal_name": "CSGO_Type_Pistol",
"name": "Pistol",
"category": "Type",
"category_name": "Type"
},
"1": {
"internal_name": "weapon_elite",
"name": "Dual Berettas",
"category": "Weapon",
"category_name": "Weapon"
},
"2": {
"internal_name": "set_weapons_iii",
"name": "The Arms Deal 3 Collection",
"category": "ItemSet",
"category_name": "Collection"
},
"3": {
"internal_name": "normal",
"name": "Normal",
"category": "Quality",
"category_name": "Category"
},
"4": {
"internal_name": "Rarity_Rare_Weapon",
"name": "Mil-Spec Grade",
"category": "Rarity",
"color": "4b69ff",
"category_name": "Quality"
},
"5": {
"internal_name": "WearCategory2",
"name": "Field-Tested",
"category": "Exterior",
"category_name": "Exterior"
}
},
"classid": "350462890"
},
"success": true
}


}


So does anyone have any idea how I can return the market_hash_name? also please take note, I am fairly new to using Node.js.


parse JSON containing an array in NodeJS

I have to parse a json file containing an array into a NodeJS program:


My json file:





{
"name":"M"
"types":[
"al",
"pouf",
"k"
]
}



I read the JSON file into my NodeJS program using:



var my_json = require(./json);


But I have this error message:



Unexpected string
at Object.parse (native)
at Object.Module._etensions..json (module.js:486:27)
at Module.load (module.js:310:12)

Bluebird: Waiting for one promise to settle

At the risk of sounding stupid: What's the most efficient way to wait for one promise to settle? Say I have promise A and I want to create promise B. The fulfillment of B does not depend on the final value of A. It should proceed even if A is rejected. The process shouldn't start, however, until A is settled one way or the other.


What I have currently looks like the following:



var prevResult;

function doStuff() {
if (prevResult) {
prevResult = promise.settle([ prevResult ]).then(function() {
return doStuff();
})
} else {
prevResult = updateDB().finally(function() {
prevResult = null;
});
}
return prevResult;
}


The code is a bit non-obvious. I'm also a bit worried about chaining a bunch of promises from settle() together, seeing how it's function for performing less trivial kind of coordination. Seems like there ought to be a simpler way of doing this.


How to setup yeoman test for subgenerator that reads package.json

I have a subgenerator that uses the name from the package.json. Now I want to test that function and wrote a before() that is supposed to create a dummy package.json for the test.


Problem is that the subgenerator cannot read the dummy json file.


test file:



before(function (done) {

helpers.run(path.join( __dirname, '../addcomponent'))
.inDir(path.join( __dirname, './tmp'), function(dir) {

fs.copyTpl(
path.join(__dirname, '../app/templates/_package.json'),
dir + 'package.json',
{ ProjectName: 'foo' }
);

var test = fs.readJSON(dir + 'package.json');
console.log('test: ' + test); // returns the object
console.log('test.name: ' + test.name); // returns the correct name

})
.withArguments(['foo'])
.withPrompts(prompts)
.withOptions(options)
.on('end', done);

});


but in my sub-generator:



var memFs = require('mem-fs');
var editor = require('mem-fs-editor');
var store = memFs.create();
var fs = editor.create(store);

...

init: function() {
this.pkg = this.fs.readJSON('package.json');
console.log('this.pkg: ' + this.pkg); // returns undefined
}
// or
init: function() {
this.on('ready', function() {
this.pkg = this.fs.readJSON('package.json');
console.log('this.pkg: ' + this.pkg); // returns undefined
});
}
// or
anyOther: function() {
this.pkg = this.fs.readJSON('package.json');
console.log('this.pkg: ' + this.pkg); // returns undefined
}


The whole setup can be found here: http://ift.tt/1JaLOcd


thanks for any help


error npm install -g iron-meteor on windows 8.1

can not install iron-meteor with command:npm install -g iron-meteor



installed



1.os

 win 8.1 pro

2.node -v

 v0.12.2

3.npm -v

 2.7.4

4.python --version

 Python 2.7.9

5.I also have Microsoft Visual Studio Ultimate 2013 in my machine, and later installed Microsoft Visual Studio C++ 2012 for Windows Desktop.


When i



run: npm install -g -iron-meteor



I got error below:


Building the projects in this solution one at a time. To enable parallel build,please add the "/m" switch.

C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\V120\Microsoft.Cpp.Platform.targets(64,5): error MSB8020: The build tools for Visual Studio 2010 (Platform Toolset = 'v100') cannot be found. To build using the v100 build tools, please install Visual Studio 2010 build tools. Alternatively, you may upgrade to the current Visual Studio tools by selecting the Project menu or right-click the solution, and then selecting "Upgrade Solution...". [C:\Users\Gary\AppData\Roaming\npm\node_modules\iron-meteor\node_modules\fibers\build\fibers.vcxproj]


gyp ERR! build error

gyp ERR! stack Error: C:\Program Files (x86)\MSBuild\12.0\bin\msbuild.exe failed with exit code: 1

gyp ERR! stack at ChildProcess.onExit (D:\nodejs\node_modules\npm\node_modules\node-gyp\lib\build.js:269:23)

gyp ERR! stack at ChildProcess.emit (events.js:110:17)

gyp ERR! stack at Process.ChildProcess._handle.onexit (child_process.js:1074:12)

gyp ERR! System Windows_NT 6.3.9600

gyp ERR! command "node" "D:\nodejs\node_modules\npm\node_modules\node-gyp\bin\node-gyp.js" "rebuild" "--release"

gyp ERR! cwd C:\Users\Gary\AppData\Roaming\npm\node_modules\iron-meteor\node_modules\fibers

gyp ERR! node -v v0.12.2

gyp ERR! node-gyp -v v1.0.3

gyp ERR! not ok

Build failed

npm ERR! Windows_NT 6.3.9600

npm ERR! argv "D:\nodejs\\node.exe" "D:\nodejs\node_modules\npm\bin\npm-cli.js" "install" "-g" "iron-meteor"

npm ERR! node v0.12.2

npm ERR! npm v2.7.4

npm ERR! code ELIFECYCLE

npm ERR! fibers@1.0.5 install: node ./build.js

npm ERR! Exit status 1

npm ERR!

npm ERR! Failed at the fibers@1.0.5 install script 'node ./build.js'.

npm ERR! This is most likely a problem with the fibers package,

npm ERR! not with npm itself.

npm ERR! Tell the author that this fails on your system:

npm ERR! node ./build.js

npm ERR! You can get their info via:

npm ERR! npm owner ls fibers

npm ERR! There is likely additional logging output above.


And, when



run: npm install -g iron-meter --msvs-version=2012 or npm install -g > iron-meter --msvs-version=2013



I got result:


Building the projects in this solution one at a time. To enable parallel build, please add the "/m" switch.

fibers.cc

coroutine.cc

C:\Program Files (x86)\Microsoft Visual Studio 11.0\VC\include\xlocale(336): warning C4530: C++ exception handler used, but unwind semantics are not enabled. Specify /EHsc (..\src\fibers.cc) [C:\Users\Gary\AppData\Roaming\npm\node_modules\iron-meteor\node_modules\fibers\build\fibers.vcxproj]

..\src\fibers.cc(703): warning C4244: 'argument' : conversion from 'size_t' to 'double', possible loss of data [C:\Users\Gary\AppData\Roaming\npm\node_modules\iron-meteor\node_modules\fibers\build\fibers.vcxproj]

..\src\fibers.cc(707): warning C4244: '=' : conversion from 'double' to 'size_t', possible loss of data [C:\Users\Gary\AppData\Roaming\npm\node_modules\iron-meteor\node_modules\fibers\build\fibers.vcxproj]

..\src\fibers.cc(714): warning C4244: 'argument' : conversion from 'size_t' to'double', possible loss of data [C:\Users\Gary\AppData\Roaming\npm\node_modules\iron-meteor\node_modules\fibers\build\fibers.vcxproj]

..\src\coroutine.cc(109): warning C4552: '!' : operator has no effect; expectedoperator with side-effect [C:\Users\Gary\AppData\Roaming\npm\node_modules\iron-meteor\node_modules\fibers\build\fibers.vcxproj]

..\src\coroutine.cc(185): warning C4267: 'argument' : conversion from 'size_t' to 'unsigned int', possible loss of data [C:\Users\Gary\AppData\Roaming\npm\node_modules\iron-meteor\node_modules\fibers\build\fibers.vcxproj]

C:\Program Files (x86)\Microsoft Visual Studio 11.0\VC\include\vector(1066): warning C4530: C++ exception handler used, but unwind semantics are not enabled. Specify /EHsc (..\src\coroutine.cc) [C:\Users\Gary\AppData\Roaming\npm\node_modules\iron-meteor\node_modules\fibers\build\fibers.vcxproj]

C:\Program Files (x86)\Microsoft Visual Studio 11.0\VC\include\vector(1059) : while compiling class template member function 'void std::vector<_Ty::resize(unsigned __int64)' with [_Ty=void *]

..\src\coroutine.cc(146) : see reference to function template instantiation 'void std::vector<_Ty::resize(unsigned __int64)' being compiled with [_Ty=void *] coro.c

Creating library C:\Users\Gary\AppData\Roaming\npm\node_modules\iron-meteor\node_modules\fibers\build\Release\fibers.lib and object C:\Users\Gary\AppData\Roaming\npm\node_modules\iron-meteor\node_modules\fibers\build\Release\fibers.exp

Generating code

Finished generating code

fibers.vcxproj -C:\Users\Gary\AppData\Roaming\npm\node_modules\iron-meteor\node_modules\fibers\build\Release\fibers.node

Installed in C:\Users\Gary\AppData\Roaming\npm\node_modules\iron-meteor\node_modules\fibers\bin\win32-x64-v8-3.28\fibers.node

C:\Users\Gary\AppData\Roaming\npm\iron C:\Users\Gary\AppData\Roaming\npm\node_modules\iron-meteor\bin\iron

iron-meteor@1.1.2

C:\Users\Gary\AppData\Roaming\npm\node_modules\iron-meteor

├── single-line-log@0.4.1

├── xtend@4.0.0

├── minimist@0.0.8

├── underscore@1.3.3

├── ejs@0.8.5

├── cli-table@0.3.0 (colors@0.6.2)

├── cli-color@0.2.3 (es5-ext@0.9.2, memoizee@0.2.6)

├── shell-source@1.0.1 (concat-stream@1.4.8)

└── fibers@1.0.5


Can anyone help me?


NodeJS Error read ECONNRESET

If i am sending data from php to nodejs I get this error on my console: http://ift.tt/1yGtpDN


How can I fix that error? Thanks !


That's my code in nodejs:



var Server = net.createServer(function(Sock) {
Sock.on('data', function(data) {
dataObj = JSON.parse(data);
logger.info("received data");
});

Sock.pipe(Sock);
});


and thats my php code:



$data = array('xxx' => 'xxx');

// Prepares to transmit it
$encdata = json_encode($data);

socket_write($socket, $encdata, strlen($encdata));

socket_close($socket);

Incompatible node.js version

I used to install node-png: npm install node-png -g


I got a message says:



npm WARN engine node-png@0.4.3: wanted: {"node":"0.8.x"} (current: {"node":"0.12.2","npm":"2.7.4"})



How to fix it, note: I use earlier version ,,


My sailsjs service return “undefined” to the calling controller action

I created a service called CategorieService. Its function getAllCategorie() is supposed to return an object:



//CategorieService.js
module.exports={
getAllCategorie:function()
{
Categorie.find({where:{statut:'A'},sort:'libelle ASC'}).exec(function(err,categories)
{
if(categories)
{
categories.forEach(function(categorie){
console.log('categorie libelle =>'+categorie.libelle);
})
return categories;
}
})
}
};


log in console show result as I want



categorie libelle => car
categorie libelle => clothes
categorie libelle => rent


But When in my controllercategorie is undefined why ? and How can I fix it ? below my controller



//ArticleControllerjs
var categorie=require('../services/CategorieService');
module.exports = {

/*view*/
indexArticle:function(req,res)
{
var title=req.__('gestiondesarticles.title');
res.view('article',{categories:categorie.getAllCategorie(),title:title,page_name:'article'});

},
}

node.js link routes each other

I have a little bit problem with the routes of node.js . I managed these routes(get and post) in the folder called routes. Now I'd like to call from one route an other route in a different file . How can I do?


error while npm install iron-meteor on Windows 8.1

can not install iron-meteor with command:npm install -g iron-meteor


installed


1.os

 win 8.1 pro

2.node -v

 v0.12.2

3.npm -v

 2.7.4

4.python --version

 Python 2.7.9

5.I also have Microsoft Visual Studio Ultimate 2013 in my machine, and later installed Microsoft Visual Studio C++ 2012 for Windows Desktop.


When i


run: npm install -g -iron-meteor


Can anyone help me?


nginx and node.js optimisation - setTimeout having adverse affect

I'm having a little trouble in getting a node.js app fronted by nginx to perform as expected. I've stripped everything back to it's basics to ensure I have optimised things correctly.


For every request the node.js app will wait 10 seconds and then return a response. The timeout is required for the purposes of this app (going to be used for load/performance testing). The index.js I have for the app is:



var requestCount = 0;
var responseCount = 0;

http.createServer(function (request, response) {
requestCount++;
console.log('Dealing with request ' + requestCount);
setTimeout(function () {
responseCount++;
console.log('Sending back response ' + responseCount);
response.writeHead(200, {"Content-Type": "text/plain"});
response.end('Hello World\n');
}, 10000);
}).listen(config.serverPort);


I have then fronted the node.js app with nginx. The intention is to have a few instances of node.js app with nginx handling SSL and acting as a load balancer. My nginx.conf is below



worker_processes 8;
pid /var/run/nginx.pid;

events {
worker_connections 8192;
multi_accept on;
use kqueue;
}

worker_rlimit_nofile 65536;

http {

include mime.types;
default_type application/octet-stream;
server_names_hash_bucket_size 128;

sendfile on;
tcp_nopush on;
tcp_nodelay on;
types_hash_max_size 2048;

access_log /var/log/nginx/access.log;
error_log /var/log/nginx/error.log;

gzip on;
gzip_disable "msie6";

upstream backend {
server 127.0.0.1:8001;
}

server {
# proxy to backend
location / {
proxy_redirect off;
proxy_connect_timeout 10;
proxy_send_timeout 15;
proxy_read_timeout 20;
proxy_set_header Host $host;
proxy_set_header X-Forwarded-For $remote_addr;
proxy_pass http://backend;
}
}

include /etc/nginx/sites-enabled/*;
}


Now, my application fires 300 concurrent requests in a one go. When fired directly at the node.js app, everything is fine and I get 300 responses back. However, when I try to fire the requests through nginx, only 248 responses come back. The other 52 requests timeout. The http client I am using in the application has a requestTimeout of 60 seconds and is set to 'Connection: Keep-Alive'.


When I look at the log file for node.js app, nginx has not sent the outstanding 52 requests to the node.js app. They seem to be blocked somewhere. In the nginx error log, the only thing written is:



*559 setsockopt(TCP_NODELAY) failed (22: Invalid argument) while keepalive, client: 127.0.0.1, server: 0.0.0.0:80


This error does not worry me per se, as I understand it to mean that the client has closed the connection as per it's 60 second timeout.


The weird thing is, if I set the timeout on the node.js app to 50ms (instead of 10000ms) everything works fine! The timeout seems to be having an adverse affect on nginx's ability to pass all 300 requests upstream.


Is anyone able to point out any missing nginx optimisations or errors in the node.js app I have written? I am a little stumped on why the setTimeout would be having this affect. Is there something about setTimeout(...) which blocks a socket? Appreciate any help offered.


Thanks!


Less - import .css file using nodejs gulp

According to this question it's possible to import css file from less v.1.4.0.

I tried to use gulp-less task, and it's not working.

There's any way to import css file like this: @import "../spritesheet.css"; with nodeJS gulp?


node.js/express chaining multiple get commands

I have a route I that in order to get all the data needs to access the API server multiple times (according to the data that was given).


Now I need to add a third access to the server and it's getting rather unreadable.

The following code is working, but I have a feeling I'm not doing it right (promises?) - couldn't figure out what exactly is recommended in this case


The code: (stripped down to emphasise the point)



router.get('/', function(req, main_response) {
http.get(FIRST_API_COMMAND, function (res) {
var moment_respose_content = '';
res.on("data", function (chunk) {
moment_respose_content += chunk;
});
res.on('end',function(){
if (res.statusCode < 200 || res.statusCode > 299) {
main_response.send('error in getting moment');
return;
}
var response = JSON.parse(moment_respose_content );
if (response.success)
{
var data = response.data;
//doing something with the data
http.get(SECOND_API_COMMAND, function (res) {
res.on("data", function (chunk) {
comment_respose_content += chunk;
});

res.on('end',function(){
var response = JSON.parse(comment_respose_content);
if (response.success)
{
var comments = response.data;
main_response.render('the page', {data: data});
return;
}
});
}).on('error', function (e) {
console.log("Got error: " + e.message);
main_response.send('Error in getting comments');
});
return;
}
});
}).on('error', function (e) {
console.log("Got error: " + e.message);
main_response.send('Error in getting moment');
});
});