mardi 31 mars 2015

front end workflow tools and NODE.JS

as a example



  • package manager (BOWER,GULP)

  • JavaScript Task Runner(GRUNT)

  • web scaffolding tool (Yeoman)


why theses type of front end workflow tools ONLY AVAILABLE IN NODEJS not any other language.


Find inside callback of another find (...), how to escape from callback hell?

(First: I'm sorry, I don't speak english very well!)


I wanna return the results of 3 finds in one array. My code (next) is running well, but I'm in callback hell!





_Schema
.static('retrieveAll', function(cb) {
query = {};
this.find(query, function(err, data) {
if(err) {
cb(err, null);
return;
}

if(data)
all = data;
else
all = [];

_StoresModel.find(query).select('contact address').exec(function(err, data) {
if(err) {
cb(err, null);
return;
}

if(data) {
all = data.reduce(function(coll, item) {
coll.push(item);
return coll;
}, all);
}

_CustomersModel.find(query).select('contact address').exec(function(err, data) {
if(err) {
cb(err, null);
return;
}

if(data) {
all = data.reduce(function(coll, item) {
coll.push(item);
return coll;
}, all);
}

cb(null, all);
});
});
});
});



I've a FIND inside a FIND inside a FIND. Is there anyway to improve this?


How do I include dependencies for dr.js?

I am trying to install dr.js as my javascript documentation. Here is a link: http://ift.tt/1EBecG3


I uses node.js and npm so I installed. I can see there are dependencies. What are they? What do I do with dependencies? Does anyone know how to install dr.js?


How do I include dependencies?


I did



npm install dr.js@0.1.1


and I got



dr.js@0.1.1 ../node_modules/dr.js
├── eve@0.5.0
├── dot@1.0.3
├── markdown@0.5.0 (nopt@2.1.2)
└── topcoat@0.7.5 (topcoat-utils@0.1.3, topcoat-range-base@0.0.3, topcoat-list@0.5.0, topcoat-textarea@0.3.0, topcoat-switch-base@0.1.0, topcoat-search-input-base@0.1.2, topcoat-tab-bar@0.1.0, topcoat-radio-button-base@0.1.1, topcoat-input-base@0.4.1, topcoat-checkbox-base@0.1.3, topcoat-checkbox@0.4.0, topcoat-notification@0.1.1, topcoat-icon-button@0.3.4, topcoat-button-bar-base@0.1.5, topcoat-notification-base@0.0.1, topcoat-textarea-base@0.3.2, topcoat-button-base@0.6.1, topcoat-list-base@0.4.1, topcoat-button-bar@0.1.1, topcoat-radio-button@0.1.2, topcoat-text-input@0.3.4, topcoat-theme@0.5.24, topcoat-search-input@0.3.3, topcoat-button@0.5.5, topcoat-switch@0.1.4, topcoat-range@0.1.0, topcoat-navigation-bar-base@0.4.0, topcoat-navigation-bar@0.4.2)


What does this output mean? Does this mean installation is done? I am very new to node.js and npm..


Using custom Machinepack in treeline.io

I'm currently testing the beta of treeline.io, firstly it's awesome.


Secondly, i'm struggling with how i would add a custom Machinepack. I have one i need that's very specific to an application i'm testing that it doesn't make sense to add to NPM/Github.


Now if this was a sails app i could manually add it to api/machines, but that get's overwritten locally whenever i restart my treeline.io application.


Now i have the option of creating a new machine directly on treeline.io, but i can't then sync that locally to edit it, or actually get at the underlying code of the machinePack (i know that's kinda the point, but i am still always going to need to do some custom things).


Perfectly possible this just isn't quite there yet given the beta nature of things, just wanted to check i wasn't missing something!


Thanks


Gareth


Error creating two Redis To Go clients in NodeJS on Heroku

I've got a single web dyno running on a Heroku app that's built using NodeJS and it's using a single Redis To Go database on the Nano (free) tier. That tier is supposed to support up to ten connections, yet if I try to connect to it in two different modules using:



var redis_url = require('url').parse(process.env.REDISTOGO_URL);
var redis = require('redis').createClient(redis_url.port, redis_url.hostname);


I get this error when trying to start the app:



Error: Ready check failed: NOAUTH Authentication required. Mar 31 21:52:18 <> app/web.1: at RedisClient.on_info_cmd (/app/node_modules/redis/index.js:380:35)



The REDISTOGO_URL environment variable is set correctly, and if I remove the code from one of the modules then it starts fine with no errors. I could create one client and pass it to each module, but I'd really prefer to understand what's actually causing the problem here.


Can somebody explain what's going on?


How do I get an octet-stream in Node.js using http (or request)?

I'm writing a Node.js script to parse Wikidata database dumps and insert them into my own database. I'm having troubles downloading the files, as it seems both Node.js's http module and the request module from npm are ending the response stream too early. I'm getting the following error:



events.js:85
throw er; // Unhandled 'error' event
^
SyntaxError: Unexpected end of input
at parse (native)
at emit (/var/app/current/node_modules/split/index.js:27:17)
at next (/var/app/current/node_modules/split/index.js:48:7)
at Stream.<anonymous> (/var/app/current/node_modules/split/index.js:53:5)
at Stream.stream.write (/var/app/current/node_modules/split/node_modules/through/index.js:26:11)
at Gunzip.ondata (_stream_readable.js:540:20)
at Gunzip.emit (events.js:107:17)
at readableAddChunk (_stream_readable.js:163:16)
at Gunzip.Readable.push (_stream_readable.js:126:10)
at Gunzip.Transform.push (_stream_transform.js:140:32)


I'm using the following code to download the database dump:



var streamer = request.get("http:/dumps.wikimedia.org/other/wikidata/20150330.json.gz").pipe(require('zlib').createGunzip()).pipe(split(JSON.parse));

streamer.on('data', function(obj){
...
});


The error is the same whether I use request module or the built-in http module.


How do you serve static files from an nginx server acting as a reverse proxy for a nodejs server?

My current nginx config is this:



upstream nodejs {
server 127.0.0.1:3000;
}

server {
listen 8080;
server_name localhost;
root ~/workspace/test/app;
index index.html;

error_log /usr/local/var/log/nginx/error.log;
access_log /usr/local/var/log/nginx/access.log;

location / {
proxy_pass http://nodejs;
proxy_set_header Host $host ;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
}
}


I'm very very new to nginx, but at the very least I know that nginx is better than node/express at serving static files. How can I configure the server so that nginx serves the static files?


Node.js HTTPS request using Elliptic Curve key

I want to build an HTTPS client that connects to a HTTPS Server that requires mutual authentication. Additionally, the client key is an Elliptic Curve key instead of a RSA key. To support Elliptic Curve keys, I have recompiled Node.js with OpenSSL 1.0.2a.


In my node.js program, I set the options to specify a key and certificate,



var options = {
// These are necessary only if using the client certificate authentication
key: fs.readFileSync('client-key.pem'),
cert: fs.readFileSync('client-cert.pem'),


and when I run it, I get this error:



Error: error:0906D06C:PEM routines:PEM_read_bio:no start line
at Error (native)
at Object.createSecureContext (_tls_common.js:110:19)
at Object.exports.connect (_tls_wrap.js:854:21)
at Agent.createConnection (https.js:84:14)
at Agent.createSocket (_http_agent.js:196:16)
at Agent.addRequest (_http_agent.js:168:23)
at new ClientRequest (_http_client.js:156:16)
at Object.exports.request (http.js:51:10)
at exports.request (https.js:138:15)
...


This indicates that Node.js is not able to read the EC key. This error message is similar to when openssl attempts to read the key as an X509 cert:



openssl x509 -text -in sample.key
unable to load certificate
140735234208608:error:0906D06C:PEM routines:PEM_read_bio:no start
line:pem_lib.c:701:Expecting: TRUSTED CERTIFICATE


How can I force Node.js to load this key as an EC key?


How do you serve static files on nginx when it is acting as a reverse proxy for a nodejs server?

My current nginx config is this:



upstream nodejs {
server 127.0.0.1:3000;
}

server {
listen 8080;
server_name localhost;
root ~/workspace/test/app;
index index.html;

error_log /usr/local/var/log/nginx/error.log;
access_log /usr/local/var/log/nginx/access.log;

location / {
proxy_pass http://nodejs;
proxy_set_header Host $host ;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
}
}


I'm very very new to nginx and I'm wondering how I can also serve static files with a setup like this.


Only Find and Copy the Images referenced in the CSS File using Grunt

I've got a single page webapp (index.html), I built this using a responsive template I purchased so the template came with 100s of icons and images that I could use.


I pick and choose which images during the development stage, but when it comes to building a version of the app to deploy I need to manually go through the main CSS and HTML file and make a list of images used and copy them over. This is not the ideal solution of course and I'd like to automate this "find and build" stage.


The structure of the app is like so:



images/
------ icons/
------ logos/
------ tab/
------ slides/

css/
------ compiled/main.css

js/
------ main.js

index.html


I've been looking for a Grunt plugin that can go through my only main CSS file (compiled/main.css) and find all the images I ended up using. This CSS file is fairly large. It should then copy only those images into a "build" folder which looks like so:



build/
------ images/
------ css/
------ js/

index.html


Within the "build/images" folder it should also maintain the folder structure as needed in CSS so the reference dont break.


e.g. when I run the task:


it finds this snippet in the main.css file -



#hero.bg {
background-image: url("images/slides/hello.jpeg");
}


and makes a folder like so: "build/images/slides"


and copies the 'hello.jpeg' into it.


Does anyone if there is such a Grunt plugin already available, if not, I could write it but I don't want to reinvent the wheel.


If possible, I'd like to take it one step further and also look in the index.html file for image links and copy them over too.


Thanks very much,


how can i do google image search by image file(Not URL) with node.js?

I've looked around so long, but never resolve.


this is with one condition,not using "Google custom API".


I DID searched some way, but i couldn't resolve yet.


this is the way i tried.



`var client = require('cheerio-httpcli');
var word = 'http://ift.tt/1Cvk400';
client.fetch('http://ift.tt/1DqMWbU'
, { q: word },
function (err, $, res) {
console.log(res.headers);
console.log($('title').text());
('a').each(function (idx) {
console.log($(this).attr('href'));
});
});`


how can i do?


Is there any way to do so, if so can anybody point me in the right direction, i just can't seem to get started.


how to embed any website or open in new tab (client side)

I'm trying to find a way to display any external url (user submitted) without leaving the current url.


I originally tried using iframes but it's becoming very tedious to try and handle common websites that dont allow their site to be embedded. So then I figured I might be able to open the external urls's in a new tab but since links are not initiated by a user click I have no control over a tab vs. a window and windows get blocked by most pop up blockers.


Does anyone have any idea?


If it matters, I am trying to maintain access to a node.js (socket.io) session while a user browses an external URL through my web app.


How to call async.each in an external function having async.waterfall inside async.each

I have a exports module in Node.JS



exports.doSomethingImportant= function(req, res) {
var id = req.params.id;
Demo.findOne({'_id': id})
.exec(function(err, demosReturned) {
async.waterfall([
function(outerCallBack){
console.log("In the First Call Back");
firstOrderFunction(demosReturned,outerCallBack);
},
function(x,outerCallBack){
var y =3
var z = x*y;
console.log("In the Second Call Back");
outerCallBack(null,z);
}
],function(err,z){
if(err){
console.log("Error is == " +err);
}else{
console.log("The Returned Value is == "+z);
}
});
});//End Demo.findOne
};


Now, my firstOrderfunction again has a async.each embedding async.waterfall



function fistOrderFunction(demosReturned,outerCallBack){
console.log("Called the External Function");


async.each(demosReturned.locations, function(location, innerCallBack) {
console.log('Computing Location #');

async.waterfall([
function(internalCallBack){
console.log("Computing Inner First Waterfall");
a = 14;
innerCallBack(null,a);
},
function(a,internalCallBack){
console.log("Computing Inner Second Waterfall");
b =14;
c = a*b;
innerCallBack(null,c)
}
],function(err,c){
if(err){
console.log("Error is == " +err);
}else{
d = c;
console.log("The Returned Value is == "+c);
}
});//End Async.Waterfall
},function(err,d){
if(err){enter code here
console.log("The Error in Async.Each === " + err);
}else{
console.log("The Returned Value is Processed ");
outerCallBack(null, d);
}
}); //End Async.Each
}


The Output I get is





In the First Call Back


Calculating Payments Called in First CallBack


Computing Location #


Computing Location #


Computing Inner First Waterfall


Computing Inner First Waterfall


The Returned Value is Processed


In the Second Call Back


The Returned Value is == NaN



I want everything to be run synchronously in the following order.





  1. Call async.waterfall in exec call back of Demo.findone




  2. Call the firstOrderFunction




  3. Call async.each inside firstOrderFunction




  4. Call async.waterfall inside async.each




  5. Call the first callback function returning a=14.




  6. Call the second callback function returning c =14*14 =196.





How do I achieve this using async?


Thanks in advance and apology for such a long question.


_iteratorError undefined is not a function

The following code takes the content of a file, replace some characters, and outputs the result:


test.txt:



# Title

## Title 2

Paragraph


index.js:



#!/usr/bin/env node

'use strict'

var fs = require('fs')
, filename = process.argv[2]

if (process.argv.length < 3) {
console.log('Usage: node ' + process.argv[1] + ' FILENAME')
process.exit(1)
}

function massReplace(text, replacementArray) {
let results = text
for (let [regex, replacement] of replacementArray) {
results = results.replace(regex, replacement)
}
return results
}

function transformHeadings(text, orig) {
return massReplace(text,
[/^## (.*)/gm, '<h2>$1</h2>'],
[/^# (.*)/gm, '<h1>$1</h1>'] ]
)
}

fs.readFile(filename, 'utf8', function(err, data) {
if (err) throw err

data = data.split(/\n\n/gm)
var tree = data.slice()

console.log(transformHeadings(data, tree))
})


I get this error:



alex@alex-K43U:~/node/m2n$ babel-node index4.js test.txt
/home/alex/node/m2n/index4.js:41
throw _iteratorError;
^
TypeError: undefined is not a function
at massReplace (/home/alex/node/m2n/index4.js:17:4)
at transformHeadings (/home/alex/node/m2n/index4.js:30:2)
at /home/alex/node/m2n/index4.js:39:3
at fs.js:336:14
at FSReqWrap.oncomplete (fs.js:99:15)


No idea what the problem is, nor what throw _iteratorError means.


I'm using Babel to parse the ES6 code.


What could be the problem?


Strongloop deploy to Heroku doesn't work

Followed all the steps can't figure out how to get this done Keep getting bash: slc: command not found


Procfile web: slc run



2015-03-26T02:43:13.998982+00:00 heroku[web.1]: State changed from crashed to starting 2015-03-26T02:43:18.757072+00:00 heroku[web.1]: Starting process with command slc run 2015-03-26T02:43:20.054559+00:00 app[web.1]: Detected 512 MB available memory, 512 MB limit per process (WEB_MEMORY) 2015-03-26T02:43:20.054946+00:00 app[web.1]: bash: slc: command not found 2015-03-26T02:43:20.054584+00:00 app[web.1]: Recommending WEB_CONCURRENCY=1 2015-03-26T02:43:20.786104+00:00 heroku[web.1]: Process exited with status 127 2015-03-26T02:43:20.807164+00:00 heroku[web.1]: State changed from starting to crashed 2015-03-26T02:44:16.130969+00:00 heroku[router]: at=error code=H10 desc="App crashed" method=GET path="/" host=salty-journey-3310.herokuapp.com request_id=48c6c94a-22d7-4c5e-9a6c-2384c5d37cdc fwd="216.165.95.72" dyno= connect= service= status=503 bytes= 2015-03-26T02:44:50.716616+00:00 heroku[router]: at=error code=H10 desc="App crashed" method=GET path="/worker-signin" host=salty-journey-3310.herokuapp.com request_id=e12abfd7-4d0a-4869-93b3-5139f8d4e34c fwd="216.165.95.72" dyno= connect= service= status=503 bytes= 2015-03-26T02:46:07.463936+00:00 heroku[router]: at=error code=H10 desc="App crashed" method=GET path="/client-signup" host=salty-journey-3310.herokuapp.com request_id=be1015fc-a9b4-4b46-b858-b59ed1702d60 fwd="216.165.95.72" dyno= connect= service= status=503 bytes= 2015-03-26T02:48:38.323367+00:00 heroku[web.1]: State changed from crashed to starting 2015-03-26T02:48:44.211997+00:00 heroku[web.1]: Starting process with command slc run 2015-03-26T02:48:46.005432+00:00 app[web.1]: Detected 512 MB available memory, 512 MB limit per process (WEB_MEMORY) 2015-03-26T02:48:46.005458+00:00 app[web.1]: Recommending WEB_CONCURRENCY=1 2015-03-26T02:48:46.024208+00:00 app[web.1]: bash: slc: command not found 2015-03-26T02:48:46.871183+00:00 heroku[web.1]: Process exited with status 127 2015-03-26T02:48:46.882329+00:00 heroku[web.1]: State changed from starting to crashed 2015-03-26T02:55:07.351372+00:00 heroku[api]: Add strongloop:test add-on by tejas.vj.bhatt@gmail.com 2015-03-26T02:55:07.351372+00:00 heroku[api]: Release v8 created by tejas.vj.bhatt@gmail.com 2015-03-26T02:55:07.460254+00:00 heroku[web.1]: State changed from crashed to starting 2015-03-26T02:55:12.262442+00:00 heroku[web.1]: Starting process with command slc run 2015-03-26T02:55:13.558424+00:00 app[web.1]: Detected 512 MB available memory, 512 MB limit per process (WEB_MEMORY) 2015-03-26T02:55:13.558444+00:00 app[web.1]: Recommending WEB_CONCURRENCY=1 2015-03-26T02:55:13.558648+00:00 app[web.1]: bash: slc: command not found 2015-03-26T02:55:14.284209+00:00 heroku[web.1]: State changed from starting to crashed 2015-03-26T02:55:14.278656+00:00 heroku[web.1]: Process exited with status 127 2015-03-26T02:57:35.737008+00:00 heroku[api]: Deploy b6cac37 by tejas.vj.bhatt@gmail.com 2015-03-26T02:57:35.737008+00:00 heroku[api]: Release v9 created by tejas.vj.bhatt@gmail.com 2015-03-26T02:57:35.974640+00:00 heroku[web.1]: State changed from crashed to starting 2015-03-26T02:57:41.438704+00:00 heroku[web.1]: Starting process with command slc run 2015-03-26T02:57:42.881097+00:00 app[web.1]: Detected 512 MB available memory, 512 MB limit per process (WEB_MEMORY) 2015-03-26T02:57:42.881117+00:00 app[web.1]: Recommending WEB_CONCURRENCY=1 2015-03-26T02:57:42.881843+00:00 app[web.1]: bash: slc: command not found 2015-03-26T02:57:43.606052+00:00 heroku[web.1]: Process exited with status 127 2015-03-26T02:57:43.633805+00:00 heroku[web.1]: State changed from starting to crashed 2015-03-26T02:58:01.953397+00:00 heroku[router]: at=error code=H10 desc="App crashed" method=GET path="/" host=salty-journey-3310.herokuapp.com request_id=5b405b2b-3e81-4140-8ed6-205868383854 fwd="216.165.95.72" dyno= connect= service= status=503 bytes=



Node Redis connection working locally but not on Azure

I am hosting both a Web App and a Redis Cache Instance on Microsoft Azure. The connection should be pretty straightforward.



var redis = require('redis');
client = redis.createClient(6379, 'somedomain.redis.cache.windows.net');
client.auth('asecret');


I can connect fine locally, but whenever I connect on Azure I get the following error:



Unaught exception: Error: Redis connection to somedomain.redis.cache.windows.net:6379 failed - connect EADDRNOTAVAIL
at RedisClient.on_error (D:\home\site\wwwroot\node_modules\redis\index.js:196:24)


This problem hasn't occurred in the past and I've deployed to Azure countless times- it seems to have just randomly appeared. What on earth am I missing? I'm tearing out my hair over this problem.


Error in OpenShift: "phantomjs-node: You don't have 'phantomjs' installed"

I successfully created a script using phantomjs-node in local and I would like to host on OpenShift.


The thing is when I run my script hosted, I had this strange error:



phantom stderr: execvp(): No such file or directory phantomjs-node: You don't have 'phantomjs' installed



But as you can see, I put the dependancies in the package.json file:



"dependencies": {
"express": "~3.4.4",
"phantom": "*",
"phantomjs": "*"
},


Any suggestions?


Edit:


This is how I initialize the phantomjs script:



var options = {
port: 16000,
hostname: "127.2.149.1",
path: "/phantom_path/"
}
phantom.create(function(ph) {
visitUrl(ph, 0, 0);
}, options);

Why is my stream error handler not being called?

I have the following restify handler:



var assert = require('assert-plus');
var request = require('request');

function postParseVideo(req, res, next) {
assert.string(req.body.videoSourceUrl, 'videoSourceUrl');

var stream = request.get({uri: req.body.videoSorceUrl});
stream.on('response', function(parseResponse) {
fnThatTakesAReadableStream(parseResponse, function(err, data) {
if (err) {
console.log(err);
next(err);
} else {
res.send(201, null, {Location: data.location});
next();
}
});
});
stream.on('error', function(err) {
console.log(err);
next(err);
});
};


When I run this, neither of the stream event handlers is ever called. Instead, an error bubbles up to the restify server: {"code":"InternalError","message":"options.uri is a required argument"}. I looked at the source for request v2.54.0, and it looks like that error message must be coming from line 406 of restify.js, which reads:



return self.emit('error', new Error('options.uri is a required argument'))


I have used streams successfully in a number of other languages, but I’m new to using them in JavaScript. It seems to me like request.get is throwing a synchronous error, when it should be emitting an 'error' event. Is there something I’m fundamentally misunderstanding about how streams are implemented in Node?


Strange issue, model.find undefined in mongoose

My model is



user.js




module.exports = function (mongoose) {

var schema = new mongoose.Schema({
name : String,
emailAddress : String,
lastActivity: { type: Date, default: Date.now },
isActive : Boolean
});

return mongoose.model('User', schema);
}



db helper class db.js




var mongoose = require('mongoose');
mongoose.connect('mongodb://localhost/MyDatabase');
var db = mongoose.connection;

console.log('Try to connect to MongoDB via Mongoose ...');

db.on('error', console.error.bind(console, 'Mongoose connection error:'));
db.once('open', function callback() {
console.log('Connected to MongoDB !');
});

module.exports = {
User: require('../models/user.js')(mongoose),
}


And I use this in sails js controller



var db = require('../services/db.js');

module.exports = {
save : function(req,res){
var user = db.User({
name: "govind",
emailAddress: "govind@govind.com",
isActive : true
});
user.save(function(){
res.json({"response" : "successfully saved sample data"});
});

},
get : function(req,res){
console.log(db.User.find); //this print undefined
db.User.find(function (err, s) {
res.json({"response" : s});
});
}
};


When I call save controller work fine but when I call get it throw exception in console same line where db.User.find is being called, I also log find method in console and this also print undefined. I am using mongoose version 4.0.1.


Failed to load c++ bson extension with windows

I know this has been asked many times but I still get this after installing mongoose on Windows 8. I don't even see anything bson related in this path:



node_modules\mongoose\node_modules\mongodb\node_modules

2015-03-31 07:08 PM <DIR> .
2015-03-31 07:08 PM <DIR> ..
2015-03-31 07:08 PM <DIR> mongodb-core
2015-03-31 07:08 PM <DIR> readable-stream


Am I doing the installation wrong?


nodeJS Agenda - jobs do not run after restart

After defining and setting up several jobs to run at intervals, they work fine. If the job if killed and restarted, the jobs are still in the agenda collection but they do not continue running. Do they need to be individually scheduled to run again? Included the graceful() function.


the memory distribution of nodejs

When I start a new nodejs and type process.memoryUsage(), it shows


> process.memoryUsage() { rss: 11296768, heapTotal: 7195904, heapUsed: 2964776 }


so the nodejs uses the 11M memory and v8's heap uses 7M of them.


What else consumes the remaining 11-7=4M memory, the c++ part of nodejs? libuv? v8 itself?


Is there any methods or tools to see the memory distribution?


ps: I don't need node-heap/node-memwatch to detect the memory in v8 heap. They are mainly measuring the memory used by js project (js files). I want to know the memory used by node itself. Which parts use the remaining 4M, and how much does each part use.


nodejs object properities - find empty element

I have an object called obj which has 20 properities. Because I'm building an XML with json xml2js builder, I want to insert only those which are NOT empty. I have the following code (variable result has sql query results from db, and sometimes it happens, that a property in object is undefined, which I want to avoid).



var obj = [] ;
for (var i = 0; i < Object.keys(result[0]).length; i++) {
var value = (result[0])[Object.keys(result[0])[i]];
obj[i] = value;
}


What would be a fastest solution? Using try/catch would make the app slower, or?


How to include required files in Istanbul that are not in same directory as test case?

I'm trying to do something simple, but it's not working.. I must be doing something dumb.


I am using Istanbul with Mocha for code coverage + unit testing.


In the code being tested, it is using functions from modules which are being require'd, and I want those imported modules to be included in the code coverage - but it's not.


I am explicitly including a library from a require with a full path to it (it is not the same dir as where the test case resides)



var d = require(srcroot + '/scripting/wf_daemon/daemon_lib');


And then later, the test case is making a call to a function in that module startWorkFlow.



d.startWorkflow(workflow, function (msg) { // do something })


However, Istanbul does not go into the referenced function startWorkFlow, it only gives me coverage for the test file.


What I need is code coverage to extend into all functions from the modules require'd by the test case.


I am calling Istanbul like this:



istanbul cover --include-all-source --dir C:\Build\buildarea --print none "C:\Program Files\nodejs\node_modules\mocha/bin/_mocha" -- --reporter mocha-teamcity-reporter ./test.js


Is there any way to get Istanbul to instrument the files which are not in the directory (or subdirectories) where the test case resides? What simple mistake am I making?


Cheers!


npm install causes errors like npm ERR! tar.unpack untar error on Debian

Installing Grunt as in the Bootstrap documentation shown, I first installed grunt-cli globally with npm install -g grunt-cli and now I'm trying to execute npm install, but only getting errors:



root@devvm:/var/www/axit/portfolio/public/bower_components/bootstrap# npm install
npm ERR! tar.unpack untar error /root/.npm/wrappy/1.0.1/package.tgz
npm ERR! tar.unpack untar error /root/.npm/wrappy/1.0.1/package.tgz
npm ERR! tar.unpack untar error /root/.npm/brace-expansion/1.1.0/package.tgz
npm ERR! tar.unpack untar error /root/.npm/delayed-stream/0.0.5/package.tgz
npm WARN optional dep failed, continuing form-data@0.1.4
npm ERR! tar.unpack untar error /root/.npm/is-property/1.0.2/package.tgz
npm WARN optional dep failed, continuing request@2.54.0
npm ERR! Linux 3.2.0-4-amd64
npm ERR! argv "node" "/usr/local/bin/npm" "install"
npm ERR! node v0.10.35
npm ERR! npm v2.7.4
npm ERR! path /var/www/axit/portfolio/public/bower_components/bootstrap/node_modules/grunt-saucelabs/node_modules/sauce-tunnel/node_modules/request/node_modules/form-data/node_modules/combined-stream/node_modules/delayed-stream/test/integration/test-delayed-http-upload.js
npm ERR! code EPERM
npm ERR! errno 50

npm ERR! Error: EPERM, open '/var/www/axit/portfolio/public/bower_components/bootstrap/node_modules/grunt-saucelabs/node_modules/sauce-tunnel/node_modules/request/node_modules/form-data/node_modules/combined-stream/node_modules/delayed-stream/test/integration/test-delayed-http-upload.js'
npm ERR! { [Error: EPERM, open '/var/www/axit/portfolio/public/bower_components/bootstrap/node_modules/grunt-saucelabs/node_modules/sauce-tunnel/node_modules/request/node_modules/form-data/node_modules/combined-stream/node_modules/delayed-stream/test/integration/test-delayed-http-upload.js']
npm ERR! errno: 50,
npm ERR! code: 'EPERM',
npm ERR! path: '/var/www/axit/portfolio/public/bower_components/bootstrap/node_modules/grunt-saucelabs/node_modules/sauce-tunnel/node_modules/request/node_modules/form-data/node_modules/combined-stream/node_modules/delayed-stream/test/integration/test-delayed-http-upload.js' }
npm ERR!
npm ERR! Please try running this command again as root/Administrator.
npm ERR! Linux 3.2.0-4-amd64
npm ERR! argv "node" "/usr/local/bin/npm" "install"
npm ERR! node v0.10.35
npm ERR! npm v2.7.4
npm ERR! path npm-debug.log.ba707f2a7a688e388708bbe88e2dd4ed
npm ERR! code ETXTBSY
npm ERR! errno 62

npm ERR! ETXTBSY, rename 'npm-debug.log.ba707f2a7a688e388708bbe88e2dd4ed'
npm ERR!
npm ERR! If you need help, you may report this error at:
npm ERR! <http://ift.tt/1NicLxs;

npm ERR! Please include the following file with any support request:
npm ERR! /var/www/axit/portfolio/public/bower_components/bootstrap/npm-debug.log


I've tried to delete node_modules and to rerun the npm install, but it hasn't helped.


The environment: VirtualBox with Debian guest system and Windows 7 host system.


I supposed shared folders permissions issues, since there were some problems with it, when I was installing Bower. But symlinks are enabled for this shared folder and I started the VM as admin.


What is causing thhese errors and how to solve this problem?


Getting started with hapi js

I am looking at making a simple hello world with the hapi.js tutorial.


I have installed happy-



  1. npm init

  2. npm install hapi --save

  3. I get a large set of folders with files


I tried doing node index.js and that gave me errors. So I cd into node_modules and got another error when running node. I cd again into happi and again got an error when running node index.js. I added all of the syntax from the tutorial.



var Hapi = require('hapi');

var server = new Hapi.Server();
server.connection({
host: 'localhost',
port: 8000
});

// Add the route
server.route({
method: 'GET',
path:'/hello',
handler: function (request, reply) {
reply('hello world');
}
});

// Start the server
server.start();


Not sure where I should be running index.js


Node mocha array should contain an element

I want to do a simple assertion of something like



knownArray.should.include('known value')


The array is correct, but I simply can't figure out the proper assertion to use to check whether the array has this value (the index doesn't matter). I also tried should.contain but both of these throw an error that Object #<Assertion> has no method 'contain' (or 'include')


How can I check that an array contains an element using should?


Efficient Ruby and Node.js communication/IPC

I have a main Node.js API application that needs to generate a PDF file, the only mature PDF generator is Prawn PDF, which is written in Ruby.


I basically need to spawn a Ruby process from Node.js, pass it an arbitrary JSON payload, then listen for contents returned by the Ruby process, and then download it in the browser.




  1. What would be the most efficient method for Node.js to spawn/start a Ruby process?




  2. How should Node.js pass information efficiently to the Ruby process? currently I am using a JSON payload, however, Ruby have to parse this, and I don't believe this is the fastest in terms of performance. Is there something more efficient that I can use to pass information between the two processes?




At what number of columns should you make another table?

I had a table that nearly reached 20 columns.


I find each one necessary since it's a "project" table, and the project has columns that indicate stuff such as when it was created, when it was updated, its id, who started it, who finished it, some metadata such as keywords, the content of the project itself, a brief description and some other stuff.


I was happy with my table, but then I browsed some questions through stackoverflow and saw stuff like "your table should never have more than 10 columns" and suggestions that if it was the case, you should split your table into smaller ones.


Following StackOverflow's advice, I split my table into two tables, but I'm finding it more complicated to manipulate. Now, when each project is created, I have to create two new records, each one for each table. I have to handle errors on the creation of either record, which means that if the creation of the record on the second table fails, I have to do yet another query to rollback the creation of the first record. Data retrieval and record deletion has also been made more complex, since now I have to do that on two tables.


I'm using the Sails.js framework, and trying to use associations, but I find that it's pretty much the same, I still have to repeat tasks for each table.


Is it really worth it to split your table into smaller ones if it gets that big? Or should I just keep my 20 column table? My project is new, not even online, so I don't know performance. I've never understood associations/joins or databases in general, as in, I've never understood why people do it, so, what are the benefits?


Thanks.


How to access the target socket in a Socket.io handler

I'm working with a Socket.io server and I'm trying to access the socket that emitted the event from the client.


There's this example in the docs :



socket.on('private message', function (from, msg)
{
console.log('I received a private message by ', from, ' saying ', msg);
});


Trying to use this, I've noticed that the parameter order has changed in the latest version and the 'from' parameter is actually a function.


I'm having trouble using this to get information about who emitted the event.


Is there another way? or perhaps a way using the parameter to get the info?


Dynamic attach event to socket (socket.io)

In this example code "offer" event transmits some data from one client to another through server. Although, it doesn't works now)



//server side
var ioSpace = io.of('/channel_name');
ioSpace.on('connection', function(socket) {
var calleeID = '1e-hBUBhwhUUQo8pAAAC';

socket.on('offer', function(data) {
var callee = io.sockets.connected[calleeID];
callee.nsp.name = '/channel_name';

callee.once('answer', function(data) {
//To do something
});

callee.emit('offer', data);
});
});


This data transmitted to server from second client.



42/channel_name,["answer",<some data>"type":"answer"}]


but event 'answer' don't working.


And this code



callee.nsp.name = '/channel_name';


looks dirty. Anybody know how best to do here?


It task can be solved with another flags/if/else logic, but interesting concrete this approach - get socket with its namespace by id, attach once event to it, send data and waiting answer.


socket.io 1.3.5, node.js 0.12.0.




So. After some code and manual reading:



//server side
var ioSpace = io.of('/channel_name');
ioSpace.on('connection', function(socket) {
var calleeID = '1e-hBUBhwhUUQo8pAAAC';

socket.on('offer', function(data) {
var callee = ioSpace.connected[calleeID];

callee.once('answer', function(data) {
//To do something
});

callee.emit('offer', data);
});
});

Possible to populate two levels?

Say I have collections/documents like below:


question collection:



{
_id: ObjectId("0000"),
title: "test question",
survey: ObjectId("1234") //abbreviated for question assume this is a legit object id
}




survey collection:



{
_id: ObjectId("1234"),
title: "survey title!",
user: ObjectId("5678")
}




user collection:



{
_id: ObjectId("5678"),
Name: "Abe"
}


Is there a way for me to call something like:



questions.findOne({_id: "0000"}).populate("survey.user")


to get the user information along with the question? I understand that I can populate immediate parents of question, but I'm curious if it can be done for grandparents.


Is sha1 necessary in cookie-signature on github?

I don't know why sha1 is used at line 42 in http://ift.tt/1Nl5CN8


At line 42 in index.js you can see



return sha1(mac) == sha1(val) ? str : false;


I've tried changing it to



return mac == val ? str : false;


And it seems that the sign and unsign functions still work correctly.


What is the reason to use sha1 here? Is it a kind of security issue? Is sha1 necessary here?


MongoDB. Which data model I should choose for my project?

Started to learn MongoDB + node.js. For practice I want to create webapp which collect data and draw infographic. But I have very big problem with choosing data model for MongoDB base.


How it works now, but I think it's bad structure.


My service retrieving all data from API every hour and collecting it in collection "accounts". Every experience value from every hour in special object "expStart" with unique key which generated by request time. Structure of one document from collection "accounts" :



{
"_id" : ObjectId("54bd56cb699f4890160aacc9"),
"name" : "Shubiii",
"characters" : [
{
"online" : false,
"rank" : 562,
"dead" : false,
"name" : "ShrupShurp",
"league" : "hardcore",
"level" : 93,
"class" : "Ranger",
"experience" : 2515197599,
"expStat" : {
"dd2015_1_19_19_9" : 122120,
"dd2015_1_19_20_11" :45222
}
},
{
"online" : false,
"rank" : 563,
"dead" : false,
"name" : "ShrupShurp2",
"league" : "hardcore",
"level" : 93,
"class" : "Ranger",
"experience" : 2515197599,
"expStat" : {
"dd2015_1_19_19_9" : 3122120,
"dd2015_1_19_20_11" :21212
}
}
],
"challenges" : {
"total" : 3
},
"twitch" : {
"name" : ""
}


}


This is API data structure (http://ift.tt/1mtPFnj):



{
"total": 15000,
"entries": [
{
"online": false,
"rank": 2,
"dead": false,
"character": {
"name": "iamgodyi",
"level": 100,
"class": "Ranger",
"experience": 4250334444
},
"account": {
"name": "TW_James",
"challenges": {
"total": 0
},
"twitch": {
"name": "destiny601"
}
}
},
{
"online": true,
"rank": 3,
"dead": false,
"character": {
"name": "xVisco",
"level": 100,
"class": "Templar",
"experience": 4250334444
},
"account": {
"name": "xVisco",
"challenges": {
"total": 0
}
}
}
]


}


Sorry for my English.


npm ssh error doesn't replicate from shell

When trying npm install, I get error:



1206 error node v0.12.0
1207 error npm v2.7.4
1208 error code 128
1209 error Command failed: git clone --template=/root/.npm/_git-remotes/_templates --mirror ssh://git@git.spindle.factfiber.com/schematist-postgres.git /root/.npm/_git-remotes/ssh-git-git-spindle-factfiber-com-schematist-postgres-git-8e4b2071
1209 error ssh: Could not resolve hostname git.spindle.factfiber.com: Name or service not known


However, when I try the very command reported from the shell:



git clone --template=/root/.npm/_git-remotes/_templates --mirror ssh://git@git.spindle.factfiber.com/schematist-postgres.git /root/.npm/_git-remotes/ssh-git-git-spindle-factfiber-com-schematist-postgres-git-8e4b2071


It works:



Cloning into bare repository '/root/.npm/_git-remotes/ssh-git-git-spindle-factfiber-com-schematist-postgres-git-8e4b2071'...
remote: Counting objects: 47, done.
remote: Compressing objects: 100% (39/39), done.
remote: Total 47 (delta 20), reused 0 (delta 0)
Receiving objects: 100% (47/47), 13.05 KiB, done.
Resolving deltas: 100% (20/20), done.


I tried this a couple times in case it was a temp dns glitch. Does anyone know what could be happening?


UPDATE: when I create a new repo and



npm install git+http://sshgit@git.spindle.factfiber.com:schematist-postgres.git


Then once it failed, 2nd time succeeded. Either way npm takes a really long time (20-30 secs?), while shell git succeeds in just a second.


This is a docker image, btw. I decided to throw in some command-line tools to an image (screen, jove) and rebuild -- before it was building fine, but still taking a really long time. So I don't think that just including those packages was the problem. The problem might be some DNS misconfiguration? But if so why is it npm specific?


UPDATE -- Dockerfile. This is being built on mac/yosemite



FROM debian:wheezy
RUN DEBIAN_FRONTEND=noninteractive apt-get update
RUN DEBIAN_FRONTEND=noninteractive apt-get install -y curl wget \
openssh-client build-essential unicode-data screen jove

# ------------------------------------------------
# node
RUN curl -sL http://ift.tt/1BMRHvi | bash -
RUN DEBIAN_FRONTEND=noninteractive apt-get install -y nodejs
RUN npm install -g coffee-script && npm -g install npm@latest

# ------------------------------------------------
# git
# setup keys so we can pull from github
# mounted "sw-pull-rsa" contains key

RUN DEBIAN_FRONTEND=noninteractive apt-get install -y git
ADD ./sw-pull-rsa /root/.ssh/id_rsa
ADD ./sw-pull-rsa.pub /root/.ssh/id_rsa.pub
RUN chmod 0600 /root/.ssh/id_rsa
RUN ssh-keyscan github.com >> /root/.ssh/known_hosts
RUN ssh-keyscan git.spindle.factfiber.com >> /root/.ssh/known_hosts

# ------------------------------------------------

RUN npm install -g npm
RUN npm install -g npm-cache

RUN mkdir -p /tmp/common
COPY ./common/package.json /tmp/common/
RUN cd /tmp/common && npm install

Testing promises and sync functions that throw errors

I'm trying to build and test a function at the same time. Testing makes sense and I love it in theory, but when It comes down to it it always is a pain in the behind.


I have a function that takes a string and throws errors when something goes awry if all goes well it's going to return the original text argument and therefore a truthy value, if not it should be caught by the promise it's either in or itself as the promise.


This is the test / what I actually want to do (which doesn't work).



var main = require("./index.js")
var Promise = require("bluebird")
var mocha = require("mocha")
var chai = require("chai")
var chaiPromise = require("chai-as-promised")
chai.use(chaiPromise)

var shouldThrow = [
"random", // invalid non-flag
"--random", // invalid flag
"--random string", //invalid flag
"--wallpaper", // invalid flag w/ match
"--notify", // invalid flag w/ match
"wallpaper", // valid non-flag missing option(s) image
"wallpaper image.jpg" // invalid flag value
"wallpaper http://ift.tt/1G3L9LT", // invalid flag value
"wallpaper //cdn.shopify.com/s/files/1/0031/5352/t/28/assets/favicon.ico?12375621748379006621", // invalid flag value
"wallpaper http://ift.tt/1FecoQO", // invalid flag value
"wallpaper http://ift.tt/1G3L9LV", // invalid flag value
"wallpaper http://ift.tt/1Fecqs1", // invalid flag value
"wallpaper http://ift.tt/1G3L76O --queue", // invalid flag value
"wallpaper http://ift.tt/1G3L76O --queue "+moment().subtract(1, "month").format("YYYY-MM-DD-HH-mm"), // invalid flag value
"wallpaper http://ift.tt/1G3L76O --queue "+moment().add(1, "month").format("YY-MM-DD-HH"), // invalid flag value
"wallpaper --image http://ift.tt/1G3L9LT", // invalid flag value not https
"wallpaper --image //cdn.shopify.com/s/files/1/0031/5352/t/28/assets/favicon.ico?12375621748379006621", // invalid flag no protocol
"wallpaper --image http://ift.tt/1FecoQO", // invalid flag value not https
"wallpaper --image http://ift.tt/1G3L9LV", // invalid flag value not valid image
"wallpaper --image http://ift.tt/1Fecqs1", // invalid flag image not found
"wallpaper --image http://ift.tt/1G3L76O --queue", // invalid subflag queue missing value
"wallpaper --image http://ift.tt/1G3L76O --queue "+moment().subtract(1, "month").format("YYYY-MM-DD-HH-mm"), // invalid subflag queue date value is past
"wallpaper --image http://ift.tt/1G3L76O --queue "+moment().add(1, "month").format("YY-MM-DD-HH"), // invalid subflag queue date value format
"--wallpaper --image http://ift.tt/1G3L76O", //no action non-flag
"--wallpaper --image http://ift.tt/1G3L76O --queue "+moment().add(1, "month").format("YYYY-MM-DD-HH-mm"), //no action non-flag
"notify", // valid non-flag missing option(s) message, open
'notify --message "Hello world"', // valid flag missing params open
'notify --open "https://www.holstee.com"', // valid flag missing params message
'notify --message "Hello world" --open "http://www.holstee.com"', // invalid subflag value `open` should be https
'notify --message "Hello world" --open "https://www.holstee.com" --queue', // invalid subflag queue missing value
'notify --message "Hello world" --open "https://www.holstee.com" --queue '+moment().subtract(1, "month").format("YYYY-MM-DD-HH-mm"), // invalid subflag queue date value is past
'notify --message "Hello world" --open "https://www.holstee.com" --queue '+moment().add(1, "month").format("YY-MM-DD-HH"), // invalid subflag queue date value format
'--notify --message "Hello world" --open "https://www.holstee.com"', //no action non-flag
'--notify --message "Hello world" --open "https://www.holstee.com --queue "'+moment().add(1, "month").format("YYYY-MM-DD-HH-mm"), //no action non-flag
]

var shouldNotThrow = [
'notify --message "Hello world" --open "https://www.holstee.com"',
'notify --message "Hello world" --open "https://www.holstee.com --queue "'+moment().add(1, "month").format("YYYY-MM-DD-HH-mm"),
"wallpaper --image http://ift.tt/1G3L76O",
"wallpaper --image http://ift.tt/1G3L76O --queue "+moment().add(1, "month").format("YYYY-MM-DD-HH-mm"),
]

describe('Process Text', function(){
return Promise.map(shouldThrow, function(option){
it('throw error', function(){
return main.processText(option).should.throw()
})
})
return Promise.map(shouldNotThrow, function(option){
it('throw error', function(){
return main.processText(option).should.not.throw()
})
})
})


Here's a snapshot of the non-working* function I'm trying to test.



main.processText = function(text){
var args = minimist(text.split(" "))
var actions = _.keys(actionsFlags)
var flags = _.chain(_.map(actionsFlags, _.keys)).flatten().uniq().value()
var extraUnparsed = _.extra(actions, args._)
var providedFlags = _.chain(args).keys().without("_").value()
var extraParsed = _.extra(flags, providedFlags)
var validActions = _.intersection(actions, args._)
var requiredFlags = _.mapObject(actionsFlags, function(flags){
return _.filterObject(flags, function(flag){
return flag
})
})
if(extraUnparsed.length) throw new Error("invalid unparsed argument(s): "+extraUnparsed.join(", "))
if(extraParsed.length) throw new Error("invalid parsed argument(s): "+extraParsed.join(", "))
if(validActions.length > 1) throw new Error("too many actions: "+validActions.join(", "))
if(validActions.length == 0) throw new Error("no action: "+actions.join(", "))
_.each(actions, function(action){
var missingFlags = _.missing(_.keys(requiredFlags[action]), providedFlags)
var extraFlags = _.extra(_.keys(requiredFlags[action]), providedFlags)
if(_.contains(args._, action)){
if(missingFlags.length) throw new Error(util.format("missing required flags for %s: %s", action, missingFlags.join(", ")))
if(extraFlags.length) throw new Error(util.format("extra flags for %s: %s", action, extraFlags.join(", ")))
}
})
return text
}


Note its not a promise and doesn't return any promises yet. One of the validation features I want is to check a if a url responds in a 200 status code, that's gonna be a request promise. If I update this function then does all of the function contents need to be nested within a Promise.resolve(false).then()? Perhaps the promise shouldn't be in this block of code and all async validation operations should exist somewhere else?


I don't know what I'm doing and I'm a little frustrated. I'm of course looking for some golden bullet or whatever that will make sense of all this.


Ideally I could use some help on how to test this kind of function. If I make it into a promise later on I still want all my tests to work.




Here's some example code of what I mean by sync functions and promises.



function syncFunction(value){
if(!value) throw new Error("missing value")
return value
}

function asyncFunction(url){
return requestPromise(url)
}

// Both of these will throw errors the same way they will be caught by the promise then you can use `.catch` (in bluebird).

Promise.resolve(false).then(function(){
return syncFunction()
})

Promise.resolve(false).then(function(){
return asyncFunction("http://404.com")
})


I want this to reflect the way that I test for errors and whether something should or should not throw an error in my test.




I left the promises out of it, it's a sync function and I'm testing like this.



describe('Process Text', function(){
_.each(shouldThrow, function(option){
it('throw error ('+option+')', function(){
expect(function(){
main.textValidation(option)
}).to.throw()
})
})
_.each(shouldNotThrow, function(option){
it('not throw error ('+option+')', function(){
expect(function(){
main.textValidation(option)
}).to.not.throw()
})
})
})

MEAN with mySQL

I'm trying to learn angularjs and I want to use mySQL instead of MongoDB in my MEAN stack. Coming from a LAMP background, I am having trouble understanding the organizational/functional structure of the MEAN stack. I have a result query I would like to obtain from a mySQL database and display it is a filterable table with user-re-sizable columns on my webpage, which will appear below a header and navigation bar. From examples I've seen, this is my current folder structure.



├── app/
│ ├── controllers/
│ ├── models/
│ ├── routes/
│ └── views/
├── config/
│ └── env/
├── mysql.conf
├── package.json
├── public/
│ ├── css/
│ ├── index.html
│ └── js/
└── Server.js


I know this isn't much to work with. In PHP, I basically had all my code in one .php file. # mySQL connection info $con=mysqli_connect("$host","$user","$pw","myDatabase"); if (mysqli_connect_errno($con)){ echo "Failed to conenct to MySQL: " . mysqli_connecterror(); } $sqlSelect = "SELECT * FROM myTable"; $queryResult = mysqli_query($con, "$sqlSelect") or die(mysqli_error); $queryRowNum = mysqli_num_rows($queryResult);



$columnNameArr = [];
$i = 0;
while($nameField = mysqli_fetch_field($queryResult)){
$columnNameArr[$i] = $nameField->name;
$i++;
}
?>


Then I could use the $queryRowNum variable to print out the rows.



<table>
<?php //populate data rows
foreach ($queryResult as $row){
echo "<tr>";
foreach ($row as $col){
echo "<td>$col</td>";
}
echo "</tr>";
}
?>
</table>


So the data fetching and HTML output is all done in the same file. How do I make this more modular and more MEAN compliant?


Node base64 encode doesn't give whole string

Good day,


I am having a weird issue with Node, I am encoding a file as Base64 and albeit it works for most of my PDFs that I am encoding, one in particular doesn't output the whole base64 string.


The actual b64 string starts like this: "JVBERi0xLjMKJf////8KNiAwIG9i..." but I only get "JVBERi0xLjMK"


Here is my code:



function sendPDF() {
// Grab the final PDF
require('fs').readFile(transaction.deliverable, function (err, data) {
if (err) {
console.log(err);
log(2, "Couldn't read: " + transaction.deliverable);
} else {
transaction.deliverable = new Buffer(data, 'binary').toString('base64');
//transaction.deliverable = data.toString('base64');
console.log(transaction.deliverable);

}


The commented out line was another attempt. The transaction structure is:



function Transaction(snapshot) {
var data = snapshot.val();
this.tid = snapshot.key();
this.request = data.request;
this.pages = [];
this.fileCount = 0;
this.deliverable = null;
this.fileName = "";
}


This transaction simple stores some information that I pull from Firebase however the important var, .deliverable is a string to the path of the PDF I need to encode and send.


I don't get any read errors when this happens, and the next transaction goes through this code block just fine, giving a full base64 string.


I was curious if my toString() was interpolating the base64 string, but then I thought I would have had larger problems earlier.


Any ideas? I can put this on hold and move on with my work but I would love to fix this.. Thank you.


npm forever equivalent in rails?

With node, I can use forever to keep a process running forever. What's the most popular equivalent in rails?


I'm trying to avoid any overly complex or expensive setup right now so I'm really looking for a simple, cheap solution.


Multi schema XSDs node-soap

This may be on the same lines as this: node-soap multiple import schemas But I coudnt find the fix.


I have a WSDL which refers to one XSD and this XSD refers another one.


Parent.wsdl --> a.xsd -->b.xsd


node-soap is not able to create namespace for b.xsd due to which the soap request packet does not have entry for b.xsd types.


Please let me know how can I address this.


I have tried node-soap for 1 xsd scenario and it works perfectly fine.


Node Cannot pipe pdf response

Trying to write a test (mocha) to check that the PDF returned from my api endpoint holds the correct data and looks like it should. The PDF is generated on the sever. It returns 'corrrectly' when hit the endpoint manually, but would like to write some tests. I uploaded an example 'correct' PDF to my test suite which I am able to parse with hummus js and pull out the necessary aspects for comparison.


I would like to make a request to my endpoint (with superagent) and then pipe the response (a pdf) into a temp pdf. Then I'll parse both PDF's (the uploaded perfect one, and the one returned from my endpoint) and make sure they match.


My code for the request:



it('Should look like the proposed pdf', function (done) {

request(app)
.get(url) //var that sets the path earlier
.expect(200)
.end(function (err, res) {
if(err) return done(err);

var writeStream = fs.createWriteStream('./test/materials/tmp.pdf');
writeStream.pipe(res); //ERROR can't pipe
writeStream.on('end', function() {
console.log('complete');
done();
})
});

});


When I run my tests I get: Uncaught Error: Cannot Pipe. Not readable. I am very new to node so I am not sure what is causing the error. When I console out res, I get a large binary encoded jumble so maybe that is the issue? I've tried a couple things - using the hummus pdfWriter, trying to decode with



new Buffer(res, 'base64')



etc... but still no luck. I believe I have all the necessary packages installed for these operations and it seems to be a piping/decoding/superagent issue. Thanks for the help!


CouchbaseError: Client-Side timeout exceeded for operation

I am running couchbase server 3.0.1, with NodeJS SDK. I have set up a server with two buckets, and no documents. I opened a connections via:


var dbCluster = new couchbase.Cluster([my_host]); var db = dbCluster.openBucket( [my_bucket_name]);


When I try to execute 'db.upsert', I get the following error:


Error: CouchbaseError: Client-Side timeout exceeded for operation. Inspect network conditions or increase the timeout


Now the strangest part, is that the document still gets added. So the command executes successfully, but still returns the error. Also the call gets executed very fast, and definitely does not take 2.5 sec (default operation timeout).


Following this: Couchbase times out after few seconds I have tried increasing the timeout via:


db.operationTimeout = 5000;


This did not work. Again the call was executed successfully, but timeout error was returned even though the specified timeout did not pass (the call was very fast, much faster than 5 sec).


The only other support for this I could find is in this Google Groups, but it seems dead: http://ift.tt/1EzGlgP


I am running OSX Yosemite 10.10.2. Any help would be really appreciated, as I do not know where to look next.


Thank you,


Here is the code for reference:



var couchbase = require('couchbase');

var dbCluster = new couchbase.Cluster("mylocalhostname");
var db = dbCluster.openBucket( "mybucketname", function(err){
// no errors here
if(err){
console.log("Can't open bucket");
throw err;
}
else {
console.log("Successfully opened bucket");
}
});

db.operationTimeout = 5000;

db.upsert("docname", someJSON, function(err, result){

if(err){
// get an error here, even though the call successfully executes
return console.log("Error: " + err);
}
});

HTTPS using a .pfx file


var https = require('https');
var fs = require('fs');
var atob = require('atob');

var options = {
pfx: atob(fs.readFileSync('server.pfx').toString())
};

https.createServer(options, function(req, res) {
res.writeHead(200);
res.end("Hello world!\n");
}).listen(1337);


I'm pretty sure I didn't screw up the certificate or anything (tho I'm a newb at SSL)… I copied the code pretty much from the HTTPS module official docs. I got this:



crypto.js:176
c.context.loadPKCS12(pfx);
^
Error: wrong tag
at Object.exports.createCredentials (crypto.js:176:17)
at Server (tls.js:1127:28)
at new Server (https.js:35:14)
at Object.exports.createServer (https.js:54:10)
at Object.<anonymous> (/Users/brian/Desktop/DevDoodle/https.js:9:7)
at Module._compile (module.js:456:26)
at Object.Module._extensions..js (module.js:474:10)
at Module.load (module.js:356:32)
at Function.Module._load (module.js:312:12)
at Function.Module.runMain (module.js:497:10)


I assume the problem has to do with crypto.createCredentials, which is deprecated. ಠ_ಠ


Am I using the HTTPS module correctly? What can I do to make SSL work?


socket io not found - 404

I ran into this error when following a tutorial about creating a chat service. Here is the code: (most of it is automatically generated)



var port = normalizePort(process.env.PORT || '3000');
app.set('port', port);

/**
* Create HTTP server.
*/

var server = http.createServer(app);

/**
* Listen on provided port, on all network interfaces.
*/

var io = require('socket.io').listen(app.listen(port));
io.sockets.on('connection', function (socket) {
socket.emit('message', { message: 'welcome to the chat' });
socket.on('send', function (data) {
io.sockets.emit('message', data);
});
});


Client side:



script(src='/javascripts/chat.js')
script(src='/http://ift.tt/1aeIZU4')


Do I actually have to manually put socket.io.js into the folder? I don't get it...


How configure jshint in a meanjs project with socket.io?

I followed exactly that tutorial (I obviously replaced the vexxhost domain name by localhost:3000 for my tests).


And while calling grunt there is an error ('io' is not defined) but the server start without any other complain.


If I correctly understood the problem, jshint scan the whole project to validate the code and he finds on reference to an undefined variable ! But 'io' is defined when the whole application starts (because the script are loaded). In fact, that error is more a warning than an error


If I am right (and I hope some people here will correct me if I am not), that bring us to my question : How refactor the code or configure jshint to avoid that warning ?


If possible I would prefer to have an explicit reference to 'io'.


By advance thank you to everyone.


Kiip web sdk integration showing bad request error in console

I just integrated KIIP SDK for web in my node js application and it works correctly . But the problem is it showing error on browser console on each page refresh, But this error does not affect working of this sdk, it works perfectly.


the errror message showing is,


'POST http://ift.tt/1ad1XTE 400 (Bad Request)'.


My kiip code integration is as follows,


1) Included the script file on head tag


2)Declared the app key as global variable,


kiip_app_key='app-key from kiip site';


3) And intialized the kiip instance and invoked the method.



var kiipInstance = new Kiip(kiip_app_key);
kiipInstance.setTestMode();
kiipInstance.postMoment('received offer');

NodeJS Variable Scope

I'm very, very new to the whole NodeJS stack, and I'm trying to rough up a simple login system for practice.


Jumping to my question,


app.js



...
var mongoose = require( 'mongoose' );
var templates = require( './data/inc.js' ); // includes schema structures
...




user.js - included in inc.js



...
module.exports =
{
"Schema" : new exports.mongoose.Schema({
"uid": mongoose.Schema.Types.ObjectId,
"username": { type:String, unique:true },
"alias": String,
"credentials":
{
"salt": String,
"password": String,
"key": String
},
"profile":
{
"age": { type: Number, min: 18 }
},
"last_login": Date,
"updated": { type: Date, default: Date.now }
})
}
...


The 'user.js' script above will not work because it doesn't have access to the mongoose object variable instantiated in the 'app.js' script. In PHP any included/required scripts would be able to access variables from the parent script, but in NodeJS as I know it for example I have to re-require/state the mongoose variable in order to create my schema tree.


user.js



...
* var mongoose = require( 'mongoose' ); // must include in script to use mongoose object

module.exports
{
...
}
...


Is there any work-around that will allow me the same scope access as PHP?


Sails Hook route forbidden

I try to make an installable hook to make some security verifications so under my hook.js I set :



routes : {
before : {
"/" : function (req, res, view)
{
.....
res.forbidden();
}
}
},


And I have this error if I try to send forbidden page to the user :



error: Sending 500 ("Server Error") response:
TypeError: Object #<ServerResponse> has no method 'view'
at Object.forbidden (/Users/jaumard/Documents/workspaceIDE/KikiLib/api/responses/forbidden.js:56:19)
at ServerResponse.bound [as forbidden] (/usr/local/lib/node_modules/sails/node_modules/lodash/dist/lodash.js:729:21)
at isRouteAllowed (/Users/jaumard/Documents/workspaceIDE/KikiLib/api/hooks/acl.js:66:18)
at routeTargetFnWrapper (/usr/local/lib/node_modules/sails/lib/router/bind.js:179:5)
at callbacks (/usr/local/lib/node_modules/sails/node_modules/express/lib/router/index.js:164:37)
at param (/usr/local/lib/node_modules/sails/node_modules/express/lib/router/index.js:138:11)
at pass (/usr/local/lib/node_modules/sails/node_modules/express/lib/router/index.js:145:5)
at nextRoute (/usr/local/lib/node_modules/sails/node_modules/express/lib/router/index.js:100:7)
at callbacks (/usr/local/lib/node_modules/sails/node_modules/express/lib/router/index.js:167:11)
at /usr/local/lib/node_modules/sails/lib/router/bind.js:187:7
at /usr/local/lib/node_modules/sails/lib/hooks/i18n/index.js:35:7
at Object.i18nInit [as init] (/usr/local/lib/node_modules/sails/node_modules/i18n/i18n.js:89:5)
at addLocalizationMethod (/usr/local/lib/node_modules/sails/lib/hooks/i18n/index.js:33:11)
at routeTargetFnWrapper (/usr/local/lib/node_modules/sails/lib/router/bind.js:179:5)
at callbacks (/usr/local/lib/node_modules/sails/node_modules/express/lib/router/index.js:164:37)
at param (/usr/local/lib/node_modules/sails/node_modules/express/lib/router/index.js:138:11) [TypeError: Object #<ServerResponse> has no method 'view']
error: Server Error:
error: TypeError: Object #<ServerResponse> has no method 'view'
at Object.forbidden (/Users/jaumard/Documents/workspaceIDE/KikiLib/api/responses/forbidden.js:56:19)
at ServerResponse.bound [as forbidden] (/usr/local/lib/node_modules/sails/node_modules/lodash/dist/lodash.js:729:21)
at isRouteAllowed (/Users/jaumard/Documents/workspaceIDE/KikiLib/api/hooks/acl.js:66:18)
at routeTargetFnWrapper (/usr/local/lib/node_modules/sails/lib/router/bind.js:179:5)
at callbacks (/usr/local/lib/node_modules/sails/node_modules/express/lib/router/index.js:164:37)
at param (/usr/local/lib/node_modules/sails/node_modules/express/lib/router/index.js:138:11)
at pass (/usr/local/lib/node_modules/sails/node_modules/express/lib/router/index.js:145:5)
at nextRoute (/usr/local/lib/node_modules/sails/node_modules/express/lib/router/index.js:100:7)
at callbacks (/usr/local/lib/node_modules/sails/node_modules/express/lib/router/index.js:167:11)
at /usr/local/lib/node_modules/sails/lib/router/bind.js:187:7
at /usr/local/lib/node_modules/sails/lib/hooks/i18n/index.js:35:7
at Object.i18nInit [as init] (/usr/local/lib/node_modules/sails/node_modules/i18n/i18n.js:89:5)
at addLocalizationMethod (/usr/local/lib/node_modules/sails/lib/hooks/i18n/index.js:33:11)
at routeTargetFnWrapper (/usr/local/lib/node_modules/sails/lib/router/bind.js:179:5)
at callbacks (/usr/local/lib/node_modules/sails/node_modules/express/lib/router/index.js:164:37)
at param (/usr/local/lib/node_modules/sails/node_modules/express/lib/router/index.js:138:11) [TypeError: Object #<ServerResponse> has no method 'view']


I'm under sails 0.11.0. I don't use policies cause policies apply only for controller action and I want to make ACL for view, controller and more...


All source code available here : http://ift.tt/19zr8P8


For now I use : res.status(403).send("<h1>" + req.__("Forbidden") + "</h1>"); instead of res.forbidden(); but it would be nice to send a view


How to set up code coverage and unit tests for express functions?

I have a route defined like


app.post '/v1/media', authentication.hasValidApiKey, multipart, mediaController.create, mediaController.get


I want to write tests for the individual components of the route. So starting with authentication.hasValidApiKey, that's a function defined in another file:



exports.hasTokenOrApi = (req, res, next) ->
if not req.headers.authorization
return res.status(403).end()

doOtherStuff...


In my test, I have:



authentication = require '../src/middlewares/authentication'

describe 'Authentication Middleware', ->
before (done) ->
done()

it 'should check for authentication', (done) ->
mock_req = null
mock_res = null
mock_next = null

authentication.hasTokenOrApi mock_res, mock_req, mock_next
done()


How do I deal with the req, res and next? And how can I setup code coverage to run? I am running my tests with: export NODE_ENV=test && ./node_modules/.bin/mocha --compilers coffee:'./node_modules/coffee-script/lib/coffee-script/register'


TypeScript / NodeJS: variable not defined, using internal ref paths for separate files

I have been testing typescript with node, all was going well until i tried splitting these out.


Am I forced to use modules ?


I have 2 files, app.ts which has a reference path to the hellofile.tst



/// <reference path="hellofile.ts" />
var testme = new Hello()
console.log(testme.testMe());


and the hellofile.ts which contains



class Hello {
testMe():string {
return "hello";
}
}


Now running the program (i am using webstorm), I get the following error.




/usr/local/bin/node app.js

/

Users/tst/WebstormProjects/NodeJsWithTypescript/app.js:2
var testme = new Hello();
^
ReferenceError: Hello is not defined
at Object.<anonymous> (/Users/tst/WebstormProjects/NodeJsWithTypescript/app.js:2:18)
at Module._compile (module.js:456:26)
at Object.Module._extensions..js (module.js:474:10)
at Module.load (module.js:356:32)
at Function.Module._load (module.js:312:12)
at Function.Module.runMain (module.js:497:10)
at startup (node.js:119:16)
at node.js:906:3

Process finished with exit code 8


Any help or ideas really appreciated


thanks


lundi 30 mars 2015

Glob on mean.js v4.0 does not get module name "resume"

Using the meanjs.org stack to create a portfolio site.


Created a vertical module named "resume", apparently it changes to "resumes" for whatever reason.


Decided to be picky and renamed everything back to "resume". Routing doesn't work. I check the globs, looks like it missed the file entirely but got every other module despite the fact it's in the same folder hierarchy.


Changed the modules/resume folder name to random gibberish i.e. "jskdhfkjsdhfkjsd", it detects, changed back to "resume", missing.


Changed to "resumé" and it works.


My config/default.js is this:



server: {
allJS: ['gruntfile.js', 'server.js', 'config/**/*.js', 'modules/*/server/**/*.js'],
models: 'modules/*/server/models/**/*.js',
routes: ['modules/*[!core]/server/routes/**/*.js','modules/core/server/routes/**/*.js'],
sockets: 'modules/*/server/sockets/**/*.js',
config: 'modules/*/server/config/*.js',
policies: 'modules/*/server/policies/*.js',
views: 'modules/*/server/views/*.html'
}


So it really should pick up the folder if it's named "resume". The only thing I can think of is that there's something preventing "resume" from being used, i.e. if it's a reserved keyword somewhere.


node.js version v0.10.35.


package.json dependencies:



"dependencies": {
"express": "~4.10.0",
"express-session": "~1.9.1",
"serve-favicon": "~2.1.6",
"body-parser": "~1.9.0",
"cookie-parser": "~1.3.2",
"compression": "~1.2.0",
"method-override": "~2.3.0",
"morgan": "~1.4.1",
"multer": "0.1.6",
"connect-mongo": "~0.4.1",
"connect-flash": "~0.1.1",
"helmet": "~0.4.0",
"consolidate": "~0.10.0",
"swig": "~1.4.1",
"mongoose": "~3.8.8",
"passport": "~0.2.0",
"passport-local": "~1.0.0",
"passport-facebook": "~1.0.2",
"passport-twitter": "~1.0.2",
"passport-linkedin": "~0.1.3",
"passport-google-oauth": "~0.1.5",
"passport-github": "~0.1.5",
"acl": "~0.4.4",
"socket.io": "~1.1.0",
"lodash": "~2.4.1",
"forever": "~0.11.0",
"bower": "~1.3.8",
"grunt-cli": "~0.1.13",
"chalk": "~0.5.1",
"glob": "~4.0.5",
"async": "~0.9.0",
"nodemailer": "~1.3.0"
},

How to handle callbacks in a for loop(Node.JS)

I am trying to write a code with NodeJS where I grab data from an external API and then populate them in MongoDB using Mongoose. In between that, I'll check to see if that particular already exists in Mongo or not. Below is my code.



router.route('/report') // the REST api address
.post(function(req, res) // calling a POST
{
console.log('calling report API');
var object = "report/" + reportID; // related to the API
var parameters = '&limit=100' // related to the API
var url = link + object + apiKey + parameters; // related to the API

var data = "";
https.get(url, function callback(response)
{
response.setEncoding("utf8");
response.on("data", function(chunk)
{
data += chunk.toString() + "";
});

response.on("end", function()
{
var jsonData = JSON.parse(data);
var array = jsonData['results']; // data is return in array of objects. accessing only a particular array
var length = array.length;
console.log(length);

for (var i = 0; i < length; i++)
{
var report = new Report(array.pop()); // Report is the schema model defined.
console.log('^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^');
console.log(i);
console.log('*****************************');
console.log(report);
console.log('*****************************');
// console.log(report['id']);

/*report.save(function(err)
{
if(err)
res.send(err);
});*/

Report.find({id:report['id']}).count(function(err, count) // checks if the id of that specific data already exists in Mongo
{
console.log(count);
console.log('*****************************');
if (count == 0) // if the count = 0, meaning not exist, then only save
{
report.save(function(err)
{
console.log('saved');
if(err)
res.send(err);
});
}
});
};
res.json({
message: 'Grabbed Report'
});
});
response.on("error", console.error);
});
})


My problem is that since NodeJS callbacks are parallel, it is not getting called sequentially. My end result would be something like this :



  1. Calling report API

  2. console.log(length) = 100

  3. ^^^^^^^^^^^^^^^^^^^^^^^^

  4. console.log(i) = starts with 0

  5. *******************************

  6. console.log(report) = the data which will be stored inside Mongo

  7. *******************************

  8. number 3 - 7 repeats 100 times as the length is equals to 100

  9. console.log(count) = either 0 or 1

  10. number 9 repeats 100 times

  11. console.log('saved')

  12. number 11 repeats 100 times

  13. Lastly, only the last out of 100 data is stored into Mongo


What I need is some sort of technique or method to handle these callbacks which are executing one after the other and not sequentially following the loop. I am pretty sure this is the problem as my other REST APIs are all working.


I have looked into async methods, promises, recursive functions and a couple others non which I could really understand how to solve this problem. I really hope someone can shed some light into this matter.


Feel free also to correct me if I did any mistakes in the way I'm asking the question. This is my first question posted in StackOverflow.


SLC Command not found

Im using Mac OS X 10.10.2 .

node -v v0.12.1 npm -v 2.5.1 Installing the Package worked only with sudo even when i fixed the permission rights via

$ sudo chown -R $USER /usr/local/bin $ sudo chown -R $USER /usr/local/lib/node_modules Without sudo it throws EACCS error´s http://ift.tt/19u54p1


/usr/local/bin/npm/node_modules/strongloop/bin/slc exists. Any thoughts?


node.js website deployed to Azure - 404 failed to load a .json resource

I have a node.js Web App in Azure,


the site loads the index.html, the css, images, etc. but the JS search functionality doesn't initialize, I did an F12 inspection in Chrome and saw this error



[domain]http://.azurewebsites.net/data/policies.json Failed to load resource: the server responded with a status of 404 (Not Found)


in the Azure console I can see the file list



> cd public
D:\home\site\wwwroot\public

> cd data
D:\home\site\wwwroot\public\data

> ls
D:\home\site\wwwroot\public\data

policies.json


according to the folder/file structure (everything is in the /public folder) I have made a configuration change as follows


/ = "site/webroot/public"


the folders are laid out like this



/public/index.html
/public/data
/public/js
/public/css


etc


Without the config setting the website doesn't see /public as the root folder, so it doesn't find the index.html and nothing loads.


So the site loads, which is great, the images and css load, which is great


but it says it can't find the .json file in the data folder? (and using the console the file is definitely there!)


please advise.


How do I install, and connect to mongodb in node.js on my remote server (ubuntu)?

I have spent at least 3 hours failing to connect to mongodb onto my server. I have managed to install it, create my database, and i have created a new user in the console:



use admin


then



db.createUser(
{
user: "admin",
pwd: "xxx",
roles: [ { role: "userAdminAnyDatabase", db: "admin" } ]
}
)


And all seems good. Then, to connect to the user in app.js, i do:



var databaseUrl = "mongodb://admin:xxx@108.61.221.63:27018/mydatabasename";
var db = require("mongodb").connect(databaseUrl);


At this point, when i run the server (npm start) this is what i get:



{ [Error: Cannot find module '../build/Release/bson'] code: 'MODULE_NOT_FOUND' }
js-bson: Failed to load c++ bson extension, using pure JS version
{ [Error: Cannot find module '../build/Release/bson'] code: 'MODULE_NOT_FOUND' }
js-bson: Failed to load c++ bson extension, using pure JS version
{ [Error: Cannot find module '../build/Release/bson'] code: 'MODULE_NOT_FOUND' }
js-bson: Failed to load c++ bson extension, using pure JS version
{ [Error: Cannot find module '../build/Release/bson'] code: 'MODULE_NOT_FOUND' }
js-bson: Failed to load c++ bson extension, using pure JS version
/home/adminftp/public_html/globular/node_modules/mongodb/lib/mongo_client.js:92
throw new Error("no callback function provided");
^
Error: no callback function provided
at Function.MongoClient.connect (/home/adminftp/public_html/globular/node_modules/mongodb/lib/mongo_client.js:92:11)
at Object.<anonymous> (/home/adminftp/public_html/globular/app.js:31:29)
at Module._compile (module.js:462:26)
at Object.Module._extensions..js (module.js:480:10)
at Module.load (module.js:357:32)
at Function.Module._load (module.js:312:12)
at Function.Module.runMain (module.js:503:10)
at startup (node.js:132:16)
at node.js:817:3


...i have no idea what to make of it.


If anyone has any ideas how i can resolve it i would appreciate it. Thank you!


node scheduler / mongo monitoring tool

I have an node scheduler that scraps data and inserts into mongo. Sometimes the source that I scrap from dies, sometimes the server dies etc.


I am monitoring the server with NewRelic.


Is there a way I can monitor the source OR a package that can monitor whether the scheduler is inserting records OR a way that can monitor is a db in mongo is growing and sends an alert if something is wrong?


Any help is appreciated!


Protractor and Fitnesse

Is it possible to run protractor with fitnesse. I.e. specify the test case in fitnesse and have it call protractor in the fixturing?


I want to be able to run protractor like selenium grid with remote webdriver, where we have a node and hub setup, and instantiate a session by defining capabilities for anew remote webdriver instance. Is this sort of thing possible if you want to use fitnesse instead of node.js?


Thanks, Rahul


Node.js Heroku Facebook Developer

I have managed to successfully implement a 'log in using Facebook' button in to my website/heroku app, it works fine when used locally with the site URL and App Domain set to localhost:8080 within the app settings of the Facebook development page.


I have now tried to push the app to heroku to go live, I have changed the Site URL and App Domain to myapp.herokuapp.com, I have set my heroku config using: heroku config:set FACEBOOK_APP_ID=133333333463066 \ FACEBOOK_SECRET=a7244e333333333a7a2bf9492a6089a0but whenI attempt to use the button I receive this response:



Given URL is not permitted by the Application configuration: One or more of the given URLs is not permitted by the App's settings. It must match the Website URL or Canvas URL, or the domain must be a subdomain of one of the App's domains.



How to collaborate with multiple npm publishers on one module

How can we organize our file-system and processes if there are multiple publishers on an npm module? Do we need a common repository (ex. GIT) or is there a smart way to use npm's own publishing & updating process?


The main issue I can't get my head around is that the initial publisher of the package is not able to get the latest version from within the package itself, is he? Unless he installs it as a dependency on another package and then updates & publishes from within that dependency.


Saving Images of Detected Faces using AR Drone

I am creating a code for the parrot ar drone such that when a face is detected through the front camera on the ar drone an image of the face is saved to the controller. Similar to the Copterface code, but instead of following the face, an image of it is saved. I originally wrote the code in c++ and it worked and I am writing the code for the drone in node.js/javascript but I cant get it to work. Note that I used the Coperface code as a basis for creating mine. Can anyone help me?



// Run this to receive a png image stream from your drone.

var arDrone = require('ar-drone');
var cv = require('opencv');
var http = require('http');
var fs = require('fs');

console.log('Connecting png stream ...');

//var stream = arDrone.createUdpNavdataStream();
var client = arDrone.createClient();
var pngStream = client.getPngStream();
var processingImage = false;
var lastPng;
var navData;
var flying = false;
var startTime = new Date().getTime();
var log = function(s){
var time = ( ( new Date().getTime() - startTime ) / 1000 ).toFixed(2);

console.log(time+" \t"+s);
}

pngStream
.on('error', console.log)
.on('data', function(pngBuffer) {
//console.log("got image");
lastPng = pngBuffer;
});

// Detect faces
face_cascade.detectMultiScale(frame_gray, faces, 1.1, 2, 0 | CASCADE_SCALE_IMAGE, Size(30, 30));

// Set Region of Interest
var roi_b;
var roi_c;

var ic = 0; // ic is index of current element
var ac = 0; // ac is area of current element

var ib = 0; // ib is index of biggest element
var ab = 0; // ab is area of biggest element

for (ic = 0; ic < faces.size(); ic++) // Iterate through all current elements (detected faces)

{
var roi_cqx = faces[ic].x;
var roi_cqy = faces[ic].y;
var roi_cqwidth = (faces[ic].width);
var roi_cqheight = (faces[ic].height);

var ac = roi_cqwidth * roi_cqheight; // Get the area of current element (detected face)

var roi_bqx = faces[ib].x;
var roi_bqy = faces[ib].y;
var roi_bqwidth = (faces[ib].width);
var roi_bqheight = (faces[ib].height);


var crop = frame(roi_b);
resize(crop, res, Size(128, 128), 0, 0, INTER_LINEAR); // This will be needed later while saving images
cvtColor(crop, gray, CV_BGR2GRAY); // Convert cropped image to Grayscale

// Form a filename
filename = "/home/iram4/Desktop/Faces";
var ssfn;
ssfn << filename.c_str() << filenumber << ".jpg";
filename = ssfn.str();
imwrite(filename, res); filenumber++;

}

var faceInterval = setInterval( detectFaces, 150);

client.takeoff();
client.after(5000,function(){
log("going up");
this.up(1);
}).after(1000,function(){
log("stopping");
this.stop();
flying = true;
});


client.after(60000, function() {
flying = false;
this.stop();
this.land();
});

client.on('navdata', function(navdata) {
navData = navdata;
})


var server = http.createServer(function(req, res) {
if (!lastPng) {
res.writeHead(503);
res.end('Did not receive any png data yet.');
return;
}

res.writeHead(200, {'Content-Type': 'image/png'});
res.end(lastPng);
});

server.listen(8080, function() {
console.log('Serving latest png on port 8080 ...');
});