I am trying to read xml/json document using nodejs from marklogic. I have downloaded nodejs API code base from the following URL.
Its working fine for small documents like 500KB. But our requirement to read large document like 2 MB or 10 MB.
Regarding this, We have two case which is as mention below:-
Case 1:- Whenever I am trying to read document using marklogic Nodejs API,I should get more than one chunks but I got only one chunk as response. So for this reason it is not working.
var chunks = 0;
var length = 0;
db.documents.read("test.xml").stream('chunked').
on('data', function(chunk) {
console.log(chunk);
console.log(chunk.length);
chunks++;
length += chunk.length;
}).
on('error', function(error) {
console.log(JSON.stringify(error));
}).
on('end', function() {
console.log('read '+chunks+' chunks of '+length+' length');
console.log('done');
});
Case 2:- Whenever I am trying to read same large document( like 2 MB or 10 MB) using "http-digest-client" package that time its working fine and I get complete xml as response.
var digest = require('http-digest-client')('<Username>', '<password>');
digest.request({
host: '<Host server>',
path: '/v1/documents?uri=test.xml',
port: 8007,
method: 'GET',
}, function (res) {
reply(res);
});
I have tested this with large document with below mentioned example(please refer below url), but I got same response which I have described in case 1 in above.
As per my requirement I would like to read the large document using marklogic Nodejs API(Case 1).
- How can I read large document using marklogic nodejs API?
- Is there any option to increase pool memory size or any other memory size?
- Is this issue related memory size?
Aucun commentaire:
Enregistrer un commentaire