Exporting large data in csv for users - export-to-csv

I am trying to make a system where users are allowed to export data in csv format. It is no problem in formatting the data in csv format. I am having he problem in processing the request for csv export data. suppose a User requests for exporting all data and it is huge. I think it would not be best way to ask user to wait untill the request is complete right? Should I tell users that exporting of data is in progress and we will notify once it is complete? If yes i should use background process for it right?

If possible, define some kind of configurable 'setting' which might default to 10Mb. If you can tell the export will be more than 10Mb you ask the user if they want to export such a large rowset. If possible, estimate the size and let them know how large that will be. If you show them a progress bar while you're running a background export you'll have to have some sort of process to process communication to pass that back to the user command form. Have an option on the form to change that setting to a number closer to the one they want as a warning level.

I finally went with streaming data to users in csv format using nodejs.
function csvExport(req, res){
mysqlPool.getConnection(function(err, connection){
if(err){
throw err;
}
res.header('Content-disposition', 'attachment; filename=connects.csv');
res.header('Content-Type', 'text/csv');
var csv_header_row = "Email,First Name,Last Name,Status,Created\n";
res.write(csv_header_row);
var query = connection.query('SELECT * FROM contacts where user_id = ? AND deleted = 0', [req.params.user_id]);
query
.on('error', function(err) {
//handle error
})
.on('fields', function(fields) {
})
.on('result', function(row) {
var csv_row = row.email+','+row.first_name+','+row.last_name+','+row.status+','+row.created+"\n";
res.write(csv_row);
})
.on('end', function() {
res.end('', function(){
console.log('done');
});
});
});
}

Related

Get undefined variable trying to update the path to an uploaded file within MongoDB using angular-file-upload and MEAN.js

If someone could help me I would be eternally grateful. I have been slamming my head against a brick wall for weeks trying to get images to upload the way it is demonstrated out of the box with the MEAN.js users module. In the generated users module the file is uploaded into a directory and the path to that file is stored in a field in the mongodb document. I can get the file to upload to where it needs to go using multer and the fileupload function. However, I cannot save the path to the field within the document. I cannot figure out how to avoid getting an 'undefined' variable. I've tried creating a $window service and passing data to it as a global variable and a bunch of other things and I'm totally stuck.
I have commented the code below to demonstrate what is going awry in my server controller changeShoePicture function.
// This is the boilerplate code from the mean.js "users" module.
// I can not create a $window service or global variable to store the
// shoe data below so that I can update the shoe.shoeImageURL field
// in MongoDB with path to the successfully uploaded file.
exports.changeShoePicture = function (req, res) {
var message = null;
var shoe = req.shoe;
var upload = multer(config.uploads.shoeUpload).single('newProfilePicture');
var profileUploadFileFilter = require(path.resolve('./config/lib/multer')).profileUploadFileFilter;
console.log('i am here', shoe); // shoe is defined here.
// Filtering to upload only images. This works and proceeds to the else condition!
upload.fileFilter = profileUploadFileFilter;
upload(req, res, function (uploadError) {
if(uploadError) {
return res.status(400).send({
message: 'Error occurred while uploading profile picture'
});
} else {
//shoe image file is successfully uploaded to the location on the server,
// However the following fails because the shoe variable is undefined.
shoe.shoeImageURL = config.uploads.shoeUpload.dest + req.file.filename;
}
});
To make sure I've got this right:
The upload function is being called on your parameters passed by your route, req and res. You set the shoe var from req.shoe.
What are the chances that upload() is messing with your req?
Drop a console.log(req) in right after you call upload and report back

Loki.js lost data in Ionic Apps

I am developing an Ionic app using loki.js, but every time it refresh the app press f5 i loss all data stored in loki database.
Why it happen?
I using no chache in my ionic app.
It could be that when you press F5 the data saved in memory is not written to the target json file yet.
You can try to set explicitly a time range to save the data when you instantiate loki:
var _db = new Loki('./database/db.json', {
autoload: true,
autosave: true,
autosaveInterval: 5000 // 5 secs
});
function add(newPatient) {
return $q(function(resolve, reject) {
try {
var _patientsColl = GetCollection(dbCollections.PATIENTS);
if (!_patientsColl) {
_patientsColl = _db.addCollection(dbCollections.PATIENTS,{indices:['firstname','lastname']});
}
_patientsColl.insert(newPatient);
console.log('Collection data: ', _patientsColl.data);
resolve();
}
catch (err) {
reject(err);
}
});
}
function GetCollection(collectionName){
return _db.getCollection(collectionName);
}
With "autosaveInterval" the data in memory will be written to the JSON file every 5 seconds (you can adjust this value as you prefer).
EDIT
I added the code I use to save a new document into my collection and even with page refresh, the data is stored correctly. I use "autosave" among the db settings, maybe you can set it as well, in case there is a path that is not correctly catch when you explicitly trigger saving.
Put the file in the /Users/xxx directory,then magical things happened, the program is running normally.

Azure Mobile Services Node.js update column field count during read query

I would like to update a column in a specific row in Azure Mobile Services using server side code (node.js).
The idea is that the column A (that stores a number) will increase its count by 1 (i++) everytime a user runs a read query from my mobile apps.
Please, how can I accomplish that from the read script in Azure Mobile Services.
Thanks in advance,
Check out the examples in the online reference. In the table Read script for the table you're tracking you will need to do something like this. It's not clear whether you're tracking in the same table the user is reading, or in a separate counts table, but the flow is the same.
Note that if you really want to track this you should log read requests to another table and tally them after the fact, or use an external analytics system (Google Analytics, Flurry, MixPanel, Azure Mobile Engagement, etc.). This way of updating a single count field in a record will not be accurate if multiple phones read from the table at the same time -- they will both read the same value x from the tracking table, increment it, and update the record with the same value x+1.
function read(query, user, request) {
var myTable = tables.getTable('counting');
myTable.where({
tableName: 'verses'
}).read({
success: updateCount
});
function updateCount(results) {
if (results.length > 0) {
// tracking record was found. update and continue normal execution.
var trackingRecord = results[0];
trackingRecord.count = trackingRecord.count + 1;
myTable.update(trackingRecord, { success: function () {
request.execute();
});
} else {
console.log('error updating count');
request.respond(500, 'unable to update read count');
}
}
};
Hope this helps.
Edit: fixed function signature and table names above, adding another example below
If you want to track which verses were read (if your app can request one at a time) you need to do the "counting" request and update after the "verses" request, because the script doesn't tell you up front which verse records the user requested.
function read(query, user, request) {
request.execute( { success: function(verseResults) {
request.respond();
if (verseResults.length === 1) {
var countTable = tables.getTable('counting');
countTable.where({
verseId: verseResults[0].id
}).read({
success: updateCount
});
function updateCount(results) {
if (results.length > 0) {
// tracking record was found. update and continue normal execution.
var trackingRecord = results[0];
trackingRecord.count = trackingRecord.count + 1;
countTable.update(trackingRecord);
} else {
console.log('error updating count');
}
}
}
});
};
Another note: make sure your counting table has an index on the column you're selecting by (tableName in the first example, verseId in the second).

Upload Data to Meteor / Mongo DB

I have a Meteor app and would like to upload data (from csv) to a meteor collection.
I have found:
solutions (e.g. Collectionfs) which deal with file uploads
methods for uploading directly to the underlying mongo db from the shell
references to meteor router - but I am using the excellent iron-router, which does not appear to provide this functionality
My requirement is that the app user be able to upload csv data to the app from within the app. I do not need to store the csv file anywhere within the app file structure, I just need to read the csv data to the collection.
It is possible that I cannot figure out how to do this because my terms of reference ('upload data to meteor') are ambiguous or incorrect. Or that I am an idiot.
ChristianF's answer is spot on and I have accepted it as the correct answer. However, it provides even more than I need at this stage, so I am including here the code I have actually used - which is largely taken from Christian's answer and other elements I have found as a result:
HTML UPLOAD BUTTON (I am not including drag and drop at this stage)
<template name="upload">
<input type="file" id="files" name="files[]" multiple />
<output id="list"></output>
</template>
JAVASCRIPT
Template.upload.events({
"change #files": function (e) {
var files = e.target.files || e.dataTransfer.files;
for (var i = 0, file; file = files[i]; i++) {
if (file.type.indexOf("text") == 0) {
var reader = new FileReader();
reader.onloadend = function (e) {
var text = e.target.result;
console.log(text)
var all = $.csv.toObjects(text);
console.log(all)
_.each(all, function (entry) {
Members.insert(entry);
});
}
reader.readAsText(file);
}
}
}
})
NB there is a jquery-csv library for Meteor here: https://github.com/donskifarrell/meteor-jquery-csv
I've solved this problem in the past using this gist of mine, together with this code (using the jquery-csv plugin to parse the csv data). This is done on the client side and is independent of using iron-router or not. It would be fairly straightforward to move the insertion code into a Meteor method, uploading the csv file first and then parsing and inserting the data on the server. I've tried that, too, but didn't see any performance improvement.
$(document).ready(function() {
var dd = new dragAndDrop({
onComplete: function(files) {
for (var i = 0; i < files.length; i++) {
// Only process csv files.
if (!f.type.match('text/csv')) {
continue;
}
var reader = new FileReader();
reader.onloadend = function(event) {
var all = $.csv.toObjects(event.target.result);
// do something with file content
_.each(all, function(entry) {
Items.insert(entry);
});
}
}
}
});
dd.add('upload-div'); // add to an existing div, turning it into a drop container
});
Beware though that if you are inserting a lot of entries, then you are better off turning all reactive rerendering off for a while, until all of them are inserted. Otherwise, both node on the server and the browser tab will get really slow. See my suggested solution here: Meteor's subscription and sync are slow

How to stream MongoDB Query Results with nodejs?

I have been searching for an example of how I can stream the result of a MongoDB query to a nodejs client. All solutions I have found so far seem to read the query result at once and then send the result back to the server.
Instead, I would (obviously) like to supply a callback to the query method and have MongoDB call that when the next chunk of the result set is available.
I have been looking at mongoose - should I probably use a different driver?
Jan
node-mongodb-driver (the underlying layer that every mongoDB client uses in nodejs) except the cursor API that others mentioned has a nice stream API (#458). Unfortunately i did not find it documented elsewhere.
Update: there are docs.
It can be used like this:
var stream = collection.find().stream()
stream.on('error', function (err) {
console.error(err)
})
stream.on('data', function (doc) {
console.log(doc)
})
It actually implements the ReadableStream interface, so it has all the goodies (pause/resume etc)
Streaming in Mongoose became available in version 2.4.0 which appeared three months after you've posted this question:
Model.where('created').gte(twoWeeksAgo).stream().pipe(writeStream);
More elaborated examples can be found on their documentation page.
mongoose is not really "driver", it's actually an ORM wrapper around the MongoDB driver (node-mongodb-native).
To do what you're doing, take a look at the driver's .find and .each method. Here's some code from the examples:
// Find all records. find() returns a cursor
collection.find(function(err, cursor) {
sys.puts("Printing docs from Cursor Each")
cursor.each(function(err, doc) {
if(doc != null) sys.puts("Doc from Each " + sys.inspect(doc));
})
});
To stream the results, you're basically replacing that sys.puts with your "stream" function. Not sure how you plan to stream the results. I think you can do response.write() + response.flush(), but you may also want to checkout socket.io.
Here is the solution I found (please correct me anyone if thatis the wrong way to do it):
(Also excuse the bad coding - too late for me now to prettify this)
var sys = require('sys')
var http = require("http");
var Db = require('/usr/local/src/npm/node_modules/mongodb/lib/mongodb').Db,
Connection = require('/usr/local/src/npm/node_modules/mongodb/lib/mongodb').Connection,
Collection = require('/usr/local/src/npm/node_modules/mongodb/lib/mongodb').Collection,
Server = require('/usr/local/src/npm/node_modules/mongodb/lib/mongodb').Server;
var db = new Db('test', new Server('localhost',Connection.DEFAULT_PORT , {}));
var products;
db.open(function (error, client) {
if (error) throw error;
products = new Collection(client, 'products');
});
function ProductReader(collection) {
this.collection = collection;
}
ProductReader.prototype = new process.EventEmitter();
ProductReader.prototype.do = function() {
var self = this;
this.collection.find(function(err, cursor) {
if (err) {
self.emit('e1');
return;
}
sys.puts("Printing docs from Cursor Each");
self.emit('start');
cursor.each(function(err, doc) {
if (!err) {
self.emit('e2');
self.emit('end');
return;
}
if(doc != null) {
sys.puts("doc:" + doc.name);
self.emit('doc',doc);
} else {
self.emit('end');
}
})
});
};
http.createServer(function(req,res){
pr = new ProductReader(products);
pr.on('e1',function(){
sys.puts("E1");
res.writeHead(400,{"Content-Type": "text/plain"});
res.write("e1 occurred\n");
res.end();
});
pr.on('e2',function(){
sys.puts("E2");
res.write("ERROR\n");
});
pr.on('start',function(){
sys.puts("START");
res.writeHead(200,{"Content-Type": "text/plain"});
res.write("<products>\n");
});
pr.on('doc',function(doc){
sys.puts("A DOCUMENT" + doc.name);
res.write("<product><name>" + doc.name + "</name></product>\n");
});
pr.on('end',function(){
sys.puts("END");
res.write("</products>");
res.end();
});
pr.do();
}).listen(8000);
I have been studying mongodb streams myself, while I do not have the entire answer you are looking for, I do have part of it.
you can setup a socket.io stream
this is using javascript socket.io and socket.io-streaming available at NPM
also mongodb for the database because
using a 40 year old database that has issues is incorrect, time to modernize
also the 40 year old db is SQL and SQL doesn't do streams to my knowledge
So although you only asked about data going from server to client, I also want to get client to server in my answer because I can NEVER find it anywhere when I search and I wanted to setup one place with both the send and receive elements via stream so everyone could get the hang of it quickly.
client side sending data to server via streaming
stream = ss.createStream();
blobstream=ss.createBlobReadStream(data);
blobstream.pipe(stream);
ss(socket).emit('data.stream',stream,{},function(err,successful_db_insert_id){
//if you get back the id it went into the db and everything worked
});
server receiving stream from the client side and then replying when done
ss(socket).on('data.stream.out',function(stream,o,c){
buffer=[];
stream.on('data',function(chunk){buffer.push(chunk);});
stream.on('end',function(){
buffer=Buffer.concat(buffer);
db.insert(buffer,function(err,res){
res=insertedId[0];
c(null,res);
});
});
});
//This is the other half of that the fetching of data and streaming to the client
client side requesting and receiving stream data from server
stream=ss.createStream();
binarystring='';
stream.on('data',function(chunk){
for(var I=0;i<chunk.length;i++){
binarystring+=String.fromCharCode(chunk[i]);
}
});
stream.on('end',function(){ data=window.btoa(binarystring); c(null,data); });
ss(socket).emit('data.stream.get,stream,o,c);
server side replying to request for streaming data
ss(socket).on('data.stream.get',function(stream,o,c){
stream.on('end',function(){
c(null,true);
});
db.find().stream().pipe(stream);
});
The very last one there is the only one where I am kind of just pulling it out of my butt because I have not yet tried it, but that should work. I actually do something similar but I write the file to the hard drive then use fs.createReadStream to stream it to the client. So not sure if 100% but from what I read it should be, I'll get back to you once I test it.
P.s. anyone want to bug me about my colloquial way of talking, I'm Canadian, and I love saying "eh" come at me with your hugs and hits bros/sis' :D