How to avoid to crash a browser? - facebook

I need a way to get the friend ids of a user.
I written this code but my browser crashes ( maybe because I have 2500 friends ):
var query = FB.Data.query('SELECT uid2 FROM friend WHERE uid1 = me()');
query.wait(function(rows) {
var i;
for(i=0;i<rows.length;i++){
document.getElementById('friends').innerHTML += i + ') ' + rows[i].uid2 + '<br />';
}});
Is there a less CPU consuming approach ?

At VERY least, compile all your HTML in a variable and then pass it to DOM in one .innerHTML assignment. Right now your forcing page redraw two times per each of your 2500 friends because browser needs to update its own internal understanding of page on innerHTML read in += and then once again on writing back.
query.wait(function(rows) {
var i;
var html = "";
for(i=0;i<rows.length;i++){
html += i + ') ' + rows[i].uid2 + '<br />';
}
document.getElementById('friends').innerHTML = html
}
You can also use some other approaches, like storing generated strings in an array and then joining them together and assigning to HTML, but that's just optimizing around JS immutable strings and garbage collector. Some JS engines may eventually do this much better that you'd do by hand. Doing innerHTML assignment once, however, is almost guaranteed to be huge speed increase, because it literally means "regenerate everything and then apply a little bit and encode everything back again" and there's hardly any way to automatically optimize doing this inside loop inside JS+HTML engines junction.

Related

Properly attaching pictures to an email with google scripts

I'm trying to make a script with google forms and sheets to help with the automation and tracking our technicians pictures on the jobsite.
The setup is they take pictures of the jobsite and fill out a google form with the information and attach the pictures there. When the form gets submitted, it runs this script to send an email to a predetermined email that everyone in the office can see.
So far I am able to get the email to send the information from the form besides the pictures.
The information for the attached pictures come in as a drive url that is all dumped into one cell as a string.
"https://drive.google.com/open?id=xxxxxxxx, https://drive.google.com/open?id=yyyyyyyy, https://drive.google.com/open?id=zzzzzzzz"
I convert this string to an array using .split(" ,) which outputs this.
[https://drive.google.com/open?id=xxxxxxxx, https://drive.google.com/open?id=yyyyyyyy, https://drive.google.com/open?id=zzzzzzzz]
I then iterate through the array and use.slice(33) to get rid of the url so all that I'm left with is the drive id (there is probably a better way of doing this but it works for now).
[xxxxxxxx, yyyyyyyy, zzzzzzzz]
This is the part where I'm having trouble.
I then iterate agian through that array and grab the driveID and the get the file as a JPEG.
I then use .push to put it into another array that I'm using to attachment them to the email.
The issue is that I think I'm not doing this step properly by not pushing the correct thing into the array and/or assuming that MailApp.sendEmail can even take an array for attachments.
I'm also not entirely sure how [Blobs] work and how to use them properly and that's probably where I'm getting stuck.
Again, this is code is made with very little experience and could probably be optimized futher but at the moment, I just need to have it attach the pictures properly to show that it works.
function onFormSubmit(e) {
//for testing purposes
var values = e.namedValues;
//gets the form's values
var pureValues = e.values;
//sets the values
var email = pureValues[1];
var woNum = pureValues[2];
var firstN = pureValues[3];
var lastN = pureValues[4];
var desc = pureValues[5];
var superDuperRawPics = pureValues[6];
//splits the picture urls into an array
var superRawPics = superDuperRawPics.split(", ");
//slices the url part off to get the drive ID
var i, rawPics =[]
for (i = 0; i < superRawPics.length; ++i) {
rawPics.push(superRawPics[i].slice(33))
}
//takes the array of ID's and gets the drive file
var j, picAttach =[]
for (j = 0; j < rawPics.length; ++j) {
var driveID = DriveApp.getFileById(rawPics[j]);
var drivePic = driveID.getAs(MimeType.JPEG);
picAttach.push(drivePic);
}
//sets the subject of the email to be Jobsite Pictures and the work number
var subject = "Jobsite Pictures" + " " + woNum;
//sets the body of the email
var body = "Technician: " + email + " \n" +
"WO#: " + woNum + " \n" +
"Customer: " + firstN + " " + lastN + " \n" +
"Description: " + desc;
//for checking if the vars are set correctly
Logger.log(superDuperRawPics);
Logger.log(superRawPics);
Logger.log(rawPics);
Logger.log(picAttach);
Logger.log(subject);
Logger.log(body);
//sends email to me with the new info
MailApp.sendEmail('example#domian.com', subject, body, {attachments: [picAttach]});
}
If you just want to attach them then use options attachments
I was being dumb and added brackets to the attachments: when it didn't need them.
The correct way is this.
MailApp.sendEmail('example#domian.com', subject, body, {attachments: picAttach});
Changing this has the script sending emails with the pictures attached.

Select a random node on every reload

I want to select a random node on every reload. My fusion file looks like this:
randomInt = ${Math.randomInt(0, q(node).children(Neos.Neos:Document).count()}
randomNode = ${q(node).children(Neos.Neos:Document).get(this.randomInt)}
Unfortunately the result is stored in the cache. That means that only after the cache get flushed a new node will be returned. How can I prevent this? I have already experimented with the cache rules a little bit, but I didn't come up with a solution yet.
The element that I want to use is on every page. That's why something like the unchached mode would be a really bad idea.
In my situation the output is only a array of strings. So I did following in my Fusion.
Generate "almost" a Array in Fusion
allMyStrings = Neos.Fusion:Loop {
items = ${q(node).children(Neos.Neos:Document).get()}
itemName = 'node'
itemRenderer = ${"'" + q(node).property('testString') + "'"}
#glue = ','
}
Pick a random array in JS
<p id='replaceMe'></p>
<script>
var quoteArray = [{allMyStrings -> f:format.raw()}]
var randomIndex = Math.floor(Math.random() * quoteArray.length);
var randomElement = quoteArray[randomIndex];
document.getElementById('replaceMe').outerHTML= '<p>' + randomElement + '</p>';
</script>
A bit hacky but it works and it don't harm the performance of the website

Where can I find the emit() function implementation used in MongoDB's map/reduce?

I am trying to develop a deeper understanding of map/reduce in MongoDB.
I figure the best way to accomplish this is to look at emit's actual implementation. Where can I find it?
Even better would just be a simple implementation of emit(). In the MongoDB documentation, they show a way to troubleshoot emit() by writing your own, but the basic implementation they give is really too basic.
I'd like to understand how the grouping is taking place.
I think the definition you are looking for is located here:
https://github.com/mongodb/mongo/blob/master/src/mongo/db/commands/mr.cpp#L886
There is quite a lot of context needed though to fully understand what is going on. I confess, I do not.
1.Mongo's required JS version is no longer in O.Powell's url, which is dead. I cannot find it.
2.The below code seems to be the snippet of most interest. This cpp function, switchMode, computes the emit function to use. It is currently at;
https://github.com/mongodb/mongo/blob/master/src/mongo/db/commands/mr.cpp#L815
3.I was trying to see if emit has a default to include the _id key, which seems to occur via _mrMap, not shown here. Elsewhere it is initialized to {}, the empty map.
void State::switchMode(bool jsMode) {
_jsMode = jsMode;
if (jsMode) {
// emit function that stays in JS
_scope->setFunction("emit",
"function(key, value) {"
" if (typeof(key) === 'object') {"
" _bailFromJS(key, value);"
" return;"
" }"
" ++_emitCt;"
" var map = _mrMap;"
" var list = map[key];"
" if (!list) {"
" ++_keyCt;"
" list = [];"
" map[key] = list;"
" }"
" else"
" ++_dupCt;"
" list.push(value);"
"}");
_scope->injectNative("_bailFromJS", _bailFromJS, this);
}
else {
// emit now populates C++ map
_scope->injectNative( "emit" , fast_emit, this );
}
}

How to continue loop in Map( /Reduce ) function?

I have a Map function in MongoDB which I'm later using Reduce on. I use a collection which has a bunch of users in it and users own some channels. However, there are users that do not have any channels and the Map/Reduce function raises an error in my script.
map = Code("function () {"
" if(!this.channels) continue;"
" this.channels.forEach(function(z) {"
" emit(z, 1);"
" });"
"}")
When I use return instead of continue to quit the function it works flawlessly except that I don't want to end the loop. Is there any smart way around this?
Thanks for your advice and better widsom.
If you return from map, it returns only from map for this document. Maps for other documents will be executed regardless of that.
I suggest rewriting your map to this form
function () {
if(this.channels) {
this.channels.forEach(function(z) {
emit(z, 1);
});
}
}
I think, this form is more clear. It will emit something for users that have channels, and skip those that don't have any.

Javascript Database Mass Insert

I am trying to insert over 70,000 rows into a javascript database (using Chrome 5.0.317.2). The inserts are taking a very long time. The actual page loads in a few seconds, and I can see progress as the percent increases very slowly as each row is inserted. It took about an hour to finish inserting all the records. Is there a way to optimize the inserts, or somehow start out with a preloaded SQLite database?
<script src="jquery.1.3.2.min.js" type="text/javascript" charset="utf-8"></script>
<script type="text/javascript" charset="utf-8">
// Truncated to 1 row for example. There are really 76547 rows.
var zipcodes = var zipcodes = [{"city_name":"AMHERST","city_alias":"AMHERST","zipcode":"01002"}];
var db;
function openMyDatabase() {
var shortName = 'mydb';
var version = '1.0';
var displayName = 'mydb';
var maxSize = 65536;
db = openDatabase(shortName, version, displayName, maxSize);
db.transaction(
function(transaction) {
transaction.executeSql(
'CREATE TABLE IF NOT EXISTS zipcode ' +
' (id INTEGER NOT NULL PRIMARY KEY AUTOINCREMENT, ' +
' city_name TEXT NOT NULL, ' +
' city_alias TEXT NOT NULL, ' +
' zipcode TEXT NOT NULL)'
);
}
);
$.each(zipcodes, function(i, zipcode) {
insertZipcode(zipcode.city_name, zipcode.city_alias, zipcode.zipcode, i);
});
}
function errorHandler(transaction, error) {
alert('Oops. Error was '+error.message+' (Code '+error.code+')');
return true;
}
function insertZipcode(cityName, cityAlias, zipcode, i) {
db.transaction(
function(transaction) {
transaction.executeSql(
'INSERT INTO zipcode (city_name, city_alias, zipcode) VALUES (?, ?, ?);',
[cityName, cityAlias, zipcode],
function(){
$('#counter').html((100 * i / zipcodes.length) + '%');
},
errorHandler
);
}
);
return false;
}
$(function() {
openMyDatabase();
});
</script>
Solution: On the PHP side, I made an associative array and used the zip code as the key and an array of cities as the value, and I ran it through json_encode and passed that to the javascript. On the javascript side I was able to very quickly get a list of cities for a particular zip code by using the following code:
var zipcodes = {"55437":["MINNEAPOLIS","BLOOMINGTON"]}; //truncated
alert('Cities in 55437: ' + zipcodes['55437'].join(', '));
One problem I can see is that you are trying to insert one row at a time, this can cause a lot of overhead in making connections etc...
It would be faster if you could insert multiple rows (maybe 20 or 50 in one shot) in one go. You can insert multiple rows by using some effecient procedure or INSERT INTO or something..
If you can't move it to something server-side (Javascript is really not a tool for a job like that), definitely, bundle multiple inserts together like Suraj suggests.
90% of the work is start connection, start transaction, end transaction, close connection. 10% are actual DB operations.
transaction.executeSql('
INSERT INTO zipcode (city_name, city_alias, zipcode) VALUES (?, ?, ?);
INSERT INTO zipcode (city_name, city_alias, zipcode) VALUES (?, ?, ?);
INSERT INTO zipcode (city_name, city_alias, zipcode) VALUES (?, ?, ?);
... //20-50 lines like this, maybe generated by a loop.
',[
cityName1, cityAlias1, zipcode1,
cityName2, cityAlias2, zipcode2,
cityName2, cityAlias3, zipcode3,
... // a matching table, generated by a loop as well.
],
...
Why not use a preloaded XML instead of creating all the fields when the webpage loads? That way you will reduce the loading time, and the searching time could be reduced by some type of indexing, maybe hashtable indexing or binary search.
This would reduce flexibility, in means that you will have to change the XML and compile it with the help of a tool - Which I don't know if something like that exists; but will allow for better performance, specially if you are working in a limited device like an IPhone.
What I did to overcome this problem was to first create a string containing one trasanction with all its executes and then run it using javascript eval method
jsonResponse = Ext.util.JSON.decode(result.responseText);
jsonIndex = 0;
var consulta = "DB.transaction(function (transaction){";
while(jsonResponse[jsonIndex] != null){
var ins = jsonResponse[jsonIndex].instruccion;
ins = ins.replace(/"/gi, "\"");
consulta+= "transaction.executeSql('"+ins+"'); ";
jsonIndex++;
}
consulta+="});";
eval(consulta);
I had the exact same problem. I found a blog post providing a possible solution.
Here's the Link: http://blog.heldes.com/html5/sqlite-class-for-html5-database/
Good Luck
An hour is probably too long in any case but even if you reduce that by a lot you still have a significant wait. It will probably pay to spawn a new thread to handle this process separate from your UI thread to preserve responsiveness for the user.