I am trying to insert over 70,000 rows into a javascript database (using Chrome 5.0.317.2). The inserts are taking a very long time. The actual page loads in a few seconds, and I can see progress as the percent increases very slowly as each row is inserted. It took about an hour to finish inserting all the records. Is there a way to optimize the inserts, or somehow start out with a preloaded SQLite database?
<script src="jquery.1.3.2.min.js" type="text/javascript" charset="utf-8"></script>
<script type="text/javascript" charset="utf-8">
// Truncated to 1 row for example. There are really 76547 rows.
var zipcodes = var zipcodes = [{"city_name":"AMHERST","city_alias":"AMHERST","zipcode":"01002"}];
var db;
function openMyDatabase() {
var shortName = 'mydb';
var version = '1.0';
var displayName = 'mydb';
var maxSize = 65536;
db = openDatabase(shortName, version, displayName, maxSize);
db.transaction(
function(transaction) {
transaction.executeSql(
'CREATE TABLE IF NOT EXISTS zipcode ' +
' (id INTEGER NOT NULL PRIMARY KEY AUTOINCREMENT, ' +
' city_name TEXT NOT NULL, ' +
' city_alias TEXT NOT NULL, ' +
' zipcode TEXT NOT NULL)'
);
}
);
$.each(zipcodes, function(i, zipcode) {
insertZipcode(zipcode.city_name, zipcode.city_alias, zipcode.zipcode, i);
});
}
function errorHandler(transaction, error) {
alert('Oops. Error was '+error.message+' (Code '+error.code+')');
return true;
}
function insertZipcode(cityName, cityAlias, zipcode, i) {
db.transaction(
function(transaction) {
transaction.executeSql(
'INSERT INTO zipcode (city_name, city_alias, zipcode) VALUES (?, ?, ?);',
[cityName, cityAlias, zipcode],
function(){
$('#counter').html((100 * i / zipcodes.length) + '%');
},
errorHandler
);
}
);
return false;
}
$(function() {
openMyDatabase();
});
</script>
Solution: On the PHP side, I made an associative array and used the zip code as the key and an array of cities as the value, and I ran it through json_encode and passed that to the javascript. On the javascript side I was able to very quickly get a list of cities for a particular zip code by using the following code:
var zipcodes = {"55437":["MINNEAPOLIS","BLOOMINGTON"]}; //truncated
alert('Cities in 55437: ' + zipcodes['55437'].join(', '));
One problem I can see is that you are trying to insert one row at a time, this can cause a lot of overhead in making connections etc...
It would be faster if you could insert multiple rows (maybe 20 or 50 in one shot) in one go. You can insert multiple rows by using some effecient procedure or INSERT INTO or something..
If you can't move it to something server-side (Javascript is really not a tool for a job like that), definitely, bundle multiple inserts together like Suraj suggests.
90% of the work is start connection, start transaction, end transaction, close connection. 10% are actual DB operations.
transaction.executeSql('
INSERT INTO zipcode (city_name, city_alias, zipcode) VALUES (?, ?, ?);
INSERT INTO zipcode (city_name, city_alias, zipcode) VALUES (?, ?, ?);
INSERT INTO zipcode (city_name, city_alias, zipcode) VALUES (?, ?, ?);
... //20-50 lines like this, maybe generated by a loop.
',[
cityName1, cityAlias1, zipcode1,
cityName2, cityAlias2, zipcode2,
cityName2, cityAlias3, zipcode3,
... // a matching table, generated by a loop as well.
],
...
Why not use a preloaded XML instead of creating all the fields when the webpage loads? That way you will reduce the loading time, and the searching time could be reduced by some type of indexing, maybe hashtable indexing or binary search.
This would reduce flexibility, in means that you will have to change the XML and compile it with the help of a tool - Which I don't know if something like that exists; but will allow for better performance, specially if you are working in a limited device like an IPhone.
What I did to overcome this problem was to first create a string containing one trasanction with all its executes and then run it using javascript eval method
jsonResponse = Ext.util.JSON.decode(result.responseText);
jsonIndex = 0;
var consulta = "DB.transaction(function (transaction){";
while(jsonResponse[jsonIndex] != null){
var ins = jsonResponse[jsonIndex].instruccion;
ins = ins.replace(/"/gi, "\"");
consulta+= "transaction.executeSql('"+ins+"'); ";
jsonIndex++;
}
consulta+="});";
eval(consulta);
I had the exact same problem. I found a blog post providing a possible solution.
Here's the Link: http://blog.heldes.com/html5/sqlite-class-for-html5-database/
Good Luck
An hour is probably too long in any case but even if you reduce that by a lot you still have a significant wait. It will probably pay to spawn a new thread to handle this process separate from your UI thread to preserve responsiveness for the user.
Related
I am trying to test a dynamic web table using protractor and trying to find the count of headers, rows and cols but always getting 0 as the count
var x = element.all(by.xpath('//table//thead//tr//th'))
x.count().then(function(c){
console.log(c);
});
I tried using element.all(by.css ) as well and it returns the same , can anyone help?
I used selenium and able to retrieve the value, so xpath is not wrong, but I have to use protractor to fetch the same.
Selenium script which is working
List col = driver.findElements(By.xpath("//div[#class='table-wrapper']//table//thead//tr/th"));
System.out.println(col.size());
html
Try the below code
var x = await element.all(by.css('table[title="status"]'))
//Add wait if the table take more time to load
x.count().then(function(c){
console.log(c);
});
In general, you should avoid xpath since it's very inefficient.
This should work for you:
var table = element(by.css('table.table'));
table
.element(by.css('thead'))
.all(by.css('tr th'))
.count()
.then(function(count) {
console.log('count:',count);
});
I'm trying to copy (duplicate) a record in ServiceNow table of incidents, but can not make this string work: gr.sys_id[key] = current.getValue(glideElement.getName());
The goal is to copy all fields values except sys_id.
Take a look at the UI Action Insert & Stay which is kind of a Duplicate Script.
You can use the same functionality in your Business rule or any other server side script:
doInsertAndStay();
function doInsertAndStay() {
var saveMe = current;
if (typeof current.number != 'undefined' && current.number){
current.number = ""; // generate a new number
}
current.insert();
action.setRedirectURL(saveMe);
}
The GlideRecord function insert() duplicates a record and of course a new sys_id is used for the new record. As far as I know you are not able to define the sys_id by your self.
I'm in Zend Framework 2, trying to get the last inserted id after inserting using postgresql PDO. The insert works fine unless I add a SequenceFeature, like this:
class LogTable extends AbstractTableGateway
{
protected $table = 'log';
public function __construct(Adapter $adapter)
{
$this->adapter = $adapter;
$this->featureSet = new Feature\FeatureSet();
$this->featureSet->addFeature(new Feature\SequenceFeature('id','log_id_seq'));
$this->resultSetPrototype = new ResultSet();
$this->resultSetPrototype->setArrayObjectPrototype(new Log());
print_r($this->getFeatureSet());
$this->initialize();
}
When I later do an insert like this:
$this->insert($data);
It fails, because INSERT INTO "log" () VALUES (), so for some reason zf2 is nulling out the columns and values to insert, but only if I add that SequenceFeature.
If I don't add that feature, the insert works fine, but I can't get the last sequence value. Debugging the Zend/Db/Sql/Insert.php, I found that the values function is accessed twice with the SequenceFeature in there, but only once when it's not in there. For some reason when the SequenceFeature is there, all the insert columns and values are nulled out, possibly because of this double call? I haven't investigated further yet, but maybe it's updating the sequence and then losing the data when making the insert?
Is this a bug, or is there just something I'm missing?
Screw it! We'll do it live!
Definitely not the best solution, but this works. I just cut and pasted the appropriate code from Zend/Db/TableGateway/Feature/SequenceFeature.php and added it as a function to my LogTable class:
public function nextSequenceId()
{
$sql = "SELECT NEXTVAL('log_id_seq')";
$statement = $this->adapter->createStatement();
$statement->prepare($sql);
$result = $statement->execute();
$sequence = $result->getResource()->fetch(\PDO::FETCH_ASSOC);
return $sequence['nextval'];
}
Then I called it before my insert in my LogController class:
$data['id'] = $this->nextSequenceId();
$id = $this->insert($data);
Et voila! Hopefully someone else will explain to me how I'm really supposed to do it, but this will work just fine in the interim.
I need a way to get the friend ids of a user.
I written this code but my browser crashes ( maybe because I have 2500 friends ):
var query = FB.Data.query('SELECT uid2 FROM friend WHERE uid1 = me()');
query.wait(function(rows) {
var i;
for(i=0;i<rows.length;i++){
document.getElementById('friends').innerHTML += i + ') ' + rows[i].uid2 + '<br />';
}});
Is there a less CPU consuming approach ?
At VERY least, compile all your HTML in a variable and then pass it to DOM in one .innerHTML assignment. Right now your forcing page redraw two times per each of your 2500 friends because browser needs to update its own internal understanding of page on innerHTML read in += and then once again on writing back.
query.wait(function(rows) {
var i;
var html = "";
for(i=0;i<rows.length;i++){
html += i + ') ' + rows[i].uid2 + '<br />';
}
document.getElementById('friends').innerHTML = html
}
You can also use some other approaches, like storing generated strings in an array and then joining them together and assigning to HTML, but that's just optimizing around JS immutable strings and garbage collector. Some JS engines may eventually do this much better that you'd do by hand. Doing innerHTML assignment once, however, is almost guaranteed to be huge speed increase, because it literally means "regenerate everything and then apply a little bit and encode everything back again" and there's hardly any way to automatically optimize doing this inside loop inside JS+HTML engines junction.
I have a table with eighty fields, none to seventy of them can change depending of an update process I have. For example:
if (process.result == 1)
cmd.CommandText = "UPDATE T SET f1=1, f6='S'" ;
else if (Process.result == 2)
cmd.CommandText = string.Format("UPDATE T SET f1=2, f12={0},f70='{1}'", getData(), st);
else if ..... etc.
I can optimize the building process of the UPDATE statement, however I would like to use SQLParameter; is that possible and convenient given the variablity of data to update?
Thanks.
For each if statement you currently are using inline string formats, you could just as well just add the sql params instead.
The formatting of the UPDATE string could be replaced with the selected sql params you require to be updated insted.