Script to update 5500 fields in a telephone type column with random numbers - mysqli

I would need to create a php7 script that generates 5500 random telephone numbers starting with the example number 3
"3471239900". The script should go to overwrite the data already present.
/**
* genera numero tel casuale che inizia per 3
*/
function telefono()
{
$telefono = '';
for ($k=0; $k<9; $k++) {
//genera casuale 9 cifre
$telefono .= rand(0, 9);
}
//inizia per 3
return '3' . $telefono;
}
$res = mysqli_query($conn, 'SELECT id_com FROM commesse ORDER BY id_com');
while ($riga = mysqli_fetch_assoc($res)) {
$id = (int)$riga['id_com'];
$query = "UPDATE commesse SET cliente=tel='".telefono()."' WHERE id_com=" . $id_com;
}

You don't need to invent such code to fill a single column in a database table with random numbers.
Following update statement will populate cliente_tel column of the commesse table with 10 digit random numbers all beginning with 3.
UPDATE
`commesse`
SET
`cliente_tel` = CONCAT("3",ROUND(RAND()*(999999999-100000000)+100000000))
WHERE 1;
Using ROUND() is necessary here since RAND() returns a float between 0 and 1.
Good to remember: Running any kind of update/insert statement in a loop is always expensive and slow. Try to avoid running SQL queries in a loop as much as possible.

Related

SQL: printing common factors of two numbers?

lets say that I want to know all of the common factors with two numbers, 20 and 40, and write a script that will print them to the screen. I know I can use MOD to divide so I could write out a division of every number out of both 20 and 40 and then check if there are duplicates and print them, but that would take a lot of lines, is there a quicker way?
If you have a numbers table in your database (always a good tool to have) then you can easily do this with the T-SQL below:
declare #num1 int = 20;
declare #num2 int = 40;
select n.num as commonFactor
from dbo.Nums as n
where (n.num < case when #num1 > #num2 then #num1 else #num2 end)
and #num1 % n.num = 0
and #num2 % n.num = 0
If you don't have a numbers table then it is easy to create one - take a look at SQL, Auxiliary table of numbers for a few examples

Highcharts: Comparing 2 lines with different dates

I'm comparing 2 users of twitter on who receives the most tweets on a particular day, and put this in a line graph of Highcharts, the following mysql code I have:
<?php
require('mysql_connect.php');
$artist1 = $_POST['dj1'];
$artist2 = $_POST['dj2'];
$dates_result = mysqli_query($con,"SELECT DISTINCT(tweetDate) FROM tb_tweetDetails");
while ($row = mysqli_fetch_array($dates_result)) {
$dates[] = $row['tweetDate'];
}
$artist1_tweetsADay_result = mysqli_query($con,"SELECT tweetDate,COUNT('tweetIds') AS 'amountTweets' FROM tb_tweetDetails WHERE tweetId IN
(SELECT tweetId FROM tb_tweetLink LEFT JOIN tb_artists ON
(tb_tweetLink.artistId = tb_artists.artistId) where artistName = '$artist1') GROUP BY tweetDate");
while ($row = mysqli_fetch_array($artist1_tweetsADay_result)) {
$artist1_tweetsADay_date[] = "'" . $row['tweetDate'] . "'";
$artist1_tweetsADay_amount[] = $row['amountTweets'];
}
$artist1_tweetsADay_result = mysqli_query($con,"SELECT tweetDate,COUNT('tweetIds') AS 'amountTweets' FROM tb_tweetDetails WHERE tweetId IN
(SELECT tweetId FROM tb_tweetLink LEFT JOIN tb_artists ON
(tb_tweetLink.artistId = tb_artists.artistId) where artistName = '$artist2') GROUP BY tweetDate");
while ($row = mysqli_fetch_array($artist1_tweetsADay_result)) {
$artist2_tweetsADay_date[] = "'" . $row['tweetDate'] . "'";
$artist2_tweetsADay_amount[] = $row['amountTweets'];
}
?>
I use this to collect all the available dates I collected tweet data (so also from other then the selected 2 artists)
Then I get the amount of tweets the user received that day, together with the date.
This all works nicely, and the output is as I expected.
Now when I put it in the graph, I put the array of the dates, as the xAxis Categories.
And put the tweetAmount for both artists in the data inputs to create both lines.
The problem is:
Artist 1 has data on 06-04-2013,08-04-2013 & 10-04-2013
Artist 2 has data on 07-04-2013,08-04-2013, 09-04-2013 & 10-04-2013 (so everyday that actually is in my database)
Artist 2 would have his data of 07-04-2013 at 06-04-2013 (since that value comes first)
Artist 1 & 2 have 08-04-2013 under the categorie 07-04-2013 since that is the second available date.
etc. etc.
Is it possible I could use the dates array, to fix the arrays of the amount of tweets, so that every missing date would have 0 assigned to it so the line stays correct with the date.
The two things that I would do to accomplish this:
1) use a datetime x axis
2) while pulling the data from your table, create an array of every date returned in addition to the arrays you are already building. loop through the array of dates, and for each date, check the individual arrays for a value. if no value exists, create it.
You will then have two arrays with the same dates, and you can plot them on a datetime axis and not worry about category index matching.

sum two values from different datasets using lookups in report builder

I have a report that should read values from 2 dataset by Currency:
Dataset1: Production Total
Dataset2: Net Total
Ive tried to use:
Lookup(Fields!Currency_Type.Value,
Fields!Currency_Type1.Value,
Fields!Gross_Premium_Amount.Value,
"DataSet2")
This returns only the first amount from dataset 2.
I've tried Lookupset function as well but it didn't SUM the retrieved values.
Any help would be appreciated.
Thanks Jamie for the reply.
THis is what i have done and it worked perfect:
From Report Properties--> Code , write the below function:
Function SumLookup(ByVal items As Object()) As Decimal
If items Is Nothing Then
Return Nothing
End If
Dim suma As Decimal = New Decimal()
Dim ct as Integer = New Integer()
suma = 0
ct = 0
For Each item As Object In items
suma += Convert.ToDecimal(item)
Next
If (ct = 0) Then return 0 else return suma
End Function
Then you can call the function:
code.SumLookup(LookupSet(Fields!Currency_Type.Value, Fields!Currency_Type1.Value,Fields!Gross_Premium_Amount.Value, "DataSet2"))
Yes, Lookup will only return the first matching value. Three options come to mind:
Change your query, so that you only need to get one value: use a GROUP BY and SUM(...) to combine your two rows in the query. If you are using this query other places, then make a copy and change that.
Is there some difference in the rows? Such as one is for last year and one is for this year? If so, create an artificial lookup key and lookup the two values separately:
=Lookup(Fields!Currency_Type.Value & ","
& YEAR(DATEADD(DateInterval.Year,-1,today())),
Fields!Currency_Type1.Value & ","
& Fields!Year.Value,
Fields!Gross_Premium_Amount.Value,
"DataSet2")
+
Lookup(Fields!Currency_Type.Value & ","
& YEAR(today()),
Fields!Currency_Type1.Value & ","
& Fields!Year.Value,
Fields!Gross_Premium_Amount.Value,
"DataSet2")
Use the LookupSet function as mentioned. With this you'll get a collection of the values back, and then need to add those together. The easiest way to do this is with embedded code in the report. Add this function to the report's code:
Function AddList(ByVal items As Object()) As Double
If items Is Nothing Then
Return 0
End If
Dim Total as Double
Total = 0
For Each item As Object In items
Total = Total + CDbl(item)
Next
Return Total
End Function
Now call that with:
=Code.AddList(LookupSet(Fields!Currency_Type.Value,
Fields!Currency_Type1.Value,
Fields!Gross_Premium_Amount.Value,
"DataSet2"))
(Note: this code was not tested. I just composed it in the Stack Overflow edit window & I'm no fan of VB. But it should give you a good idea of what to do.)

Dataset capacities

Is there any limit of rows for a dataset. Basically I need to generate excel files with data extracted from SQL server and add formatting. There are 2 approaches I have. Either take enntire data (around 4,50,000 rows) and loops through those in .net code OR loop through around 160 records, pass every record as an input to proc, get the relavant data, generate the file and move to next of 160. Which is the best way? Is there any other way this can be handled?
If I take 450000 records at a time, will my application crash?
Thanks,
Rohit
You should not try to read 4 million rows into your application at one time. You should instead use a DataReader or other cursor-like method and look at the data a row at a time. Otherwise, even if your application does run, it'll be extremely slow and use up all of the computer's resources
Basically I need to generate excel files with data extracted from SQL server and add formatting
A DataSet is generally not ideal for this. A process that loads a dataset, loops over it, and then discards it, means that the memory from the first row processed won't be released until the last row is processed.
You should use a DataReader instead. This discards each row once its processed through a subsequent call to Read.
Is there any limit of rows for a dataset
At the very least since the DataRowCollection.Count Property is an int its limited to 4,294,967,295 rows, however there may be some other constraint that makes it smaller.
From your comments this is outline of how I might construct the loop
using (connection)
{
SqlCommand command = new SqlCommand(
#"SELECT Company,Dept, Emp Name
FROM Table
ORDER BY Company,Dept, Emp Name );
connection.Open();
SqlDataReader reader = command.ExecuteReader();
string CurrentCompany = "";
string CurrentDept = "";
string LastCompany = "";
string LastDept = "";
string EmpName = "";
SomeExcelObject xl = null;
if (reader.HasRows)
{
while (reader.Read())
{
CurrentCompany = reader["Company"].ToString();
CurrentDept = reader["Dept"].ToString();
if (CurrentCompany != LastCompany || CurrentDept != LastDept)
{
xl = CreateNewExcelDocument(CurrentCompany,CurrentDept);
}
LastCompany = CurrentCompany;
LastDept = CurrentDept;
AddNewEmpName (xl, reader["EmpName"].ToString() );
}
}
reader.Close();
}

Out of memory while iterating through rowset

I have a "small" table of 60400 rows with zipcode data, 6mb in total. I want to iterate through them all, update a column value, and then save it.
The following is part of my Zipcodes model which extends My_Db_Table that a totalRows function that - you guessed it.. returns the total number of rows in the table (60400 rows)
public function normalizeTable() {
$this->getAdapter()->setProfiler(false);
$totalRows = $this->totalRows();
$rowsPerQuery = 5;
for($i = 0; $i < $totalRows; $i = $i + $rowsPerQuery) {
$select = $this->select()->limit($i, $rowsPerQuery);
$rowset = $this->fetchAll($select);
foreach ($rowset as $row) {
$row->{self::$normalCityColumn} = $row->normalize($row->{self::$cityColumn});
$row->save();
}
unset($rowset);
}
}
My rowClass contains a normalize function (basicly a metaphone wrapper doing some extra magic).
At first i tried a plain old $this->fetchAll(), but got a out of memory (128MB) right away. Then i tried splitting the rowset into chunks, only difference is that some rows actually gets updated. But still getting out of memory error.
Any ideas on how i can acomplish this, or should i fallback to ye'olde mysql_query()
I will suggest using the Zend_Db_Statement::fetch() function here.
http://files.zend.com/help/Zend-Framework/zend.db.statement.html
I suggest rebuilding the select statement so that only the columns needed to be upgraded will be selected $select->from($table, (array)$normalCityColumn)...