What causes a row (Zend_Db_Table_Row) to be set to "readOnly?" I'm having trouble deleting rows in a loop:
// this is set to some integers
$ids = array();
// get the results
$results = $table->fetchAll($select);
foreach ($results as $result)
{
$value = $result->value;
if (!in_array($value, $ids))
{
// throws a "row is read-only" error
$result->delete();
}
}
Here's my select:
$table = $options->joinModel;
$select = $table->select();
$select->from($table->getTableName(), array("id", "value" => $options->joinForeignKey))
->where("`{$options->foreignKey}` = ?", $row->id)
->group($options->joinForeignKey);
I want to delete the row's that aren't in the $ids array, but it throws an error saying the row is read only. I haven't set that flag or done anything with the row. Any idea why it's read-only?
A row is readOnly if the $select is such that prevents you from directly mapping fields back to a single origin row.
For example, if the $select involves a JOIN or a GROUP BY, it's not clear which row(s) would be affected if you change a value of a field in the row object.
You might say "I know which row is the source, why can't Zend_Db_Table_Row tell?" But there are many corner cases, so it's a hard problem to solve in general.
Keep in mind all of Zend_Db is under 3000 lines of code. It can't have a lot of magic in it.
A row object can also be readOnly if you've serialized and then deserialized it.
Related
The concept is that, after a successfull save of my object, it should update a text in the database (With a Hook). Lets call the field 'succText'. The table i would like to access is the sys_file but i only get the sys_file_reference id when i save the object. So i thought i could use the ConnectionPool to select the sys_file row of this file reference and then insert the data on the field 'succText'.
I tried this:
public function processDatamap_preProcessFieldArray(array &$fieldArray, $table, $id, \TYPO3\CMS\Core\DataHandling\DataHandler &$pObj) {
$queryBuilder = GeneralUtility::makeInstance(ConnectionPool::class)->getQueryBuilderForTable('sys_file_reference');
$findItemsId = $queryBuilder
->select('*')
->from('sys_file_reference')
->join(
'sys_file_reference',
'sys_file',
'reference',
$queryBuilder->expr()->eq('reference.uid', $queryBuilder->quoteIdentifier('uid_local'))
)
->where(
$queryBuilder->expr()->eq('uid_local', $queryBuilder->createNamedParameter($fieldArray['downloads'], \PDO::PARAM_INT))
)
->execute();
}
But this give me back the sys_file_reference id and not the id and the field values of the sys_file table.
As for the update, i havent tried it yet, cause i haven't figured out yet, how to get the row that needs to be updated. I gues with a subquery after the row is found, i don't really know.
The processDatamap_preProcessFieldArray is going to be renamed to post. I only have it this way in order to get the results on the backend.
Thanks in advance,
You might want to make use of the FileRepository class here.
$fileRepository = GeneralUtility::makeInstance(\TYPO3\CMS\Core\Resource\FileRepository::class);
$fileObjects = $fileRepository->findByRelation('tablename', 'fieldname', $uid);
Where $uid is the ID of the record that the files are connected to via file reference.
You will get back an array of file objects to deal with.
I resolved my problem by removing the first code and adding a filerepository instance.
$fileRepository = GeneralUtility::makeInstance(FileRepository::class);
$fileObjects = $fileRepository->findByRelation('targetTable', 'targetField', $uid);
VERY IMPORTANT!
If you are creating a new element then TYPO3 assigns a temp UID variable with a name that looks like this NEW45643476. In order to get the $uid from the processDatamap_afterDatabaseOperations you need to add this code before you get the instance of the fileRepository.
if (GeneralUtility::isFirstPartOfStr($uid, 'NEW')) {
$uid = $pObj->substNEWwithIDs[$uid];
}
Now as far as the text concerns, i extracted from a pdf. First i had to get the basename of the file in order to find its storage location and its name. Since i have only one file i don't really need a foreach loop and i can use the [0] as well. So the code looked like this:
$fileID = $fileObjects[0]->getOriginalFile()->getProperties()['uid'];
$fullPath[] = [PathUtility::basename($fileObjects[0]->getOriginalFile()->getStorage()->getConfiguration()['basePath']), PathUtility::basename($fileObjects[0]->getOriginalFile()->getIdentifier())];
This, gives me back an array looking like this:
array(1 item)
0 => array(2 items)
0 => 'fileadmin' (9 chars)
1 => 'MyPdf.pdf' (9 chars)
Now i need to save the text from every page in a variable. So the code looks like this:
$getPdfText = '';
foreach ($fullPath as $file) {
$parser = new Parser();
$pdf = $parser->parseFile(PATH_site . $file[0] . '/' . $file[1]);
$pages = $pdf->getPages();
foreach ($pages as $page) {
$getPdfText .= $page->getText();
}
}
Now that i have my text i want to add it on the database so i will be able to use it on my search action. I now use the connection pool to get the file from the sys_file.
$queryBuilder = GeneralUtility::makeInstance(ConnectionPool::class)->getQueryBuilderForTable('sys_file');
$queryBuilder
->update('sys_file')
->where(
$queryBuilder->expr()->eq('uid', $queryBuilder->createNamedParameter($fileID))
)
->set('pdf_text', $getPdfText)
->execute();
Now everytime i choose a PDF from my extension, i save its text on the database.
EXTRA CONTENT
If you want to include the PDFParser as well and you are on composer mode, then add this on your composer.json:
"smalot/pdfparser" : "*"
and on the autoload:
"Smalot\\PdfParser\\" : "Packages/smalot/pdfparser/src/"
Then under: yourExtension/Classes/Hooks/DataHandler.php add the namespace:
use Smalot\PdfParser\Parser;
Now you are able to use the getPages() and getText() functions.
The Documentation
If i missed something let me know and i will add it.
I'm in Zend Framework 2, trying to get the last inserted id after inserting using postgresql PDO. The insert works fine unless I add a SequenceFeature, like this:
class LogTable extends AbstractTableGateway
{
protected $table = 'log';
public function __construct(Adapter $adapter)
{
$this->adapter = $adapter;
$this->featureSet = new Feature\FeatureSet();
$this->featureSet->addFeature(new Feature\SequenceFeature('id','log_id_seq'));
$this->resultSetPrototype = new ResultSet();
$this->resultSetPrototype->setArrayObjectPrototype(new Log());
print_r($this->getFeatureSet());
$this->initialize();
}
When I later do an insert like this:
$this->insert($data);
It fails, because INSERT INTO "log" () VALUES (), so for some reason zf2 is nulling out the columns and values to insert, but only if I add that SequenceFeature.
If I don't add that feature, the insert works fine, but I can't get the last sequence value. Debugging the Zend/Db/Sql/Insert.php, I found that the values function is accessed twice with the SequenceFeature in there, but only once when it's not in there. For some reason when the SequenceFeature is there, all the insert columns and values are nulled out, possibly because of this double call? I haven't investigated further yet, but maybe it's updating the sequence and then losing the data when making the insert?
Is this a bug, or is there just something I'm missing?
Screw it! We'll do it live!
Definitely not the best solution, but this works. I just cut and pasted the appropriate code from Zend/Db/TableGateway/Feature/SequenceFeature.php and added it as a function to my LogTable class:
public function nextSequenceId()
{
$sql = "SELECT NEXTVAL('log_id_seq')";
$statement = $this->adapter->createStatement();
$statement->prepare($sql);
$result = $statement->execute();
$sequence = $result->getResource()->fetch(\PDO::FETCH_ASSOC);
return $sequence['nextval'];
}
Then I called it before my insert in my LogController class:
$data['id'] = $this->nextSequenceId();
$id = $this->insert($data);
Et voila! Hopefully someone else will explain to me how I'm really supposed to do it, but this will work just fine in the interim.
I'm using insertRow to populate an empty spreadsheet, it starts off taking about 1 second to insert a row and then slows down to around 5 seconds after 150 rows or so.
Has anyone experienced this kind of behaviour?
There aren't any calculations on the data in the spreadsheet that could be getting longer with more data.
Thanks!
I'll try to be strict.
If you take a look at class "Zend_Gdata_Spreadsheets" you figure that the method insertRow() is written in a very not optimal way. See:
public function insertRow($rowData, $key, $wkshtId = 'default')
{
$newEntry = new Zend_Gdata_Spreadsheets_ListEntry();
$newCustomArr = array();
foreach ($rowData as $k => $v) {
$newCustom = new Zend_Gdata_Spreadsheets_Extension_Custom();
$newCustom->setText($v)->setColumnName($k);
$newEntry->addCustom($newCustom);
}
$query = new Zend_Gdata_Spreadsheets_ListQuery();
$query->setSpreadsheetKey($key);
$query->setWorksheetId($wkshtId);
$feed = $this->getListFeed($query);
$editLink = $feed->getLink('http://schemas.google.com/g/2005#post');
return $this->insertEntry($newEntry->saveXML(), $editLink->href, 'Zend_Gdata_Spreadsheets_ListEntry');
}
In short, it loads your whole spreadsheet just in order to learn this value $editLink->href in order to post new row into your spreadsheet.
The cure is to avoid using this method insertRow.
Instead, get your $editLink->href once in your code and then insert new rows each time by reproducing the rest of behaviour of this method. I.e, in your code instead of $service->insertRow() use following:
//get your $editLink once:
$query = new Zend_Gdata_Spreadsheets_ListQuery();
$query->setSpreadsheetKey($key);
$query->setWorksheetId($wkshtId);
$query->setMaxResults(1);
$feed = $service->getListFeed($query);
$editLink = $feed->getLink('http://schemas.google.com/g/2005#post');
....
//instead of $service->insertRow:
$newEntry = new Zend_Gdata_Spreadsheets_ListEntry();
$newCustomArr = array();
foreach ($rowData as $k => $v) {
$newCustom = new Zend_Gdata_Spreadsheets_Extension_Custom();
$newCustom->setText($v)->setColumnName($k);
$newEntry->addCustom($newCustom);
}
$service->insertEntry($newEntry->saveXML(), $editLink->href, 'Zend_Gdata_Spreadsheets_ListEntry');
Don't forget to encourage this great answer, it costed me few days to figure out. I think ZF is great however sometimes you dont want to rely on their coode too much when it comes to resources optimization.
Can anyone tell me how to format the query below correctly in my controller.
Currently it gives me nothing in my FilteringSelect. However if I change it to >= I get back all the kennelIDs which is incorrect also but at least I'm getting something.
I've tested that the session variable is set and can confirm that there are kennels with the matching capacity.
// Create autocomplete selection for the service of this booking
public function servkennelAction()
{
$sessionKennelBooking = new Zend_Session_Namespace('sessionKennelBooking');
// disable layout and view rendering
$this->_helper->layout->disableLayout();
$this->getHelper('viewRenderer')->setNoRender(true);
// get list of grooming services for dogs from the table
$qry= Doctrine_Query::create()
->from('PetManager_Model_Kennels k');
//This should be set by default and narrows down the search criteria
if(isset($sessionKennelBooking->numPets)){
$b=(int)$sessionKennelBooking->numPets;
$qry->addWhere('k.capacity = ?','$b');
}
$result=$qry->fetchArray();
//generate and return JSON string using the primary key of the table
$data = new Zend_Dojo_Data('kennelID',$result);
echo $data->toJson();
}
Many thanks in Advance.
Graham
I think that addWhere condition is wrong. It has to be:
$qry->addWhere('k.capacity = ?', $b);
i.e. $b without quotes.
Before I describe the details, the problem is, I run a $c->model('ResultName')->search({k=>v}) and when I loop on the results of it's has_many relation, there's only one in the database, yet it loops forever. I've tried googling and found one person who solved the problem, but with too brief an explanation for me. His post was here.
Basically I have 3 tables
Orders, OrderItems and Items. Items are what's available. Orders are collections of Items that one person wants. So I can tie them all together with something like
select oi.order_item_id,oi.order_id,i.item_id from orders as o inner join order_items as oi on oi.order_id = o.order_id inner join items as i on i.item_id = oi.item_id where blah blah blah....
I ran DBIx::Class::Schema::Loader and got what seemed like proper relationships
MyApp::Schema::Result::Order->has_many('order_items'...)
MyApp::Schema::Result::Items->has_many('order_items'...)
MyApp::Schema::Result::OrderItems->belongs_to('items'...)
in a test I try
my $orders = $schema->resultset('Order')->search({
'user_id'=>1
});
while(my $o = $orders->next) {
while(my $oi = $o->order_items->next) {
warn('order_item_id: '.$oi->order_item);
}
}
It loops infinitely on the inner loop
Your solution works but it loses the niceties of next in that it is an iterator. You are in effect loading all the rows as objects into memory and looping over them.
The issue, as you said is that $o->order_items->next recreates the order_items resultset each time. You should do this:
my $orders = $schema->resultset('Order')->search({
'user_id'=>1
});
while(my $o = $orders->next) {
my $oi_rs = $o->order_items;
while(my $oi = $oi_rs->next) {
warn('order_item_id: '.$oi->order_item);
}
}
Reading more carfully in the ResultSet documentation for "next" I found
"Note that you need to store the
resultset object, and call next on it.
Calling resultset('Table')->next
repeatedly will always return the
first record from the resultset."
from here
When I changed the loops to
for my $o ($orders->all) {
for my $oi ($o->order_items->all) {
# stuff
}
}
all is well.