I have to read in a big file in Typo3 (Version 6.2.10) in a plugin we wrote. The file is uploaded via the backend and as it changes it will be newly uploaded.
Currently I use:
$file->getOriginalResource()->getContents();
$file is a \TYPO3\CMS\Extbase\Domain\Model\FileReference.
That works fine, as long as the file in question is small enough. The problem is, that the content of the file is read in the memory completely. With bigger files I reach the point, at which this fails. So my question is, how can I read in the contents of the file line by line?
You can copy it to a temporary local path with
$path = $file->getOriginalResource()->getForLocalProcessing(false);
Then you can use fgets as usual to loop through the file line by line.
Related
I need to read an file inside of the extension controller, here as example my extension key is myext_key and the file I want to open is a JSON file data.json in the Resources/Private/JSON directory. My research gave me, that the best way to open a file wouldn't be with file_get_contents($path), instead with \TYPO3\CMS\Core\Utility\GeneralUtility::getURL($path).
So I tried it with the following code, but that didn't work:
$content = \TYPO3\CMS\Core\Utility\GeneralUtility::getURL('EXT:myext_key/Resources/Private/JSON/data.json');
Thanks for all help!
Not sure if anything is wrong with file_get_contents(), other than memory implications, because file_get_contents() essentially assigns a variable with the whole file content. As long as you are dealing with a small .json file is small you should be in the safe.
The method to get the absolute file name of a file inside an extension directory is getFileAbsFileName()
$fileContent = file_get_contents(
\TYPO3\CMS\Core\Utility\GeneralUtility::getFileAbsFileName('EXT:myext_key/Resources/Private/JSON/data.json')
);
I am having the same issue as described in SharePoint::SOAPHandler perl script works for display not copy
In my case I have been able to narrow down the issue to file size. I can successfully copy a 47332 byte file to sharepoint but I cannot copy a 53689 byte file to sharepoint.
Note that using the WebUI I can successfully upload files of much larger size.
Any thoughts on why uploading using SharePoint::SOAPHandler is failing in this manner?
I have a script in MATLAB that outputs various files, including NIFTI (MRI images) files and text log files. Sometimes the output files that are created have their file permissions set to no read or write for anyone, including the file owner, and this appears to occur randomly.
This normally isn't a problem unless it occurs with the logs, as it leads MATLAB to endlessly recurse as it tries to write the error to the logs. Unfortunately, I haven't been able to find anyone who has experienced similar behavior from any program, including MATLAB.
The script, input files, and output files are all located on a NAS drive connected to a server running Ubuntu 14.04, so I'm wondering if this is a problem with the script (probably not because it happens intermittently regardless of input), the matlab environment, or the NAS drive.
I'm not sure why yoru fileattributes/permissions are changing, but I know the solution. You want file attributes if you know chmod from unix this should be familiar, if not you will still be fine
something like this will make your files writeable for all user groups on a unix (ubuntu) system
fileattrib('/home/work/results/my_file.log','+w','a')
edit
since these files are non-existant this may work better. Simply use fopen with the 'w+' flag. Which means write mode for appending. The + will also create the file if it doesn't exist(and you have permission to create files in the specified directory)
fid = fopen('my_new_log.txt','w+');
fprintf(fid,'some strings for my file');
fclose(fid);
it is also important you make sure to close the file after you are done.
easy question.
I want to create a set of records (database is an overstatement) on disk. I can open with rb+, move to a random location with fseek, and then fread and fwrite, all interspersed with fflush. all good.
now I want to delete one record. easy---move the last record to the spot where I want to delete another record, and then shorten the file by one record.
but...how do I shorten my existing file?
/iaw
Copy the contents before and after the record to be deleted to
a temporary file.
Delete the original file (which had the record to be deleted).
Rename temporary file
as the original file.
To truncate the existing file that you have opened, seek to the desired file size and then set a new EOF at that position. The problem is that C has no function for that purpose. You would have to use platform-specific APIs, like SetEndOfFile() on Windows.
there is no ANSI solution to this problem, short of copying the whole file anew.
there are OS-specific solutions only. see comments under the answers.
Windows: SetEndOfFile()
Posix: ftruncate() and truncate()
How can I either delete the contents of a .txt file so that it becomes a blank file or it deletes the file altogether? I'm reading in from the .txt file and if I don't delete the contents from the previous time it ran, it will omit results that I want to account for.
Suppose your file is named to_be_deleted.txt, you can simply use the following command to delete the file altogether:
delete 'to_be_deleted.txt';
On the other hand, if you simply want to clear it's contents, just open it using fopen with the write attribute as follows:
fopen('to_be_deleted.txt','w');