Reading / parsing text files - qpython3

I have a script that reads data, process it, and prints an output. The script is not running in the directory where it is saved. I tried to change directory via OS.chdir, but still I get the error "file not found". I placed the scripts in the phone storage by drag and drop from my PC.

Related

Renaming objects in Google Cloud Storage

I am backing up footage for my video company in google cloud storage. I did a dumb, and my object name has spaces in it and an &. So I need to download this 700gb project, and PowerShell won't do it because it thinks the text after the & is a command and fails the command. What are my options for renaming this object? Or if there's another thing I am missing, please let me know. My problem is using an & in my naming structure, so if anyone knows of any way to rename that would be fantastic.
Other info: Downloading using the command that the download button gives me. I just pasted that into my command prompt and get the error " 'Joe' is not recognized as an internal or external command,
operable program or batch file."
I have tried using "gsutil -m mv gs://my_bucket/oldprefix gs://my_bucket/newprefix" this code with the paths changed to change the name but this also fails because of the spaces in the file path

PostgreSQL 14.5 pg_read_binary_file could not open file for reading: Invalid argument

Yesterday I installed PostgreSQL 14.5 on a Windows 10 laptop.
I then ran an old script to load images into a table.
The script uses the pg_read_binary_file function.
Some of the images are .jpg files and some are .png files.
Of the 34 files, only 5 were successfully processed (1 .jpg and 4 .png). The other 29 failed with the following error:
[Exception, Error code 0, SQLState XX000] ERROR: could not open file "file absolute path" for reading: Invalid argument
For instance, the following statement executes without errors
select pg_read_binary_file('C:\Users\Jorge\OneDrive\Documents\000\020-logos\adalid.png') as adalid_png;
... and the following statement fails
select pg_read_binary_file('C:\Users\Jorge\OneDrive\Documents\000\020-logos\oper.png') as oper_png;
... with the following error message
[Exception, Error code 0, SQLState XX000] ERROR: could not open file "C:/Users/Jorge/OneDrive/Documents/000/020-logos/oper.png" for reading: Invalid argument
So far, I have not been able to identify any difference in the files that could be the cause of the error. Also, I'm pretty sure the script works on earlier releases of version 14. Unfortunately I have not been able to find a website to download any of those earlier releases to test it again.
Has anyone else found this problem, and its solution?
I think the issue is somehow caused by OneDrive. This laptop is new. When I logged in with my Microsoft account, the OneDrive directory was automatically created and updated. Apparently this operation only updates the directory entries, leaving the contents of the files in the cloud until they are opened. When I zipped the directory that contains all my images, a message from OneDrive appeared saying that in that moment it will restore some files. After that, all the commands in my scripts work.
My theory is that pg_read_binary_file gets the file entry from the directory, so it doesn't give the "No such file or directory" message; but then fails reading the contents, giving the "Invalid argument" message instead.
The unanswered question would be: why does 7-Zip make OneDrive restore the files but pg_read_binary_file does not?
UPDATE
After more testing, and reading Save disk space with OneDrive Files On-Demand for Windows, now I am sure that pg_read_binary_file could fail and send the message "Invalid argument" when the OneDrive file is not a locally available file. In Windows File Explorer such file has a blue cloud icon next to it.

Skipping the errors and recording downloads in kdb

I am trying to use kdb q script to download file from remote source.
How can I make the download keep going if there is an error?
also, how can i mark it down what its downloaded in linux when there are other files in the same directory???
Here is my code:
file:("abc.csv";"def.csv");
dbdir:"/home/terry/";
dlFunc:{
system "download.sh abc.com user /"get /remote/path/",x /",dbdir};
dlFunc each file;
If you're asking how to continue downloading other files if one file fails then you can put a protected eval around your dlFunc each file, e.g.
#[dlFunc;;()]each file;
You could capture the list of failed files using something like:
badfiles:();
{#[dlFunc;x;{y;badfiles,:enlist x}x]}each file;
Then inspect the badfiles list afterwards. The ones that succeeded would be:
file except badfiles

How can I copy a file in Powershell without locking the source file which is being copied?

Is there a way to use the "Copy" command in Powershell to copy a source file to a destination folder but not lock the actual file being copied?
Reason for asking is the file being copied is also an input file for a separate process and if the process starts while the Powershell script is running then it will fail if the Powershell script has locked the input file it uses.

make Restore-DfsrPreservedFiles continue after long path error

I am trying to get Powershells Restore-DfsrPreservedFiles to restore about 100GB of files that got placed into the DfsrPrivate\PreExisting folder. I've got the Command all worked out and it does work until it hits a file that causes a path name too long error. At this point have 99.9GB restore with 1 file missing I can get from backups later is not a problem.
I can't figure out (if its possible) to make it skip the 1 or 2 files its having problems with and keep going.