I am using Oracle SQL Developer on MacOS and I am trying to save zip file into Blob field. I know how to load any other file type by clicking "Load" button and then selecting the file.
The problem is that when I select the zip file and click on Open, it does not select that file, but shows the me the content of zip file and then I can select only one file from that zip. This is not what I want, because I want to upload the whole zip.
Is there any setting in Oracle SQL Developer or any other way?
I do not have such problem on same table when using PLSQL Developer on Windows machine.
I have found out that when I have a zip file in a folder, then Oracle Sql Developer automatically shows also folders for this file, even if does not exist in reality on disk. The name of the folder and the file is the same including extension. Different is only the icon. If you select archive then loading to blob works ok. This is the behaviour on MacOS. I did not test it on other systems. On bottom picture is the example of folder containing only 2 zip files and how it looks when you try selecting file from that folder.
Related
in a scheduler action i need to add files to FAL (to sys_file) which are already in the storage, uploaded via ftp. The normal way
storage->addFile(....)
which copies the file from a temporary folder to the file storage and adds it to the sys_file table does not work, because the file is already in the fileadmin directory. If i try i get this error message:
[ERROR] Cannot add a file that is already part of this storage.
How is it possible to add a file to sys_file which is already in fileadmin?
Thanks!
AFAIK addFile() is to be used for files uploaded from a local disk. If the files are already available on the remote server, you should go for addUploadedFile() instead.
I have a scheduled script that outputs bunch of HTML files with static names to a remote location. I noticed, that if I have one of those files selected in Windows Explorer so that its contents are shown in Preview Pane, then Powershell cannot overwrite that file and skips updating it.
This only happens if output files are in remote location. Works just fine if files are local.
How do I force PowerShell to overwrite remote files in this situation? Lots of users work with those reports and if one of them leaves Windows Explorer window with one of those files highlighted overnight when the script runs, the file is not going to be updated.
Move HTML files to webserver. You will solve your problem entirely. IIS Setup on windows server is Next, Next, Next. You can leave link to a new file location (https://....) in old place, so users can easily navigate to a new place. Possibly this link can be automated (not sure because of modern security standards)
Try [System.IO.File]::Delete($path) just before writing this file. This removes file entry from filesystem, but leaves file open for those who have it open for now. This makes your script to write to a new file with the same name. Old file exists without name (deleted) but leaves open until everyone close it. Check it actually deleted with resresh!
Try [System.IO.File]::Move($path, $someTrashFullName) just before writing this file. $someTrashFullName probably must be on same drive. Same as Delete, but renames file. Some self-updating software use this strategy. File is renamed, but it's still kept open under new name.
Try replace file with shortcut to some file. You can generate files with different names and change shortcut programmatically
HTML files that change location using js ? They read nearby JSON (generated by export script) and lookup there for a new filename. So user opens static unchanged A.html, JS inside lookups at A.json for new name and redirects user to A-2020-08-11.html. I'm not sure browsers allow reading JSON files from JS for files that opened from network drive.
Only way left is to stop network share or\and close open files server-side.
Maybe some fun with to disable preview in this folder \ completely?
Try with -Force. But to me, it seems to be more a permission issue.
Remove-Item -Path '\\server\share\file' -Force
This guide indicates that you need both a file directory and sql file to accomplish this, does anyone know a workaround?
https://localwp.com/help-docs/how-to-import-a-wordpress-site-into-local/
You can retrieve the backup archives from the starting-site folder. Within your WordPress folder, navigate to wp-content -> uploads -> backwpup-xxxxxx-backups. Open the archive. Inside you’ll find a .SQL file (local.sql).
I am using MAC OS. SQL Developer is not installed. Only zip file is in apolication folder. And when I click on the zip file, it extracts only one file - SQLDevelopr.exe. No others files or folders. Where do I find .conf file? And add my timezone?
In the latest version of Xcode I could simply go to Organizer->MyDevice->Applications and then select the app I wanted to look at and download the appdata in form of a folder with all the app content. Now I only get a .xcappdata file.
How can I access this file for take a look in a .sqlite file?
Under the Data files in Sandbox pane in the Organizer, you'll find all the individual files that the selected app stores on the device and uses, displayed in a hierarchical view.
For my app, it looks like this:
To view the files in Finder, download the .xcappdata file, go to where you save it in Finder, Control-click on it and choose Show Package Contents. The directory structure is identical to what you see in the Organizer, and you can open and/or copy out the files as usual.
I'm constantly checking my app's database, so to speed things up I always download the .xcappdata to the same folder. I then run the following script that's sitting in that folder to look at the latest version of the database in 'sqliteman' (a sqlite program available through MacPorts):
#!/bin/bash
shopt - nullglob
for PACKAGE in *.xcappdata; do
CURRENT=$PACKAGE
done
sqliteman "$CURRENT/AppData/Documents/yourapphere.sqlite"