I am trying to save a json value to a .json file in Ionic 3. I have this code but I can't get it to work. I get this error of A URI supplied to the API was malformed. I am not connecting to an API. Please see my codes below:
fileName: string = "tickets.json";
saveTickets(obj: any) {
return this.file.writeFile('src/assets/data/', this.fileName, JSON.stringify(obj), {replace: false});
}
Thank you very much for your help.
Look into the docs
writeFile(path, fileName, text, options)
path is not a path it's the name of one of the Base FileSystems listed below.
path | string | Base FileSystem.
Base FileSystems:
applicationDirectory Read-only directory where the application is
installed.
applicationStorageDirectory Read-only directory where the application
is installed.
dataDirectory Where to put app-specific data files.
cacheDirectory Cached files that should survive app restarts. Apps
should not rely on the OS to delete files in here.
externalApplicationStorageDirectory Android: the application space on
external storage.
externalDataDirectory Android: Where to put app-specific data files on
external storage.
externalCacheDirectory Android: the application cache on external
storage.
externalRootDirectory Android: the external storage (SD card) root.
tempDirectory iOS: Temp directory that the OS can clear at will.
syncedDataDirectory iOS: Holds app-specific files that should be
synced (e.g. to iCloud).
documentsDirectory iOS: Files private to the app, but that are
meaningful to other applications (e.g. Office files)
sharedDirectory BlackBerry10: Files globally available to all apps
Related
I am working on a webapp for transferring files with Aspera. We are using AoC for the transfer server and an S3 bucket for storage.
When I upload a file to my s3 bucket using aspera connect everything appears to be successful, I see it in the bucket, and I see the new file in the directory when I run /files/browse on the parent folder.
I am refactoring my code to use the /files/{id}/files endpoint to list the directory because the documentation says it is faster compared to the /files/browse endpoint. After the upload is complete, when I run the /files/{id}/files GET request, the new file does not show up in the returned data right away. It only becomes available after a few minutes.
Is there some caching mechanism in place? I can't find anything about this in the documentation. When I make a transfer in the AoC dashboard everything updates right away.
Thanks,
Tim
Yes, the file-id base system uses an in-memory cache (redis).
This cache is updated when a new file is uploaded using Aspera. But for files movement directly on the storage, there is a daemon that will periodically scan and find new files.
If you want to bypass the cache, and have the API read the storage, you can add this header in the request:
X-Aspera-Cache-Control: no-cache
Another possibility is to trigger a scan by reading:
/files/{id}
for the folder id
After upgrading to Android 11, access to my app’s files under /sdcard/Android/data/<packageId>/files is no longer possible. I get the following error:
type=1400 audit(0.0:672): avc: denied { read } for name="sdcard" dev="tmpfs" ino=6474 scontext=u:r:untrusted_app_29:s0:c244,c256,c512,c768 tcontext=u:object_r:mnt_sdcard_file:s0 tclass=lnk_file permissive=0 app=com.example.myapp
This behavior presists even after I request MANAGE_EXTERNAL_STORAGE in the manifest and grant it to the app.
I understand that, even with maximum storage permissions, Android 11 restricts access to certain paths, including /sdcard/Android/data. However, the directory I am trying to access is the app’s own data dir.
How do I get access to this path while using File and the like, rather than SAF?
Closer examination of the issue revealed that the app tried to access its own data dir through /mnt/sdcard/Android/data/<packageId>/files. While (on my device) /mnt/sdcard and /sdcard are both symlinks pointing to the same target, apparently they are not created equal.
Removing the path setting, causing the app to fall back to the data dir as reported by the API, fixed things for me.
I ran some more tests and found the following:
path as reported by the API: works
/sdcard/Android/data/<packageId>/files: works
/mnt/sdcard/Android/data/<packageId>/files: does not work
/storage/emulated/0/Android/data/<packageId>/files (replacing /sdcard with its symlink target): works
It is also worth noting that the docs mention /sdcard specifically.
Conclusion: App access to their external private storage area depends on the path used. Different ways of addressing the same path may yield different results.
I tried disabling hermes in build.gradle and it worked just fine.
I would like to download publicly available data from google cloud storage. However, because I need to be in a Python3.x environment, it is not possible to use gsutil. I can download individual files with wget as
wget http://storage.googleapis.com/path-to-file/output_filename -O output_filename
However, commands like
wget -r --no-parent https://console.cloud.google.com/path_to_directory/output_directoryname -O output_directoryname
do not seem to work as they just download an index file for the directory. Neither do rsync or curl attempts based on some initial attempts. Any idea of how to download publicly available data on google cloud storage as a directory?
The approach you mentioned above does not work because Google Cloud Storage doesn't have real "directories". As an example, "path/to/some/files/file.txt" is the entire name of that object. A similarly named object, "path/to/some/files/file2.txt", just happens to share the same naming prefix.
As for how you could fetch these files: The GCS APIs (both XML and JSON) allow you to do an object listing against the parent bucket, specifying a prefix; in this case, you'd want all objects starting with the prefix "path/to/some/files/". You could then make individual HTTP requests for each of the objects specified in the response body. That being said, you'd probably find this much easier to do via one of the GCS client libraries, such as the Python library.
Also, gsutil currently has a GitHub issue open to track adding support for Python 3.
I am currently trying to use the FTPHook in Airflow in order to upload and download file to/from a remote ftp. But I'm not sure if I can use the gs:// path as part of the source/destination path.
I currently don't want to use local folder within the AF pod since the file size might get big, so I would rather use gcs path directly or gcs file stream.
conn = FTPHook(ftp_conn_id='ftp_default')
conn.store_file('in', 'gs://bucket_name/file_name.txt')
link to the FTPHook code:
here
Thanks for any help!
I found a simple streaming solution to upload/download from gcs to ftp server and vice versa using pysftp which I'll like to share with you.
First, I found this solution, which was working great, but the only issue with that solution was that it didn't support upload file from gcs to FTP. So I was looking for something else.
So than I was looking into different approach, so I've found this google document which basically allow you to stream to/from blob file which was exactly what I was looking for.
params = BaseHook.get_connection(self.ftp_conn_id)
cnopts = pysftp.CnOpts()
cnopts.hostkeys = None
ftp = pysftp.Connection(host=params.host, username=params.login, password=params.password,
port=params.port,
cnopts=cnopts)
#This will download file from FTP server to GCS location
with ftp.open(self.ftp_folder + '/' + file_to_load, 'r+') as remote_file:
blob = bucket.blob(self.gcs_prefix + file_to_load)
blob.upload_from_file(remote_file)
#This will upload file from GCS to FTP server
with sftp.open(self.ftp_folder + '/' +file_name,'w+') as remote_file:
blob = bucket.blob(fileObject['name'])
blob.download_to_file(remote_file)
GCS does not implement FTP support, so this won't work.
It looks like FTP hook only knows how to deal with a local file path or buffer, not one of the GCS APIs.
You might be able to find (or write) some code that reads from FTP and writes to GCS.
I have online/offline project.
I need to download wav/ogg/mp3 file from Application.persistentDataPath on WebGL platform.
I tried www/webrequest.
For example - WWW("file://" + Application.persistentDataPath + filePath);
But always get error: Failed to load: Cross origin requests are only supported for protocol schemes: http, data, chrome, chrome-extension, https.
Could you help me?
P.S. From remote server works fine.
You can not load local files in a browser as it's a security risk. If you could then a webpage could read your hard drive and steal all your files.
If you're just testing you can run a local server.
If you want to let the user supply a file you can let them choose a file