Swift, Mac OS X. Uploading large files to s3 bucket AWS. - swift

I have my first app on swift and I need to upload large files to s3 bucket. I was trying to upload via Alamofire, but s3 bucket has limit on max file size ( 5 gb) when I use put request. Also I haven't found AWS SDK for Swift. Has anyone any ideas? Thank you!

Related

Upload image error -> blob:http://localhost:3000/48c7da66-42c0-4ed3-8691-2dedd5ce4984:1 Failed to load resource: net::ERR_FILE_NOT_FOUND [duplicate]

I build a MERN app and hosted on heroku.
I saved the user's images on server by multer and it works fine for some time i.e. uploaded image is fetched successfully.
But after closing the application for long that image is not available on server.
On searching I found that each dyno on heroku boots with a clean copy of the filesystem from the most recent deploy.
But then how and where to save images?
Dyno file system is ephemeral so you need to store the file on an external storage (ie S3, Dropbox) or use an Heroku plugin (ie for FTP).
Check Files on Heroku to understand (free) options for storing/managing files (the examples are in Python but the concept is valid for other stacks too).

How to edit files directly on Google Cloud Storage using VS Code?

Is there a way to directly load / edit / save files to a given bucket in Google Cloud Storage without having to download the file, edit it, and then upload it again?
We have a GCS bucket with about 20 config files that we edit for various reasons. We would really just like to load the bucket into VS Code and then browse between updating the files and saving edits.
I have tried the vscode-bucket-explorer extension for VS Code but this just seems to provide viewing capability with no editing/saving capability. Unless I am missing something?
Is there a way to mount a bucket as a drive on a Mac? With read/write ability?
Is there a way to directly load / edit / save files to a given bucket
in Google Cloud Storage without having to download the file edit it and then upload it again
No, blobs objects in Google Cloud Storage can not be edited in place.
As with buckets, existing objects cannot be directly renamed. Instead,
you can copy an object, give the copied version the desired name, and
delete the original version of the object. See Renaming an object for
a step-by-step guide, including instructions for tools like gsutil and
the Google Cloud Console, which handle the renaming process
automatically.
Is there a way to mount a bucket as a drive on a Mac? With read/write
ability?
You can use Cloud Storage FUSE where the mounted bucket will behave similarly to a persistent disk.
Cloud Storage FUSE is an open source FUSE adapter that allows you to
mount Cloud Storage buckets as file systems on Linux or macOS systems.
It also provides a way for applications to upload and download Cloud
Storage objects using standard file system semantics. Cloud Storage
FUSE can be run anywhere with connectivity to Cloud Storage, including
Google Compute Engine VMs or on-premises systems

Why azure cdn returns me the old version of file with custom domain

I have a file uploaded to my azure storage, and now I have replaced it with another version of this file.
The old file size was 22 mb.
now the new version is about 10 mb.
After replace when I try to download the file with my custom domain it still downloads the old file(22 mb).
But when I try to download with it's original url(storageName.blob.core.windows.net)
I get the correct file.
I have tried to set cache-control header 1 minutes using Microsoft azure storage explorer.
max-age=1
But it didn't help.
Why is such kind of behavior? And how to solve this problem?
When you have a CDN configured with Azure Storage and you updated the file in Storage, CDN will still serve the cached old file until the TTL expires.
So you should either do a Purge or you need to configure the caching rules to get desired rules.
You can read more about Caching rules in CDN here.

Compress / zip multiple files on google cloud storage without downloading

I want to compress / zip multiple files in google cloud storage bucket into single zip file without downloading them.
Is there any gsutil cli method which takes multiple path input and cp zip / compressed of all those input files.
Thank you in advance.
Nope, there's no functionality in GCS that supports this. And if the API doesn't support it, no tools or client libraries can, as they're simply making API calls under the hood.
Here it is, though not natively but you can host on ur machine or Google cloud for better
https://www.npmjs.com/package/zip-bucket

Auto upload remote files into Google cloud storage via FTP?

I download a lot of csv files via ftp from different sources on a daily basis. I then upload these files into Google Cloud Storage.
Are there any programs/api/tools to automate this?
Looking for a best way, if possible, to load these files directly into Google Cloud Storage without having to locally download them. Something that I can deploy on Google Compute, so I don't need to run a local programs like Filezilla/CrossFTP. The program/tool will keep checking the remote location on a regular basis and load new files into Google Cloud Storage; ensuring a checksum match.
I apologize in advance if this is too vague/generic question.
Sorry, no. Automatically importing objects from a remote FTP server is not currently a feature of GCS.