Why azure cdn returns me the old version of file with custom domain - azure-cdn

I have a file uploaded to my azure storage, and now I have replaced it with another version of this file.
The old file size was 22 mb.
now the new version is about 10 mb.
After replace when I try to download the file with my custom domain it still downloads the old file(22 mb).
But when I try to download with it's original url(storageName.blob.core.windows.net)
I get the correct file.
I have tried to set cache-control header 1 minutes using Microsoft azure storage explorer.
max-age=1
But it didn't help.
Why is such kind of behavior? And how to solve this problem?

When you have a CDN configured with Azure Storage and you updated the file in Storage, CDN will still serve the cached old file until the TTL expires.
So you should either do a Purge or you need to configure the caching rules to get desired rules.
You can read more about Caching rules in CDN here.

Related

MAUI: Persisting access to Nextcloud files via the file picker in connection with an iOS File Provider

I have currently the problem specially on iOS, that when I select a file through the FilePicker, which is located in Nextcloud and integrated in the file manager through the file provider, I get a shared path to the file. The access is now possible to the file.
However, if I now save the path and want to access it when I restart the application, this is no longer possible. I ran this on a local device.
Access to the path'/private/var/mobile/Containers/Shared/AppGroup/XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX/File Provider Storage/abcd.../test.xyz' is denied.
I couldn't figure it out from the documentation at apple.
https://developer.apple.com/documentation/fileprovider
Is the selected file only temporarily accessible? How can I permanently access e.g. a file in nextcloud via FilePicker without implement a WebDAV client to the app and constantly kept up to date the selected file?
With Android, as far as I have read, a permanent copy of the file is created, and access to the file also works when the application is restarted. Here, however, there is the problem that the file is not updated.
The permission checks and requests are automatically handled by .NET MAUI. But, when you use the FilePicker, you need to enable the iCloud capabilities.
For how to enable the capabilities, you could check the MS docs. https://learn.microsoft.com/en-us/dotnet/maui/ios/deployment/entitlements

How to edit files directly on Google Cloud Storage using VS Code?

Is there a way to directly load / edit / save files to a given bucket in Google Cloud Storage without having to download the file, edit it, and then upload it again?
We have a GCS bucket with about 20 config files that we edit for various reasons. We would really just like to load the bucket into VS Code and then browse between updating the files and saving edits.
I have tried the vscode-bucket-explorer extension for VS Code but this just seems to provide viewing capability with no editing/saving capability. Unless I am missing something?
Is there a way to mount a bucket as a drive on a Mac? With read/write ability?
Is there a way to directly load / edit / save files to a given bucket
in Google Cloud Storage without having to download the file edit it and then upload it again
No, blobs objects in Google Cloud Storage can not be edited in place.
As with buckets, existing objects cannot be directly renamed. Instead,
you can copy an object, give the copied version the desired name, and
delete the original version of the object. See Renaming an object for
a step-by-step guide, including instructions for tools like gsutil and
the Google Cloud Console, which handle the renaming process
automatically.
Is there a way to mount a bucket as a drive on a Mac? With read/write
ability?
You can use Cloud Storage FUSE where the mounted bucket will behave similarly to a persistent disk.
Cloud Storage FUSE is an open source FUSE adapter that allows you to
mount Cloud Storage buckets as file systems on Linux or macOS systems.
It also provides a way for applications to upload and download Cloud
Storage objects using standard file system semantics. Cloud Storage
FUSE can be run anywhere with connectivity to Cloud Storage, including
Google Compute Engine VMs or on-premises systems

Compress / zip multiple files on google cloud storage without downloading

I want to compress / zip multiple files in google cloud storage bucket into single zip file without downloading them.
Is there any gsutil cli method which takes multiple path input and cp zip / compressed of all those input files.
Thank you in advance.
Nope, there's no functionality in GCS that supports this. And if the API doesn't support it, no tools or client libraries can, as they're simply making API calls under the hood.
Here it is, though not natively but you can host on ur machine or Google cloud for better
https://www.npmjs.com/package/zip-bucket

How do I exclude a sub folder/directory from the Azure Backup for an App Service?

How do I exclude a sub folder/directory from the Azure Backup for an App Service?
Our backup system seems to fail because this folder makes the site exceed the backup limit. So I'd like to exclude only that folder.
The website + database size exceeds the 10 GB limit for backups. Your
content size is 10 GB.
#AjayKumar-MSFT's link to Partial Backups works, but here's the details:
create a file called _backup.filter
list one directory or file per line
Upload _backup.filter file to the D:\home\site\wwwroot\ directory of your site
Example:
\site\wwwroot\Images\brand.png
\site\wwwroot\Images\2014
\site\wwwroot\Images\2013
You may filter out unnecessary files from the backup by Configuring Partial Backups. You can exclude static content that does not often change such as videos/images or unneeded log files (directories).
Partial backups allow you to choose exactly which files you want to backup.
Firstly I suggest you could check your app usage(Make sure your web app has already use 10GB). You can determine folder content size using the Azure Web Apps Disk Usage Site Extension.
You could follow this steps to see the disk usage:
Browse the Kudu site for your web app: https://sitename.scm.azurewebsites.net Click on Site extensions and then click on Gallery.
Search for Azure Web Apps Disk Usage site extension. Click on + icon to install the site extension.
Click run to start the Disk Usage Site Extension
If your web site is not exceed the 10GB. I suggest you could create a new service plan and move the your web app to the new service plan and test again. Maybe something wrong with the web app server.
If this still doesn't solve your issue. You don't exceed the 10GB but still show this error. I suggest you could create an Azure support request.
How do I exclude a sub folder/directory from the Azure Backup for an App Service?
As Ajay says, you could use Partial backups.

Auto upload remote files into Google cloud storage via FTP?

I download a lot of csv files via ftp from different sources on a daily basis. I then upload these files into Google Cloud Storage.
Are there any programs/api/tools to automate this?
Looking for a best way, if possible, to load these files directly into Google Cloud Storage without having to locally download them. Something that I can deploy on Google Compute, so I don't need to run a local programs like Filezilla/CrossFTP. The program/tool will keep checking the remote location on a regular basis and load new files into Google Cloud Storage; ensuring a checksum match.
I apologize in advance if this is too vague/generic question.
Sorry, no. Automatically importing objects from a remote FTP server is not currently a feature of GCS.