Where does Azurite store blobs, queues and tables on mac? - azurite

I'm developing an Azure Function on VSCode. I see that a bunch of files are created in my workspace folder. However, even if I delete them, when I open Azure Storage Explorer, I still see a bunch of containers etc. How can I delete all of them in one command?

Folders in Azure Storage aren't really created or deleted (Azure Blob storage does not have a concept of folders and everything inside the container is considered a blob including the folders. You can easily delete a folder including all its contents in Storage Explorer) , they exist as long as there are blobs stored in them. The way to delete a folder is to retrieve all blobs in it using ListBlobsSegmentedAsync and calling DeleteIfExists() on each of them.
Ref: There is a similar discussion threads here, refer to the suggestions mentioned in this Q&A thread and SO thread

Related

How to restore DatabricksRoot(FileStore) data after workspace is decommissioned?

My Azure Databricks workspace was decommissioned. I forgot to copy files stored in the DatabricksRoot storage (dbfs:/FileStore/...).
Can the workspace be recommissioned/restored? Is there any way to get my data back?
Unfortunately, the end-user cannot restore Databricks Workspace.
It can be done by raising support ticket here
It is best practice not to store any data elements in the root Azure Blob storage that is used for root DBFS access for the workspace. The root DBFS storage is not supported for production customer data. However, you might store other objects such as libraries, configuration files, init scripts, and similar data. Either develop an automated process to replicate these objects or remember to have processes in place to update the secondary deployment for manual deployment.
Refer - https://learn.microsoft.com/en-us/azure/databricks/administration-guide/disaster-recovery#general-best-practices

Is it possible to revert the action in Firebase Cloud Storage?

Folders with images in my entire bucket are almost all gone. I do have a backup, but I don't know if it's possible to upload multiple folders with files at once.
Is there any possibility to revert the accidental deletion of folders in bucket in Cloud Console or are there any snapshots of storages being kept?

How to Use Cloud Storage Fuse to backup files when deleted in local machine

I installed gcsfuse on my local macOS system and mounted a folder to cloud storage bucket.
everythings works fine.
but, If deleted a file from mounted folder also deleting on bucket.
I don't want this to be happen.
when ever I delete any files, It should only delete on my local machine.
Can anyone help me to do it.
Thanks.
You can't do this with official version of gcsfuse.
As workaround, you can activate the object versioning. Thereby, even is you delete a file, a versioned copy still live in your Bucket. You lost nothing.
This video is also great for explaining the versioning
If you really want to use gcsfuse with your special feature, you can fork the project and remove the delete part in the code of the open source project

copy files from azure file storage to azure website after release

I have files that need to be copied over to my website (azure website) after a deployment has been made. usually these files are server specific (I have multiple different servers for different releases), and usually in the past, before i used azure, i just had a backup folder with these files and a powershell script that i ran after deployment that just copied those files right over.
Now that i'm moving to azure, i'd like to keep this functionality. I'm interested in copying over these files into azure file storage, and then in my release task after azure website deployment, just copying from that file storage over into the site\wwwroot folder. I'm not really seeing an easy way to do this. Is there a release task i can use with this in mind?
Is there a release task i can use with this in mind?
Yes, we could use the Azure File Copy task. I also do a demo to copy the zip file to the azure storage. It works correctly on my side. Fore more information, please refer to the screenshot.
Note: If you don't want to zip the files, you could remove the Archive File task.
Test result:

How to deploy only worker/web role in Azure

If you have a web AND a worker role in an Azure solution, all the waiting for the publishing an update package, uploading to the cloud storage, waiting for the package to be deployed could be exhausting, an waste a lot of time.
How to upload/deploy only the worker or web role of an Microsoft Azure Solution, that contains both roles, and save both internet traffic and time?
There is no option to build a package for only one of the both roles, but if you have limited bandwidth or traffic, and want to save from the upload time (which can be quite a big portion if you have a lot of static content: Look here for an example), there is one option.
As maybe you know, the package generated from Visual Studio for deployment (the 'cspkg' file) is nothing more, than an archive file.
Suppose, you want to update the WORKER role only. The steps are:
Create the update package as normal
Open it with the best archive manager (7zfm)
Inside, besides the other files are 2 'cssx' files - one for each
role. Delete the unnecessary cssx file.
Upload to Azure Blob Storage (optional)
Update the instances from the Azure Management Portal using the
'local' or 'storage' source as normal
On the Role dropdown, select only the role you want to update
Press OK :)
Hope this helps.
It is a lot easier to just add two additional cloud projects to your solution. In one project, have it reference only your web role. In the other project, have it reference only your worker role.
You can keep the cloud project that references both roles and use that for local debugging but when it is time to deploy, right click the cloud project that references only role you wish to deploy and click "Publish"
You will end up maintaining configuration files for each cloud project but that sounds a lot easier than manually messing around with editing the package file each time.