How to Use Cloud Storage Fuse to backup files when deleted in local machine - google-cloud-storage

I installed gcsfuse on my local macOS system and mounted a folder to cloud storage bucket.
everythings works fine.
but, If deleted a file from mounted folder also deleting on bucket.
I don't want this to be happen.
when ever I delete any files, It should only delete on my local machine.
Can anyone help me to do it.
Thanks.

You can't do this with official version of gcsfuse.
As workaround, you can activate the object versioning. Thereby, even is you delete a file, a versioned copy still live in your Bucket. You lost nothing.
This video is also great for explaining the versioning
If you really want to use gcsfuse with your special feature, you can fork the project and remove the delete part in the code of the open source project

Related

Where does Azurite store blobs, queues and tables on mac?

I'm developing an Azure Function on VSCode. I see that a bunch of files are created in my workspace folder. However, even if I delete them, when I open Azure Storage Explorer, I still see a bunch of containers etc. How can I delete all of them in one command?
Folders in Azure Storage aren't really created or deleted (Azure Blob storage does not have a concept of folders and everything inside the container is considered a blob including the folders. You can easily delete a folder including all its contents in Storage Explorer) , they exist as long as there are blobs stored in them. The way to delete a folder is to retrieve all blobs in it using ListBlobsSegmentedAsync and calling DeleteIfExists() on each of them.
Ref: There is a similar discussion threads here, refer to the suggestions mentioned in this Q&A thread and SO thread

Deploying web app from Visual Studio Code to Azure but leave out a data folder

I am building a very small Node/Express API app in Azure using Twilio to route communication for a small group. I initially built out a data structure for users in CosmosDB but found out it's minimum $24 per month, which is way over budget for something that will likely hold 20 or so records. Because of this, is seems much more reasonable to just build this into a json file that sits in a ./json subfolder. However, it has occurred to me that whenever I deploy, I would be overwriting this file with the default file I have locally. I have been working via the Azure App Service tool in Visual Studio Code and can't figure out a way to make it ignore the file.
I can go into Kudu and copy the file down each time before I deploy, but I will eventually forget and this sounds like a very brittle process.
I added a json/ line to .gitignore, but that has no effect on the deployment (as expected).
I also added "appService.zipIgnorePattern": ["json{,/**}"] to the settings.json file, but instead of just ignoring that folder on the server, it erases it on deploy (the zip ignores it and then it wipes/replaces the whole wwwsite folder). Looking for the file gives me {"Message":"'D:\\home\\site\\wwwroot\\json\\users.json' not found."}
I was hoping there is a setting that would deploy, replacing all folders in the package, and ignoring all content in the ./json folder. Does this exist?
Alternative solution, 2021:
Instead of excluding folders, select the folder that you do want to deploy. Data in other folders will not be affected.
Deploy from: edit .vscode/settings.json in your local project and add "appService.deploySubpath": "./folderToDeploy"
Deploy to: In the Azure Portal go to your app service. Under Configuration / Application Settings add a new Application Setting with name SCM_TARGET_PATH and value ./folderToDeployTo
Using VS Code right+click deploy will deploy the contents of the folder. I was able to work around this by adding Azure as a remote branch and using .gitignore. I placed my json file inside a random folder (content/json) then placed /content/json in my .gitignore file.

copy files from azure file storage to azure website after release

I have files that need to be copied over to my website (azure website) after a deployment has been made. usually these files are server specific (I have multiple different servers for different releases), and usually in the past, before i used azure, i just had a backup folder with these files and a powershell script that i ran after deployment that just copied those files right over.
Now that i'm moving to azure, i'd like to keep this functionality. I'm interested in copying over these files into azure file storage, and then in my release task after azure website deployment, just copying from that file storage over into the site\wwwroot folder. I'm not really seeing an easy way to do this. Is there a release task i can use with this in mind?
Is there a release task i can use with this in mind?
Yes, we could use the Azure File Copy task. I also do a demo to copy the zip file to the azure storage. It works correctly on my side. Fore more information, please refer to the screenshot.
Note: If you don't want to zip the files, you could remove the Archive File task.
Test result:

How do I speed up an EB deploy using ebignore?

I'm deploying my app to ElasticBeanstalk. I'm using an .ebignore file because there are files that I do not want to check into git, but I do want deployed with the app(like application secrets, config vars, etc). The issue I'm facing is that when using an .ebignore, the deploy takes FOREVER. I've used the --verbose flag, and I can see that it is recursing my entire node_modules directory and skipping each file individually. When I deploy by using .gitignore, it becomes very fast.
Has anyone else experienced this? How do I speed up this process?

Regarding the database in Pythonanywhere

I am following the Djangogirls tutorial according to which I added new posts in the blog on the Django admin. I created a template using Django templates to display this Dynamic data. I checked it by opening 127.0.0.1:8000 in browser and I was able to see the data. Then for deploying this site on Pythonanywhere, I pushed the data to github from my local rep using git push and did git pull on Pythonanywhere from github.All the files including the db.sqlite3(database) file were updated properly in pythonanywhere but still I could not the see the data after running my webapp on pythonanywhere.Then , I manually removed the db.sqlite3 file from pythonanywhere and uploaded the same file from my local desktop and it worked. Why did this work? and is there an alternative for this?
That's kind of odd; if the SQLite DB was in the git repository, and was uploaded correctly, I'd expect it to work. Perhaps the database is in a different directory? On PythonAnywhere, the working directory of your running web app might be (actually, probably is) different to your local machine. And if you're specifying the database using a relative path (which you probably are) then that might mean that the one you created locally is somewhere different to where it is on PythonAnywhere.
BTW, from my memories of the Django Girls tutorial (I coached for one session a few months ago) you're not actually expected to put the database in your Git repository. It's not how websites are normally managed. You'd normally have one database locally, for testing, where you'd be able to put random testing data, and then a completely different one on your live site, with posts for public consumption.