Azure Function temp file - powershell

When getting content from an endpoint in Azure Functions, I used to be able to save stuff locally and then handle that stuff before passing it to output in Azure Functions.
Now I have this setup:
1: What happens, when I call my endpoint
2: My function code
3: My call to the function
4: The contents of C:\Local\Temp AFTER the function has been called
According to my function code (2) the file C:\local\Temp\cspcustomer.parquet should exist, but when trying to read the file I obviously get an error.
Furthermore, when looking at the actual contents of C:\local\Temp in Kudu (4), the file is not really there.
My question is, where is my file, so I can continue my work?

Azure function app has different file system storage locations. Those are
D:\local
This is local temporary storage, and you cannot share any file from this storage. Temporary storage meaning, this storage goes away as soon as you delete your Azure function from the Virtual Machine. You can store up to 500MB data here.
If your Azure function has to say 3 instances, each one of these instances will actually run on its own virtual machine, and ultimately, each instance will have its own D:\local storage up to 500MB.
D:\Home
This storage is shared storage. There are no restrictions, All your Azure Function app instances will have access to this storage. In this storage case, If you will delete your Azure function app or moved it to somewhere else then the storage will not go away like in the case of D:\local. The storage will remain as it is.
You can use the Home directory instead of Local.
I can able to see the parquet file which is available in Home Directory
Refer here

Turns out this issue will be fixed once the PowerShell workeer is updated: https://github.com/Azure/azure-functions-powershell-worker/issues/733
The PowerShell version in Azure Function is a bit behind. It will be updated to 7.2 as required by the module.

Related

Firestore: Copy/Import data from local emulator to Cloud DB

I need to copy Firestore DB data from my local Firebase emulator to the cloud instance. I can move data from the Cloud DB to the local DB fine, using the EXPORT functionality in the Firebase admin console. We have been working on the local database instance for 3-4 months and now we need to move it back to the Cloud. I have tried to move the local "--export-on-exit" files back to my storage bucket and then IMPORT from there to the Cloud DB, but it fails everytime.
I have seen one comment by Doug at https://stackoverflow.com/a/71819566/20390759 that this is not possible, but the best solution is to write a program to copy from local to cloud. I've started working on that, but I can't find a way to have both projects/databases open at the same time, both local and cloud. They are all the same project ID, app-key, etc.
Attempted IMPORT: I copied the files created by "--export-on-exit" in the emulator to my cloud storage bucket. Then, I selected IMPORT and chose the file I copied up to the bucket. I get this error message:
Google Cloud Storage file does not exist: /xxxxxx.appspot.com/2022-12-05 from local/2022-12-05 from local.overall_export_metadata
So, I renamed the file metadata file to match the directory name from the local system, and the IMPORT claims to initiate successfully, but then fails with no error message.
I've been using Firestore for several years, but we just started using the emulator this year. Overall I like it, but if I can't easily move data back to the Cloud, I can't use it. Thanks for any ideas/help.

How to update files whenever script is scheduled to run in Heroku app

I have a simple python script that is hosted on Heroku and I'm using the Heroku Scheduler to run the script every hour/day. The script will possibly update a simple .txt file (could also be a config var if possible) when it runs. When it does run and conditions are met, I need that value stored and used when the next scheduled script runs. The value changed is simply a date.
However, since the app is containerized based on the most recent code I have on Github, it doesn't store those changes anywhere to be used again. Is there any way I can accomplish to update the file and use it every time it runs? Any simple add-ons or other solutions I can use?
Heroku Dynos have a local file system that does not survive an application restart or redeployment, therefore it cannot be used to persist data.
Typically you have 2 options:
use a database. On Heroku you can use (there is also a Free tier) Postgres
save the file on external storage (S3, Dropbox, even GitHub). See Files on Heroku for details and examples

Copy Items from one resource group to another in Azure data lake store using powershell

All I want is to copy the data from a development environment to production environment in Azure data lake store. There is not QA..
These are .CSV files the environments are nothing but different resource groups.
I tried copying the data in the similar Resource Group using the command
Move-AzureRmDataLakeStoreItem -AccountName "xyz" -path "/Product_S
ales_Data.csv" -destination "/mynewdirectory
Which worked fine, however, I want the data movement to take place between two different resource groups.
Possible solution that I have come across is by using the Export command which downloads the files in the local machine and then using the Import command and uploading them to a different resource group.
Import-AzureRmDataLakeStoreItem
Export-AzureRmDataLakeStoreItem
The reason behind using a PowerShell is to automate the process of importing the files/copying them across different environment which is nothing but automating the entire deployment process using PowerShell.
The solution mentioned above might help me in taking care of the process but I am looking for a better solution where the local machine or a VM is not required.
You do have a number of options, all of the below will be able to accomplish what you are looking to achieve. Keep in mind that you need to check the limitations of each and weigh the costs. For example, Azure functions have a limited time they can execute (default maximum of 5 minutes) and local storage limitations.
Azure Logic Apps (drag and from config)
Azure Data Factory (using the Data Lake lined service)
Azure Functions (using the Data Lake REST API)
You could use Azure Automation and PowerShell to automate your current approach.
Use ADLCopy to copy between lakes (and other stores)
Choosing which can be opinionated and subjective.

Redirecting output to a text file located on Azure Storage - Using Powershell

Using PowerShell, what is the best way of writing output to a text file located on Azure storage container? Thank you.
Simply, you can't.
While this capability exists within Azure Storage with the (relatively) new Append blob it hasn't yet filtered down to Powershell.
In order to implement this you would either need to create a new c# cmdlet that encapsulates the functionality. Or you will need to redirect the output to a standard file and then us the usual Azure storage cmdlets to upload that file to Azure Storage.

Best way to stage file from cloud storage to windows machine

I am wanting to store a data file for Quickbooks in the cloud. I understand that the data file is more of a database-in-a-file, so I know that I don't want to simply have the data file itself in a cloud directory.
When I say 'cloud', I'm meaning something like Google Drive or box.com.
What I see working is that I want to write a script (bat file, or do they have something new and improved for Windows XP, like some .net nonsense or something?)
The script would:
1) Download the latest copy of the data file from cloud storage and put it in a directory on the local machine
2) Launch Quickbooks with that data file
3) When the user exits Quickbooks, copy the data file back up into the cloud storage.
4) Rejoice.
So, my question(s)... Is there something that already does this? Is there an easily scriptable interface to work with the cloud storage options? In my ideal world, I'd be able to say 'scp google-drive://blah/blah.dat localdir' and have it copy the file down, and do the opposite after running QB. I'm guessing I'm not going to get that.
Intuit already provides a product to do this. It is called Intuit Data Protect and it backs up your Quickbooks company file to the cloud for you.
http://appcenter.intuit.com/intuitdataprotect
regards,
Jarred