I have been trying to create a function with a http trigger, but unfortunately while creating the azure function, it doesn't show the pop up to choose the trigger point for creating the function, but seeing the function app is created with out init.py file
Attached the screen shot
Resources Section is used to create the Applications in the Azure Portal through VS Code but not the Triggers directly from VS Code.
If you want to create the Http Trigger or any other Triggers, you have the several ways like VS Code, CLI, PowerShell and Portal in the specified Function App.
I believe you are using the VS Code, so you’ll have workspace option in the Azure Explorer to create the Function Triggers locally and then deploy to the Azure Function App:
Related
I need to deploy an azure function app via Azure DevOps.
If I deploy via visual studio, it asks me to create a publish profile, where storage is specified.
I'm unsure how this works however with DevOps.
I have a build pipeline that builds the (.net core) function app, but on the release, I'm unsure how to proceed.
The Microsoft documentation is quite poor in my opinion, so would appreciate any expertise.
Thanks!
You have to create the underlying infrastructure prior to deploying the Azure Function to it.
There are steps you could user here and have an inline script job/stage within your pipeline:
https://learn.microsoft.com/en-us/azure/azure-functions/create-first-function-cli-csharp?tabs=azure-cli%2Cin-process#create-supporting-azure-resources-for-your-function
Alternatively you could use an ARM template or terraform to provision the app service and storage account as required.
I've got a sample github actions deployment of a function which uses the inline script method here:
https://github.com/brettmillerb/testfunctionapp/blob/master/.github/workflows/main.yml
I am trying to set up dbt on Google Cloud using this tutorial. The idea is to have a Cloud Build trigger on GitHub repo merge request. However, starting from the first deployment, Cloud Run would use a template image (and it obviouly fails to deploy):
Container image URL selection window
When I manually select the built image, all is fine. But on trigger, it still keeps the one I've selected previously, and I have to manually select the new one.
What am I doing wrong? Thanks!
I would like to setup one GitHub repo which will hold all backend Google Cloud Functions. But I have 2 questions:
How can I set it up so that GCP knows that there are multiple Cloud Functions to be deployed?
If I only change code for one Cloud Functions, how can I setup so that GCP will only deploy the modified Cloud Functions, and NOT to redeployed the unchanged onces?
When saving your cloud function to a github repo just make sure to have in the gitignore file the proper settings. Exclude node_modules and such folder and files you don't want to commit.
Usualy all cloud functions are deployed trough a single index.js file. So here you need to make sure you have that and all files you import into it.
If you want to deploy a single function you can use this command:
firebase deploy --only functions:your_function_name
If you want to have a more structure solution you can read this article to learn more about it.
As I understand, you would like to store cloud functions code in GitHub, and I assume you would like to trigger deployment on some action - push or pull request. At the same time, you would like only modified cloud functions be redeployed, leaving others without redeployment.
Every cloud function is your set is (re)deployed separately, - I mean the for each of them you might need to provide plenty of specific parameters, and those parameters are different for different functions. Details of those parameters are described in the gcloud functions deploy command documentation page.
Most likely the --entry-point - the name of the cloud function as defined in source code - is to be different for each of them. Thus, you might have some kind of a "loop" through all cloud functions for deployment with different parameters (including the name, entry point, etc.).
Such "loop" (or "set") may be implemented by using Cloud Build or Terraform or both of them together or some other tool.
An example how to deploy only modified cloud functions is provided in the How can I deploy google cloud functions in CI/CD without redeploying unchanged SO question/answer. That example can be extended into an arbitrary number of cloud functions. If you don't want to use Terraform, the similar mechanism (based not he same idea) can be implemented by using pure Cloud Build.
I have a logic app that connects to an sftp server (virtual machine that I created on azure) and does actions when a file is added to that sftp:
When a file is added I create a new blob on the blob storage.
Delete the file from the SFTP server
I have also created a blob trigger-based azure function that, every time a blob is created, processes some actions (like blob content decryption and parsing).
Next steps will be chaining some other azure functions executions in my logic app (like sending e-mail after executing and azure function etc... )…
Now, I have two main questions:
In order to have the best CI/CD pipeline suited for this workflow, do I create the logic app from the portal or from visual studio and why please?
Do I put azure function and logic app in the same solution/Repo? Same project?
Then, how can I create the CI/CD pipeline (type template and steps please)?
Ps: I want to add unit tests to test if my logic app and azure function are working correctly so I want to integrate test step in my build definition.
For more details about the logic app please see this Stack overflow question in which i detailed the process
and here is the logic app
Please find the below points:
I would recommend using Visual Studio. The main advantage is it gives you the same
designer experience, and you can make use of ARM Template and parameters to
deploy your logic app robustly to multiple environments to dev, Staging , proc etc.,
making a robust CI/CD pipeline. It also gives you an advantage of using Azure key vaults using ARM template and the parameter syntax to store any sensitive data.
Also Visual studio provides you to connect to the cloud using cloud
explorer where you can mimic resubmit , run history etc..
If you are using Azure function only for one process then you can
put it under the same solutions, but keeping Azure functions as a
separate Repo gives you more flexibility of re-usability, so that
other applications can also make use of it.
You can utilise Speck flow for automate logic app testing Automated tesing logic app with speckflow this link explained it in detail.
I created an Azure Functions 2.0 (C#) project in VS 2017 and put it in GitHub. If I publish to Azure directly from VS, it works just fine. Then I accessed Azure Portal in order to configure Azure Functions, and there is this option to deploy from GitHub. I configured this option and when I commit something to GitHub, the Azure Portal detects and start some process (in Deployment Center there are logs with "success" status for each change I made in GitHub) but the code isn't deployed.
Any ideas?
Thanks, guys! I found the problem! I first published my solution directly from Visual Studio to Azure. Then, all functions became read-only, so build process did executed with success, but the files aren't updated.
I erased my functions app and recreated manually, and configured deployment with Kudu, getting from GitHub, and then everything works like a charm! Each commit in github updates my app!
Make sure Visual Studio is connected to GitHub to push the azure function
In the deployement center , you need to check that deployement is connected to github
You also need to check the Azure function version 2.
Step by steps guide Referenced from my article
Continuous Azure function deployement from Github using Kudu Build Server