Use CI/CD Pipeline in Gitlab to periodically retrieve data from a REST endpoint and save to Gitlab repo - rest

I've got a Dash Enterprise app that uses some monthly data reports. Every month I manually upload the new data files which are stored at a Rest endpoint. I want to automate this process using GitLab. Can the GitLab CI/CD Pipeline retrieve data from a REST endpoint and upload it to the GitLab repo?
I was thinking something like this:
YAML file specifies a stage in the pipeline that runs a python script
The python script pulls the data from the REST endpoint
example:
data = requests.post('<some rest endpoint url>', params=dict(criteria='foo')).json()
Somehow python or the YAML file pushes this data to the GitLab repo.
Is this possible or am I using the wrong tool for the job?
I've created a GitLab repo and created a CI/CD pipeline. I create a job in the YAML file that calls a python file which can pull data from the REST endpoint. I just don't know how this data can be pushed to the GitLab repo that the YAML and python files reside in.

Related

How to authenticate to Azure devops private package feed from .npmrc using jenkins pipeline

I have a react application whose deployment is done through Jenkins pipeline.
package.json uses a private feed present in Azure Devops Artifcats.
I want to authenticate the .npmrc present in gitlab , to azure devops using my jenkins pipeline.
Could you please advice How to do that?
I would like to know if there is a way to authenticate azure devops private feeds using service principle from a jenkins pieline.
I want to authenticate the .npmrc present in gitlab , to azure devops using my jenkins pipeline. Could you please advice How to do that?
You can add the .npmrc file to the same path as Package.json file.
Then you can add the following content in the .npmrc file
; begin auth token
//pkgs.dev.azure.com/orgname/projectname/_packaging/feedname/npm/registry/:username=xx
//pkgs.dev.azure.com/orgname/projectname/_packaging/feedname/npm/registry/:_password=[BASE64_ENCODED_PERSONAL_ACCESS_TOKEN]
//pkgs.dev.azure.com/orgname/projectname/_packaging/feedname/npm/registry/:email=npm requires email to be set but doesn't use the value
//pkgs.dev.azure.com/orgname/projectname/_packaging/feedname/npm/:username=xxx
//pkgs.dev.azure.com/orgname/projectname/_packaging/feedname/npm/:_password=[BASE64_ENCODED_PERSONAL_ACCESS_TOKEN]
//pkgs.dev.azure.com/orgname/projectname/_packaging/feedname/npm/:email=npm requires email to be set but doesn't use the value
; end auth token
You need to generate PAT in Azure DevOps, then transfer it to Based64 type and add it to .npmrc file to authenticate the feed.

How to configure Azure DevOps with SQL DB

We Have Automated scripts that we would like to build and Test on Azure DevOps but our pipeline cannot run our Test Scripts on Azure
We have a Database Service Account that we want to configure on Azure but we don't know how to go about it. Please assist.
Here is a well explained video (by Hassan Habib from Microsoft) on exactly how to run a console app (you create) in an Azure Pipeline that securely gets credentials to immediately do stuff in Azure (https://youtu.be/ht0xhQyF1x4?t=1688)
He basically, in a handful of minutes shows exactly how to:
Link Pipeline Variables to KeyVault Secrets, so when accessed, the variables do a get() from KeyVault and return that value.
Securely links Pipeline Variables to Azure Environment Variables.
As a step in the release pipeline the console app reads the Azure Environment Variables to get credentials to do stuff in Azure.
In his case he created an Azure Resource Group in Azure.
In your case if I’m understanding correctly. You could possibly make a simple console app that runs in the pipeline, that gets creds\connections strings for your database to do whatever in the DB and could possibly test your scripts.

Is it possible to send Azure DevOps service hook-webhook to log analytics custom table

What I want to do is to generate Azure monitor alerts for Azure DevOps pipeline failures.
We think we may achieve this without having to modify our DevOps YAML pipelines. So we focused on using Azure DevOps Service Hooks to do this by sending the pipeline log data to the log analytics http data API.
I can send the data with Powershell. However I got a forbidden failure when testing the service webhook in Azure DevOps.
So I wonder if there is any missing operations?
enter image description here

How to deploy/filer the respective Server base endpoint in Swagger

I have an YAML/JSON files and we have the base serve endpoint defined as seen in the below screenshot.
How do we filter only the respective base URL for specific environment
For instance:
Server: dev files should be deployed to DEV environment, Stage files should be deployed to Stage environment and so on
Note: I'm using Azure pipeline for deployment.
In your current situation, in the devops pipeline, we do not have this function/option to do this. We recommend you can try to create a New Generic service connection and use it in your different deploy steps.

YAML Pull Request Security

When reading the documentation, it says that when doing a pull request, the "source" azure pipelines file is read when doing the PR check.
How is this in any way secure? Any developer that executes a pull request can now use the service connections the build might use and do whatever they want with it.
In other systems, it always uses the target branch CI configuration for pull requests. Is there any way to configure Azure Devops for this behavior?
What's the best practice here?