Docker-Compose with Gitlab-CI managing sensitive data - docker-compose

I want to create a ci/cd pipeline with gitlab. Currently I set my sensitive data as environment variables in the docker-compose file. I don't want this data to be visible in the repository.
What can I do to prevent this from happening?

You can add Variables to GitLab using the GitLab UI.
See https://docs.gitlab.com/ee/ci/variables/#masked-variables
See also https://gitlab.com/gitlab-com/support-forum/issues/1452

you should past your sensible data in the "ci cd" menu available in "settings". In your job, you can use them as a local variable.

Related

Is there a way to auto-pull environmental variables like a .env file when sharing a repository amongst team members?

So….if you want to have environmental variables, you’ll create the .env file right? Well…what if you want to share a repo with team members? Do they have to each manually create a .env file when working on the repository locally or is there a better solution than that? I’m just trying to think of the security behind not allowing everyone to have access to the private strings in a .env file but be able to use the repo as if there was a .env file.
I’m hosting my code in GitHub for reference. Is this a GitHub actions thing or is there another solution? Any help would earn my eternal gratitude in this dire time, lol
There is this Stack Overflow thread that is similar, but I don't want the environmental variables for deployment, just for team-members to do a git-pull and get the repository to work locally without needing to custom-create the .env file and add in the environmental variable strings: React app build/deploy using github actions with secrets

Is it possible to default to self-hosted runners?

I am planning to move from GitLab to GitHub and just started to experiment with GH Actions.
One thing I see is that I have to indicate that I want to run on a self-hosted runner for each job.
Is there a way to set a default for the Organization, or have it defined only once in the workflow YAML file?
I don't believe so. This is just another attribute that you need to consider when specifying a place for a job to run. For example, you already need to specify what type of system (e.g., OS and version) you want to use for each job via the runs-on directive, since there's no reasonable way to guess, so specifying self-hosted instead isn't exceptionally burdensome.

How to maintain hundred different Terraform configs?

we created a Terraform template which we will deploy in future many dozen times in different workspaces. Each workspace will have a separate configuration file.
Now we would like to automate this procedure and think about keeping the configuration files in a Git Repo.
Has anyone a best practise how to store the configuration files in a Git Repo and trigger a CICD workflow (Azure DevOps)?
In general we only would like to apply changes for workspaces that have a changed configuration.
The terraform plan and apply command have an option for you to pass in the tfvars file you want to use. So something like this:
terraform apply --var-file=workspace.tfvars
So in the pipeline you would grab your terrafom template artifacts and your config files. I would then set a TF_WORKSPACE variable to force your workspace and I would also make your tfvars files match the workspace name so you can re use the variable in your apply command. This would force your workspace and configuration file to match.
To trigger this when those files have changed would require a path trigger that would trigger on those changes.
I don't see any harm in running the Terraform every time regardless if changes occur. The worse possible outcome would be that someone made a change that isn't in Terraform and it gets undone.

Reusing PowerShell Scripts in Azure DevOps

I have a PowerShell script that I want to re-use across multiple build pipelines. My question is, is there a way I can "store" or "save" my PowerShell script at the project or organization scope so that I can use it in my other build pipelines? If so, how? I can't seem to find a way to do this. It would be super handy though.
It is now possible to check out multiple repositories in one YAML pipeline. You could place your script in one repository and check it out in a pipeline of any other repository. You could then reference the script directly on the pipeline workspace.
More info here.
Depending on how big theese scripts are you can create Taskgroups that contain powershell-tasks with the script as inline-powershell. But this only works on project-scope..
Another attempt i'd try would be to create a repo containing your powershell-scripts, add this repo as submodule in the repository you are trying to build and then call the scripts from the submodule-folder. But this only works when using git-repos.
Or you could create a custom build-task that contains your script.
From what I have seen, no.
A few different options I have explored are:
If using a non-hosted agent, saving the file onto the build server. Admittedly this doesn't scale well, but it is better than copy/pasting the script all over. I was able to put these scripts into version control and deploy them via their own pipeline so that might be an solution for scaling (if necessary)
Cloning another repository that has these shared scripts during the process.
I've been asking for this feature for a bit, but it seems the Azure DevOps team has higher priorities.
How about putting the powershell in a nuget package and install that in depending projects?
I just discovered YAML templates (https://learn.microsoft.com/en-us/azure/devops/pipelines/process/templates?view=azdevops#step-re-use).
I think it may help you in this case (depending how large it is your file), you can put an inline powershell script in that template yaml, and reuse it on your main yaml.
Documentation is pretty straightforward.

Tool for storing per environment configuration

I have a requirement to store configuration information on a per environment basis in a tool.
This is a tool with a GUI for adding/updating configuration values (e.g connection strings). This should have a default value and be able to change this based on different environments.
There should be an API to retrieve these configuration values during deployment to a particular environment to add to the application.
I have googled for a while and can't see any tools that would fit this bill. Are there any suggestions?
It would be great to know more about where you will be using the configs, ie. local, cloud etc. but consider using Puppet with a source control repo like Github.
There are GUIs but you can find lots of template examples to get started if you want to change the conn settings in the config code yourself and then commit them to your repo or set them as a variable and you'll only have to change the variables file.
Doing it this way will allow you to eventually start running automated builds.
If the GUI is not a must, then configrd is exactly this. A central repository for config values, environment variables and secrets.
You can structure your config values per whatever axis you want, including by environment with values inheriting and overriding from environment to environment.
Configrd also handles on the fly encryption and decryption of secrets so that you can keep your plain text and secrets side by side versioned in git if you choose.
It's all accessible over a simple API