Basically I have a .env file in the project, however it will be ignored when I commit by git regarding to .gitignore, which also makes it kind of disappear when Netlify deploys this project from GitHub. Then I have to manually set my environment variables on Netlify.
Are there any other ways to do this, so I don't have manually set the environment variables in Netlify and it will read my .env file?
Netlify supports environment variables, not through .env though. You'll need to configure these either through your dashboard or or in the netlify configuration file.
File
In the Netlify configuration file. File-based configuration allows you
to set different environment variables for different deploy contexts.
Variable values set in the configuration file will override values set
in the UI.
Dashboard
In your site dashboard under Settings > Build & deploy > Environment >
Environment variables. Variable values set under site settings will
override the team-level settings.
Related
I have a NextJS project I want to deploy to Vercel. The server needs a config file which is a typescript file containing an object, and is ignored from version control. Obviously when Vercel clones my repo it doesn't get the config file. Is there any way to sideload this config file into Vercel or do I need to fork my own repo privately so I can include the config file?
I've done some research and the only faster way I found is to push directly to Vercel using the cmd/cli.
Here's the doc: https://vercel.com/docs/cli
Another way could be to create two repositories, one private where is your Vercel project linked, and another public without your config file (as you said).
Running into an issue when we deployed to production, had to update manage.py to set os.environ.setdefault("DJANGO_SETTINGS_MODULE", "config.settings.local") to config.settings.production. Of course this broke local settings when we pulled back to our dev branch.
We're running our containers via the docker-compose local.yml commands recommended in the documentation.
Am I missing something? Is this by design?
This environment variable should be set via a .env file, the production one is located under .envs/.production/.django, and is not in source control (for security reasons). So yes, it is by design.
Depending on how you start your server, this file might be missing and the environment will end up unset.
I have different config files for each different deployment environments I have. Dev, QA and live are those environments.
How do i get bamboo to use the right config file for each environment when deploying?
So when deploying the dev environment bamboo use the dev config file and change name on it and put it on the right place.
I assume that artifacts can fix this? But how?
I solved it. I simply used artifacts and pointed directly to the file. Then in deploy i made a ps script for deploying the right artifact to the right deployment environment.
I also made the file changing depending on environment so i don't need separately config files.
Just hit me up if you are having the same problem, Atlassian documentation on this is terrible.
I know how to make a shared config file for traditional projects and adding them to each project with the following tag:
<appSettings file="../other_project/foo.config">.
How do I share application settings in VSTS, ensuring every role can access the shared config settings? I assume you can't directly reference other projects' config files using relative path names, like in my example above.
I would like to centralize my configuration and make my config transform file relatively short, as there are a lot of projects.
I assume you can't directly reference other projects' config files
using relative path names, like in my example above.
You can manage the config file into solution directory or the root of your git repo.
Then you can add (Add -> Existing Item) the config file for each project separately.
And keep the config file as artifacts, so even when deploying different projects into different machines, the config file will always accessible.
We have a couple of cloud services and do a Continuous Delivery to a test environment via Team Build.
For the production environment, we have our own deployment powershell script. This script needs a .cspkg file for deployment.
My problem is now that I haven't found a way to let team build just create a .cspkg file but not to publish it to Azure.
I've used the AzureContinousDeployment.11.xaml template and it insists on publishing the package.
I've tried to set the "Deployment Settings Name" to an empty string. The build runs without errors, but that way, no package is created.
Is there a way to stop it somewhere in between?
Maybe something I could change in the .azurePubxml file to achieve that?
Environment: VS2012, Team Foundation Service (visualstudio.com)...
On the Process tab of the edit definition:
If you clear out the Deployment Settings Name and all the values under Publishing - Azure Cloud Service and Publishing - Azure Website (except you can leave True for the Allow Upgrade) the building template won't know where to publish the project.
Also on the Build Defaults tab of the edit definition:
Select the "Copy build output to the following Source Control folder (this folder will be created if it does not exist):"
Add some name and path like: "$/your cloud service/drops"
That will cause all your builds to be moved to the drops folder after a successful build. In those drops folders, there will be a app.publish directory that will have your *.cspkg and *csfg files.
I have my custom build template xaml invoke the MSBuild task (Microsoft.TeamFoundation.Build.Workflow.Activities.MSBuild) on the Azure project (.ccproj file) with the "Publish" target.
(Despite having the same name as the UI command that pushes the package to Azure, the "Publish" target just means "generate the package without pushing it anywhere".)