I have a visual studio project, 3 people are working using different databases, so each person is keeping his own version of appsettings.json locally without pushing the file into git but sometimes we forgot and we push it causing issue on the other guys.
Is there any way to put some conditions to work all together without interfering each others?
For example, this is my file using MYDB:
{
"ConnectionStrings": {
"MdmDb": "Server=XXX;Database=MYDB;Trusted_Connection=Yes; MultipleActiveResultSets=true;"
},
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft": "Warning",
"Microsoft.Hosting.Lifetime": "Information"
}
},
"AllowedHosts": "*"
}
My colleague will have the same file but db name will be different
Related
[warning] Via 'product.json#extensionEnabledApiProposals' extension
'github.vscode-pull-request-github' wants API proposal
'commentsResolvedState' but that proposal DOES NOT EXIST. Likely, the
proposal has been finalized (check 'vscode.d.ts') or was abandoned.
I don't know what is this error msgs mean is it a bug on the current version of vscode 1.75.1?
I don't know why it's called extension even though I don't install any extension named github.vscode-pull-request-github
also I tried to (check 'vscode.d.ts') in my laptop. I dont find any file of it, there's only vscode.d. Well My code and project still works find but this error msg always appear every time I opened my vs code and kind of annoying to see. I use windows 11
This is what I found on github:
https://github.com/microsoft/vscode-pull-request-github/pull/4447/commits/f36acaff7b81f077db18e74a7c673cf249eba996
I tried to put the code in setting.json but it seems doesn't work. this is the code:
{
"name": "vscode-pull-request-github",
"displayName": "%displayName%",
"description": "%description%",
"icon": "resources/icons/github_logo.png",
"repository": {
"type": "git",
"url": "https://github.com/Microsoft/vscode-pull-request-github"
},
"bugs": {
"url": "https://github.com/Microsoft/vscode-pull-request-github/issues"
},
"enabledApiProposals": [
"tokenInformation",
"contribShareMenu",
"treeItemCheckbox",
"contribCommentPeekContext",
I've created a AWS SAM project using nodejs14.x. I've been able to get debugging working, but every time I run my program it stops on an AWS wrapper index.js file.
See this link for a picture of the actual file. It looks like the file is located at \var\runtime\index.js. If I press Continue, my program runs fine, and breakpoints in it are stopped at. But it's really annoying to have to press Continue to get past this file every single time I debug.
Does anyone know a way to "ignore" this file when debugging? In this help page about debugging in VSCode, it talks about a stopOnEntry variable you can set in launch.json. However, no Intellisense suggestions appear for stopOnEntry, and I'm not sure if aws-sam supports this.
If it helps, here's my launch.json file:
{
"configurations": [
{
"type": "aws-sam",
"request": "direct-invoke",
"aws": {
"credentials": "profile:<myemail>"
},
"name": "MyLambdaFunction:src/handlers/my-lambda-file.myLambdaFunction (nodejs14.x)",
"invokeTarget": {
"target": "code",
"projectRoot": "${workspaceFolder}/",
"lambdaHandler": "src/handlers/my-lambda-file.myLambdaFunction",
},
"lambda": {
"runtime": "nodejs14.x",
"payload": {},
"environmentVariables": {}
},
}
]
}
I am struggling with my VS Code setup in connection with the container extension.
My project structure has one project folder and several libraries on the same level (i.e. libraries are not subfolders of my project folder). The key thing is that I would like to save all config files in my project folder so the information how to bring up the project is version controlled with the project.
If I specify the workspace file as follows (using relative paths) and open the workspace file, things work fine locally.
{
"folders": [
{
"path": "."
},
{
"path": "../library1"
},
{
"path": "../library2"
},
]
}
However, when I try to bring this in my development container, I get the error message:
The workspace cannot be opened in a container. Folder c:\..\library1 is not a subfolder of shared root folder c:\..\project.
I could pull the project definition (and devcontainer.json file) one level up but then they are not under source control of my project folder.
Any ideas how to resolve this?
It seems to be a design limitation. Even if you set the workspace root of the container accordingly, it still doesn't seem to be possible to reference workspace folders outside of the folder containing the workspace definition.
https://github.com/microsoft/vscode-remote-release/issues/387
To start, we could support an open workspace command that is the
equivalent of doing "Open Folder in Container" followed by "Open
Workspace" but does not resolve these two limitations. Specifically it
would:
Look for a .devcontainer/devcontainer.json or .devcontainer.json file
in the same folder as the .codeworkspace file. Mount this folder into
the container and open the workspace. The .codeworkspace file would
only be able to reference sub-folders.
It's 2022 - now using the updated Dev Containers extension.
I have the following structure (in short):
├───deployment
├───documentation
├───service1
│ ├───.vscode
│ └───sources
├───service2
│ ├───.vscode
│ └───sources
└───workspace
├───acme.app.code-workspace
├───.devcontainer
└───devcontainer.json
acme.app.code-workspace
{
"folders": [
{
"path": "."
},
{
"name": "Service 1",
"path": "../service1"
},
{
"name": "Service 2",
"path": "../service2"
},
{
"name": "Deployment",
"path": "../deployment"
},
{
"name": "Documentation",
"path": "../documentation"
}
]
}
devcontainer.json
Nothing really of interest here. Just put in what you need.
{
"name": "Docker in Docker",
"image": "mcr.microsoft.com/devcontainers/base:bullseye",
"features": {
"ghcr.io/devcontainers/features/docker-in-docker:2": {
"version": "latest",
"enableNonRootDocker": "true",
"moby": "true"
},
"ghcr.io/devcontainers/features/kubectl-helm-minikube:1": {},
"ghcr.io/devcontainers/features/python:1": {
"version": "3.11"
}
}
}
So when you start the container, all should be fine.
The only drawback - whether the prompt is acceptable when starting the devcontainer:
This does not work if I bring the .devcontainer folder into it's own workspace folder.
{
"folders": [
{
"path": "."
},
{
"path": "../.devcontainer"
},
...
I need a work around pretty quickly - this was a late surprise in the dev process when we added an Az function to our development ADF pipeline.
When you use a function app in ADF V2, when you generate the ARM template, it does not parameterize the key references unlike in other linked services. Ugh!
So for CI/CD scenarios, when we deploy we now have a fixed function app reference. What we'd like to do is the same as other linked services - override the key parameters to point to the correct Dev/UAT /Production environment versions of the functions.
I can think of dirty hacks using powershell to overwrite (does powershell support ADF functions yet? don't know - in January they didn't).
Any other ideas on how to override function app linked service settings?
the key parameters are under typeProperties (assuming the function key is in keyvault):
{"functionAppUrl:="https://xxx.azurewebsites.net"}
{"functionkey":{"store":{"referenceName"="xxxKeyVaultLS"}}}
{"functionkey":{"secretName"="xxxKeyName"}}
Right now these are hard coded from the UI settings - no parameter and no default.
ok, eventually got back to this.
The solution looks a lot but it is pretty simple.
In my devops release, I create a Powershell task after both the data factory ARM template has been deployed and the powershell task for deployment.ps1 with the "predeployment=$false" setting has run (see ADF CI/CD here.)
I have a json file for each environment (dev/uat/prod) in my git repo (I actually use a separate "common" repo to store scripts apart from the ADF git repo and its alias in DevOps is "_Common" - you'll see this below in the -File parameter of the script).
The json file to replace the deployed function linked service is a copy of the function linked service json in ADF and looks like this for DEV:
(scripts/Powershell/dev.json)
{
"name": "FuncLinkedServiceName",
"type": "Microsoft.DataFactory/factories/linkedservices",
"properties": {
"annotations": [],
"type": "AzureFunction",
"typeProperties": {
"functionAppUrl": "https://myDEVfunction.azurewebsites.net",
"functionKey": {
"type": "AzureKeyVaultSecret",
"store": {
"referenceName": "MyKeyvault_LS",
"type": "LinkedServiceReference"
},
"secretName": "MyFunctionKeyInKeyvault"
}
},
"connectVia": {
"referenceName": "MyintegrationRuntime",
"type": "IntegrationRuntimeReference"
}
}
}
...and the PROD file would be like this:
(scripts/Powershell/prod.json)
{
"name": "FuncLinkedServiceName",
"type": "Microsoft.DataFactory/factories/linkedservices",
"properties": {
"annotations": [],
"type": "AzureFunction",
"typeProperties": {
"functionAppUrl": "https://myPRODfunction.azurewebsites.net",
"functionKey": {
"type": "AzureKeyVaultSecret",
"store": {
"referenceName": "MyKeyvault_LS",
"type": "LinkedServiceReference"
},
"secretName": "MyFunctionKeyInKeyvault"
}
},
"connectVia": {
"referenceName": "MyintegrationRuntime",
"type": "IntegrationRuntimeReference"
}
}
}
then in the devops pipeline, I use a Powershell script block that looks like this:
Set-AzureRMDataFactoryV2LinkedService -ResourceGroup "$(varRGName)" -DataFactoryName "$(varAdfName)" -Name "$(varFuncLinkedServiceName)" -File "$(System.DefaultWorkingDirectory)/_Common/Scripts/Powershell/$(varEnvironment).json" -Force
or for Az
Set-AzDataFactoryV2LinkedService -ResourceGroupName "$(varRGName)" -DataFactoryName "$(varAdfName)" -Name "$(varFuncLinkedServiceName)" -DefinitionFile "$(System.DefaultWorkingDirectory)/_Common/Scripts/Powershell/Converter/$(varEnvironment).json" -Force
Note:
the $(varXxx) are defined in my pipeline variables e.g.
varFuncLinkServiceName = FuncLinkedServiceName.
varEnvironment = "DEV", "UAT", "PROD" depending on the target release
Force is used because the Linked service must already exist in the Data Factory ARM deployment and then we need to force the overwrite of just the function linked service.
Hopefully MSFT will release a function app linked service that uses parameters but until then, this has got us moving with the release pipeline.
HTH. Mark.
Update: Added the Az cmdlet version of the AzureRM command and changed to Set ("New-Az..." worked but in the new Az - there is only Set- for V2 linked services).
I am trying to publish a private extension for AzureDevops on Visual Studio Marketplace. It is a .vsix package. The packaging goes well, I upload a package, but it doesn't pass a verification. I obtain the following error:
Extension validation error The task.json file was not found in
contribution xxx
And I don't know why I get this one as I have a task.json file. It is the first time that I am trying to upload a package, so I really have no idea where the problem comes from.
As Shayki mentioned that is one possible cause for the issue. Another possible issue will be the folder/path name
Make sure you will give the same name for the files as the name of the properties
"contributions": [
{
"id": "..."
"types": "..."
"targets": "..."
"properties": {
"name": "buildAndReleaseTask"
}
}
],
"files": [
{
"path": "buildAndReleaseTask"
}
]
For anyone that stumbles upon this question, the JSON file with your task configuration literally needs to be named "task.json". In your extension file, you need to give the name for each of your tasks folders, in which there must be an individual task.json file.
In the vss-extension.json you have this section:
"contributions": [
{
"id": "..."
"types": "..."
"targets": "..."
"properties": {
"name": "buildAndReleaseTask"
}
}
]
In my case the task.json was in buildAndReleaseTask folder, and the name in the properties was something else (the name you got in the error message), when I changed it to the name to buildAndReleaseTask (where the task.json exist) the error disappeared.