I would like to use a .MDF SQL Server database file on which my unit tests are performed during a Visual Studio Test when deploying via a DevOps pipeline.
I added the .MDF and .LDF files to the unit test project and I am able to execute locally. I have confirmed that the files are properly deployed to the pipeline using a PowerShell script that lists the folder contents. I have also confirmed case-sensitivity by ensuring everything is upper-case.
When adding a Visual Studio Test step to an existing working pipeline, I initially received this error when attempting to save to the database during test initialization:
Microsoft.Data.SqlClient.SqlException: Failed to update database "D:\A\1\S\APPCORETESTS\BIN\RELEASE\NETCOREAPP3.1\MYTESTDATABASE.MDF" because the database is read-only
I added the following call during test initialize:
ALTER DATABASE [MYTESTDATABASE.MDF] SET READ_WRITE
and am now getting the following error:
Unable to open the physical file "D:\a\1\s\AppCoreTests\bin\Release\netcoreapp3.1\MYTESTDATABASE.mdf". Operating system error 5: "5(Access is denied.)"
Unable to open the physical file "D:\a\1\s\AppCoreTests\bin\Release\netcoreapp3.1\myTestDatabase_log.ldf". Operating system error 5: "5(Access is denied.)".
I am able to open a connection to the .MDF file using the following connection string (from a config, hence the two backslash "\"):
"Server=(LocalDB)\\MSSQLLocalDB;AttachDbFileName='|DataDirectory|\\MYTESTDATABASE.MDF'"
and I am able to run a query that returns several databases including mine (as well as tempdb, master, etc):
select name from sys.databases
If this error was occurring locally, I would set the folder permissions for the folder in which the .MDF / .LDF reside, but do not know if that is possible in a DevOps pipeline or if it is the correct approach to solving this problem.
Which agent are you using, hosted agent or self hosted agent?
Azure DevOps Pipeline accesses file via service account, we should check the service account permission and ensure it has enough permission.
If you are using hosted agent, it access the file via this account test Build Service(Org name), we need open project settings->Repositories->select the repo and check the service account test Build Service ({Org name}) permission, and also check the Pipeline->Settings
If you are using self-hosted agent and access local file, you should check the file permission. Steps: Select the local->Properties->Security, the service account name should be Administrators(Agent service name\Administrators) or Users(Agent service name\Users).
By the way, we can change the agent service account to your owner account.
Steps: Open service on the agent machine and search the agent service account, check the pic below, just change the account name and password to yours, then It will use this account to perform the operation.
Related
Our pipeline mounts some SQL Server .mdf files as LocalDBs in order to run some tests.
After switching the pipeline from windows-2019 to windows-2022, our tests requiring the DB generate the following errors:
Database 'xxxxxxxxxxx' cannot be upgraded because it is read-only, has read-only files or the user does not have permissions to modify some of the files. Make the database or files writeable, and rerun recovery
The .mdf files are not read-only databases and mount fine locally.
I have tried recursively turning off the read-only flags on the directory in case the 2022 version of Windows is setting the build directories with different defaults and that has not helped. Any suggestions for why this might have happened / a fix?
Azure DevOps Pipeline accesses file via service account, we should check the service account permission and ensure it has enough permission.
If you are using MS hosted agent, it accesses the file via this account test Build Service(Org name), we need open project settings->Repositories->select the repo and check the service account test Build Service ({Org name}) permission, and also check the Pipeline->Settings
If you are using self-hosted agent and access local file, you should check the file permission. Just make sure the service account has the writable permission to the files.
Select the local file->Properties->Security, the service account name should be Administrators (Agent service name\Administrators) or Users (Agent service name\Users who has the correct permissions granted).
Just try to set file permissions of the .mdf to writable for BUILTIN\Users or service account.
I have a Windows 10 Enterprise VM running an Azure Devops Agent in Interactive mode. The agent runs using the only user the machine has, and it is an Administrator with UAC disabled. However, when executing tasks that require an elevated command prompt, as registering dlls, the command fails with the following error message:
##[error]Cmd.exe exited with code '5'.
And this message is when I'm trying to COPY files into Windows\SysWow64
##[error]Error: Failed cp: cp: copyFileSync: could not write to dest file (code=EPERM):C:\windows\SysWow64\test.txt
My testes also fail with the following error message:
Test method TestesRegressaoPGB.Autenticacao.AutenticarNoPGBL02 threw exception:
System.AggregateException: One or more errors occurred. (The requested operation requires elevation.) ---> System.ComponentModel.Win32Exception: The requested operation requires elevation.
TestCleanup method TestesRegressaoPGB.Autenticacao.Cleanup threw exception. System.NullReferenceException: System.NullReferenceException: Object reference not set to an instance of an object..
How do I run all commands while elevated?
In the Azure DevOps, If you want to access local file in the azure devops pipeline via self-hosted agent, It accesses the file via service account instead of Personal account.
Workaround
We could open check the file permission and configure the service account permission. service account format User/Administrator/Administrators({Agent.ComputerName}\User/Administrator/Administrators)
Also, we could change the agent service account to your owner account.
Steps: Open service on the agent machine and search the agent service account, check the pic below, just change the account name and password to yours, then It will use this account to perform the operation.
I just added permissions on each and every folder the pipeline agent needed to open. As I said to Vito, the agent was not running using a service account.
I have a python script that execute an automation script on remote SUT. and given that the script is working when execute locally with user tester and password xxx.
when I build the DevOps Azure pipeline I have checkout from GIT the project into the agent and then try to execute the code from the command line .
cd .\MatrixPro\TestFramework
python .\main.py -t profaund_tests.matrix_pro_rf_energy_across_impedances
this code gave me an error
E PermissionError: [WinError 5] Access is denied:
'//192.168.1.100\c$\'
seems that this script try to create report file on the SUT and doesn't have permission.
more over that the azure user agent have admin permission but I suspect that I need to change into the local user before execute the command.
note: I'm working on windows 10 .
what is the right method to solve this issue ? how can I figure out way this error occur ?
is their a simple way to change the pipeline permmision to work on local agent with local user and password?
When you run the build pipeline on Azure DevOps.
It's actually the build service account which is running the script. You should make sure the build service account have sufficient permission to Access: '//192.168.1.100\c$\'
To change the identity of the build agent, just go into Windows Services and change the identity of related Build service (service name is " Azure Pipelines Agent").
In the Services pane, right-click Azure Pipelines Agent.
Under Service Status, click Stop.
Click the Log On tab.
Specify the account you want to use for the service in the This
account text box.
Type the new password in the Password text box, and then type the
new password again in the Confirm password text box.
Under Service Status, click Start.
You should use a user to remote to that the server hold build agent and manually run the script to perform the deploy process. If that user is able to deploy succeed without any permission issue. Simply use this user as your build service account of Azure DevOps agent.
Hope this helps.
When building an artifact using Azure DevOps, "Publish artifact" task failed with error
Publishing build artifacts failed with an error: EPERM: operation not permitted
When you are using A file server option as an Artifact publish location for Publish task, make sure the Agent/Service has access to the server.
In my case, I added the agent in Administrator group to be able to do WRITE operation.
If you use self-hosted agent, by default it runs as local NETWORK SERVICE account that doesn't have access to remote shares. So you should either:
Change agent's logon account. For some reason Microsoft doesn't recommend to change it from Services snap-in, but rather reconfigure it from their scripts.
Grant to local NETWORK SERVICE account access to remote share. To do it, you need to add computer account with your agent (i.e. "DOMAIN\ServerA$") to Share and NTFS permissions on the shared folder. See for example here.
I am using visualstudio.com Teams Services to build and deploy an ASP.NET website to two Azure VMs.
I have a build which on completion triggers a release to my two servers in a deployment group. When you configure a Deployment Group for Visual Studio Team Services you create an agent that by default runs as NT AUTHORITY\SYSTEM.
If I publish my build artifacts to Azure (the server option) then everything works fine and deployment succeeds to both my VMS. However when using a file-drop I get the following error:
The artifact directory does not exist:
\\MACHINE1\drop\RRStore\20170517.20. It can happen if the password of
the account NT AUTHORITY\SYSTEM is changed recently and is not updated
for the agent.
This is basically saying MACHINE2 cannot access \\MACHINE1\drop due to permissions. In windows I can bring up this folder just fine, but since the agent is running as NT AUTHORITY\SYSTEM it cannot access it.
I want to use a filedrop because my website is about 250MB (although in the meantime I am using the 'publish to server' option and deploying via team services.)
I am unclear how to give permissions to the file drop though as the agent is running as SYSTEM. I am running as a WORKGROUP and giving permissions to 'Everyone' does not seem to work.
What is the correct way to configure access to a VSTS drop folder so that the deployment agent can access it?
Few possible options:
Set up a domain (I tried doing this but then I need a new network interface and it sounds klunky)
Continue using teamservices to deploy the artifacts (or reduce the website size!)
Save to a storage account, but again I'm not sure how to configure that.
Run as a different user account
I have similar problems when deploying with VSTS. Instead I chose to:
Run VSTS agent on the deployment group VM as a local user with limited access.
Impersonate the account on the deployment group VM to test its access to the drop folder.
Save/cache a different credential to access the drop folder if applicable.
(So the sensitive information stays on the VM.)
The cached credentials can be a different local user account created on the drop server just for this purpose.
Grant the local user access to various parts of the file system explicitly to limit access permission of this VSTS agent service runner account.
This should work in most cases. In fact, this same way is used in my VSTS, Jenkins and TFS instances. This should prevent you from setting up a domain to solve this problem.
This may not be the best practice, but at least it should get you started in the right direction.