If you want to configure your VS "Load Tests" to write the results to a database server, you use the following instructions.
If you want to run your "Load Tests" through powershell on a separate machine(think TFS 2018 release step), you use the following instructions.
I would like to do both, on multiple machines, in a automated manner, but there's not a great deal of documentation on this, I can run my tests like this:
.\mstest /testcontainer:"C:\XXX\ABC.loadtest"
But the results are kicked out to a "TRX" file rather than being placed into a database(there is some discussion on this). How do I put the results into a external database like when I run it locally(per instructions above)?
Note: #AdrianHHH points out that the "TRX" file is only a summary and that most of the info is stored locally(MDF/LDF file) in the user folder of current user running the load tests.
Update 1
Hmm I wonder where this is persisted:
(Curious, also click on the "?" icon in the "Manage Test Controller" box, nothing...)
It's not in the saved XML:
<RunConfigurations>
<RunConfiguration Name="Run Settings1" Description="" ResultsStoreType="Database" TimingDetailsStorage="AllIndividualDetails" SaveTestLogsOnError="true" SaveTestLogsFrequency="0" MaxErrorDetails="200" MaxErrorsPerType="1000" MaxThresholdViolations="1000" MaxRequestUrlsReported="1000" UseTestIterations="false" RunDuration="10" WarmupTime="0" CoolDownTime="0" TestIterations="100" WebTestConnectionModel="ConnectionPerUser" WebTestConnectionPoolSize="50" SampleRate="5" ValidationLevel="High" SqlTracingConnectString="" SqlTracingConnectStringDisplayValue="" SqlTracingDirectory="" SqlTracingEnabled="false" SqlTracingFileCount="2" SqlTracingRolloverEnabled="true" SqlTracingMinimumDuration="500" RunUnitTestsInAppDomain="true" CoreCount="0" ResourcesRetentionTimeInMinutes="0" AgentDiagnosticsLevel="Warning">
<CounterSetMappings>
<CounterSetMapping ComputerName="[CONTROLLER MACHINE]">
<CounterSetReferences>
<CounterSetReference CounterSetName="LoadTest" />
<CounterSetReference CounterSetName="Controller" />
</CounterSetReferences>
</CounterSetMapping>
<CounterSetMapping ComputerName="[AGENT MACHINES]">
<CounterSetReferences>
<CounterSetReference CounterSetName="Agent" />
</CounterSetReferences>
</CounterSetMapping>
</CounterSetMappings>
<LoadGeneratorLocations>
<GeoLocation Location="Default" Percentage="100" />
</LoadGeneratorLocations>
</RunConfiguration>
</RunConfigurations>
They're not persisted in my default "testsettings" file either:
<?xml version="1.0" encoding="UTF-8"?>
<TestSettings name="Local" id="02cad612-043b-447d-993e-a9b9b0547c9d"
xmlns="http://microsoft.com/schemas/VisualStudio/TeamTest/2010">
<Description>These are default test settings for a local test run.</Description>
<Deployment enabled="false" />
<Execution hostProcessPlatform="MSIL">
<TestTypeSpecific />
<AgentRule name="Execution Agents">
</AgentRule>
</Execution>
<Properties>
<Property name="TestSettingsUIType" value="UnitTest" />
</Properties>
</TestSettings>
So I need to find where ever this configuration information is being persisted, then maybe I can find a way to feed it to MSTest. Does anyone else understand how this works?
Update 2
My TRX file does contains a "connection string" but I don't think it's to my database, my database is empty, on running via powershell it completes, but all I see is the "TRX" file.
Update 3
This one is tricky, I keep trying various ways to determine where this "Manage Test Configuration" data/credentials is being stored. One of the ways I did this was to use Microsoft's Process Monitor. You can actually see where it initially is being populated from:
It's from a Application Hive, of course that's begs the question where did the "Application Hive" get populated from, that's where things get a bit murky, there's allot of different calls to many files. A common trend is that the "Temp\Local" folder is often referenced.
I deleted the entire "Temp" folder for my user account(in the process losing all my VS configuration) and upon reopening my solution it appears as though this had an effect. When I pull up my "LoadTest" file, the "Load test results store" line is now empty. In fact the entire "Manage Test Controller" window has been restored back to default(empty).
I know believe that the configuration for this "Manage Test Controller" window is persisted in the temp folder. However, I've yet to locate where it is and/or how to change/automatically populate that information with a powershell script.
Finally figured this out. Basically I used several tools to check what files were being modified when I changed the connection string, the results made it obvious:
privateregistry.bin
Once I found this it was pretty obvious that VS was maintaining it's own little registry hive. It's clearly stated in this post, so I opened it in the way described in the article and found the connection string:
This indicated that:
"The SQL Connection String is NOT stored in the loadtest files. The
setting seems to be PC specific so I had to change it on the build
server - in one loadtest file (address.loadtest) as shown, then all
the other loadtests adopt the same connection string."
So that's basically what I did, I logged into each build server and configured them so that they write all there results to my database rather than locally.
Load tests are clearly not designed to make this process easy, I don't think many people have attempted to do what I've done. All the articles just tell you to use their cloud service. I'm pretty sure that only covers web tests. If your using load testing to test unit tests you pretty much out of luck(without this work around). I really hope this gets official support in the future, it would be really nice to both run/view all types of load tests from TFS. For now though I'm going to have to keep using this work around.
Related
I am doing some initial one off setup using [BeforeTestRun] hook for my specflow tests. This does check on some users to make sure if they exist and creates them with specific roles and permissions if they are not so the automated tests can use them. The function to do this prints a lot of useful information on the Console.Writeline.
When I run the test on my local system I can see the output from this hook function on the main feature file and the output of each scenario under each of them. But when I run the tests via Azure DevOps pipleine, I am not sure where to find the output for the [BeforeTestRun] because it is not bound a particular test scenario. The console of Run Tests Tasks has no information about this.
Can someone please help me to show this output somewhere so I can act accordingly.
I tried to use System.Diagnostics.Debug.Print, System.Diagnostics.Debug.Print, System.Diagnostics.Debug.WriteLine and System.Diagnostics.Trace.WriteLine, but nothing seems to work on pipeline console.
[BeforeTestRun]
public static void BeforeRun()
{
Console.WriteLine(
"Before Test run analyzing the users and their needed properties for performing automation run");
}
I want my output to be visible somewhere so I can act based on that information if needed to.
It's not possible for the console logs.
The product currently does not support printing console logs for passing tests and we do not currently have plans to support this in the near future.
(Source: https://developercommunity.visualstudio.com/content/problem/631082/printing-the-console-output-in-the-azure-devops-te.html)
However, there's another way:
Your build will have an attachment with the file extension .trx. This is a xml file and contains an Output element for each test (see also https://stackoverflow.com/a/55452011):
<TestRun id="[omitted]" name="[omitted] 2020-01-10 17:59:35" runUser="[omitted]" xmlns="http://microsoft.com/schemas/VisualStudio/TeamTest/2010">
<Times creation="2020-01-10T17:59:35.8919298+01:00" queuing="2020-01-10T17:59:35.8919298+01:00" start="2020-01-10T17:59:26.5626373+01:00" finish="2020-01-10T17:59:35.9209479+01:00" />
<Results>
<UnitTestResult testName="TestMethod1">
<Output>
<StdOut>Test</StdOut>
</Output>
</UnitTestResult>
</Results>
</TestRun>
Going through MSDN and various other posts on line, I understand that we can very well do away with the older method of ini files for writing and retrieving configuration details so as to be able to change database path/connection string. (refer)
I am able to get the connection string data from an external file ("abcConnStr.config") and am able to read and write from / to the database tables as normally one would expect to.
However I am unable to crack the way I would be able to change / update the connection string value in the external file ("abcConnStr.config"). If I try to pass the value manually through a text box I get a message saying "Your step-into request resulted in an automatic step-over of a property or operator.....". When I click "Yes" the sub is completed without any further message or error. However when I open the config file ("abcConnStr.config"), I find that the value of the connection string has not changed.
May I request you to guide me as to how can I change the connection string parameter in the external config file ("abcConnStr.config").
Addl.info:
Here is the external config file that I am using:
File name: connStr.config
<?xml version="1.0" encoding="utf-8" ?>
<connectionStrings>
<add name="NewConStr"
connectionString="Provider=Microsoft.ACE.OLEDB.12.0;Data Source=&
quot;D:\CATALOG\CATALOG_DB\CAT_DB.accdb""
providerName="System.Data.OleDb" />
</connectionStrings>
I am using the following code line in appication.exe.config (app.config) file to access the database:
<connectionStrings configSource="connStr.config" />
I am able to connect to the database if it resides in the file location as defined in the external config file. However, when I change the database location (put in another folder or change the name) as expected, the connection to the DB fails. At this point I am catching the exception and trying to call the sub to effect a change to the path value assigned to the "connectionString".
In the sub I want to call code/s to change the DB file path as selected by the user thru'"OpenFileDialog".
Kindly advise a possible way to achieve this.
I got the solution on here.
However, I will have to do away with external config file (for now, till I get a solution on it) and have ended up making changes to the application.exe.config file. It is working seamlessly.
I am using VB 2010 Pro.
Thanks.
I have two folders for my migrations (AuthContext and UserProfileContext), each has their own migration and some custom sql to run afterwards for data migrations and whatnot.
This works fine when using package manager console. I
Restore from production
Run Update-Database -ConfigurationTypeName Migrations.Auth.Configuration
Run Update-Database -ConfigurationTypeName Migrations.UserProfile.Configuration
Then everything is very happy in the new database, migrations executed data shuffled where it needs to.
I tried to test out the migrations on publish piece by:
Restore production on dev database
Single connection string (all contexts use the same) pointed to dev database
Publish to azure web site
Checked the box for Apply Code First Migrations
Selected that single connection string
Okay it published fine; however, when I went to look at the database, nothing happened! It did not create the necessary tables, columns, or data moves.
TLDR; Code first migrations are not running after publish to Azure
Update 1
I've tried any combination of the below: only one single connection string so I'm guessing that's not the issue, and execute migrations is checked.
On publish the api runs but no database changes are made. I thought perhaps I needed to hit it first but I just get random errors when I try to use the api (which now of course relies on the new database setup), and the database is still not changed.
I've seen a couple references out there about needing to add something to my Startup class but I'm not sure how to proceed.
Update 2
I solved one issue by added "Persist Security Info=True" to my connection string. Now it actually connects to the database and calls my API; however, no migrations are running.
I attached debugger to Azure dev environment and stepped through... on my first database call it steps into the Configuration class for the Migration in question, then barfs and I can't track down the error.
public Configuration()
{
AutomaticMigrationsEnabled = false;
MigrationsDirectory = #"Migrations\Auth";
ContextKey = "AuthContext";
}
Update 3
Okay, dug down and the first time it hits the database we're erroring. Yes this makes sense since the model has changed, but I have migrations in place, enabled, and checked! Again, it works fine when running "Update-Database" from package manager console, but not when using Execute Code First Migrations during publish to Azure
The model backing the 'AuthContext' context has changed since the
database was created. Consider using Code First Migrations to update
the database (http://go.microsoft.com/fwlink/?LinkId=238269).
Update 4
Okay I found the root issue here. VS is setting up the additional web.config attrib for databaseInitializer on only one of my database contexts, the one not mentioned is in fact hit first from my app.
So now I have to figure out how to get it to include multiple contexts, or, combine all of my stuff into a single context.
The answer to this post is not very detailed.
This article explains what I had to do to fix a similar problem to this:
https://blogs.msdn.microsoft.com/webdev/2014/04/08/ef-code-first-migrations-deployment-to-an-azure-cloud-service/
I'll roughly describe the steps I had to take below:
Step 1
Add your connection strings to your dbContexts, in my situation, they were both the same.
Step 2
Add this to your web.config
<appSettings>
<add key="MigrateDatabaseToLatestVersion" value="true"/>
</appSettings>
Step 3
And add this to the bottom of your global.asax.cs / Startup.cs(OWIN startup)
var configuration = new Migrations.Configuration();
var migrator = new DbMigrator(configuration);
migrator.Update();
Solved! To summarize the solution for posterity:
Enable Code First Migrations only enables them for one base connection string per checkbox checked, regardless of how many contexts have migrations against that base connection string. So in my case I broke out the two in question into two different connection strings.
Then I was hitting other errors and identified that if you're changing the base connection string to the model backing asp identity you need to include (one time publish) the additional flag base("AuthContext" , throwIfV1Schema: false)
For anyone who has this issue and may have overlooked the following: be sure to check that you have correctly set the connection string in your Web.config file and/or Application settings on Azure. This includes DefaultConnection and DefaultConnection_DatabasePublish.
In our case the former was correct but the latter contained the wrong database instance because it had been carried over from an App Service clone operation. Therefore the wrong database was being migrated.
I want to run spring xd with Oracle(11g) which i already have in my environment. Currently my first concern is the jobs UI (my database has existing data of job executions that were performed by spring-batch and i simply want to display the details of those executions).
i'm using spring-xd-1.0.0.M5. I followed the instructions in the reference guide and i changed application.yml to have the following:
spring:
datasource:
url: jdbc:oracle:oci:MY_USERNAME/MYPWD#//orarmydomain.com:1521/myservice
username: MY_USERNAME
password: MYPWD
driverClassName: oracle.jdbc.OracleDriver
profiles:
active: default,oracle
i modified also batch-jdbc.properties to have the database configuration similar to the above.
Yet, when i start xd-singlnode.bat (or either xd-admin.bat) it seems like it ignores my oracle configuration and still uses the default hsqldb.
what am i doing wrong?
Thanks
The likely reason is that we did not upgrade the windows .bat scripts to take advantage of the property overriding via xd-config.yml. If you go into the unix script for xd-singlenode you will see that when java is invoked there there is an option
-Dspring.config.location=$XD_CONFIG
you can for now hardcode your location of that file, use file: as the prefix.
Also, The UI right now is very primitive, you will not be able to see many details about the job execution. There are however many job related commands you can execute in the shell and there is only one gap regarding step execution information as compared to what is available via spring-batch-admin.
The issue to watch for this is https://jira.springsource.org/browse/XD-1209 and it is schedule for the next milestone release.
Let me know how it goes, thanks!
Cheers,
Mark
When I add "" to my ServiceDefinition.csdef, it fails to start when publishing:
<WebRole name="xxx" vmsize="Small" enableNativeCodeExecution="true">
<Runtime executionContext="elevated" />
<Sites>...
Everything works with it gone. Need to add it to specify machine key in Azure SDK 1.3, as described here: http://msdn.microsoft.com/en-us/library/gg494983.aspx
Anyone else run into this?
Would running under admin privileges cause other code to break? When I RDP in, the error I find when I run on localhost is not related to this, but it's code that works when this runtime line is removed.
I just checked the Service Definition Schema and the Runtime
element does not exist inside WebRole.
Actually it does.
<WebRole name="<web-role-name>" vmsize="[ExtraSmall|Small|Medium|Large|ExtraLarge]" enableNativeCodeExecution="[true|false]">
...
<Runtime executionContext="[limited|elevated]">
I just checked the Service Definition Schema and the Runtime element does not exist inside WebRole.
If you would like to have Full Trust, you should go to yor CloudService project properties and set it there. But i really don't know if you need that to specify the machine key.
Best regards