NServiceBus not creating queues automatically (MSMQ) - msmq

I'm doing the examples from the NServiceBus website (http://docs.particular.net/samples/step-by-step/) and when I run in Visual Studio everything works and the queues are created automatically (even if I delete them and re-run the solution than they get created automatically). The dev machine runs Windows 8.
I have an emulated Windows 2008 RC2 - when I take my solution bin folder to the server machine and try to run NServiceBus.Host.exe it eventually crashes and the reason is: "The queue does not exist or you do not have sufficient permissions to perform the operation" - even when I run with administrative privileges. When looking the the queue I see that they were not created, and if I manually create the queues than it works.
Why doesn't queues get created automatically when I run on a different machine?
Update
If I run NServiceBus.Host.exe /install than it creates the queues, and also installs it a s windows service. But what I would like to do is run it regularly. If I just run the exe, is it considered as self hosted?

You can run the NServiceBus.Host.Exe with NServiceBus.Integration argument, that should create your queues...
For more info on profiles take a look here
and here
HTH

Related

Why won't .NET MAUI Run as "Windows Machine" On Different Development PCs

I have a MAUI application that I put into source control; then pulled down and attempt to develop on a different machine; on the second machine, in Visual Studio 2022 I no longer had the option to run "Windows Machine" it was replace with the name of the project and it wouldn't run it attempts to start but breaks in App.g.i.cs with the message
System.DllNotFoundException: 'Unable to load DLL 'Microsoft.ui.xaml.dll' or one of its dependencies: The specified module could not be found. (0x8007007E)'
How do I setup a MAUI application such that I can run it on different development PCs?
Steps to Replicate / Steps to Confirm Issue
To confirm the issue / attempt to see what the difference is I did the following
Machine 1: Create MAUI App from Template (no changes); confirm "Windows Machine" option for debugging; and it runs as expected
Machine 2: Create MAUI App from Template (no changes); confirm "Windows Machine" option for debugging; and it runs as expected
Copy Code from Machine 1 to Machine 2: Windows Machine option not available; application won't start up with above exception
Copy Code from Machine 2 to Machine 1: Windows Machine option not available; application won't start up with above exception
Use Git to compare the two applications: looks like only difference is OpenSans-Regular.ttf and OpenSans-Semibold.ttf and the ApplicationIdGuid ... note if change those values (copy files, change guid) it still won't permit Machine 1 code to have "Windows Machine" Option on Machine 2
Assumption: I'm assuming there is some difference between the two machines that is causing this; but I'm not sure how to determine that difference and resolve; both are running VS2022 17.3.6 64-bit; Xamarin 17.3.0.296; .NET Core 6
It seems the symptom that show the issue is that the MauiApp showing that "Windows Machine" as an optio to run up the application; if the App was created on the Machine you get that; but if you pass that code to another machine you see the csproj name and get this exception
Desired Outcome: My goal here is to have a single code base, committed in Git, that I can develop on two different machines with a MAUI App; no worries if there are settings or similar to support; or if one machine needs updates; just how to determine and get to that state is the issue / question.
Any help on resolution; or what that would happen (note: all non-Maui apps work across these two machines; console apps, windows services, web api, web app etc... commit on one; run on the other; it is only Maui apps having this difficulty)
Resolution: launchSettings.json not put into source control; copy launchSettings.json and everything works as expected
It looks like the issue here was less about the machine this was running on; and more that launchSettings.json were not on the target machine when transferring code
I copied the launchSettings.json from the template created to the target machine and things worked as expected from there.

Get console outputs from a script running inside VM on Pipeline

I have a pipeline with the steps below:
Create a Resource Group
Create a Virtual Machine from an image
Copy a Python project to VM created in the previous step
This Python project is an RPA (Robotic Process Automation). Now, I need to execute this Python project inside the VM and get the outputs from it to know what is going on, what the robot is doing. This robot read some sites and internal software.
Is there a way to execute a script inside a Virtual Machine from a Pipeline and get its console outputs? Any clue?
I saw this running on Jenkins.
If you are having Microsoft host the VM for you, there may be a way to "talk" to said VM as it is hosted on Microsoft's platform. However, the only way that I know of is to deploy an agent to the VM and select it as a resource target to run tasks on, such as copying your Python project and even running it.
Have you attempted this yet?

How to run a foreground task through WinRM / Remote Powershelling

I'm trying to incorporate a test suite that runs nightly on an Azure VM.
As of now, I have a Build process using TFS2015 that publishes my test files, starts the VM, and copies the files.
I'm trying to then use the "PowerShell on Target Machine" task to execute a script that launches a batch file. The reason it'll be executing a batch file is because I can't have the build process wait until that script is finished (it takes around 3 hours for the tasks in the batch file to complete).
My initial logic was to have the powershell script create a task using schtasks. This part works and the task itself is created on the virtual machine, however, it never runs at the scheduled time.
The other issue is that if I manually create these tasks, the task is executed, but everything is executed in the background. I need everything to be executed in the foreground.
I'm aware that this is by design since you should not be able to run foreground processes/applications remotely since it isn't "your session".
So the question will remain, are there any work arounds to this?
I'm trying to do launch selenium webservers and then execute protractor automation tests on the virtual machine. So one batch file starts the selenium server and the second launches protractor with a defined suite. If these are ran in the background (essentially headless) all my tests break.
Any insight would be helpful, or if I need to expand on my question or provide further details please let me know. Thanks.
I'm aware that this doesn't answer your specific question, but have you looked at moving your Selenium tests into VSTS? They're officially supported in the build/release pipelines and other people are taking this approach with protractor here and elsewhere. People are also making it work with TFS.

Visual Studio Online / Azure stopping and starting web applications using Powershell

I'm using Visual Studio Online's build tools to deploy web applications from a single solution. I've occasionally been running into file locking issues.
Error: Web Deploy cannot modify the file 'Microsoft.CodeAnalysis.CSharp.dll' on the destination because it is locked by an external process.
After some Googling, I believe the "fix" is to stop the web applications before deployment on Azure and start it back up after. Sounds legit.
However, there does not seem to be a straight forward way to do this directly on VSO's build definitions. I've created an "Azure Powershell" build task, but it wants a PS1 file from the repository. It doesn't seem to let me just run Azure Powershell commands (e.g. Stop-AzureWebsite) from here. My team has created a work-around where we have a "run.ps1" that just executes the command you pass as a parameter, but none of us are satisfied by that.
What are we missing? There has got to be an easier way to do this without having a PS1 script checked into source control.
I solved this by installing Azure App Services - Start and Stop extension from Visual Studio Marketplace.
When installed, it will allow you to wrap the Deploy Website to Azure task in your Release definition with Azure AppServices Stop and Azure AppServices Start tasks, effectively eliminating the lock issues.
Check if you are using "/" on the "Web Deploy Package" path for folder separators instead of "\".
i.e. change
$(System.DefaultWorkingDirectory)/My Project/drop/MyFolder/MyFile.zip
for
$(System.DefaultWorkingDirectory)\My Project\drop\MyFolder\MyFile.zip
I noticed that was the only difference between the one I was getting the error and the others (the Restart step I added was not helping). Once I modified the path, I got it working.
Sounds crappy, but fixed my issue.
Did you use the Build Deployment Template that sets the correct msbuild parameters for you for your package? You can see how here. I would create a build using that template and see if you have the same issues. If so ping me on Twitter #DonovanBrown and I will see if I can figure what is going on.
As a rule it is good practice to have any scripts or commands required to deploy your software to be checked into source control as part of your build. They can then be easily run repeatedly with little configuration at the build level. This provides consistency and transparency.
Even better is to have deployment scripts output as part of the build and use a Release Management tool to control the actual deployment.
Regardless having configuration as code is a mantra that all Dev and Ops teams should live by.

How to run scheduled Coded UI Tests on Virtual Machine without having a RDP connection

Situation in short:
Virtual Machine with Visual Studio 2013 installed. PowerShell script
runs on the VM to execute Get Latest, Build and Execute Coded UI
Tests. Windows Scheduled Task to execute PowerShell nightly.
auto-logon is enabled (or I'm doing something wrong?)
yes, I've read post Is it possible to run Coded UI tests without having to connect via remote desktop?
I've seen posts about TCM. Does this help and how can I use it in my
situation?
I made some tests in Microsoft Test Manager and I also executed and recorded them.
I've loaded these tests in a test project (and changed the script providing categories and custom checks).
I then categorized these (as Development or Acceptance).
I executed out using a PowerShell script on a VM (with Visual Studio 2013 installed) with following actions:
Get Latest
Build
Run latest build with a selected set aka category using mstest.exe
So far everything is going perfectly. All the tests pass.
However, when I create a Scheduled Task on my VM run the PowerShell script everything fails because of a missing session.
Do I have the VM (I have no knowledge of Virtual Machines) then unlock or something?
Side-Note:
I also tried to fix this with a Test Agent and Controller, but once I had installed these, all other users of TFS lost their rights, so I prefer not to do this again.
I would be very grateful if you know something that can solve this.
I spent hours on Google finding a solution for this issue, but no solution helped me.
Do I need to provide more information?
The problem you're seeming to have is that the testagent is not setup correctly. You need an active desktop session for Coded UI to be able to run (it needs it to perform all the actions such as clicks).
Microsoft has some nice info about setting up your test agent here.
But to tackle your exact problem of the test failing because of a missing session I'd suggest the following:
Run AutoLogOn.exe from the sysinternals suite (can be found at http://live.sysinternals.com/). It will automatically log in with specified user when the machine starts, and keeps the desktop session active.
Alright...I'm making progress.
I've installed test agent and controller. It's all running fine.
Next I've opened Lab Center on my own MTM to create a new environment.
Test Controller is found, but I receive the message in this post "Microsoft Test Manager cannot install test agent on these machines" when creating new Lab Center environment
File and Printer sharing exception is enabled. I don't get the other message.
I don't understand what is wrong.
Maybe I'm completely on the wrong track and it's not necessary to use the Lab Center.
Then the only remaining issue is the non-active desktop issue.