Is there a way to reset an existing dev cluster with powershell similar how the Service Fabric tray tool is doing it?
There sure is (requires Admin privileges):
& "C:\Program Files\Microsoft SDKs\Service Fabric\ClusterSetup\DevClusterSetup.ps1" -PathToClusterDataRoot c:\SfDevCluster\data -PathToClusterLogRoot c:\SfDevCluster\log
You can change the path to \data and \log to where ever you want. Also, if you want the new 1-node dev cluster, add the -CreateOneNodeCluster option.
& "C:\Program Files\Microsoft SDKs\Service Fabric\ClusterSetup\DevClusterSetup.ps1" -PathToClusterDataRoot c:\SfDevCluster\data -PathToClusterLogRoot c:\SfDevCluster\log -CreateOneNodeCluster
Note that when you run this, any existing cluster will be removed and replaced.
Related
I try the "run powershell on remote machines" task to restart my Tomcat (java) service on the Windows server.
It just keep printing useless info in the console (target machine name)
Here is the detail about the powershell script:
stop Tomcat service (call a .bat file)
move .jar files to right location & replace old files
start Tomcat service (call a .bat file)
& D:\MY\PATH\stop.bat;
......
Copy-Item -Path "D:/s-1.0.jar" -Destination $sqs_path -Force;
......
& D:\MY\PATH\start.bat;
When I run the same command directly in target windows server, the "powershell part" trigger .bat script job, then get back to powershell console successfully .
Later, a new window pops out. The new window is Tomcat server that shows logs of my service.
However, when I do the same job with Azure release pipeline, the Tomcat window did not show up in target machine.
And release job console keep hanging (just print the name of targer machine).
I guess somewhat the output of popout window has be redirected to the console in release pipeline.
In addition, if I cancel the release job. my Tomcat service still working. (just without console to debug)
Or, another thought, can I achieve my goal with other 'task' in release pipeline? (powershell is not a "must")
Any suggestion would be appreciated.
[Update1]
I change the service start commamd to run the .bat in another window.
& D:\MY\PATH\stop.bat;
......
Copy-Item -Path "D:/s-1.0.jar" -Destination $sqs_path -Force;
......
Start-Process cmd.exe -ArgumentList "/C D:\sources\SQS.Dev\start.bat;"
And it turns out "nothing happens" ...
The remote powershell task finish immediately.
The PowerShell on Target Machines task uses WinRM to connect and access the remote target machine.
Normally, WinRM requires the agent machine and the remote target machine have been joined into the same domain or workgroup.
Please check with the following things:
Ensure the agent machine and the remote target machine have been joined into the same domain or workgroup.
Ensure you have followed the steps here to configure the WinRM.
If you have configured the WinRM. Login to the agent machine, ensure you can connect and access the remote target machine when you manually try to call the remote PowerShell. You can try with the following script on the agent machine to test if the WinRM can work well to connect and access the remote target machine.
Param(
# The IP address or FQDN of the remote machine
[string]$computerIp = "{computerIp}"
)
# Username and Password of the admin account on the remote machine
$Username = "{Username}"
$Password = ConvertTo-SecureString "{Password}" -AsPlainText -Force
$cred = New-Object System.Management.Automation.PSCredential($Username,$password)
# Call the remote PowerShell script
Invoke-Command -ComputerName $computerIp -Credential $cred -ErrorAction Stop -ScriptBlock {Invoke-Expression -Command:"powershell.exe /c '{absolute path of the remote PowerShell script}'"}
If it does not work when you try manually calling the remote PowerShell script on the agent machine, the task in the pipeline of course is not able to work. The issue should be on the WinRM, maybe, the WinRM is not configured well.
If it work well when you manually try on the the agent machine, the issue should be on the self-hosted agent. Try to set up a new self-hosted agent with the admin account to see if it can work.
[UPDATE]
As I mentioned above, please try manually calling the remote PowerShell script on the agent machine to see if the commands for 'start Tomcat service' can work.
The PowerShell on Target Machines task is running on the agent machine. If the remote PowerShell script cannot work as expected when you manually try it on the agent machine, it is of course not able to work on the task in pipeline.
At this time, the possible reason of the issue could be the following:
The connection between the machines has some problems.
The PowerShell script has some issues. Maybe you missed some settings for remoting call the script.
I cannot find any PS script to restart splunk services.
I just wonder whether it can be doable to restart splunk service.
Usually it is run in command and change to splunk folder first
C:\program files\splunkuniversalforwarder\bin --then run command as below:
splunk restart
I would try one of the following:
Invoke-Expression "C:\program files\splunkuniversalforwarder\bin\splunk.exe restart"
OR
& "C:\program files\splunkuniversalforwarder\bin\splunk.exe restart"
This question already exists:
SCCM CMDrive can not be accessed from Jenkins Power Shell
Closed 6 years ago.
I have tried to execute below commands from Jenkins consecutively:
Import-Module -Name "C:\Program Files (x86)\Microsoft Configuration Manager\AdminConsole\bin\ConfigurationManager.psd1"
cd IPL:
But it seems like Jenkins is releasing the session after executing each command. Hence we have tried to keep a delay between two command, but no luck:
Import-Module -Name "C:\Program Files (x86)\Microsoft Configuration Manager\AdminConsole\bin\ConfigurationManager.psd1"
Start-Sleep -s 5
cd IPL:
While executing script, Jenkins is taking each command from its workspace directory D:\jenkins\workspace\<JobName>. I was looking to modify the configuration in such a way, where Jenkins will execute entire script (with all commands within) from the same drive where script is located. Not from the Jenkins workspace.
But not such materials available in google. I have looked into the Jenkins Workspace modification area (Jenkins->Manage Jenkins->Configure System and click on the Advanced)
Jenkins Workspace Modification
But it will not help as, it will only change the workspace path and whenever we will execute the script. Again it will take individual command inside workspace and will execute them over there instead of directory where script is located.
Is there any way can we execute all the command (from a single powershell script) without terminate the session for each command? So that the powershell script script can be executed for it own directory only?
Anyway, to answer this question (not touching the SCCM part).
Jenkins will execute all the command that are in the SAME step during the same Powershell session, so if you have 1 step in your Jenkins job, all the commands you put in there will be executed in the same PS shell.
As for the second question, you can use Set-Location to change the current working directory of powershell.
Windows Powershell was running using \Administrator account while Jenkins was running with System Account. So we have change the Jenkins service Logon from System account to Domain account: First, downloaded “PsExec.exe” and execute following command from command prompt ("C:\Users\Administrator.DUMMYIPSL\Desktop\PsExec.exe" –i –s cmd.exe) / Second, opened up Jenkins service and open the properties tab. Now from the Logon tab to change the priviliges from System account to the Domain Admin account and restarted the Jenkins service to execute the script. / Third, then triggered the build from Jenkins and it worked.
Windows10 PRO & PowerShell v5.1
I used Enable-WindowsOptionalFeatures cmdlet to enable the Active Directory Lightweight Directory Services.
When I try to use Set-ADDomain, it doesn't work, because Active Directory Web Services aren't running. I know I could use the ADLDS Setup Wizard to start this service. Does any cmdlet can achieve that ?
When you have enabled the windows feature Active Directory Lightweight Directory Services then no AD LDS instance has been added yet. You can add an AD LDS instance by running %systemroot%\ADAM\adaminstall, which is possible to run in silent mode as well if you need to script it.
See https://technet.microsoft.com/en-us/library/cc816778(v=ws.10).aspx and https://technet.microsoft.com/en-us/library/cc816774(v=ws.10).aspx for more information on how to add an AD LDS instance.
The command Set-ADDomain that you mention is to be used with an AD DS domain, which is not the same thing as AD LDS.
sc [\\dc.contoso] start adws
In an elevated command prompt can do the trick.
Get-Service -Name adws [-ComputerName dc.contoso] | Start-Service
Is the Powershell pendant.
In short:
No, there isn't "the special cmdlet to start ADWS", but it's easy to use PowerShell (or cmd) to start that service.
I have a cloud service with the following line of code in startup.cmd:
net use n: \\<storage-account>.file.core.windows.net\scorm /u:<storage-account> <storage-password>
This successfully creates the mapped drive to point to the Azure File Services share, but it shows in Windows explorer as a disconnected drive and any attempt to remove it using the 'Disconnect' option results in a "This network connection does not exist" although if I double click the folder I am successfully able to access the files.
If I run the same command through a cmd prompt the drive shows as connected with the name of the share and the path displayed. Do I need to do anything different in the PowerShell startup command to render the same results as the cmd line prompt?
The "net use" command only connects to the share in the context you are running. So you will have to run the "net use" in the same context your role will run.
For web roles this will be "NT AUTHORITY\NETWORK SERVICE". To run "net use" in that context, you need a tool like psexec.exe, which you can download from Windows Sysinternals.
Place psexec.exe into your role's bin directory, and set up an elevated startup script with this command:
psexec -accepteula -u "NT AUTHORITY\NETWORK SERVICE" net use n: \\<storage-account>.file.core.windows.net\test /u:<storage-account> <storage-password>
Drives are mapped to your user token, and administrators have two tokens. Limited and elevated. Make sure you are using consistent tokens. I.e. if mapped while Run As Aministrator then only programs running elevated can access that mapping.