I am getting confused with this plug in.
Basically my target is to Deploy files from Server1 to Server2
Now the buildoutput dir is in a specific location in Server1
example: E:\BuildOutput\Apps\Application1\Bin\Release\
I need to deploy them in Server2: C:\Program Files\Tools\Application1\Bin\
How do I set up this plugin to work to what I need?
I am getting stressed in the amount of files that needs to be deployed to another server, I just wished a simple xcopy tool to another server could work.
I am looking for plugin if not this, to basically deploy only the files that has been changed to another server for automated feature testing.
Any methods will do too, that's if possible.
XCOPY should work fine. You need to create a share on Server2 in the desired location
Go to the Jenkins configuration and click "Add build step"->"Execute Windows batch command"
You should be able to execute any DOS commands you need there.
XCOPY E:\BuildOutput\Apps\Application1\Bin\Release\my_app.exe \\SERVER2\Share
If you don't want to share your applications bin directory:
Make a different share on Server2
Configured build to XCOPY to the new share
Add Server2 as a build node (Manage Jenkins->Nodes)
Create a new build job to move the files where you want them
Tie the new job to Server2 build node (Check the box "Restrict where this project can be run" in the job config
If your account has admin rights on Server2 you can just connect to the admin share of the C: drive like this:
XCOPY E:\BuildOutput\Apps\Application1\Bin\Release\* \\SERVER2\c$\Program Files\Tools\Application1\Bin\
Related
I have a use case where I need to copy some binaries from one windows server to another in a specified directory and then restart the IIS server. This will have below steps.
create backup of existing files in win server 1 -> stop IIS -> copy new files from win server 2 -> start IIS
The app is running on an ec2 instance.
Is there a way to automate this in the release phase of VSTS ? How can I achieve this if this is something not supported out of the box.
I need to copy some binaries from one windows server to another in a specified directory and then restart the IIS server.
You could use the task Windows Machine File Copy task to copy those binaries from one windows server to another:
- task: WindowsMachineFileCopy#2
inputs:
sourcePath:
#machineNames: # Optional
#adminUserName: # Optional
#adminPassword: # Optional
targetPath:
#cleanTargetBeforeCopy: false # Optional
#copyFilesInParallel: true # Optional
#additionalArguments: # Optional
Or you could create a share folder on the remote server, then we could just use the copy task those binaries:
To restart the IIS, just like the Krzysztof Madej said, you could use the PowerShell on Target Machines task to execute the powershell scripts to restart the IIS.
It should be doable. What you should do is:
enable PS remoting on ec2 instance to be able to stop and start iis. You can find this here how to do this. On Azure DevOps you may use PowerShell on Target Machines task
copy files from and into ec2 instance over scp. More details are here and for that you can use Copy Files Over SSH task
I am trying to deploy ear file and there is error comes saying
"A composition unit with name ace-ear already exists. Select a
different application name"
which is not there . what else can be the problem ?
1.Check the following locations to see if the application directories exist. If they do exist delete the application folder 'your_app'
<profile root>/config/cells/cellname/applications/your_app
<profile root>/config/cells/cellname/blas/your_app
<profile root>/config/cells/cellname/cus/your_app
2.Clear the contents of the profile/wstemp directory
3.Clear the contents of the profile/temp director
4.Restart the Application Server.
Though not recommended, you can manually remove the references of the ear from the server configuration files.
To check if the ear exists inside the profile, run the below command from Dmgr/config folder. Delete the ear files manually, if present.
find . -name '*ace-ear*'
To check if there are ear references in the configuration xmls, run the below command from Dmgr/config folder, and then remove those entries from the xml files manually, if present.
find . -name '*.xml' | xargs grep -i ace-ear
Post this, Restart the Deployment manager, sync nodes and restart the JVMs and try deploying the application.
NOTE : Be very careful updating the server configuration files manually, as any mistakes can corrupt the server configurations. Taking profile backups before applying any changes to server configuration files is recommended.
My problem was on remote environment, with ftp filezilla client searched for all occurrences of my application name inside appserver folders (Server --> Find remote files), then deleted all folder and files with the name of the application, restarted server, deploy again the application, succeeded
I use the VSTS online build and release process. I have two servers that have firewalls between them (can't just run a script on Server1 to accomplish moving a file between servers). I have installed the VSTS Agent on both servers and each server is assigned to its own Pool in VSTS. I can release to either server with out any issues.
What I can't figure out (or if its even possible) is how can I copy files FROM an agent in one pool to VSTS' release working directory/temp path (or even better from one pool, directly to another agent pool)?
For example, I have a Server1 in Pool1 and Server2 in Pool2. For my release steps, I have selected "Run on agent" and selected Pool1. I then have a task that copies files over to the agent and it does its thing. What I need to do is then pull down a zip file from a path accessible by the Pool1 server(s) to VSTS and send that zip file to a path accisble by the Pool2 server(s).
Is it possible to download a file from an agent pool? I assume if I was able to have the "run on pool1 steps" store that zip file somewhere in the release temp path/working dir, I would be able to do a windows file copy to send it from the working directory to the "run on pool2 steps".
If the other machine can be accessed by Windows Machine File Copy task. You can use Windows Machine File Copy task to copy files from a machine to another. Settings for the task as below:
Source: path for the files you want to copy, such as $(System.DefaultWorkingDirectory)/build/drop.
Machines: IP of the other machine you want to copy to.
Admin Login: ComputerName\AdminUserName.
Password: password for Admin user name.
Destination Folder: path for the other machine you want to copy files to.
If you still can't access the other machine based on firewall settings, you can upload the files to a place where both the two machines can access (such as your own website or github etc), and then use a power shell tash to download files from website.
No, there isn’t the task to download the file from an agent to other, also the middle server is needed if these servers can’t connect to each others to upload/download files (e.g. FTP, HTTP)
On the other hand, the files aren’t stored in agent pool, they are in agent, the agent pool is used to organize agents and different servers/agents can in the same agent pool, also a server can in multiple agent pools. Agent pools and queues.
It sounds like you are going to make a very brittle spaghetti type build. These files that are needed can they not be held in source control or as an artifact of a build? That way you can clone/pull the repo or just use the download artifact task that is out in preview now.
I want to run the sdbinst command on a .sdb database file as well as open it in the compatibility administrator. I have no problem doing this locally when the .sdb is stored on the machine i'm using, but i'd like to be able to open and run sdbinst on it when the file is stored in a network store location.
Is this possible?
Yes, according to the MS Help files within the MS Compatibility Toolkit.
See: "Mitigating Issues by using Compatibility Fixes". There is an example of a network deployment workflow: "Deploying the Contoso.sdb Database to your environment".
The basic pattern is to place the sdb on a network Share. Create a one line deployment script that references a path to that share.(sdbinst "\\SomePath\Ex.sdb" -q) Either push or execute the deployment script to/on each target computer in your environment.
I have followed this guide to install a jenkins slave on windows 8 as a service:
https://wiki.jenkins-ci.org/display/JENKINS/Installing+Jenkins+as+a+Windows+service#InstallingJenkinsasaWindowsservice-InstallSlaveasaWindowsservice%28require.NET2.0framework%29
I need to run a job that interact with the desktop (run an application that opens a browser etc.). So after I have installed the slave as a service (running jnlp downloaded from the master) I have changed the service "Log on" to "Allow to interact with display".
For some reason its only possible to enable this for the "Local System account" even though its recommended to run the service as a specified user, eg. jenkins.
But nothing happens when I execute the job, the browser is not opened. If I instead stop the service and just launch the slave through the jnlp file the job runs fine - the browser is opened.
Anybody had any luck interacting with the desktop when running a jenkins windows slave as a service?
Services run since Vista in Session 0 and the first user is now in Session 1. So you can't interact any longer. This is called Session 0 Isolation.
Microsoft explains this here and here. You have to use 2nd Program which uses IPC to communicate to the Service.
I had lots of issues running Jenkins in Windows using the service.
Instead I now disable the service and run it from CMD.
So open CMD.
cd C:\Program Files (x86)\Jenkins
java -Xrs -Xmx256m -Dhudson.lifecycle=hudson.lifecycle.WindowsServiceLifecycle -jar
jenkins.war --httpPort=9091
To resolve it, first create Windows auto-logon as I explain here:
https://serverfault.com/questions/269832/windows-server-2008-automatic-user-logon-on-power-on/606130#606130
Then create a startup batch for Jenkins agent (place it in Jenkins directory). This will launch agent console on desktop, and should allow Jenkins to interact with Windows GUI:
java -jar slave.jar -jnlpUrl http://{Your Jenkins Server}:8080/computer/{Your Jenkins Node}/slave-agent.jnlp
(slave.jar can be download from http://{Your Jenkins Server}:8080/jnlpJars/slave.jar)
EDIT :
If you're getting black screenshots (when using Selenium or Sikuli, for example), create a batch file that disconnects Remote Desktop, instead of closing the RDP session with the regular X button:
%windir%\system32\tscon.exe %SESSIONNAME% /dest:console
Consider running the Java slave server directly at startup and then using something to monitor and restart should the server go down (e.g., Kiwi Restarter).
Please check the services (# TestNode) make sure the "Interactive Services Detection" service is STARTED, by default the startup type is set to Manual, you may like to set it to automatic as well.
After service started, when you run your test in the Test Node, you will see something like the below:
Click on it and choose view the message
You will see the activities happen there. Hope this helps :D
Note: If login with other account and cannot view the Interative Services Detection prompt, restart the service again.
My Jenkins Service runs as user "jenkins" and all I did was to create Desktop folders in: C:\Windows\system32\config\systemprofile\desktop and if 64 bit Windows also in C:\Windows\SysWOW64\config\systemprofile\desktop - then it runs perfectly.
Make sure that Desktop folders are created as such:
%WINDOWS%/System32/config/systemprofile/Desktop
%WINDOWS%/SystemWOW64/config/systemprofile/Desktop
Presence of those can sometimes be mandatory while running some Java software as a Service.