Xcopy when source folder name is incremented each time - xcopy

I am trying to copy files via a .bat file on a scheduled basis. I know how to handle all of it except 1 piece - the issue is the source folder name is different each time. the first time it's C:\folder\new. Second time it will be C:\folder\new-0001, then C:\folder\new-0002. How do I account for the source folder changing?
This is Cerbot SSL renewal as standalone. Tomcat on Windows.
I plan to stop the webservice, renew the cert (which creates the new folder of new files), copy the files to Tomcat, then start the webservice.

Related

Eclipse immediately undeletes deleted file

When attempting to delete log4j.properties Eclipse undeletes in the very next moment. The file is not used at the moment, no build process is running, the project containing the file is not being deployed. Yet the file is being restored immediately. Even when the IDE and all shell (ie. Git bash) sessions are closed and the file is not open anywhere, it can be successfully deleted using Windows Explorer or Total Commander, but the moment the IDE is started the file is there again.
The intention was to replace the old properties structure with an xml, but I couldn't even overwrite it as upon saving the original content got restored. Any advice on this?

drag and drop ear file on wildfly to deploy project

I'm trying to deploy my project on wildfly using drag and drop way.
In fact, I drag and drop the ear project to wildfly server, as result, I got myProject-ear.ear.dodeploy on wildfly-10.0.0.Final\standalone\deployments.
I want to have myProject-ear.ear.deployed instead of myProject-ear.ear.dodeploy after drag and drop the ear project on the server.
Have you please any idea about solving my issue. Thanks a lot.
Whether drag&drop (or actually creating the war/jar/ear/... file in the deployments directory) is sufficient can be configured in the Wildfly configuration file (standalone.xml in your case). But since you create that file and see a ...dodeploy file popping up should tell you the deployment scanner has found your file and is acting.
Once the deployment finished, you should instead see a file named .deployed or .failed. In case of failure a log snippet inside the file could hint to the reason for failure.
But be aware of something: A drag&drop usually triggers a copy operation. Depending on the size of your file that copy may take some time. Wildfly's deployment scanner checks the directory every XXX seconds (configurable). So if your copy process started but the deployment scanner identifies the file before the copy is complete, Wildfly tries to deploy an imcomplete archive. This should result in an error message but may cause what you experience.
So it may be better to first copy the file to another directory (on the same disk), then just move/rename the file into the deployments folder - this operation is atomic, and the deployment scanner will immediately see the full file.
Another way would be to stop the deployment scanner completely and stop/start JBoss after every change to the deployments directory. This is anyway advisable if you run short on PermGen Space.

Copy batch files to Jenkins Slaves with Different OS versions

I am running automated tests of our application on different versions of an OS build (Windows 7, Windows 10, etc...). My testing suite requires that I copy files to the Slave computers when there are changes in the tests (external to the build application). The test files are not in the Jenkins work space as they do not change frequently and therefore do not need to be copied to the Slave with each execution.
I am looking to be able to update the files on the Slaves, but not under the work space directory, so the Copy-To-Slave plugin will not work from my understanding.
I am looking to have batch files, testing resource files, DB generation scripts and others copied to the Slave computer by a Jenkins job. This job may monitor GIT, but not everything being copied is from GIT.
In essence, execute the following but to the Slave computer
xcopy C:\Testing*.* C:\Resources\Testing /s/v/e
The reason for this is our testing scripts look for certain files to execute (DB scripts for building the database for the current platform/DB Engine) and as these do not change too frequently, we only need to copy the files when they are changed, and leave the files in place for subsequent test runs. There is a large amount of files and GBs of data that does not need to be copied with each test run. There are also multiple executions of the application with the same testing files where the application has different configurations, but should produce the same results, so the test files do not need to be copied with each of these executions.
I found a configuration on the Copy-To-Slave plugin to add additional directories as destinations, that are relative to the file system root directory (C:\ in my case) which will solve my problem.

Keep files when deploying .war in Glassfish 3.12

I've got a bit of a problem with deployments on my project and after hours of searching the web I can't find an answer to this.
Situation:
I am working on a Web application that lives of uploads and other files that get generated during use.
To keep things simple I store these into: .../mywebapp/web/some subfolders/*
So far, so good.
My Problem:
Every time I redeploy my project on the actual server (after updating classes/jsp's)
Glassfish deletes the entire content of .../mywebapp/ during redeployment.
My Procedure so far:
Export the latest version of my webapp as .war.
Add the changed files into the .war file on the server (rename to .zip, then back to .war)
Redeploy the .war on my server using the admin console (locahost:4848)
My question is
This current procedure is very prone to dataloss (I could lose the files!)
Is there a straight forward way where I can upload changes to my server without the risk of losing all the files that have been added during runtime?
I see two choices:
move the data 'out of harm's way' (find some place for it that isn't
in the deployment directory; like a database)
Switch to directory deployment instead of archive deployment.
The better of these two choices is the first one... It is more portable than the other; every server out there supports deploying archives. A lot of servers support directory based deployment... but they all do it a bit differently... so a directory structure that deploys on A may not deploy on B.
I had this same issue, solved using XCOPY and Event Scheduler.
Effectively, you are continuously sync two folders
Run a scheduled task for the following batch file every X minutes
sync.bat:
xcopy "domain1\applications\%YOUR_APP_NAME%l\path\to\folder" "D:\folder\to\sync" /D /I /Y
xcopy "D:\folder\to\sync" "domain1\applications\%YOUR_APP_NAME%l\path\to\folder" /D /I /Y
Switches:
/D - Only copy newer files if the destination file exists
/I - If the destination does not exist, and you are copying more than one file, this switch assumes that the destination is a folder.
/Y - Overwrite without prompting

Synchronizing with live server via FTP - how to FTP to different folder then copy changes

I'm trying to think of a good solution for automating the deployment of my .NET website to the live server via FTP.
The problem with using a simple FTP deployment tool is that FTPing the files takes some time. If I FTP directly into the website application's folder, the website has to be taken down whilst I wait for the files to all be transferred. What I do instead is manually FTP to a seperate folder, then once the transfer is completed, manually copy and paste the files into the real website folder.
To automate this process I am faced with a number of challenges:
I don't want to FTP all the files - I only want to FTP those files that have been modified since the last deployment. So I need a program that can manage this.
The files should be FTPed to a seperate directory, then copy+pasted into the correct destination onces complete.
Correct security permissions need to be retained on the directories. If a directory is copied over, I need to be sure that the permissions will be retained (this could probably be solved by rerunning a script that applies the correct permissions).
So basically I think that the tool that I'm looking for would do a FTP sync via a temporary directory.
Are there any tools that can manage these requirements in a reliable way?
I would prefer to use rsync for this purpose. But seems you are using windows OS here, some more effort is needed, cygwin stuff or something alike.