currently I need to use powershell and Robocopy to copy from a source directory to target directory. It is a library files consists a lots of .dll.
I will need to overwrite the .dll that exist in target directory. However, some of the .dll in target directory are newer version. Therefore, before overwriting, i will need to prompt a warning to tell users about that. So that they are aware of that. How can i check and compare the version/properties of the .dlls before doing the robocopy?
I have checked that there is no such switch for robocopy.
Any solution for this ? Thank you
Related
I have a script that I've created to prep our customer's servers for a software install. Part of this requires the script to be run as administrator, so just instructing people to click "Run With Powershell" doesn't get the job done. The script is in a folder with a number of .ini files that the script needs to copy to different server locations. If I just right-click the Powershell script and select "Run With Powershell," it is able to find the files and copy them without issue. Unfortunately, if I open the script in ISE, it opens with a default directory of C:\users\user, and I can't seem to copy those .ini files without first running a change directory command to get us to the folder that the script and the .ini files are in. But I'd like our installation techs to be able to run this without worrying about the exact location they initially drop these folders. I'd also like them to not have to worry about changing the directory manually in PowerShell. Some of our customers have multiple drives, and it might make sense to put this stuff on something other than the C drive, so it's hard to tell where this folder might end up. But I'm not sure of a command that will get me to the directory of the *.ps1 file, without knowing where that file is beforehand... Anyone have a suggestion?
You can use $PSScriptRoot that will have the location of the directory where the script is located.
This is referenced in the following post:
How can I get the file system location of a PowerShell script?
I need to create a script (or command line) that will utilise 7-zip and archive all files dated a specific month-date. e.g. compress and archive all files in a specified directory that are date modified xx-may-2018 and create the archive in another specified directory.
There are plenty of scripts around for archiving but none that allow you to specify month-year.
Any help much appreciated.
Is there a way to make Progress Developer Studio 3.7 (Eclipse) generate all the wrx files (from the ocx) and place them in for example the rcode folder?
Clarification:
I dont know even how to make one wrx file. Have heard this "They get automatically created as soon as you drop an OCX control onto an ABL frame". But if you have removed that file, can you create it anew without having to redrop the control? And how do you automatically place it in a certain folder?
wrx files contain the properties of an ActiveX you set in the appbuilder.
If you loose the wrx, those properties revert back to default values. You should check-in the wrx files into your version control system together with the source .
To copy the wrx to the rcode directory I use robocopy.
suppose your sources are in a directory named src then you can copy them using
robocopy src rcode *.wrx /s
The wrx-file is generated when compiling in the AppBuilder.
See this entry in the Progress Knowledge base
In my project under "Resource Files" I have some properties files that I'd like to be copied to the output directory. The idea is that I could just give my output directory to someone else and they'd automatically read the properties files within the bin/output directory.
I believe the way I'd go about doing this is to add a build event command line command and use the XCOPY or COPY commannds. After having looked through the help for XCOPY the command is just
XCOPY src dest
And I used the command: XCOPY $(InputDir)/properties.conf $(OutDir)/properties.conf
but it says it cannot find the file. So I tried to find out what $(InputDir) points to, since other people got it to work, but the 'set' command in the VS command line tool only shows system env. variables and not ones available to vcprojects.
Any ideas on how to get this to work? Maybe there's a different way to do it?
SOLN: Just used "COPY properties.conf $(OutDir)\properties.conf"
just used "COPY properties.conf $(OutDir)\properties.conf"
I've been desperately looking for the answer to this and I feel I'm missing something obvious.
I need to copy a folder full of data files into the TARGETDIR of my deployment project at compile time. I can see how I would add individual files (ie. right click in File System and go to Add->File) but I have a folder full of data files which constantly get added to. I'd prefer not to have to add the new files each time I compile.
I have tried using a PreBuildEvent to copy the files:
copy $(ProjectDir)..\Data*.* $(TargetDir)Data\
which fails with error code 1 when I build. I can't help but feel I'm missing the point here though. Any suggestions?
Thanks in advance.
Graeme
Went to this route.
Created a new project (deleted the default source file Class1)
Added the files/folders necessary to the project.
Added the project as project output in the installer, choosing the option content files.
This removes the complexity of having to zip/unzip the files as suggested earlier.
Try
xcopy $(ProjectDir)..\Data\*.* $(TargetDir)Data /e /c /i [/f] [/r] /y
/e to ensure tree structure fulfilment (use /s if you want to bypass empty folders)
/c to continue on error (let the build process finish)
/i necessary to create the destination folder if none exists
/y assume "yes" for overwrite in case of previously existing files
[optionnal]
/f if you wanna see the whole pathes resulting from the copy
/r if you want to overwrite even previously copied read-only files
The method is simpler on the project than on files, yes. Beside, on files, it copies only the modified/missing files on each build but forces you to maintain the project on each data pack modification. Depends on the whole data size and the variability of your data pack.
Also beware the remaining files if you remove some from your data pack and rebuild without emptying your target folder.
Good luck.
I solved the problem by a workaround:
Add a build action of packaging entire directory (could be filtered) to a ZIP file.
Add a reference to an empty ZIP file to deployment project.
Add a custom action to deployment project to extract the ZIP to destination folder.
It's simple and stable.
Your error is probably because your path has spaces in it and you don't have the paths in quotes.
ex copy "$(ProjectDir)..\Data*.*" "$(TargetDir)Data\"
I need to do a similar thing. Thinking a custom action...
I found a different workaround for this. I added a web project to my solution that points at the data directory I want included in the deployment project. The web project automatically picks up any new files in the data directory and you can refer to the project content in the deployment project.