I have a legacy Windows application (no source code) that does something with files in a given directory say C:\Pickup The directory path is hard coded into the application and cannot be changed. If I run multiple instances of this application, the instances will compete for the same files in C:\Pickup which is not good.
This application does not have a GUI. I launch it from Task Scheduler many times a day and it runs from 1 minutes to say 20 minutes depending on the number of files it needs to process in C:\Pickup
I am wondering if there is easy to use virtualization technology that will allow me to launch instances of this application in some virtual space where each instance gets its own C:\Pickup folder?
EDIT 1: I am thinking of a solution like IE uses for plug-ins (ActiveX controls) that run inside of it. Somehow when the plug-in accesses the file system, it gets it's own view of the file system. Does anyone know how IE does this?
You can just spin up a series of VM's with something like virtual box. Create a share and mount as D:\ on all of the VM's, then run a batch script to copy the files from your share to C:\Pickup.
Related
I am having trouble packaging applications to get them to run in Azure Batch compute nodes. I am using user subscription with VM configuration, so I can't use application packages. I have been uploading my executable files and dlls as resource files. Currently, I have a task that requires a lot of dlls, but it seems that I can't upload more than 10 resource files through Azure portal.
What is the best way to package an application and all its required dlls to have it run on a batch compute node without using the built-in application package? Is there a way other than going through all its dlls and adding them individually manually as resource files?
How to go about the limitation of 10 resource files per task application?
Thanks!
Application package functionality for Virtual Machine configuration should be available now (documentation may be out of date). With that being said, answers to your questions:
Without using application packages, you can do one of the following: (1) create a SFX-archive (self-extracting archive) with your archiver of choice. Ensure that it can be silently installed without a GUI pop-up (e.g., 7-zip can do this) and run the SFX-archive command as part of your start task. (2) Zip up your files. Add the zip file and unzip.exe as your two resource files. Run the unzip command as part of your start task.
The service limit is not 10 (although that may be the limit in portal). You can add as many resource files up to the service limit which varies depending upon the length of your URLs. For large number of dependencies, please follow the recommendation from #1 or use Application Packages (if possible).
I'm opening this one as it's related to another thread I've opened but not the same problem.
I currently have 2 scripts to monitor a folder. In this folder (a sharepoint site folder), is a powerpoint presentation running on a remote laptop. The goal is to have the possibility to change the presentation without having to go to the laptop itself (as there will eventually be a large number of them in remote locations).
So what I'm doing so far is a scrip1 that monitors when a file is dropped in the folder. This script then shut downs the presentation currently running.
Another script monitors if a ppt file is renamed in the folder (as the instructions will be: 1-dump your file 2-delete Slide.pptx 3- rename your new file as Slide.pptx). I then kicks off Powerpoint with the presentation file.
But don't know how to have both scripts running at the same time. Even having 1 calling 2 and when 2 has run it calls 1 again.
Any ideas ?
Thanks
Don't reinvent. You can use PowershellGuard. Original ruby guard is also what I have used for this purpose (it is way more mature).
Guards allow you to run multiple scripts mapped to file specification.
I am running automated tests of our application on different versions of an OS build (Windows 7, Windows 10, etc...). My testing suite requires that I copy files to the Slave computers when there are changes in the tests (external to the build application). The test files are not in the Jenkins work space as they do not change frequently and therefore do not need to be copied to the Slave with each execution.
I am looking to be able to update the files on the Slaves, but not under the work space directory, so the Copy-To-Slave plugin will not work from my understanding.
I am looking to have batch files, testing resource files, DB generation scripts and others copied to the Slave computer by a Jenkins job. This job may monitor GIT, but not everything being copied is from GIT.
In essence, execute the following but to the Slave computer
xcopy C:\Testing*.* C:\Resources\Testing /s/v/e
The reason for this is our testing scripts look for certain files to execute (DB scripts for building the database for the current platform/DB Engine) and as these do not change too frequently, we only need to copy the files when they are changed, and leave the files in place for subsequent test runs. There is a large amount of files and GBs of data that does not need to be copied with each test run. There are also multiple executions of the application with the same testing files where the application has different configurations, but should produce the same results, so the test files do not need to be copied with each of these executions.
I found a configuration on the Copy-To-Slave plugin to add additional directories as destinations, that are relative to the file system root directory (C:\ in my case) which will solve my problem.
I am looking to use click once to deploy an application for internal use, When publishing to the network share it creates several files and folders. (manifest, ApplicationFiles etc)
Is there a way to bundle this up as a single file, I do not fancy the idea of allowing other users access to the application Files folder that is created, I would rather just give them the exe and have it take care of everything else.
Does anyone have experience with this, or am I stuck with the application Folder, Application Manifest, and setup file all being in the same directory for installation.
There is not a way to package the whole application folder and files into one file, like an MSI with ClickOnce.
You could code something on your own to have a shell app that use ClickOnce and its only file would be your app compressed. The shell would download that compressed file to the client's machine and would unzip etc.
You could also InstallShield Limited Edition that comes with VS 2012/2013 in the Other Projects, Setup and Deployment but that does give you the ClickOnce easy of deployment features. You could use the InstallShield setup to be your compress file in your shell clickonce app and then just use Process.Start to launch the InstallShield setup. It should work.
I require a small space online (free) where I can
upload/download few files automatically using a script.
Space requirement is around 50 MB.
This should be such that it could be automated so I can set
it to run without manual interaction i.e. No GUI
I have a dynamic ip & have no tech on setting up a server.
Any help would be appreciated. Thanks.
A number of online storage services provide 1-2 GB space for free. Several of those have command-line clients. E.g. SpiderOak that I use has a client that can run in a headless (non-GUI) mode to upload files, and there's even a way to download files from it by wget or curl.
You just set up things in GUI mode, then put files into the configured directory and run SpiderOak with right options; files get uploaded. Then you either download ('restore') all or some of the files via another SpiderOak call or get them via HTTP.
About the same applies to Dropbox, but I have no experience with that.
www.bshellz.net gives you a free shell running Linux. I think everyone gets 50mb so you're in luck!