I'm looking for a way to move files from my machine to several servers on a web farm.
I currently use beyond compare to move the files over; one BC session for each server on the farm. I'm fine with this because BC is fast and I like the control I have.
Our business gave me a new requirement of automatic file distribution of images. I've read a little about DFS but I'm not sure that is the route I want to take. I want the files to actually end up on the servers.
Any tools for this out there. A scripting option perhaps?
We use Robocopy in order to copy identical configuration files to all servers in a citrix farm, works great.
Have a look into MSDeploy and MSBuild this might be a newer version of msdeploy
At my place I think we use WanSync though I couldn't be 100% sure because I'm not in that department.
Related
I'm struggling to find any method that works with current Unity.
This for a conventional Windows build (not a Windows Universal via VS).
So, there's the separate data, dll, etc files of a build: how to create a civilian-usable "single exe" for Windows, with current Unity??
As said afaik this was actually always the case.
See e.g. Windows standalone Player build binaries to see a list of resulting output of a build. It exists back until version 2017.2.
So the short answer is:
It is how it is. You will always get multiple files and the data folder as output.
What you can do however is using a pack tool which simply packs all your folder content into one single exe file.
One example is Appacker
=>
BUT unfortunately there is one known issue: Windows Defender recognizes it and every exe created with it as malware. The reason for that is actually mentioned by the author in the link
Spoiler: A self-extracting .exe file? Windows Defender hates that trick!
So either with this tool or any similar one there is no real way around that except you need to trust the tool and your users need to trust you ^^
(The icon is also only used for the process window, not for the exe file itself ^^)
The long and correct way would probably be to create an actual installer for your final app which is then allowed to extract all the files to a certain location.
So in the end the user anyway will again have an exe and according data and dll files e.g. in the Programs folder but get a registered shortcut to the Start Menu which is just how any other application on Windows usually works like.
Just to add to the answer.
In 2020 if it's a game you should just use Steam. Making auto-update way easier for your users.
https://partner.steamgames.com/doc/gettingstarted
I run some control-m jobs which generate files and places in UNIX box under various folders.
These files need to be sent to different users who don't have access to the system.
Each time, I have to copy these files from the Unix folder (based on which control-m job was run) to my local directory and then send those to the users.
I am looking for a way to automate this. I want to create an interface where users can specify parameters (Job names), which in turn will copy the file from the particular folder on Unix to a location user has to access to.
The way I think I might have to approach this problem is -
Share a directory on any Windows virtual machine which everyone has access to. (This will be my landing zone)
Create a script which transfers files from various folders on Unix to Windows directory, based on the parameters that are being passed.
Create an HTA interface where users can specify parameters, which in turn will trigger the script and transfer the file, user is looking for, to windows directory
I am not a programmer but I would like to develop something which will make everyone's life easier.
Could someone please advise if this approach is correct or if this can be achieved in a better way.
Moreover, which language will be a good choice to write this script in. I know a bit of shell scripting and PowerShell. Willing to learn anything else if that solves my problem.
Please advise.
Here is one solution:
Obtain empty Windows server
Install chocolatey
cinst winscp (to copy files)
Use https://github.com/tomohulk/WinSCP to automate file copy via posh script. Provide adequate parameters for it.
cinst rundeck --params /Service to provide graphical interface for users in web
Manually create rundeck job and expose parameters for users so they have nice web GUI. You can let users specify folder or let them choose from the list.
Background: I have a .NET application, which must be deployed and auto-configured to work in multiple third-party environments. Currently, it gets deployed via posting a customer-compiled MSI to the intranet. The reason why MSI needs to be customer compiled is to specify deployment parameters, such as internal web service URLs to connect to.
Problem statement: both MSI and ClickOnce deployment installations must be signed; otherwise security popup shows. I have a signing key, but cannot sign them, since there is no information about which customer environment will it be used in. Customer has information about environment, so he can delpoy as ClickOnce or build MSI, but cannot sign them because he's got no key.
Question: Is it possible to launch pre-built MSI, executable, or ClickOnce application, from a web page, while supplying them parameter(s), such as the URL? Alternatively, is it possible to generate deployment package, such that it can determine from which URL it was downloaded (so that it is used to discover environment)?
Example Solution: One way of addressing the problem is renaming the file itself. For example: rename mysetup.msi to aHR0cDovL215aG9zdC5sb2NhbC9jb25maWcueG1s.msi. This will not break digital signature because file itself is not altered. But the name of MSI is accessible to custom actions, so an action would convert file name to a text and learn that it should read configuration from http://myhost.local/config.xml. That would work, but is pretty ugly. I'm looking for more elegant solution.
I have some difficulties to follow the reason for the "customer-compiled" msi strategy. Quite uncommon thing this :-)
Yes, if you mean with web deploy, the user shall double click on the msi, you cannot pass parameters.
Solution 1: Of course you could ask the user in setup dialogs to fill in the data. There was an earlier answer for this than mine.
Solution 2: Hardcode dependencies. You ask all your customers for their web URLs, try to find out a domain name or special registry key or environment variable for the machines of your customers and put this logic into the msi. The msi will be started on this special environment and could find the correct data. You can even ping different URLs or IPs to assure the environment. Takes some time but..
Not very beautiful, but you could get money every time they want to change something :-)
Solution 3: Prepared machines at the customer
Tell the customer: OK, custom compile is not a good solution, If have to deliver one sigend setup, please prepare all your machines by Group policy with a defined registry value(s) where all you specific URL data are captured. These are read out either by setup or by the application itself. Or let him put a custom .config file at a specific place on his own.
Solution 4: Two-step-deployment
Deploy the .config file with predefined custom URLs or other configs separately from you MSI. In your MSI you only check, if it exists in the same path already. if you choose an .ini file format, (instead of .xml) MSI is able with standard methods to read them in MSI properties. .XML is supported by tools like InstallShield or others.
Solution 5: Somewhat similar: Don't care for configuration in the setup. Install with the URL/config information and ask the user the first time the app starts, to provide the data or give a path for a config file containing the information.
Solution 6: If the other solutions are not for your case, let the customer sign the MSI with their own certificate. Make a batch script to help him. If the company is not capable of buying an own certificate, buy one separate certificate anywhere for each customer, and include the price in your price for the product and the support :-)
I think 4 and 5 are my favourites, actually.
An MSI supports passing parameters to it using public properties. The limitation that you have is from VS setup project, i.e. I don't know if it has the support to help you configure the package to accept the parameters.
The following tutorial about dialog editing, made with Advanced Installer, can show you what a standard MSI can do. A more advanced example is of importing an XML file in your setup, like a web.config, and configuring it to be updated at install time with connection parameters entered by the user during the install.
Of course all of these parameters support to be passed on the command line, during silent installation, it goes something like this: msiexec /i [msi-path] /qn MY_URL="http:www.example.com" USERNAME="John Doe"
Basically any column from a MSI table that accepts formatted data can be used to refer properties set by the user at install time, from the command line or the installer UI. The only limitation comes from the tool that you are using to build the setup package.
I'm trying to think of a good solution for automating the deployment of my .NET website to the live server via FTP.
The problem with using a simple FTP deployment tool is that FTPing the files takes some time. If I FTP directly into the website application's folder, the website has to be taken down whilst I wait for the files to all be transferred. What I do instead is manually FTP to a seperate folder, then once the transfer is completed, manually copy and paste the files into the real website folder.
To automate this process I am faced with a number of challenges:
I don't want to FTP all the files - I only want to FTP those files that have been modified since the last deployment. So I need a program that can manage this.
The files should be FTPed to a seperate directory, then copy+pasted into the correct destination onces complete.
Correct security permissions need to be retained on the directories. If a directory is copied over, I need to be sure that the permissions will be retained (this could probably be solved by rerunning a script that applies the correct permissions).
So basically I think that the tool that I'm looking for would do a FTP sync via a temporary directory.
Are there any tools that can manage these requirements in a reliable way?
I would prefer to use rsync for this purpose. But seems you are using windows OS here, some more effort is needed, cygwin stuff or something alike.
I require a small space online (free) where I can
upload/download few files automatically using a script.
Space requirement is around 50 MB.
This should be such that it could be automated so I can set
it to run without manual interaction i.e. No GUI
I have a dynamic ip & have no tech on setting up a server.
Any help would be appreciated. Thanks.
A number of online storage services provide 1-2 GB space for free. Several of those have command-line clients. E.g. SpiderOak that I use has a client that can run in a headless (non-GUI) mode to upload files, and there's even a way to download files from it by wget or curl.
You just set up things in GUI mode, then put files into the configured directory and run SpiderOak with right options; files get uploaded. Then you either download ('restore') all or some of the files via another SpiderOak call or get them via HTTP.
About the same applies to Dropbox, but I have no experience with that.
www.bshellz.net gives you a free shell running Linux. I think everyone gets 50mb so you're in luck!