Packaging Applications for Azure Batch - azure-batch

I am having trouble packaging applications to get them to run in Azure Batch compute nodes. I am using user subscription with VM configuration, so I can't use application packages. I have been uploading my executable files and dlls as resource files. Currently, I have a task that requires a lot of dlls, but it seems that I can't upload more than 10 resource files through Azure portal.
What is the best way to package an application and all its required dlls to have it run on a batch compute node without using the built-in application package? Is there a way other than going through all its dlls and adding them individually manually as resource files?
How to go about the limitation of 10 resource files per task application?
Thanks!

Application package functionality for Virtual Machine configuration should be available now (documentation may be out of date). With that being said, answers to your questions:
Without using application packages, you can do one of the following: (1) create a SFX-archive (self-extracting archive) with your archiver of choice. Ensure that it can be silently installed without a GUI pop-up (e.g., 7-zip can do this) and run the SFX-archive command as part of your start task. (2) Zip up your files. Add the zip file and unzip.exe as your two resource files. Run the unzip command as part of your start task.
The service limit is not 10 (although that may be the limit in portal). You can add as many resource files up to the service limit which varies depending upon the length of your URLs. For large number of dependencies, please follow the recommendation from #1 or use Application Packages (if possible).

Related

Deploying config files to PLC

Is it possible include arbitrary files (in this case a .csv) from a TwinCAT project direct to the Boot directory of a PLC?
By using PATH_BOOTPATH in the file open/read FBs it is possible to load files from this directory in a convenient manner regardless of whether using a CE or Windows deployment, However deployment of files to this location seems to be the sticking point.
I know that a copy of the project code is included within the CurrentConfig<Project>.tpzip file, but this file is not easily accessible from code, or updateable.
I've found the 'Additional Files' section within the system configuration, but it makes little sense.
Adding a file from inside the project as a 'Relative' path doesn't seem to do anything
Adding a file from inside the project as an external path includes the file (via symbolic links?) in the 'CurrentConfig.tszip' file, which has the same issues as the .tpzip
Adding an external file as an external path again includes the file inside of the .tszip.
I'm willing to accept that this might not be possible, but it just feels odd that the PATH_BOOTPRJ and PATH_BOOTPATH roots are there and not accessing useful paths.
Deployment
To quote Beckhoff:
Deployment is used to set up commands that are to be executed during the installation and startup of an application.
The event types are essentially at what stage of the deployment process the command is performed, where the command can either be copying a file or execution of a script/program.
Haven't performed extensive testing but between absolute/relative pathing and execution this should solve nearly all issues with deployment configuration.

Packaging SF service into a single file

I am working through how to automate the build and deploy of my Service Fabric app. Currently I'm working on the package step and while it is creating files within the pkg subfolder it is always creating a folder hierarchy of files, not a true package in a single file. I would swear I've seem a .SFPKG file (or something similarly named) that has everything in one file (a zip maybe?). Is there some way to to create such a file with msbuild?
Here's the command line I'm using currently:
msbuild myservice.sfproj "/p:Configuration=Dev;Platform=AnyCPU" /t:Package /consoleloggerparameters:verbosity=minimal /maxcpucount
I'm concerned about not having a single file because it seems inefficient in sending a new package up to my clusters, and it's harder for me to manage a bunch of files on a build automation server.
I believe you read about the .sfpkg at
https://azure.microsoft.com/documentation/articles/service-fabric-get-started-with-a-local-cluster
Note that internally we do not yet support provisioning a .sfpkg file. This is a feature that will be coming in soon (date TBD). Instead, we upload each file in the application package.
Update (SF 6.1 - April 2018)
Since 6.1 it is possible to create a ZIP file (*.sfpkg) and upload it to an external store. Service Fabric executes a GET operation to download the sfpkg application package. For more infos see https://learn.microsoft.com/en-us/azure/service-fabric/service-fabric-package-apps#create-an-sfpkg
NOTE: This only works with external provisioning, the Azure image store still doesn't support sfpkg files.

Copy batch files to Jenkins Slaves with Different OS versions

I am running automated tests of our application on different versions of an OS build (Windows 7, Windows 10, etc...). My testing suite requires that I copy files to the Slave computers when there are changes in the tests (external to the build application). The test files are not in the Jenkins work space as they do not change frequently and therefore do not need to be copied to the Slave with each execution.
I am looking to be able to update the files on the Slaves, but not under the work space directory, so the Copy-To-Slave plugin will not work from my understanding.
I am looking to have batch files, testing resource files, DB generation scripts and others copied to the Slave computer by a Jenkins job. This job may monitor GIT, but not everything being copied is from GIT.
In essence, execute the following but to the Slave computer
xcopy C:\Testing*.* C:\Resources\Testing /s/v/e
The reason for this is our testing scripts look for certain files to execute (DB scripts for building the database for the current platform/DB Engine) and as these do not change too frequently, we only need to copy the files when they are changed, and leave the files in place for subsequent test runs. There is a large amount of files and GBs of data that does not need to be copied with each test run. There are also multiple executions of the application with the same testing files where the application has different configurations, but should produce the same results, so the test files do not need to be copied with each of these executions.
I found a configuration on the Copy-To-Slave plugin to add additional directories as destinations, that are relative to the file system root directory (C:\ in my case) which will solve my problem.

Deploy Click once as a single file?

I am looking to use click once to deploy an application for internal use, When publishing to the network share it creates several files and folders. (manifest, ApplicationFiles etc)
Is there a way to bundle this up as a single file, I do not fancy the idea of allowing other users access to the application Files folder that is created, I would rather just give them the exe and have it take care of everything else.
Does anyone have experience with this, or am I stuck with the application Folder, Application Manifest, and setup file all being in the same directory for installation.
There is not a way to package the whole application folder and files into one file, like an MSI with ClickOnce.
You could code something on your own to have a shell app that use ClickOnce and its only file would be your app compressed. The shell would download that compressed file to the client's machine and would unzip etc.
You could also InstallShield Limited Edition that comes with VS 2012/2013 in the Other Projects, Setup and Deployment but that does give you the ClickOnce easy of deployment features. You could use the InstallShield setup to be your compress file in your shell clickonce app and then just use Process.Start to launch the InstallShield setup. It should work.

.NET Application deployment (via MSI or ClickOnce) with dynamically specified parameters

Background: I have a .NET application, which must be deployed and auto-configured to work in multiple third-party environments. Currently, it gets deployed via posting a customer-compiled MSI to the intranet. The reason why MSI needs to be customer compiled is to specify deployment parameters, such as internal web service URLs to connect to.
Problem statement: both MSI and ClickOnce deployment installations must be signed; otherwise security popup shows. I have a signing key, but cannot sign them, since there is no information about which customer environment will it be used in. Customer has information about environment, so he can delpoy as ClickOnce or build MSI, but cannot sign them because he's got no key.
Question: Is it possible to launch pre-built MSI, executable, or ClickOnce application, from a web page, while supplying them parameter(s), such as the URL? Alternatively, is it possible to generate deployment package, such that it can determine from which URL it was downloaded (so that it is used to discover environment)?
Example Solution: One way of addressing the problem is renaming the file itself. For example: rename mysetup.msi to aHR0cDovL215aG9zdC5sb2NhbC9jb25maWcueG1s.msi. This will not break digital signature because file itself is not altered. But the name of MSI is accessible to custom actions, so an action would convert file name to a text and learn that it should read configuration from http://myhost.local/config.xml. That would work, but is pretty ugly. I'm looking for more elegant solution.
I have some difficulties to follow the reason for the "customer-compiled" msi strategy. Quite uncommon thing this :-)
Yes, if you mean with web deploy, the user shall double click on the msi, you cannot pass parameters.
Solution 1: Of course you could ask the user in setup dialogs to fill in the data. There was an earlier answer for this than mine.
Solution 2: Hardcode dependencies. You ask all your customers for their web URLs, try to find out a domain name or special registry key or environment variable for the machines of your customers and put this logic into the msi. The msi will be started on this special environment and could find the correct data. You can even ping different URLs or IPs to assure the environment. Takes some time but..
Not very beautiful, but you could get money every time they want to change something :-)
Solution 3: Prepared machines at the customer
Tell the customer: OK, custom compile is not a good solution, If have to deliver one sigend setup, please prepare all your machines by Group policy with a defined registry value(s) where all you specific URL data are captured. These are read out either by setup or by the application itself. Or let him put a custom .config file at a specific place on his own.
Solution 4: Two-step-deployment
Deploy the .config file with predefined custom URLs or other configs separately from you MSI. In your MSI you only check, if it exists in the same path already. if you choose an .ini file format, (instead of .xml) MSI is able with standard methods to read them in MSI properties. .XML is supported by tools like InstallShield or others.
Solution 5: Somewhat similar: Don't care for configuration in the setup. Install with the URL/config information and ask the user the first time the app starts, to provide the data or give a path for a config file containing the information.
Solution 6: If the other solutions are not for your case, let the customer sign the MSI with their own certificate. Make a batch script to help him. If the company is not capable of buying an own certificate, buy one separate certificate anywhere for each customer, and include the price in your price for the product and the support :-)
I think 4 and 5 are my favourites, actually.
An MSI supports passing parameters to it using public properties. The limitation that you have is from VS setup project, i.e. I don't know if it has the support to help you configure the package to accept the parameters.
The following tutorial about dialog editing, made with Advanced Installer, can show you what a standard MSI can do. A more advanced example is of importing an XML file in your setup, like a web.config, and configuring it to be updated at install time with connection parameters entered by the user during the install.
Of course all of these parameters support to be passed on the command line, during silent installation, it goes something like this: msiexec /i [msi-path] /qn MY_URL="http:www.example.com" USERNAME="John Doe"
Basically any column from a MSI table that accepts formatted data can be used to refer properties set by the user at install time, from the command line or the installer UI. The only limitation comes from the tool that you are using to build the setup package.