Problem
I have website hosted on Azure as a Azure Website (not in a separate virtual machine).
I am going to create job that will run systematically, crawl and index site content.
I want to be able to run PowerShell on my PC which will work with remote date and show progress and result on my PC.
Also I want to create reports on PowerShell. As I do not have dummy boss that needs Excel reports in order to understand data. PowerShell is more convenient for IT-oriented person like me.
Idea
I have some idea and need validation from community.
Is it a good idea to create PowerShell module DLL, add as reference to Web Application project?
In this case I may be able to connect to site machine remotely (not sure) import module and work with it.
Web Jobs now natively support PowerShell scripts. You can upload a zip file containing a .ps1 file and it will be executed without further configurations.
Have you tried using the Kudu powershell interface? It lets you run powershell commands on your site directly from the browser
You can find it at:
https:// [yoursitename] .scm.azurewebsites.net/DebugConsole/?shell=powershell
You'll need to login with your ftp user name and password
Related
I run some control-m jobs which generate files and places in UNIX box under various folders.
These files need to be sent to different users who don't have access to the system.
Each time, I have to copy these files from the Unix folder (based on which control-m job was run) to my local directory and then send those to the users.
I am looking for a way to automate this. I want to create an interface where users can specify parameters (Job names), which in turn will copy the file from the particular folder on Unix to a location user has to access to.
The way I think I might have to approach this problem is -
Share a directory on any Windows virtual machine which everyone has access to. (This will be my landing zone)
Create a script which transfers files from various folders on Unix to Windows directory, based on the parameters that are being passed.
Create an HTA interface where users can specify parameters, which in turn will trigger the script and transfer the file, user is looking for, to windows directory
I am not a programmer but I would like to develop something which will make everyone's life easier.
Could someone please advise if this approach is correct or if this can be achieved in a better way.
Moreover, which language will be a good choice to write this script in. I know a bit of shell scripting and PowerShell. Willing to learn anything else if that solves my problem.
Please advise.
Here is one solution:
Obtain empty Windows server
Install chocolatey
cinst winscp (to copy files)
Use https://github.com/tomohulk/WinSCP to automate file copy via posh script. Provide adequate parameters for it.
cinst rundeck --params /Service to provide graphical interface for users in web
Manually create rundeck job and expose parameters for users so they have nice web GUI. You can let users specify folder or let them choose from the list.
I have a SharePoint site and I would like to download all files and their metadata (specifically, the name of the creator and the name of the modifier for each time the file was modified, as well as the date of modification). I can see this metadata in the user interface, so I know it is available.
I have set up access to the SharePoint site via SharePoint Online Management Shell, and I am running it as an admin. I've followed all the steps for connecting to SharePoint, and I've confirmed that these steps have executed correctly.
However, when I try to deploy this script (substituting the 13th line with the name of my site), it errors out on the 'Get-SPWeb' cmdlet. Do I need to make other modifications to this script?
I am aware that there are other scripts for downloading SharePoint files, but I need to use this one because a SharePoint admininstrator told me that it would work in conjunction with a second script which extracts the metadata, found here: https://gallery.technet.microsoft.com/scriptcenter/get-file-meta-data-function-f9e8d804
Background: I have a .NET application, which must be deployed and auto-configured to work in multiple third-party environments. Currently, it gets deployed via posting a customer-compiled MSI to the intranet. The reason why MSI needs to be customer compiled is to specify deployment parameters, such as internal web service URLs to connect to.
Problem statement: both MSI and ClickOnce deployment installations must be signed; otherwise security popup shows. I have a signing key, but cannot sign them, since there is no information about which customer environment will it be used in. Customer has information about environment, so he can delpoy as ClickOnce or build MSI, but cannot sign them because he's got no key.
Question: Is it possible to launch pre-built MSI, executable, or ClickOnce application, from a web page, while supplying them parameter(s), such as the URL? Alternatively, is it possible to generate deployment package, such that it can determine from which URL it was downloaded (so that it is used to discover environment)?
Example Solution: One way of addressing the problem is renaming the file itself. For example: rename mysetup.msi to aHR0cDovL215aG9zdC5sb2NhbC9jb25maWcueG1s.msi. This will not break digital signature because file itself is not altered. But the name of MSI is accessible to custom actions, so an action would convert file name to a text and learn that it should read configuration from http://myhost.local/config.xml. That would work, but is pretty ugly. I'm looking for more elegant solution.
I have some difficulties to follow the reason for the "customer-compiled" msi strategy. Quite uncommon thing this :-)
Yes, if you mean with web deploy, the user shall double click on the msi, you cannot pass parameters.
Solution 1: Of course you could ask the user in setup dialogs to fill in the data. There was an earlier answer for this than mine.
Solution 2: Hardcode dependencies. You ask all your customers for their web URLs, try to find out a domain name or special registry key or environment variable for the machines of your customers and put this logic into the msi. The msi will be started on this special environment and could find the correct data. You can even ping different URLs or IPs to assure the environment. Takes some time but..
Not very beautiful, but you could get money every time they want to change something :-)
Solution 3: Prepared machines at the customer
Tell the customer: OK, custom compile is not a good solution, If have to deliver one sigend setup, please prepare all your machines by Group policy with a defined registry value(s) where all you specific URL data are captured. These are read out either by setup or by the application itself. Or let him put a custom .config file at a specific place on his own.
Solution 4: Two-step-deployment
Deploy the .config file with predefined custom URLs or other configs separately from you MSI. In your MSI you only check, if it exists in the same path already. if you choose an .ini file format, (instead of .xml) MSI is able with standard methods to read them in MSI properties. .XML is supported by tools like InstallShield or others.
Solution 5: Somewhat similar: Don't care for configuration in the setup. Install with the URL/config information and ask the user the first time the app starts, to provide the data or give a path for a config file containing the information.
Solution 6: If the other solutions are not for your case, let the customer sign the MSI with their own certificate. Make a batch script to help him. If the company is not capable of buying an own certificate, buy one separate certificate anywhere for each customer, and include the price in your price for the product and the support :-)
I think 4 and 5 are my favourites, actually.
An MSI supports passing parameters to it using public properties. The limitation that you have is from VS setup project, i.e. I don't know if it has the support to help you configure the package to accept the parameters.
The following tutorial about dialog editing, made with Advanced Installer, can show you what a standard MSI can do. A more advanced example is of importing an XML file in your setup, like a web.config, and configuring it to be updated at install time with connection parameters entered by the user during the install.
Of course all of these parameters support to be passed on the command line, during silent installation, it goes something like this: msiexec /i [msi-path] /qn MY_URL="http:www.example.com" USERNAME="John Doe"
Basically any column from a MSI table that accepts formatted data can be used to refer properties set by the user at install time, from the command line or the installer UI. The only limitation comes from the tool that you are using to build the setup package.
I'm trying to setup Microsoft reporting on a shared hosted server. I've set up the web.config files with the necessary entries and uploaded the assemblies Microsoft.ReportViewer.Common.dll and Microsoft.ReportViewer.WebForms.dll as well as the file Microsoft.ReportViewer.xml via FTP.
The site loads OK, but when I try to load a report a get a missing reference to Microsoft.ReportViewer.ProcessingObjectModel.dll. If I can get a hold of a copy of this dll, can I expect the report view to work? If so, what's the best way to get a copy? Or should I start trying to cajole the server administrator to run ReportViewer.exe?
This project is using Visual Studio 2008.
It appears that the answer is yes. I had never extracted a file from the GAC before, but it was pretty easy by following the command line method described in this question. Once the correct version of Microsoft.ReportViewer.ProcessingObjectModel.dll was extracted from the GAC on my development machine and uploaded to the site, the reports started working.
We've started using TFS2010 over at the company I work at. We create e-commerce web applications (shopping sites). I'm creating a custom template to deploy web projects after a build using a build template.
I've looked at the web deploy tool, but MSDN seems to indicate that it can only do initial deployments, and I need to be able to do incremental deployments with the same script.
I'm thinking of using the invokeActivity activity in the template to use powershell to do the job by specifying an FTP script which automatically copies the output of a build to a designated FTP site and then runs the SQL (upgrade) scripts, if needed by using SSH or s powershell remoting interactive session. (possibly specified in a separate SQL script)
There is some unknown for me which I can't get clear through the use of google:
When queuing a build, will I be able to let the user specify a script present in source control ( e.g. $(source)\scripts\ftpscript.ps1 ) as the script which is to be used? Will powershell be able to access/use that file? or should I copy it to the build directory and specify when I run it? (I don't know how to set up the template to get files from source control, so a pointer to some helpful info how to do that would be very much appreciated)
If the previous just doesn't work at all, could I create a folder \scripts\ in my website project, commit that to source control and then use BuildDetail.DropLocationRoot & "\scripts\" as the location for the script and fore a copy of the script files by enabling the force copy option?
To run a PowerShell script I think you can use the InvokeProcess activity which would trigger something like this:
%windir%\system32\windowspowershell\v1.0\powershell.exe "$(SolutionRoot)\test.ps1
And yes, you can reach a script file present in source control using the "SourcesDirectory" keyword.