Create QlikView task through command line - powershell

I want to be able to create a task for qvw files with a command (cmd, powershell, etc), just as you would through the QlikView Management Console. We would like to be able to automate some of the task creation remotely which would require this functionality.
I know that there are arguments that can be passed through the qv.exe to reload the document, but I want to actually create a task through a command line. Is this possible?

Don't think that this is possible out of the box. You can control QV server through QlikView Management Services API. But for this reason you need to build a .net command line app that will do whatever you want.
If you are interested follow this link for more info.

Related

Can I re-run a Power Automate flow instance from history?

Is there any way to find and re-run an earlier instance of a Power Automate workflow programmatically?
I can do this manually: download the .csv file containing the instances, search in the Trigger output column the one I want, get the id, copy-paste the run URL, and click resubmit.
I tried with Power Automate itself:
The built-in Flow Management connector supports only to find a specific flow by name, and does not even go to the history.
PowerShell:
Installed the PowerApps module, I can list the instances with
Get-FlowRun -FlowName {flow name}
But I don't see the same properties as in the exported .csv file, and there's also no Run-Flow command that would let me run it.
So, I am a little stuck here; could someone please help me out?
We cannot programmatically resubmit the Flow run from the history with PowerShell or by any other api method yet.
But can avoid some manual work by using workflow function in a Flow compose step, we can automate the composition of Flow history run url. Read more
https://xxx.flow.microsoft.com/manage/environments/07aa1562-fea6-4583-8d76-9a8e67cbf298/flows/141e89fb-af2d-47ac-be25-f9176e64e9a0/runs/08586722084717816659969428791CU12?backUrl=%2Fflows%2F141e89fb-af2d-47ac-be25-f9176e64e9a0%2Fdetails&runStatus=Failed
There are 3 guids that I need to find aso that I can build up the flow history url.
The first guid is my environmentName (07aa1562-fea6-4583-8d76-9a8e67cbf298), then I’ve got the flow name ( 141e89fb-af2d-47ac-be25-f9176e64e9a0) and finally the run (08586722084717816659969428791CU12).
There is a cmdlet from Microsoft 365 CLI to resubmit a flow run
m365 flow run resubmit --environment flowEnvironmentID --flow flowGUID --name flowRunID –confirm
You can also resubmit a flow run using Power Automate REST API
https://api.flow.microsoft.com/providers/Microsoft.ProcessSimple/environments/{FlowEnvironment}/flows/{FlowGUID}/triggers/manual/histories/{FlowRunID}/resubmit?api-version=2016-11-01
For the Power Automate REST API, you will have to pass an authorization token.
For more information, go through the following post
https://ashiqf.com/2021/05/09/resubmit-your-failed-power-automate-flow-runs-automatically-using-m365-cli-and-rest-api/

Automation using powershell

I want to create a script that will launch an application and configure it completely with provided input values.
ex: Launch outlook-> enter default address-> choose connect ->proceed with next options and finish
How can this be achieved using powershell script?
I am able to launch the applications but not sure how to input values to it.
Ths is a very broad topic, and the answer will depend on the specifics of any given application.
Here is a technet article that dscusses various ways to launch an application, with the benefits and drawbacks of each: https://social.technet.microsoft.com/wiki/contents/articles/7703.powershell-running-executables.aspx
There are sevaral approaches depending on the app & installer but the basic key is that you must be able to configure the app manually using nothing but command line tools.
If you can do it manually, e.g., one command at a time then scripting the full install becomes easy.
Most/many installers have switches for "silent install" though Outlook is a complicated setup, start there.
Many installers will show there options when you type: InstallerName /?
Usually the options you need are /silent or /q or /quiet and perhaps.
Office has the concept of creating .MST files (t=transform) to provide additional configuration so search for "creating an officer transform" file.
For some apps, you can monitor the registry with a "registry watcher" app and then write a script to make the same/similar settings for each install.
With Outlook, I would first pursue the MST/transform route and ask this question in places where Office and Outlook are discussed.

How to run a powershell script on Amazon EC2 instance at Startup?

I have to think this is a solved issue but I am just not getting it to work. So I have come to you StackOverflow with this issue:
I have a windows server 2016 machine running in amazon ec2. I have a machine.ps1 script in a config directory.
I create an image of the box. (I have tried with checking noreboot and unchecking it)
When I create a new instance of the image I want it to run machine.ps1 at launch to set the computer name and then set routes and some config settings for the box. The goal is to do this without logging into the box.
I have read and tried:
Running Powershell scripts at Start up
and used this to ensure user data was getting passed in:
EC2 Powershell Launch Tools
I have tried setting up a scheduled task that runs the machine.ps1 on start up (It just hangs)
I see the initializeInstance.ps1 on start up task and have tried to even coop that replacing the line to run userdata with the line to run my script. Nothing.
If I log into the box and run machine.ps1, it will restart the computer and set the computer name and then I need to run it once more to set routes. This works manually. I just need to find a way to do it automagically.
I want to launch these instances from powershell not with launch configurations and auto scale.
You can use User data
Whenever you deploy a new server, workstation or virtual machine there is nearly always a requirement to make final changes to the system before it’s ready for use. Typically this is normally done with a post-deployment script that might be triggered manually on start-up or it might be a final step in a Configuration Manager task sequence or if you using Azure you may use the Custom Script Extension. So how do you achieve similar functionality using EC2 instances in Amazon Web Services (AWS)? If you’ve created your own Amazon Machine Image (AMI) you can set the script to run from the Runonce registry key, but then can be a cumbersome approach particularly if you want to make changes to the script and it’s been embedded into the image. AWS offers a much more dynamic method of injecting a script to run upon start-up through a feature called user data.
Please refer following link for ther same:
Poershell User data
Windows typically won't let a powershell script call another powershell script unless it is being run as Administrator. It is a weird 'safety' feature. But it is perfectly okay to load the ps1 files and use any functions inside them.
The UserData script is typically run as "system". You would THINK that would pass muster. But it fails...
The SOLUTION: Make ALL of your scripts into powershell functions instead.
In your machine.ps1 - wrap the contents with function syntax
function MyDescriptiveName { <original script contents> }
Then in UserData - use the functions like this
# To use a relative path
Set-Location -Path <my location>
# Load script file into process memory
. <full-or-relpath>/machine.ps1
# Call function
MyDescriptiveName <params-if-applicable>
If the function needs to call other functions (aka scripts), you'll need to make those scripts into functions and load the script file into process memory in UserData also.

Is it possibile to remotely process an SSAS cube throgh script?

I have an SQL Server Analysis Service (SSAS) cube (developed with BIDS 2012) and I would like to give the opportunity to the users (that use cube through PowerPivot) to process the cube in their local machines.
I found some material on how to make a scheduled job on the server through Powershell or SQL Agent or SSIS but no material on remotely process the cube. Any advice?
There are several possibilities to trigger a cube processing. The low level method is issuing an XMLA statement to the database containing the cube. To see how this looks like, open SQL Server Management Studio, connect to the AS instance, right-click on an AS database, and select "Process". Configure the processing settings, but instead of hitting OK, select "Script from the top toolbar to have the XMLA process command be generated for you. Leave the dialog with Cancel.
All methods that process a cube end in some way or the other in sending a command like this to the AS database.
There are several options to trigger a cube processing:
In Management Studio, by clicking OK in the above mentioned dialog.
In PowerShell (see http://technet.microsoft.com/en-us/library/hh510171.aspx).
In Integration Services, there is an Analysis Services processing task (http://msdn.microsoft.com/en-us/library/ms141779.aspx).
You can set up a SQL Server Agent job, job steps could either be a direct XMLA step, or an Integration Services step containing the process task (among possibly other tasks).
The question, however, is how the setups described above can be accessed by end users. An important issue here is of course that the user executing the process task needs to have the permission to process the cube. As you might not want to give this permission directly, it might make sense to use some impersonation on the way of calling it. With Management Studio - and as far as I am aware with PowerShell - this cannot easily be achieved.
Integration services and Agent jobs offer the possibility of impersonations. Integration services packages are executed by the dtexec command line tool (part of the SQL Server client tools), there is also a tool called dtexecui (available as "Execute Package Utility" in a standard SQL Server client tool installation), which lets you use a dialog to configure all settings, and then execute a package, but it also can display the command line for dtexec, according to your settings.
And to call a SQL Server Agent job, an easy interface are the stored procedures (http://msdn.microsoft.com/en-us/library/ms187763.aspx), especially sp_start_job (Note this is asynchronous, you call it, it starts the job and returns. It does not wait for the job to complete before returning.) and sp_help_jobactivity to ask for job status as well as sp_help_jobhistory for details of jobs that were running.
All in all I think there is no final solution available, but I mentioned some building blocks that you could use to code your own solution, depending on the preferences in your environment.

Application Deployment with Powershell

I've developed a Powershell script to deploy updates to a suite of applications; including SQL Server database updates.
Next I need a way to execute these scripts on 100+ servers; without manually connecting to each server. "Powershell v2 with remoting" is not an option as it is still in CTP.
Powershell v1 with WinRM looks the most promising, but I can't get feedback from my scripts. The scripts execute, but I need to know about exceptions. The scripts create a log file, is there a way to send the contents of the log file back to the "client" (the local computer making the remote calls)?
Quick answer is No. Long version is, possible but will involve lots of hacks. I developed very similar deployment script/system using PowerShell 2 last year. The remoting feature is the primary reason we put up with the CTP status. PowerShell 1 with WinRM is flaky at best and as you said, no real feedback apart from ok or failed.
Alternative that I considered included using PsExec, which is very much non-standard and may be blocked by firewall. The other approach involves using system management tools such as MS's System Center, but that's just a big hammer for a tiny nail. So you have to pick your poison...
Just a comment on this: The easiest way to capture powershell output is to use the start-transcript cmdlet to pipe console output to a file. We have a small snippet at the start of all our script that sends a log file with the console output from each script to a central file share, and names the log file with script name and date executed so that we'll have an idea of what happened. Its not too hard to pipe all those log files into a database for further processing either. Probably won't seolve all your problems, but would definitely help on the "getting data back" part.
best regards,
Trond