Cannot remove file from data lake store using runbook - powershell

I am trying to run a runbook on azure that contains the following command:
Remove-AzureRmDataLakeStoreItem
When the Runbook is run, the following error comes out:
"Remove-AzureRmDataLakeStoreItem : The term 'Remove-AzureRmDataLakeStoreItem' is not recognized as the name of a cmdlet,..."
What should I do?

This issue typically happens when there is a version mismatch between PS modules in your runbook and the Azure Automation account. To resolve, you will need to update your Azure PS Modules within the Azure Automation Account. "update" steps are published HERE.
Important note:
"Because modules are updated regularly by the product group, changes can occur with the included cmdlets, which may negatively impact your runbooks depending on the type of change, such as renaming a parameter or deprecating a cmdlet entirely. To avoid impacting your runbooks and the processes they automate, it is recommended that you test and validate before proceeding. If you do not have a dedicated Automation account intended for this purpose, consider creating one so that you can test many different scenarios and permutations during the development of your runbooks, in addition to iterative changes such as updating the PowerShell modules. After the results are validated and you have applied any changes required, proceed with coordinating the migration of any runbooks that required modification and perform the following update as described in production."

Related

Azure Devops - Manage, Run and Track one-time Sql Scripts

We have a database project that uses a dacpac to deploy schema changes and also allows a pre-deployment and post-deployment script.
However, we frequently have to run one-off scripts and security would prefer that developers not have write access in prod (we do not have DBA role at this time). I'm trying to find a solution that would work with azure devops to store one-time run scripts in git, run the script if it has not been run before, and not run the script the next time the pipeline runs. We'd like this done through devops so the SP has access to run the queries and not the dev, and anything flowing through the pipe has been through our peer review process, plus we have record of what was executed.
I'm looking for suggestions from anyone who has done this or is aware of any product which can do this.
Use liquibase. Though I would have it as part of my code base you can also use it from the CLI and run your scripts using that tool.
Liquibase keeps track of what SQL files you have published across deployments so you can have multiple stages say DIT, UAT, STAGING, PROD and it can apply the remaining one off SQL changes over time.
Generally unless you really need support, I doubt you'd need the commercial version. The opensource version is more than sufficient for my system needs and I have a relatively complex system already.
The main reason I like liquibase over other technologies is it allows for SQL based change sets. So the learning curve is a lot lower.
Two tips:
don't rely on the automatic computation of the logicalFilePath, explicitly set it even if it is repeating yourself. This allows you to refactor your scripts so instead of lumping everything into a single folder you may group them later on.
Name your scripts with the date first. That way you can leverage the natural sorting order.
I've faced a similar problem in the past:
Option 1
If you can afford to have an additional table in your database to keep track of what was executed or not, your problem can be easily solved, there is a tool which helps you: https://github.com/DbUp/DbUp
Then you would have a new repository let's call it OneOffSqlScriptsRepository and your pipeline would consume this repository:
resources:
repositories:
- repository: OneOffSqlScriptsRepository
endpoint: OneOffSqlScriptsEndpoint
type: git
Thus you'd create a pipeline to run this DbUp application consuming the scripts from the OneOffSqlScripts repository, the DB would take care of executing the scripts only once (it's configurable).
The username/password for the database can be stored safely in the library combined with azure keyvaults, so only people with the right access rights could access them (apart from the pipeline).
Option 2
This option assumes that you wanna do everything by using only the native resources that azure pipelines can provide.
Create a OneOffSqlScripts as in option1
Create a ScriptsRunner repository
In the ScriptRunner repository, you'd create a folder containing a .json file with the name of the scripts and the amount of times (or a boolean) you've had run them.
eg.:
[{
"id": 1
"scriptName" : "myscript1.sql"
"runs": 0 //or hasRun : false
}]
Then write a python script that reads and writes a json file by updating the amount of runs, thus you'd need to update your repository after each pipeline run. It would mean that your pipeline will perform a git commit / push operation after each run in case there new scripts to be run.
The algorithm is like these, the implementation can be tuned.

Is there an easy way to run Azure DevOps PowerShell scripts on my local machine?

I tried to find anything on this but I didn't succeed. Maybe I am using the wrong words for the search.
What I am trying to achieve is that I have a script that can run in an Azure DevOps environment as well as on my local machine for debug purposes. As far as I can see to execute locally I would need some kind of wrapper for the script that is behaving like the Azure DevOps Task is. Does anything like that exist out there?
If you want to have more control over building your code and be able to see intermediate results you need to install self-hosted agent on your machine. Here you have more info about this.
Most of the task are simply wrappers around console tools which adds sort of authorization or making them visually accessible. Maybe useful for you will be enable System.Debug flag on Microsoft agent to see more details what particular task does. You will see more details and thus be able to better understand what is happening behind.
For instance if you use variables in your script like $(someVariable) setting System.Debug you will see your final script in the log with replaced values.
Be aware also that Secret variables are masked. So you may find *** in logs instead of real value.
However, there is no easy way just to extract and wrap what task does to repeat it on your machine without involving Azure DevOps agent.

How to validate Powershell Desired State Configuration Template before Executing?

As we provision certain resources in Azure, management portal validates the template generated, however when we do it using powershell we only come to know about issues, only when it is executed.
There must be some parameter or switch which could help to just
validate the template & not actually execute it. Any body knows
please?
I assume you are talking about deploying ARM templates and I also assume you are using the AzureRm PowerShell module. In that case you can use the Test-AzureRmResourceGroupDeployment command to 'Validates a resource group deployment' (from the command's help).

Run Powershell script every hour on Azure

I have found this great script which backs up SQL Azure database to BLOB.
I want to run many different variations of this script - e.g. DB1 goes to Customer1Blob, DB2 goes to Customer2Blob.
I have looked at Scheduler Job Collections. However I can only see options (Action settings) for HTTP(S)/ Storage Queue / Service Bus.
Is it possible to run a specific .ps1 script (with commands) scheduled?
You can definitely run a Powershell script as a WebJob. If you want to run a script on a schedule, you can add a settings.job file containing a chron expression with your webjob. The docs for doing so are here.
For this type of automation tasks, I prefer to use the Azure Automation service. You can create runbooks using powershell and then schedule this with the use of the Azure scheduler. You can have it run "on azure" so you do not need to use compute power that you pay for (rather you pay by the minute the job runs) or you can configure it to run with a hybrid worker.
For more information, please see the documentation
When exporting from SQL DB or from SQL Server, make sure you are exporting from a quiescent database. Exporting from a database with active transactions can result in data integrity issues - data being added to various tables while they are also being exported.

Get a list of all Resources in my Azure Subscription (Powershell Preferably)

I have an azure subscription and I'm trying to write a powershell script to automatically get a list of all the resources (VMs, Storage Accounts, Databases, etc) that I currently have in my subscription. Is there a way to do this using the azure management REST API or the Azure Cmdlets?
If you are using the new Resource Manager model (introduced in 2014) you can use the following PowerShell script.
Login-AzureRmAccount
Get-AzureRmResource | Export-Csv "c:\Azure Resources.csv"
To use the Resource Manager PowerShell commands you will need the AzureRM PowerShell module (https://learn.microsoft.com/en-us/powershell/azure/install-azurerm-ps).
Install-Module AzureRM
For more information on the difference between Resource Manager and Classic models see, https://learn.microsoft.com/en-us/azure/azure-resource-manager/resource-manager-deployment-model.
For users with multiple subscriptions:
If you want to output the contents of multiple subscriptions then you will need to call Select-AzureRmSubscription to switch to another subscription before calling Get-AzureRmResource.
I don't think there's just one function (or PS Cmdlet) to fetch all this information. However each of these can be fetched through both Windows Azure Service Management REST API as well as Window Azure PowerShell Cmdlets.
Windows Azure Service Management REST API: http://msdn.microsoft.com/en-us/library/windowsazure/ee460799.aspx. For example, if you want to list storage accounts in your subscription, you would use this: http://msdn.microsoft.com/en-us/library/windowsazure/ee460787.aspx
Windows Azure PowerShell Cmdlets: http://msdn.microsoft.com/en-us/library/jj554330.aspx. Again, if you want to list storage accounts in your subscription, you would use this: http://msdn.microsoft.com/en-us/library/dn205168.aspx.
well,
You may update the version of your AzurePowershell and execute this command.
Get-AzureResource
In the output, You may check for "ResourceType".
It has the information about the type of resource creatd on azure.
Since you said PowerShell "preferably", I'm going to assume other options are still maybe useful? You can go to http://portal.azure.com, and click on the Menu icon (three horizontal lines), then All Resources. Then at the top of the page you can click Export to CSV and open that in Excel.
You have to take 30 seconds to do a little cleanup in Excel, but for what I'm trying to do right now, this was definitely the best & fastest solution. I hope it's useful to you (or someone else) too.
Adding to #Gaurav's answer (and related to your comment about SQL database enumeration): You can enumerate all of your databases, on a per-server basis, in a few easy steps.
First, enumerate all of the SQL Database servers in your subscription:
Then, for each server, create a connection context and enumerate the databases. Note that, with the Get-Credentials cmdlet, I was prompted to enter a username + password via a popup, which I don't show here. For demonstration purposes, I created a brand new server, with only a master database, to show what the output looks like:
This sample demonstrates how to automatically get a list of all the resources (VMs, Storage Accounts, Databases, App Services) and status via Powershell by certificate authentication.
https://gallery.technet.microsoft.com/Access-Azure-resource-data-ca9cc9f7
I know it's already been answered however, I have found the Get-AzResource command easy to use and fetches all the resources from a particular subscription. Try using it with "ft" for clean text
Get-AzResource | ft
Screenshot