Get a list of all Resources in my Azure Subscription (Powershell Preferably) - powershell

I have an azure subscription and I'm trying to write a powershell script to automatically get a list of all the resources (VMs, Storage Accounts, Databases, etc) that I currently have in my subscription. Is there a way to do this using the azure management REST API or the Azure Cmdlets?

If you are using the new Resource Manager model (introduced in 2014) you can use the following PowerShell script.
Login-AzureRmAccount
Get-AzureRmResource | Export-Csv "c:\Azure Resources.csv"
To use the Resource Manager PowerShell commands you will need the AzureRM PowerShell module (https://learn.microsoft.com/en-us/powershell/azure/install-azurerm-ps).
Install-Module AzureRM
For more information on the difference between Resource Manager and Classic models see, https://learn.microsoft.com/en-us/azure/azure-resource-manager/resource-manager-deployment-model.
For users with multiple subscriptions:
If you want to output the contents of multiple subscriptions then you will need to call Select-AzureRmSubscription to switch to another subscription before calling Get-AzureRmResource.

I don't think there's just one function (or PS Cmdlet) to fetch all this information. However each of these can be fetched through both Windows Azure Service Management REST API as well as Window Azure PowerShell Cmdlets.
Windows Azure Service Management REST API: http://msdn.microsoft.com/en-us/library/windowsazure/ee460799.aspx. For example, if you want to list storage accounts in your subscription, you would use this: http://msdn.microsoft.com/en-us/library/windowsazure/ee460787.aspx
Windows Azure PowerShell Cmdlets: http://msdn.microsoft.com/en-us/library/jj554330.aspx. Again, if you want to list storage accounts in your subscription, you would use this: http://msdn.microsoft.com/en-us/library/dn205168.aspx.

well,
You may update the version of your AzurePowershell and execute this command.
Get-AzureResource
In the output, You may check for "ResourceType".
It has the information about the type of resource creatd on azure.

Since you said PowerShell "preferably", I'm going to assume other options are still maybe useful? You can go to http://portal.azure.com, and click on the Menu icon (three horizontal lines), then All Resources. Then at the top of the page you can click Export to CSV and open that in Excel.
You have to take 30 seconds to do a little cleanup in Excel, but for what I'm trying to do right now, this was definitely the best & fastest solution. I hope it's useful to you (or someone else) too.

Adding to #Gaurav's answer (and related to your comment about SQL database enumeration): You can enumerate all of your databases, on a per-server basis, in a few easy steps.
First, enumerate all of the SQL Database servers in your subscription:
Then, for each server, create a connection context and enumerate the databases. Note that, with the Get-Credentials cmdlet, I was prompted to enter a username + password via a popup, which I don't show here. For demonstration purposes, I created a brand new server, with only a master database, to show what the output looks like:

This sample demonstrates how to automatically get a list of all the resources (VMs, Storage Accounts, Databases, App Services) and status via Powershell by certificate authentication.
https://gallery.technet.microsoft.com/Access-Azure-resource-data-ca9cc9f7

I know it's already been answered however, I have found the Get-AzResource command easy to use and fetches all the resources from a particular subscription. Try using it with "ft" for clean text
Get-AzResource | ft
Screenshot

Related

Get Azure VM AD Domain

Is there a way to query the AD domain of a VM on azure using the REST API? The only way I found was to use a run command and use powershell on the VM to get the domain name, this however has a significant delay and I would like to find a faster method.
Run Command documentation:
https://learn.microsoft.com/en-us/azure/virtual-machines/windows/run-command
You can check the role assignments of the VM using
GET https://management.azure.com/subscriptions/subId/resourcegroups/rgname/providers/resourceProviderNamespace/parentResourcePath/resourceType/resourceName/providers/Microsoft.Authorization/roleAssignments?api-version=2015-07-01
See: https://learn.microsoft.com/en-us/rest/api/authorization/roleassignments/listforresource#roleassignment

Copy Items from one resource group to another in Azure data lake store using powershell

All I want is to copy the data from a development environment to production environment in Azure data lake store. There is not QA..
These are .CSV files the environments are nothing but different resource groups.
I tried copying the data in the similar Resource Group using the command
Move-AzureRmDataLakeStoreItem -AccountName "xyz" -path "/Product_S
ales_Data.csv" -destination "/mynewdirectory
Which worked fine, however, I want the data movement to take place between two different resource groups.
Possible solution that I have come across is by using the Export command which downloads the files in the local machine and then using the Import command and uploading them to a different resource group.
Import-AzureRmDataLakeStoreItem
Export-AzureRmDataLakeStoreItem
The reason behind using a PowerShell is to automate the process of importing the files/copying them across different environment which is nothing but automating the entire deployment process using PowerShell.
The solution mentioned above might help me in taking care of the process but I am looking for a better solution where the local machine or a VM is not required.
You do have a number of options, all of the below will be able to accomplish what you are looking to achieve. Keep in mind that you need to check the limitations of each and weigh the costs. For example, Azure functions have a limited time they can execute (default maximum of 5 minutes) and local storage limitations.
Azure Logic Apps (drag and from config)
Azure Data Factory (using the Data Lake lined service)
Azure Functions (using the Data Lake REST API)
You could use Azure Automation and PowerShell to automate your current approach.
Use ADLCopy to copy between lakes (and other stores)
Choosing which can be opinionated and subjective.

Run Powershell script every hour on Azure

I have found this great script which backs up SQL Azure database to BLOB.
I want to run many different variations of this script - e.g. DB1 goes to Customer1Blob, DB2 goes to Customer2Blob.
I have looked at Scheduler Job Collections. However I can only see options (Action settings) for HTTP(S)/ Storage Queue / Service Bus.
Is it possible to run a specific .ps1 script (with commands) scheduled?
You can definitely run a Powershell script as a WebJob. If you want to run a script on a schedule, you can add a settings.job file containing a chron expression with your webjob. The docs for doing so are here.
For this type of automation tasks, I prefer to use the Azure Automation service. You can create runbooks using powershell and then schedule this with the use of the Azure scheduler. You can have it run "on azure" so you do not need to use compute power that you pay for (rather you pay by the minute the job runs) or you can configure it to run with a hybrid worker.
For more information, please see the documentation
When exporting from SQL DB or from SQL Server, make sure you are exporting from a quiescent database. Exporting from a database with active transactions can result in data integrity issues - data being added to various tables while they are also being exported.

New SQLAzure databases are not visable in the portal nor via the powershell cmdlets

Last week I created 8 databases on a V12 SqlAzure server via powershell and ARM templates, it worked fine. We started to use these databases in SQL Management studio and have set up users and tables etc. There is some data in them and we can select and update as expected. In short they work!
But today I wanted to apply some resource locks to the databases using the azure powershell cmdlet New-AzureRmResourceLock but I'm finding that the command Get-AzureRmResource | Where-Object {$_.ResourceType -eq "Microsoft.Sql/servers/databases"} does not return the databases I'm looking for!
Also I now look in the portal https://portal.azure.com and I see the SQL Servers listed, and when i enter the blade for my sql server I see the databases. But if I click on a DB I'm lead to a not found resource. Also when using the SQL Databases blade I don't see any of the databases listed.
As an aside if I log on to the classic portal https://manage.windowsazure.com I can see the sql server and see all the databases, and click on them and configure them.
I don't really want to have to recreate all these databases as we have started to set them up with schemas, users and data but do need to be able to use the cmdlets to change them especially to add resource locks to them.
Has anyone see this before? and what could i try to bring them back so i can use powershell to configure them again.
I was in touch with Microsoft support last week and they had a look. this is the resolution.
From: Microsoft support Email
I suspect that our case issue derives from stale subscription cache.
In summary, subscription cache can become stale when changes made
within a subscription occur over time. In an effort to mitigate our
case issue, I have refreshed the subscription cache from the backend.
After they had a look it was sorted out that day, both the portal and more importantly the command line are fixed.
Thanks All
Please provide your subscription id, server name and missing database names and I will have this investigated. Apologies for the inconvenience. You can send details to me at bill dot gibson at microsoft . com.

How can manage Azure Table with Powershell?

I already search for this but i can't get it to work.
I already have access to the Table Storage and list all tables.
How can I do now to update a row in a specific Table in Azure using Powershell?
Here is a pure powershell + azure storage api solution:
https://github.com/chriseyre2000/Powershell/tree/master/Azure2
Please take a look at Cerebrata Azure Management Cmdlets (http://www.cerebrata.com/Products/AzureManagementCmdlets) which has Cmdlets to manage Windows Azure Storage. Other alternative is to consume Storage Client Library in PowerShell. Please look at this thread for an example: How do I change the timeout value for Add-Blob Azure cmdlet? though it is for uploading blobs with a timeout value but it should give you an idea about consuming storage client library in PowerShell.
Hope this helps.
There is also the Pipeworks library http://powershellpipeworks.com/
Follow the connecting the clouds examples.