Deciphering error after runnning New-AzureRmDataLakeAnalyticsCatalogCredential - powershell

I am working on reading Azure SQL data from a U-SQL script into Data Lake. I am working on creating an external data source in U-SQL. A part of this effort involves the creation of a “credential” using PowerShell. I am following this guidance:
https://learn.microsoft.com/en-us/powershell/module/azurerm.datalakeanalytics/new-azurermdatalakeanalyticscatalogcredential
But, am stuck on an error, shown below. This part of it has me especially stumped: "The resource '' does not exist.". Due to this part of the command : "-Credential (Get-Credential)", I get prompted for login name and password. After that, I get presented with the error.
Please help me decipher this situation.
Thank you!
Eric
C:\WINDOWS\system32> New-AzureRmDataLakeAnalyticsCatalogCredential -AccountName "" `
-DatabaseName "<MYDBNAME>" `
-CredentialName "<MYCREDENTIALNAME>" `
-Credential (Get-Credential) `
-Uri "http://<MYSERVERNAME>.database.windows.net:1433"
cmdlet Get-Credential at command pipeline position 1
Supply values for the following parameters:
Credential
WARNING: The output type defined for this cmdlet is incorrect and will be updated to reflect what is actually returned
(and defined in the help) in a future release.
New-AzureRmDataLakeAnalyticsCatalogCredential : The resource '' does not exist. Trace:
c3e04b2a-2690-4c5e-b61c-58a5ded93c6b Time: 2017-05-10T09:09:07.8971058-07:00
At line:1 char:1
+ New-AzureRmDataLakeAnalyticsCatalogCredential -AccountName "bladlalog ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : CloseError: (:) [New-AzureRmData...talogCredential], CloudException
+ FullyQualifiedErrorId : Microsoft.Azure.Commands.DataLakeAnalytics.NewAzureDataLakeAnalyticsCatalogCredential

Solved by using a new ADL database name, that I created, and not the source Azure SQL database. I got the "az dla catalog credential create" to run cleanly. .... The online documentation I initially found was not clear for indicating the expected database name was an ADL database, and not the source Azure SQL database.

Related

How to fix "Object reference not set to an instance of an object" error when running Get-AzDataLakeStoreChildItem cmdlet?

I'm getting an error while running the Azure cmdlet in Powershell. How do I resolve this?
I'm trying to get details of folders and files present in Azure datalake through powershell. I'm able to access the data lake through portal and access all files.
Using Azure cmdlet I've tested the connection using "Test-AzDataLakeStoreAccount -Name $Server" and it works fine too. However, when I execute the below command, it throws null pointer exception. How to resolve that?
**Get-AzDataLakeStoreChildItem -Account "****.azuredatalakestore.net" -Path "/" **
Get-AzDataLakeStoreChildItem : Object reference not set to an instance of an object.
At line:1 char:1
+ Get-AzDataLakeStoreChildItem -Account "entadls8cc9b872.azuredatalakes ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : CloseError: (:) [Get-AzDataLakeStoreChildItem], NullReferenceException
+ FullyQualifiedErrorId : Microsoft.Azure.Commands.DataLakeStore.GetAzureDataLakeStoreChildItem
I can reproduce your issue in both windows powershell and azure cloud shell. It looks like a bug about the powershell module Az.DataLakeStore.
I tried Get-AzDataLakeStoreChildItem -AccountName "AccountName" -Path "/test/" which is the same as the sample in the doc, also got the same error. I also tried the Get-AzDataLakeStoreItem -AccountName "AccountName" -Path "/test/123.txt" and Test-AzDataLakeStoreItem -AccountName "AccountName" -Path "/test/123.txt", both got an error like below.
I find a github issue related to this error: https://github.com/Azure/azure-powershell/issues/8352. I think the format of the commands I have tried should be correct. The comment in this issue said 'To use datalake az module you have to use it in netcore powershell (not windows powershell)', but as I know, the Az module is cross-platform, it is not a reason, according to the doc. Another comment said the 'we have fixed this issue. Apologize for the inconvenience. It will be released as part of next release.'

Add-AzureRmServiceFabricNodeType -> 'accountName' cannot be null

I'm trying to use the 'Add-AzureRmServiceFabricNodeType' command to add a new nodeType to an existing service fabric cluster. This is my command:
Add-AzureRmServiceFabricNodeType -ResourceGroupName "$ResourceGroupName$" -Name "$ClusterName$" -NodeType "$TypeName$" -VmSku "Standard_H8" -Capacity 3 -VmUserName "$UserName$" -VmPassword $pwd
Having already logged in and set the subscription using 'Login-AzureRmAccount' and 'Set-AzureRmContext'
The call runs for ~1hr and then returns the following error:
WARNING: Rolling back the changes to the cluster
Add-AzureRmServiceFabricNodeType : 'accountName' cannot be null.
At line:1 char:1
+ Add-AzureRmServiceFabricNodeType -ResourceGroupName "%ResourceGroupName% ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : CloseError: (:) [Add-AzureRmServiceFabricNodeType], ValidationException
+ FullyQualifiedErrorId : Microsoft.Azure.Commands.ServiceFabric.Commands.AddAzureRmServiceFabricNodeType
I have successfully added a nodetype to this cluster in the past, but then i didn't set the -VmSku option. As well as that difference, the subscription has since been upgraded from a pay as you go sub to an Enterprise Agreement. Based on the error received I guess it might have something to do with that, but i can't seem to find what exactly.
Any ideas?
I will assume you used the secure encoding to provide the password
$password = ConvertTo-SecureString -String 'Password$123456' -AsPlainText -Force
I also suggest you add the -Tier as part of your command, because as part of provisioning process it requires the sku, tier and capacity. If you not provide one it will use the default, and the sku might not be compatible with the default tier or your account availability.
You can also check in Azure if the VMSS are created once you run the command.
If you want to investigate further, I would recommend reading the source code for the command operation executed for adding node types.

Exception exporting Azure ResourceGroup template

I'm trying to export an Azure ResourceGroup to a template-file using the following cmdlet (Windows 10 x64, module-version 2.0) but it throws an exception PS> Export-AzureRmResourceGroup -ResourceGroupName 'service-env-rg' -Path .\resourcegroup.json -ErrorAction SilentlyContinue Export-AzureRmResourceGroup : InternalServerError : Encountered internal server error. Diagnostic information: timestamp '20160809T072241Z', subscription id 'xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx', tracking id 'eed95646-d845-4852-8971-a353bab65db2', request correlation id 'eed95646-d845-4852-8971-a353bab65db2'. At line:1 char:1 + Export-AzureRmResourceGroup -ResourceGroupName 'service-e ... + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : CloseError: (:) [Export-AzureRmResourceGroup], ErrorResponseMessageException + FullyQualifiedErrorId : InternalServerError,Microsoft.Azure.Commands.ResourceManager.Cmdlets.Implementation.ExportAzureResourceGroupCmdlet
If I run it without -ErrorAction I get a secondary exception as well with the pipeline being stopped. Anyone else seen this ? I got the same error with the previous version of the cmdlet.
I wonder if this is related to the fact that it appears the 'template export' functionality in the Azure portal is not really working properly either ... see this very recent forum thread:
https://social.msdn.microsoft.com/Forums/en-US/3200b7a1-768c-4714-a474-ac02df22e729/problem-with-exporting-current-state-arm-template-for-a-resource-group?forum=windowsazuremanagement
After renaming my Microsoft-account it now works - both in Powershell and in the portal. Ie, it seems that an ARM update has introduced an authentication-change so that a login doesn't properly differentiate between accounts even though I select the type.

Azure PowerShell Error: while using Start-AzureSqlDatabaseCopy it throws exception with Error Code: NotFound

I was trying to make a copy of a database on azure using powershell. I have used "Start-AzureSqlDatabaseCopy" for powershell as descriibed on https://msdn.microsoft.com/en-us/library/ff951631.aspx. But it was failing, and not able to create database copy there. I even tried deleting an existing database using "Remove-AzureSqlDatabase" and saw the same issue.
I have connected to the subscription successfully by using Import-AzurePublishSettingsFile. Verified the connection by providing invalid server and throws the expected exception.
Tried to execute as below
Start-AzureSqlDatabaseCopy -ServerName $SourceServerName -DatabaseName $SourceDatabaseName -PartnerServer $TargetServerName -PartnerDatabase $TargetDatabaseName
Throws the below exception.
Start-AzureSqlDatabaseCopy :https://management.core.windows.net/Id/services/sqlservers/servers/server/databases/database/databasecopies does not exist. Error Code: NotFound At s\Scripts\CreateCIDatabase.ps1:36 char:6 + Start-AzureSqlDatabaseCopy -ServerName $SourceServerName -DatabaseName $Sou ... + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : InvalidOperation: (:) [Start-AzureSqlDatabaseCopy], CommunicationException + FullyQualifiedErrorId : Microsoft.WindowsAzure.Commands.SqlDatabase.Database.Cmdlet.StartAzureSqlDatabaseCopy
I appreciate If someone can help me on this issue?
similar issue has also been posted https://stackoverflow.com/questions/29004974/azure-powershell-to-create-database-backup
Similar to this thread, it looks like you have not properly set your current subscription to the subscription your database is located on. Try to do the following.
1) Set the subscription to the subscription your database is on.
Set-AzureSubscription -SubscriptionName <Your Subscription Name>
2) Check to make sure your current subscription is the one you want to use.
Get-AzureSubscription -Current
3) Use the DBCopy cmdlet to start your copy.
Start-AzureSqlDatabaseCopy -ServerName <SourceServer> -DatabaseName <SourceDatabaseName> -PartnerServer <TargetServerName> -PartnerDatabase <TargetDatabaseName>
Hope this helps!

Service Bus 1.0 Beta New-SBFarm : The specified directory service attribute or value does not exist

I am trying to register a new SB Farm following the procedure from http://msdn.microsoft.com/en-us/library/windowsazure/jj193021(v=azure.10).aspx
However when I try to execute the first part
$mycert= ConvertTo-SecureString -AsPlainText -Force -String 'password1'
New-SBFarm -FarmMgmtDBConnectionString "data source=.\SQLEXPRESS;Integrated
Security=True;" –CertAutoGenerationKey $mycert
I receive the following error:
New-SBFarm : The specified directory service attribute or value does not exist.
At line:1 char:1
+ New-SBFarm -FarmMgmtDBConnectionString "data
source=.\SQLEXPRESS;Integrated ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
~~~
+ CategoryInfo : NotSpecified: (:) [New-SBFarm], COMException
+ FullyQualifiedErrorId : System.Runtime.InteropServices.COMException,Micr
osoft.ServiceBus.Commands.NewSBFarm
I have ensured my server meets the application requirements, however I am still no luck...
Any help please?
That line contains two separate cmdlet calls, one to create the secure string, and the second to create the new farm. Are you executing each call separately? From the pasted code, it seems like you're running them as one command, which would probably fail.
Also, the New-SBFarm has a switch to show verbose tracing. Could you try running the command with -verbose at the end and share the output?