Add-AzureRmServiceFabricNodeType -> 'accountName' cannot be null - azure-service-fabric

I'm trying to use the 'Add-AzureRmServiceFabricNodeType' command to add a new nodeType to an existing service fabric cluster. This is my command:
Add-AzureRmServiceFabricNodeType -ResourceGroupName "$ResourceGroupName$" -Name "$ClusterName$" -NodeType "$TypeName$" -VmSku "Standard_H8" -Capacity 3 -VmUserName "$UserName$" -VmPassword $pwd
Having already logged in and set the subscription using 'Login-AzureRmAccount' and 'Set-AzureRmContext'
The call runs for ~1hr and then returns the following error:
WARNING: Rolling back the changes to the cluster
Add-AzureRmServiceFabricNodeType : 'accountName' cannot be null.
At line:1 char:1
+ Add-AzureRmServiceFabricNodeType -ResourceGroupName "%ResourceGroupName% ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : CloseError: (:) [Add-AzureRmServiceFabricNodeType], ValidationException
+ FullyQualifiedErrorId : Microsoft.Azure.Commands.ServiceFabric.Commands.AddAzureRmServiceFabricNodeType
I have successfully added a nodetype to this cluster in the past, but then i didn't set the -VmSku option. As well as that difference, the subscription has since been upgraded from a pay as you go sub to an Enterprise Agreement. Based on the error received I guess it might have something to do with that, but i can't seem to find what exactly.
Any ideas?

I will assume you used the secure encoding to provide the password
$password = ConvertTo-SecureString -String 'Password$123456' -AsPlainText -Force
I also suggest you add the -Tier as part of your command, because as part of provisioning process it requires the sku, tier and capacity. If you not provide one it will use the default, and the sku might not be compatible with the default tier or your account availability.
You can also check in Azure if the VMSS are created once you run the command.
If you want to investigate further, I would recommend reading the source code for the command operation executed for adding node types.

Related

Move users from SFB on-perm to Teams\SFBO

Updated SFB-prem to CU9 but for some reason unable to move test user to Teams or SFBO. anyone facing this issue?
I tried to used PS commands and SFB admin portal but same results
$cred=Get-Credential
$url="https://admxxxx.online.lync.com/HostedMigration/hostedmigrationService.svc"
Move-CsUser -Identity teamstestuser02#xxx.com -Target sipfed.online.lync.com -Credential $cred -HostedMigrationOverrideUrl $url
Move-CsUser : Unable to connect to some of the servers in pool
"XXX.com" due to a Distributed Component Object Model (DCOM) error.
Verify that Front End service is running on servers in this pool. If
the pool is set up for load balancing, verify that load balancer is
configured correctly. At line:2 char:1
+ Move-CsUser -Identity teamstestuser02#XXXX.com -Target "sip ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (CN=TeamsTestUse...p,xx.xxcom:OCSADUser) [Move-CsUser],
MoveUserException
+ FullyQualifiedErrorId : MoveError,Microsoft.Rtc.Management.AD.Cmdlets.MoveOcsUserCmdlet
Have you run this on the FrontEnd Server itself?
I have hit and misses with Move-CsUser when it targets SkypeOnline
Potential mitigations I found:
-Run on FE
-Specify undocumented Switch -DomainController to point it at a writable DC
https://learn.microsoft.com/en-us/skypeforbusiness/hybrid/move-users-between-on-premises-and-cloud should have all the juice you need, make sure that the user you are running this command as (i.E. logged into the FE) has CsServerAdministrator and the $Cred has Global Admin or User Admin + Skype For Business Admin
Hope that helps :)
I used -UseOAuth switch on the Frond End server and the issue resolved. Some users return errors for Rollback, and nothing works for these users until I used force switch, which they lost their contacts and meeting info.

Set-AzureRmSqlDatabase command is failing while lowering the Azure SQL DB pricing tier due to key vault soft delete

I'm using an automated powershell script to downgrade the pricing tier of the database backup copy. While supplying the below command the tier downgrade fails. The error, if I understood correctly is referring to key-vault with a key named same as my server name, since there is no such key exists(hence the soft delete can also be not enabled), this command fails.
The command has been set-up when my application was not set-up with key-vault and seems like now it's failing.
Command used:
Set-AzureRmSqlDatabase -DatabaseName <*Back-up DB name*> -ServerName <*SQL server name*> -ResourceGroupName <*Resource Group name*> -Edition Standard -RequestedServiceObjectiveName S0
Error:
Set-AzureRmSqlDatabase : 45377: The provided Key Vault uri
'https://****.vault.azure.net/keys/<SERVERNAME>/<Subscription/some
ID> is not valid. Please ensure the key vault has been configured
with soft-delete. (https://aka.ms/sqltdebyoksoftdelete) At line:1
char:2
+ Set-AzureRmSqlDatabase -DatabaseName <Back-up DB name> -ServerName <SQL server name>...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : CloseError: (:) [Set-AzureRmSqlDatabase], CloudException
+ FullyQualifiedErrorId : Microsoft.Azure.Commands.Sql.Database.Cmdlet.SetAzureSqlDatabase
Questions :
1.Why Set-AzureRmSqlDatabase command is referring to a key-vault URI, when not mentioned explicitly ?
2.Is there a option we need to set at the server/DB level to allow this command to read the server/DB name directly rather than searching for a key with server name?
Is this anywhere related to Transparent data encryption ?
Are we suppose to make any changes to this command if the key-vault comes later than the application/DB/key-vault set-up ?
This issue was purely related to TDE(Transparent data encryption) as I thought of. Since Azure SQL databases were secured by TDE, it was expected that the key-vault should also be enabled with soft delete to recover any deleted keys.
While trying to enable soft-delete, I figured out that the Azure powershell installed on my machine doesn't support soft delete property.
I am able to resolve this issue by following steps :
Upgraded powershell :
Installation Package
Login into your azure subscription and run this command
$vault = Get-AzureRmKeyVault -VaultName myvault; $vault.EnableSoftDelete
If the above doesn't work run the below command. This will find the resourceId of the key-vault and then will enable soft-delete -
($resource = Get-AzureRmResource -ResourceId (Get-AzureRmKeyVault -VaultName "YourKeyVaultNameHere").ResourceId).Properties | Add-Member -
MemberType "NoteProperty" -Name "enableSoftDelete" -Value "true"
Set-AzureRmResource -resourceid $resource.ResourceId -Properties $resource.Properties
Verify if the key-vault soft-delete is enabled by below command
Get-AzureRmKeyVault -VaultName "YourKeyVaultNameHere"
Hope this would be helpful for someone facing the similar issue.
Here are some personal opinions for you to refer.
First, per my test, the command works fine on my side.
Note: In my test environment, it is a sql server and database without any other things, like transparent data encryption.
Set-AzureRmSqlDatabase -DatabaseName joydatabase -ServerName joydb -ResourceGroupName joywebapp -Edition Standard -RequestedServiceObjectiveName S0
Why Set-AzureRmSqlDatabase command is referring to a key-vault URI, when not mentioned explicitly ?
On my side, I catch the request via fiddler, it is not referring to a key-vault URL, refer to the screenshot.
Is there a option we need to set at the server/DB level to allow this command to read the server/DB name directly rather than searching for a key with server name?
On my side, I think we needn't to do so.
Is this anywhere related to Transparent data encryption ?
I think there is a great possibility that it is related to it. You could create a new sql server and database to have a try. Here is an article about transparent data encryption for azure sql server, you could refer to it.
Are we suppose to make any changes to this command if the key-vault comes later than the application/DB/key-vault set-up ?
I think it seems not make any change to this command.

Deciphering error after runnning New-AzureRmDataLakeAnalyticsCatalogCredential

I am working on reading Azure SQL data from a U-SQL script into Data Lake. I am working on creating an external data source in U-SQL. A part of this effort involves the creation of a “credential” using PowerShell. I am following this guidance:
https://learn.microsoft.com/en-us/powershell/module/azurerm.datalakeanalytics/new-azurermdatalakeanalyticscatalogcredential
But, am stuck on an error, shown below. This part of it has me especially stumped: "The resource '' does not exist.". Due to this part of the command : "-Credential (Get-Credential)", I get prompted for login name and password. After that, I get presented with the error.
Please help me decipher this situation.
Thank you!
Eric
C:\WINDOWS\system32> New-AzureRmDataLakeAnalyticsCatalogCredential -AccountName "" `
-DatabaseName "<MYDBNAME>" `
-CredentialName "<MYCREDENTIALNAME>" `
-Credential (Get-Credential) `
-Uri "http://<MYSERVERNAME>.database.windows.net:1433"
cmdlet Get-Credential at command pipeline position 1
Supply values for the following parameters:
Credential
WARNING: The output type defined for this cmdlet is incorrect and will be updated to reflect what is actually returned
(and defined in the help) in a future release.
New-AzureRmDataLakeAnalyticsCatalogCredential : The resource '' does not exist. Trace:
c3e04b2a-2690-4c5e-b61c-58a5ded93c6b Time: 2017-05-10T09:09:07.8971058-07:00
At line:1 char:1
+ New-AzureRmDataLakeAnalyticsCatalogCredential -AccountName "bladlalog ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : CloseError: (:) [New-AzureRmData...talogCredential], CloudException
+ FullyQualifiedErrorId : Microsoft.Azure.Commands.DataLakeAnalytics.NewAzureDataLakeAnalyticsCatalogCredential
Solved by using a new ADL database name, that I created, and not the source Azure SQL database. I got the "az dla catalog credential create" to run cleanly. .... The online documentation I initially found was not clear for indicating the expected database name was an ADL database, and not the source Azure SQL database.

How to deploy to Azure with powershell?

I want to deploy my application to azure with powershell. So far I have created a certificate in the localmachine store, I'm not going to run the deploy script as me, uploaded the script to azure. The next step is to get access to the service on azure in powershell but there it fails. The script I have so far is:
$cert = Get-Item Cert:\LocalMachine\deploy\xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
Set-AzureSubscription -SubscriptionName $subscriptionName -SubscriptionId $subscriptionId -Certificate $cert
Select-AzureSubscription $subscriptionName
$service = Get-AzureService $azureId
It fails on the last row with the following message:
Get-AzureService : Communication could not be established. This could be due to an invalid subscription ID. Note that subscription IDs are case sensitive.
At F:\DeployTest\deploy.ps1:9 char:12
+ $service = Get-AzureService $azureId
+ ~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidData: (:) [Get-AzureService], Exception
+ FullyQualifiedErrorId : Microsoft.WindowsAzure.Management.ServiceManagement.HostedServices.GetAzureServiceCommand
Get-AzureService : HTTP Status Code: AuthenticationFailed - HTTP Error Message: The server failed to authenticate the request. Verify that the certificate is valid and is associated with this subscription.
Operation ID:
At F:\DeployTest\deploy.ps1:9 char:12
+ $service = Get-AzureService $azureId
+ ~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : CloseError: (:) [Get-AzureService], CommunicationException
+ FullyQualifiedErrorId : Microsoft.WindowsAzure.Management.ServiceManagement.HostedServices.GetAzureServiceCommand
I really don't know what the problem is, the certificate I'm trying to use is uploaded so it feels like there is something fundamental I've missed.
Update: I did get it to work after downloading the .publishsettings-file and importing that instead of trying to use Set-AzureSubscription. I'm still a little bit confused though, shouldn't it be possible to use the method I tried above?
I finally found the problem, and of course it was a user problem. First when I was in the azure portal I didn't find where to upload the certificate, so I uploaded it to first place I found mentioning certificates. What I did found out was that this area was the wrong one, I uploaded the certificate to the certificate area under the cloud service I wanted to administrate, which is the wrong place.
The correct place to upload the certificate to is under settings in the admin portal of azure. So the code above works if the certificate is uploaded to the correct location.

Service Bus 1.0 Beta New-SBFarm : The specified directory service attribute or value does not exist

I am trying to register a new SB Farm following the procedure from http://msdn.microsoft.com/en-us/library/windowsazure/jj193021(v=azure.10).aspx
However when I try to execute the first part
$mycert= ConvertTo-SecureString -AsPlainText -Force -String 'password1'
New-SBFarm -FarmMgmtDBConnectionString "data source=.\SQLEXPRESS;Integrated
Security=True;" –CertAutoGenerationKey $mycert
I receive the following error:
New-SBFarm : The specified directory service attribute or value does not exist.
At line:1 char:1
+ New-SBFarm -FarmMgmtDBConnectionString "data
source=.\SQLEXPRESS;Integrated ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
~~~
+ CategoryInfo : NotSpecified: (:) [New-SBFarm], COMException
+ FullyQualifiedErrorId : System.Runtime.InteropServices.COMException,Micr
osoft.ServiceBus.Commands.NewSBFarm
I have ensured my server meets the application requirements, however I am still no luck...
Any help please?
That line contains two separate cmdlet calls, one to create the secure string, and the second to create the new farm. Are you executing each call separately? From the pasted code, it seems like you're running them as one command, which would probably fail.
Also, the New-SBFarm has a switch to show verbose tracing. Could you try running the command with -verbose at the end and share the output?