i have this powershell using which I want to create new table in azure storage account.
Param(
[string]$rgName,
[string] $tableName
)
$storcontext= New-AzureStorageContext -ConnectionString '$(MyConnectionString)'
if(!(Get-AzureStorageTable -Name $tableName -Context $storcontext ))
{
New-AzureStorageTable -Name $tableName -Context $storcontext
}
New-AzureStorageTable command works perfect.however i tried adding a check to see if table already exists . but on Get command, powershell throws me saying table does not exist.
What I want to do is check if table exist, if not , then create it.
Is there another way of doing this?
The cmdlet throws an error if the table doesn't exists so you could set the ErrorAction to SilentlyContinue and specify a variable for the error which you could check:
Get-AzureStorageTable -Name $tableName -Context $storcontext -ErrorVariable ev -ErrorAction SilentlyContinue
if ($ev) {
New-AzureStorageTable -Name $tableName -Context $storcontext
}
Related
I need to copy tables from table storage into a different storage account. When attempting to execute AzCopy I'm getting the following exception:
The term 'AzCopy' is not recognized as a name of a cmdlet, function,
script file, or executable program. Check the spelling of the name, or
if a path was included, verify that the path is correct and try again.
I'm connected to the terminal from the portal, and have a powershell prompt:
The issue seems to be with this line:
AzCopy /Source:$SrcTableUrl `
/Dest:$DstBlobUrl/$TableName `
/SourceKey:$SrcAccessKey `
/Destkey:$DstAccessKey
How do we run AzCopy command in the terminal in the Azure portal?
Here's the full code powershell script that I'm attempting to execute:
# This simple PowerShell script will copy one or more Azure storage table from one location into another azure storage table
#
# Dependencies :
# https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy
# https://learn.microsoft.com/en-us/powershell/azure/overview?view=azps-1.6.0
#
# Usage :
# Copy-AzureStorageTable -SrcStorageName "" -SrcAccessKey "" -DstStorageName "" -DstAccessKey "" -IncludeTable All
# Copy-AzureStorageTable -SrcStorageName "" -SrcAccessKey "" -DstStorageName "" -DstAccessKey "" -IncludeTable Table1,Table2,Table3
function Copy-AzureStorageTable
{
param
(
[parameter(Mandatory=$true)]
[String]
$SrcStorageName,
[parameter(Mandatory=$true)]
[String]
$SrcAccessKey,
[parameter(Mandatory=$true)]
[String]
$DstStorageName,
[parameter(Mandatory=$true)]
[String]
$DstAccessKey,
[parameter(Mandatory=$true)]
[String[]]
$IncludeTable
)
# Check if logged in
Azure-Login
# Source Account Storage Parameters
$SrcContext = New-AzureStorageContext -StorageAccountName $SrcStorageName -StorageAccountKey $SrcAccessKey
$SrcBaseUrl = "https://" + $SrcStorageName + ".table.core.windows.net/"
# Destination Account Storage Parameters
$DstContext = New-AzureStorageContext -StorageAccountName $DstStorageName -StorageAccountKey $DstAccessKey
$DstTempContainer = "temptable"
$DstBlobUrl = "https://" + $DstStorageName + ".blob.core.windows.net/$DstTempContainer"
$DstTableUrl = "https://" + $DstStorageName + ".table.core.windows.net"
# Create container in destination blob
Write-Host "$DstTempContainer is not existing in $DstStorageName..."
Write-Host "Creating container $DstTempContainer in $DstStorageName..."
New-AzureStorageContainer -Name $DstTempContainer -Permission Off -Context $DstContext
# Get all tables from source
$SrcTables = Get-AzureStorageTable -Name "*" -Context $SrcContext
foreach($table in $SrcTables)
{
$TableName = $table.Name
Write-Host "Table $TableName"
# Validate if copy all table from source
# Validate if table name is included in our list
if(!$IncludeTable.Contains("All") -and !$IncludeTable.Contains($TableName))
{
Write-Host "Skipping table $TableName"
return
}
Write-Host "Migrating Table $TableName"
$SrcTableUrl = $SrcBaseUrl + $TableName
# Copy Table from source to blob destination. As far as I know there is way no way to copy table to table directly.
# Alternatively, we will copy the table temporaryly into destination blob.
# Take note to put the actual path of AzCopy.exe
Write-Host "Start exporting table $TableName..."
Write-Host "From : $SrcTableUrl"
Write-Host "To : $DstBlobUrl/$TableName"
AzCopy /Source:$SrcTableUrl `
/Dest:$DstBlobUrl/$TableName `
/SourceKey:$SrcAccessKey `
/Destkey:$DstAccessKey
# Get the newly created blob
Write-Host "Get all blobs in $DstTempContainer..."
$CurrentBlob = Get-AzureStorageBlob -Container $DstTempContainer -Prefix $TableName -Context $DstContext
# Loop and check manifest, then import blob to table
foreach($blob in $CurrentBlob)
{
if(!$blob.Name.contains('.manifest'))
{
return
}
$manifest = $($blob.Name).split('/')[1]
Write-Host "Start importing $TableName..."
Write-Host "Source blob url : $DstBlobUrl/$TableName"
Write-Host "Dest table url : $DstTableUrl/$TableName"
Write-Host "Manifest name : $manifest"
# Import blob to table. Insert entity if missing and update entity if exists
AzCopy /Source:$DstBlobUrl/$TableName `
/Dest:$DstTableUrl/$TableName `
/SourceKey:$DstAccessKey `
/DestKey:$DstAccessKey `
/Manifest:$manifest `
/EntityOperation:"InsertOrReplace"
}
}
# Delete temp table storage after export and import process
Write-Host "Removing $DstTempContainer from destination blob storage..."
Remove-AzureStorageContainer -Name $DstTempContainer -Context $DstContext -Force
}
# Login
function Azure-Login
{
$needLogin = $true
Try
{
$content = Get-AzureRmContext
if ($content)
{
$needLogin = ([string]::IsNullOrEmpty($content.Account))
}
}
Catch
{
if ($_ -like "*Login-AzureRmAccount to login*")
{
$needLogin = $true
}
else
{
throw
}
}
if ($needLogin)
{
Login-AzureRmAccount
}
}
My solution was just running a command:
Install-Module -Name Az -Scope CurrentUser -Repository PSGallery -Force
in the powershell
Before:
azcopy now works
and the script now works:
Hope it helps
Azure Portal cloud shell shows to be using AzCopy version 10.6.1 as of 2021.11.10. The ability to copy between tables has been removed after version 7.3.
You need to run the script from a machine where you can download the older version of AzCopy.
seemingly simple task. I just want to rename a blob file, I know I have to copy it to rename or something, then delete the original but this is proving tricky. I have created the storage context (New-AzureStorageContext), and got the blob (Get-AzureStorageBlob), and found Start-AzureStorageBlobCopy, but how to I actually rename it?
I'd like to do this within the same container if possible as well. Ideally I'd run it in an Azure Runbook and call it using a webhook I Azure Data Factory v2. I did try to rename the file using 'Add Dynamic Content' in the copy job sink in DFv2, but I don't think you can. By the way, I just want to append the date to the existing file name. Thank you.
You can use my Rename-AzureStorageBlobconvenience function:
function Rename-AzureStorageBlob
{
[CmdletBinding()]
Param
(
[Parameter(Mandatory=$true, ValueFromPipeline=$true, Position=0)]
[Microsoft.WindowsAzure.Commands.Common.Storage.ResourceModel.AzureStorageBlob]$Blob,
[Parameter(Mandatory=$true, Position=1)]
[string]$NewName
)
Process {
$blobCopyAction = Start-AzureStorageBlobCopy `
-ICloudBlob $Blob.ICloudBlob `
-DestBlob $NewName `
-Context $Blob.Context `
-DestContainer $Blob.ICloudBlob.Container.Name
$status = $blobCopyAction | Get-AzureStorageBlobCopyState
while ($status.Status -ne 'Success')
{
$status = $blobCopyAction | Get-AzureStorageBlobCopyState
Start-Sleep -Milliseconds 50
}
$Blob | Remove-AzureStorageBlob -Force
}
}
It accepts the blob as pipeline input so you can pipe the result of the Get-AzureStorageBlob to it and just provide a new name:
$connectionString= 'DefaultEndpointsProtocol=https;AccountName....'
$storageContext = New-AzureStorageContext -ConnectionString $connectionString
Get-AzureStorageBlob -Container 'MyContainer' -Context $storageContext -Blob 'myBlob.txt'|
Rename-AzureStorageBlob -NewName 'MyNewBlob.txt'
To append the date to the existing file name you can use something like:
Get-AzureStorageBlob -Container 'MyContainer' -Context $storageContext -Blob 'myBlob.txt' | ForEach-Object {
$_ | Rename-AzureStorageBlob -NewName "$($_.Name)$(Get-Date -f "FileDateTime")" }
Further reading: Rename Azure Storage Blob using PowerShell
I am attempting to loop through an invoke-sqlcmd for multiple AzureSQL databases via Azure Automation. The first item in the loop executes, but the all the rest fail with a:
Invoke-Sqlcmd : A network-related or instance-specific error occurred
while establishing a connection to SQL Server. The server was not
found or was not accessible. Verify that the instance name is correct
and that SQL Server is configured to allow remote connections.
(provider: Named Pipes Provider, error: 40 - Could not open a
connection to SQL Server)
I am guessing that I need to close the connection from the first invoke-sqlcmd before executing the next, but have not found a direct method to accomplish that with invoke-sqlcmd. Here is my loop:
param(
# Parameters to Pass to PowerShell Scripts
[parameter(Mandatory=$true)][String] $azureSQLServerName = "myazuresql",
[parameter(Mandatory=$true)][String] $azureSQLCred = "myazureautosqlcred"
)
# DB Name Array
$dbnamearray = #("database1","database2","database3")
$dbnamearray
# Datatable Name
$tabName = "RunbookTable"
#Create Table object
$table = New-Object system.Data.DataTable "$tabName"
#Define Columns
$col1 = New-Object system.Data.DataColumn dbname,([string])
#Add the Columns
$table.columns.add($col1)
# Add Row and Values for dname Column
ForEach ($db in $dbnamearray)
{
$row = $table.NewRow()
$row.dbname = $db
#Add the row to the table
$table.Rows.Add($row)
}
#Display the table
$table | format-table -AutoSize
# Loop through the datatable using the values per column
$table | ForEach-Object {
# Set loop variables as these are easier to pass then $_.
$azureSQLDatabaseName = $_.dbname
# Execute SQL Query Against Azure SQL
$azureSQLServerName = $azureSQLServerName + ".database.windows.net"
$Cred = Get-AutomationPSCredential -Name $azureSQLCred
$SQLOutput = $(Invoke-Sqlcmd -ServerInstance $azureSQLServerName -Username $Cred.UserName -Password $Cred.GetNetworkCredential().Password -Database $azureSQLDatabaseName -Query "SELECT * FROM INFORMATION_SCHEMA.TABLES " -QueryTimeout 65535 -ConnectionTimeout 60 -Verbose) 4>&1
Write-Output $SQLOutput
}
You can try making each connection as a powershell job. This solved a very similar issue I had some time ago. Send-MailMessage closes every 2nd connection when using attachments If you want to read an explanation. Basically, if you're unable to use a .Close() method, you can force connections to close by terminating the entire session for each run. In an ideal world the cmdlet would handle all this for you, but not everything was created perfectly.
# Loop through the datatable using the values per column
$table | ForEach-Object {
# Set loop variables as these are easier to pass then $_.
$azureSQLDatabaseName = $_.dbname
# Execute SQL Query Against Azure SQL
$azureSQLServerName = $azureSQLServerName + ".database.windows.net"
$Cred = Get-AutomationPSCredential -Name $azureSQLCred
# Pass in the needed parameters via -ArgumentList and start the job.
Start-Job -ScriptBlock { Write-Output $(Invoke-Sqlcmd -ServerInstance $args[0] -Username $args[1].UserName -Password $args[1].GetNetworkCredential().Password -Database $args[0] -Query "SELECT * FROM INFORMATION_SCHEMA.TABLES " -QueryTimeout 65535 -ConnectionTimeout 60 -Verbose) 4>&1 } -ArgumentList $azureSQLServerName, $Cred | Wait-Job | Receive-Job
}
This is untested since I do not have a server to connect to, but perhaps with a bit of work you can make something out of it.
I faced the same issue previously while doing something with the database of azure sql. You can try this
1. Create Automation Account
New-AzureRmAutomationAccount -ResourceGroupName $resourceGroupName -Name $automationAccountName -Location $location
2. Set the Automation account to work with
Set-AzureRmAutomationAccount -Name $automationAccountName -ResourceGroupName $resourceGroupName
3. Create / Import a Runbook
Here we already have a runbook ready so we import it. Here's the runbook code
workflow runbookNameValue
{
inlinescript
{
$MasterDatabaseConnection = New-Object System.Data.SqlClient.SqlConnection
$MasterDatabaseConnection.ConnectionString = "ConnectionStringValue"
# Open connection to Master DB
$MasterDatabaseConnection.Open()
# Create command
$MasterDatabaseCommand = New-Object System.Data.SqlClient.SqlCommand
$MasterDatabaseCommand.Connection = $MasterDatabaseConnection
$MasterDatabaseCommand.CommandText = "Exec stored procedure"
# Execute the query
$MasterDatabaseCommand.ExecuteNonQuery()
# Close connection to Master DB
$MasterDatabaseConnection.Close()
}
}
4. Importing
Import-AzureRMAutomationRunbook -Name $runBookName -Path $scriptPath -ResourceGroupName $resourceGroupName -AutomationAccountName $automationAccountName -Type PowerShell
I hope this helps. Instead of using Invoke-Sqlcmd use the $MasterDatabaseCommand.ExecuteNonQuery() like i've provided in the runbook. It will work
It seems that you append .database.windows.net to the server name inside the loop. I guess that's why it works for the first iteration only.
Just move this line:
$azureSQLServerName = $azureSQLServerName + ".database.windows.net"
before this line:
$table | ForEach-Object {
I have some existing data in Azure table storage.So when I deploy csv file ,the latest changes are deploying but the data which is existed in the azure table storage is not overwriting or old data is not deleting.for ex:I have 3 rows of data in azure storage existing,when I deploy csv file which is having 5 rows,the5 rows data is deploying and old data of 3 rows is not deleting.It should be overwrite but its not happening.Please help me. – Subscription Details:
$subscriptionName = "Tech Enabled Solutions"
$resourceGroupName = "abc"
$storageAccountName = "defghi"
$location = "North Central US"
$tableName = "TestTable"
# Get the storage key for the storage account
$storageAccountKey = "12345678990"
# Get a storage context
$ctx = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
$table = Get-AzureStorageTable -Name $tableName -Context $ctx -ErrorAction Ignore
#If the table exists, start deleting its entities.
if ($table -ne $null)
{
$table=Get-AzureStorageTableRowAll -table $table | Remove-AzureStorageTableRow -table $table -Context $ctx
}
The Get-AzureStorageTableRowAll command is in AzureRmStorageTable Module, so install it before call command.
Add this command to your script to install that module:
Install-PackageProvider -Name NuGet -Force -Scope CurrentUser
Install-Module -Name AzureRmStorageTable -Force -Verbose -Scope CurrentUser
I want to create a container through Windows PowerShell and therefore try to obtain the account key via
$storageAccountKey = Get-AzureRmStorageAccountKey -ResourceGroupName $resourceGroupName -Name $storageAccountName | %{ $_.Key1 }
Getting the error
New-AzureStorageContext : Cannot validate argument on parameter 'StorageAccountKey'. The argument is null or empty. Provide an argument that is not null or empty, and then try the
command again.
This should work
$storageAccountKey = Get-AzureRmStorageAccountKey -ResourceGroupName $resourceGroupName -Name $storageAccountName | ? { $_.KeyName -eq 'Key1' } | % { $_.Value }
The problem is that the result is not a hashtable but a .NET generic list so it is not possible to access the value directly via the key name.