I am running the below PowerShell Script to deploy csv file data into Azure table storage. But The below parameters are different for different environment in the azure.Suppose the below script can be deployed to any environment but the below parameters will be varied as per the environment.So I want to pass the below parameters to the script while running from the PowerShell task in VSTS.How to accomplish task.Please help me out on this.So
**$subscriptionName = "Tech Enabled Solutions"
$resourceGroupName = "abc"
$storageAccountName = "defghi"
$location = "North Central US, South Central US"
$StorageAccountKey = "12345678"**
PowerShell Script:
function Add-Entity()
{
[CmdletBinding()]
param
(
$table,
[string] $partitionKey,
[string] $RowKey,
[string] $Label_Usage,
[string] $Label_Value,
[string] $Usage_Location,
[string] $subscriptionName,
[string] $resourceGroupName,
[string] $storageAccountName,
[string] $location,
[string] $StorageAccountKey
)
$entity = New-Object -TypeName Microsoft.WindowsAzure.Storage.Table.DynamicTableEntity -ArgumentList $partitionKey, $rowKey
$entity.Properties.Add("Label_Value",$Label_Value)
$entity.Properties.Add("Label_Usage",$Label_Usage)
$entity.Properties.Add("Usage_Location",$Usage_Location)
$result = $table.CloudTable.Execute([Microsoft.WindowsAzure.Storage.Table.TableOperation]::InsertOrReplace($entity))
}
$tableName = "sampletable"
# Get a storage context
$ctx = New-AzureStorageContext $StorageAccountName $StorageAccountKey
# Get a reference to the table
$table = Get-AzureStorageTable -Name $tableName -Context $ctx -ErrorAction Ignore
$csv = Import-CSV "d:\a\1\s\DeploymentScripts\sampletable.csv"
ForEach ($line in $csv)
{
Add-Entity -Table $table -partitionKey $line.partitionkey -rowKey $line.RowKey -Label_Usage $line.Label_Usage -Label_Value $line.Label_Value -Usage_Location $line.Usage_Location
}
You need to use the arguments text box to pass your parameters into the script (either inline or script file).
Your script would need to look like this:
param (
[string] $table,
[string] $partitionKey,
[string] $RowKey,
[string] $Label_Usage,
[string] $Label_Value,
[string] $Usage_Location,
[string] $subscriptionName,
[string] $resourceGroupName,
[string] $storageAccountName,
[string] $location,
[string] $StorageAccountKey
)
$entity = New-Object -TypeName Microsoft.WindowsAzure.Storage.Table.DynamicTableEntity -ArgumentList $partitionKey, $rowKey
$entity.Properties.Add("Label_Value",$Label_Value)
$entity.Properties.Add("Label_Usage",$Label_Usage)
$entity.Properties.Add("Usage_Location",$Usage_Location)
$result = $table.CloudTable.Execute([Microsoft.WindowsAzure.Storage.Table.TableOperation]::InsertOrReplace($entity))
$tableName = "sampletable"
# Get a storage context
$ctx = New-AzureStorageContext $StorageAccountName $StorageAccountKey
# Get a reference to the table
$table = Get-AzureStorageTable -Name $tableName -Context $ctx -ErrorAction Ignore
$csv = Import-CSV "d:\a\1\s\DeploymentScripts\sampletable.csv"
ForEach ($line in $csv)
{
Add-Entity -Table $table -partitionKey $line.partitionkey -rowKey $line.RowKey -Label_Usage $line.Label_Usage -Label_Value $line.Label_Value -Usage_Location $line.Usage_Location
}
Each of your variables will either need to be defaulted or passed in as arguments. In your example, you would look something like the following in the text box:
-subscriptionName "Tech Enabled Solutions" -$resourceGroupName "abc" -storageAccountName "defghi" -location "North Central US, South Central US" -StorageAccountKey "12345678
The box is expecting you input the arguments exactly as you would if you were calling the PowerShell script from the command line.
Some parameters are not used in your script, such as $subscriptionName, $resourceGroupName, you can check whether they are needed.
Refer to this code to add parameters:
param(
[string] $subscriptionName,
[string] $resourceGroupName,
[string] $storageAccountName,
[string] $location,
[string] $StorageAccountKey
)
function Add-Entity()
{
[CmdletBinding()]
param
(
$table,
[string] $partitionKey,
[string] $RowKey,
[string] $Label_Usage,
[string] $Label_Value,
[string] $Usage_Location
)
$entity = New-Object -TypeName Microsoft.WindowsAzure.Storage.Table.DynamicTableEntity -ArgumentList $partitionKey, $rowKey
$entity.Properties.Add("Label_Value",$Label_Value)
$entity.Properties.Add("Label_Usage",$Label_Usage)
$entity.Properties.Add("Usage_Location",$Usage_Location)
$result = $table.CloudTable.Execute([Microsoft.WindowsAzure.Storage.Table.TableOperation]::InsertOrReplace($entity))
}
$tableName = "sampletable"
# Get a storage context
$ctx = New-AzureStorageContext $storageAccountName $StorageAccountKey
# Get a reference to the table
$table = Get-AzureStorageTable -Name $tableName -Context $ctx -ErrorAction Ignore
$csv = Import-CSV "d:\a\1\s\DeploymentScripts\sampletable.csv"
ForEach ($line in $csv)
{
Add-Entity -Table $table -partitionKey $line.partitionkey -rowKey $line.RowKey -Label_Usage $line.Label_Usage -Label_Value $line.Label_Value -Usage_Location $line.Usage_Location
}
Specify the parameters' value in PowerShell task (Arguments input box)
-subscriptionName "Tech Enabled Solutions" -resourceGroupName "abc" -storageAccountName "defghi" -location "North Central US, South Central US" -StorageAccountKey "12345678"
Your script doesn't take any parameters. You have a function in your script that takes parameters. A param block at the top of your script, outside of any functions, will make your script take parameters.
Ex:
param($A)
function Foo {
param($B)
Write-Output $B
}
Foo -B $A
Related
I have Azure Data Factory CI/CD pipeline. My ADF have few global params, so I am following Microsoft documentation for their CI/CD. On same documentation page, there is below 'Update global param' powershell script. Issue is whenever this script runs, it resets my ADF network access to 'Public endpoint' from 'private endpoint'.
param
(
[parameter(Mandatory = $true)] [String] $globalParametersFilePath,
[parameter(Mandatory = $true)] [String] $resourceGroupName,
[parameter(Mandatory = $true)] [String] $dataFactoryName
)
Import-Module Az.DataFactory
$newGlobalParameters = New-Object 'system.collections.generic.dictionary[string,Microsoft.Azure.Management.DataFactory.Models.GlobalParameterSpecification]'
Write-Host "Getting global parameters JSON from: " $globalParametersFilePath
$globalParametersJson = Get-Content $globalParametersFilePath
Write-Host "Parsing JSON..."
$globalParametersObject = [Newtonsoft.Json.Linq.JObject]::Parse($globalParametersJson)
# $gp in $factoryFileObject.properties.globalParameters.GetEnumerator())
# may be used in case you use non-standard location for global parameters. It is not recommended.
foreach ($gp in $globalParametersObject.GetEnumerator()) {
Write-Host "Adding global parameter:" $gp.Key
$globalParameterValue = $gp.Value.ToObject([Microsoft.Azure.Management.DataFactory.Models.GlobalParameterSpecification])
$newGlobalParameters.Add($gp.Key, $globalParameterValue)
}
$dataFactory = Get-AzDataFactoryV2 -ResourceGroupName $resourceGroupName -Name $dataFactoryName
$dataFactory.GlobalParameters = $newGlobalParameters
Write-Host "Updating" $newGlobalParameters.Count "global parameters."
Set-AzDataFactoryV2 -InputObject $dataFactory -Force
I want Network access to be via 'Private endpoint' ALWAYS. Does anyone faced this issue?
Just change last line of your Global param script as follows:
Set-AzDataFactoryV2 -InputObject $dataFactory -PublicNetworkAccess "Disabled" -Force
Now your ADF network access won't reset to Public one.
I have written a powershell script which takes multiple webapps(comma separated) as input.
I am splitting these webapps using powershell split function and configuring webapps by traversing each one of them using for-each loop.
Everything works fine in Powershell editor but when I configure the same script to VSTS release pipeline , split function doesn't work and which results in failure.
Input : devopstestwebapp1,devopstestwebapp2
Code : $WebAppName = $WebAppName.Split(',')
Output (After Split) : devopstestwebapp1 devopstestwebapp2
Error : The Resource 'Microsoft.Web/sites/devopstestwebapp1
devopstestwebapp2' under resource group 'DevOpsResourseGroup' was not found.
Following is my powershell script
# Parameters
param (
[Parameter(Position=0,mandatory=$true)]
[string] $AADAppID,
[Parameter(Position=1,mandatory=$true)]
[string] $AADKey,
[Parameter(Position=2,mandatory=$true)]
[string] $TenantId,
[Parameter(Position=3,mandatory=$true)]
[string] $ResourceGroupName,
[Parameter(Position=4,mandatory=$true)]
[string] $ServerName,
[Parameter(Position=5,mandatory=$true)]
[string] $RGLocation,
[Parameter(Position=6,mandatory=$true)]
[string] $WebAppName,
[Parameter(Position=7,mandatory=$true)]
[string] $SubscriptionName
)
# Connect to Azure
$ssAADKey = ConvertTo-SecureString $AADKey -AsPlainText -Force
$psCredential = New-Object System.Management.Automation.PSCredential($AADAppID, $ssAADKey)
Connect-AzureRmAccount -ServicePrincipal -Credential $psCredential -Subscription $SubscriptionName -TenantId $TenantId
write-host $WebAppName
$WebAppName = $WebAppName.Split(',')
write-host $WebAppName
Foreach ($servicename in $WebAppName)
{
write-host $servicename
}
Below works perfectly with VSTS powershell task :
Store app name in variable :
$WebAppName = '$(WebAppName)'
write-host $WebAppName
foreach($servicename in $WebAppName.Split(','))
{
write-host $servicename
}
Output :
2019-04-22T11:02:02.7680996Z devopstestwebapp1,devopstestwebapp2,devopstestwebapp3
2019-04-22T11:02:02.7737101Z devopstestwebapp1
2019-04-22T11:02:02.7750490Z devopstestwebapp2
2019-04-22T11:02:02.7765756Z devopstestwebapp3
The problematic line is this one:
$WebAppName = $WebAppName.Split(',')
You are reassigning the result of split to the same variable $WebAppName which has been declared as a string in the parameter list. So the array result of Split will be cast to a string, not an array anymore.
The solution is to assign the result of split to a new variable:
$WebAppNameSplit = $WebAppName.Split(',')
I've been reading a ton of these articles that say to use Get-SPWeb, but I've never been able to get those functions working due to authentication errors. I have build my own little functions to do what I need but I'm struggling to figure out what I'm doing wrong for my update function. Below are the functions I've built, and all of them work:
If (!$cred) {$cred = get-credential -UserName "$ENV:Username#$ENV:UserDomain.com" -Message "Enter your office 365 login"}
function Get-AuthenticationCookie($context)
{
$sharePointUri = New-Object System.Uri($context.site.Url)
$authCookie = $context.Credentials.GetAuthenticationCookie($sharePointUri)
if ($? -eq $false) #https://ss64.com/ps/syntax-automatic-variables.html
{
return $null
}
$fedAuthString = $authCookie.TrimStart("SPOIDCRL=".ToCharArray())
$cookieContainer = New-Object System.Net.CookieContainer
$cookieContainer.Add($sharePointUri, (New-Object System.Net.Cookie("SPOIDCRL", $fedAuthString)))
return $cookieContainer
}
function Get-SharepointContext
{
Param(
[Parameter(Mandatory = $true)]
$siteUrl,
[Parameter(Mandatory = $false)]
$cred)
If (!$cred) {$cred = get-credential -UserName "$ENV:Username#$env:USERDNSDOMAIN" -Message "Login"}
[string]$username = $cred.UserName
$securePassword = $cred.Password
[Void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client")
[Void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client.ClientContext")
$ctx = New-Object Microsoft.SharePoint.Client.ClientContext($siteUrl)
$ctx.RequestTimeOut = 1000 * 60 * 10;
$ctx.AuthenticationMode = [Microsoft.SharePoint.Client.ClientAuthenticationMode]::Default
$credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($username, $securePassword)
$ctx.Credentials = $credentials
Return $ctx
}
function Add-SharepointListEntry
{
#example
#Add-SharepointListEntry -cred $cred -clientName $DestinationPages
Param(
[Parameter(Mandatory = $true)]
$cred,
[Parameter(Mandatory = $true)]
$sitename,
$siteUrl = "https://$env:Userdomain.sharepoint.com/$sitename",
[Parameter(Mandatory = $true)]
$ListName,
$SharepointData
)
[Void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client")
# Bind to site collection
$Context = Get-SharepointContext -siteUrl $siteUrl -cred $cred
# Get List
$List = $Context.Web.Lists.GetByTitle($ListName)
$Context.Load($List)
$Context.ExecuteQuery()
# Create Single List Item
$ListItemCreationInformation = New-Object Microsoft.SharePoint.Client.ListItemCreationInformation
$NewListItem = $List.AddItem($ListItemCreationInformation)
#construct the entry to insert
$NewListItem["Title"] = $SharepointData.Title #Client Name
$NewListItem["Description"] = $SharepointData.Title
#These objects should pass right through
$NewListItem["Client"] = $SharepointData.Client
$NewListItem["Author"] = $SharepointData.Author
$NewListItem["Initials"] = $SharepointData.Author
$NewListItem["Created"] = $SharepointData.Created
$NewListItem.Update()
$Context.ExecuteQuery()
}
Function Get-SharepointListData
{
#example
#Get-SharepointListData -cred $cred
Param(
[Parameter(Mandatory = $true)]
$cred,
[Parameter(Mandatory = $true)]
$sitename,
$siteUrl = "https://$env:Userdomain.sharepoint.com/$sitename",
[Parameter(Mandatory = $true)]
$ListName
)
[Void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client")
# Bind to site collection
$Context = Get-SharepointContext -siteUrl $siteUrl -cred $cred
#Retrive the List
$List = $Context.web.Lists.GetByTitle($ListName)
#Get All List Items
#reference https://gallery.technet.microsoft.com/office/How-to-do-a-CAML-Query-6f5260cf
$Query = New-Object Microsoft.SharePoint.Client.CamlQuery
$ListItems = $List.GetItems($Query)
$context.Load($ListItems)
$context.ExecuteQuery()
# Turn item into a catch array
$ListItemCollection = #()
ForEach ($item in $ListItems)
{
$propertiesValues = New-Object PSObject
$currentItem = $item
$item.FieldValues.Keys | Where {$_ -ne "MetaInfo"} | ForEach {Add-Member -InputObject $propertiesValues -MemberType NoteProperty -Name $_ -Value $currentItem[$_]}
$ListItemCollection += $propertiesValues
Remove-Variable propertiesValues
}
Return $ListItemCollection
}
Now I'm building a new function and trying to use one list (which is querying a sharepoint folder) to update a sharepoint list. I query the directory with the get-sharepointlistdata, then loop through the results to add new entries if something is missing. This whole process works without issue. I'm trying to add a step in to update for any changes, but the function keeps failing on $list.Items.GetByID($index) throwing the error "You cannot call a method on a null-valued expression.":
Function Set-SharepointListData
{
Param(
[Parameter(Mandatory = $true)]
$cred,
[Parameter(Mandatory = $true)]
$sitename,
$siteUrl = "https://$env:userdomain.sharepoint.com/$sitename",
[Parameter(Mandatory = $true)]
$ListName,
[Parameter(Mandatory = $true)]
[int]$Index,
[Parameter(Mandatory = $true)]
$Time
)
[Void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client")
# Bind to site collection
$Context = Get-SharepointContext -siteUrl $siteUrl -cred $cred
# Get List
$List = $Context.Web.Lists.GetByTitle($ListName)
$Context.Load($List)
$Context.ExecuteQuery()
# Select Single List Item
$ListItem = $List.Items.GetById($index)
$ListItem["Created"] = $time
$ListItem.Update();
$Context.ExecuteQuery();
}
I'm certain I'm overlooking something obvious here... anyone have any ideas?
The $Context.Web.Lists.GetByTitle($ListName) doesn't return the Items of the list. You have to load the Items... normally done via caml query. See here - Although the sample is in C# it should get you started.
Actually I rather suggest you to use PnPPowershell, there are plenty of cmdlets to work with Sharepoint.
I am running the below script and passing the script parameters for the $fileObj through powershell script using arguments section in VSTS powershell task.I am trying to deploy table data into Azure table storage. I have table data in .csv files and I am trying to deploy those table entities using powershell script and deploying into azure table storage.The below script is not deploying the table entities and failing with error. Could any one please help me out.
I have attached the error log in onedrive location: https://onedrive.live.com/?authkey=%21AEh2aAOnbmuzq9U&cid=5599285D52BD31F3&id=5599285D52BD31F3%21900&parId=root&action=locate
foreach($fo in $fileObj){
Write-Host $fo.filepath
$csv = Import-CSV $fo.filepath
$cArray=$fo.Cols.split(",")
foreach($line in $csv)
{
Write-Host "$($line.partitionkey), $($line.rowKey)"
$entity = New-Object -TypeName Microsoft.WindowsAzure.Storage.Table.DynamicTableEntity -ArgumentList $line.partitionkey, $line.rowKey
foreach($c in $cArray){
Write-Host "$c,$($line.$c)"
$entity.Properties.Add($c,$line.$c)
}
$result = $table.CloudTable.Execute([Microsoft.WindowsAzure.Storage.Table.TableOperation]::Insert($entity))
}
}
$subscriptionName = ""
$resourceGroupName = ""
$storageAccountName = ""
$location = ""
# Get the storage key for the storage account
$StorageAccountKey = ""
# Get a storage context
$ctx = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
According to your mentioned log, I find that it seems that your csv column name is not corresponding to your code. And your CSV file format with 2 Colunms named Partitionkey and Rowkey is not correct. Please have a try to use the following demo code and csv file format. It works correctly on myside.
$resourceGroup ="resourceGroup name"
$storageAccount = "storage account Name"
$tableName = "table name"
$storageAccountKey = "storage key"
$ctx = New-AzureStorageContext -StorageAccountName $storageAccount -
StorageAccountKey $storageAccountKey
######### Add removing table and create table code #######
try
{
Write-Host "Start to remove table $tableName, please wait a moment..."
Remove-AzureStorageTable -Name $tableName -Context $ctx -Force # Remove the Azure table
Start-Sleep -Seconds 60 # waiting for removing table, you could change it according to your table
Write-Host "$tableName table has been removed"
}
catch
{
Write-Host "$tableName is not existing"
}
Write-Host "Start to create $tableName table"
New-AzureStorageTable -Name $tableName -Context $ctx # Create new azure storage table
##########Add removing table and create table code ############
$table = Get-AzureStorageTable -Name $tableName -Context $ctx
$csvPath ='csv file path'
$cols = "Label_Usage,Label_Value,Usage_Location" #should be corrensponding to your csv column exclude Partitionkey and RowKey
$csv = Import-Csv -Path $csvPath
$number = 0
[Microsoft.WindowsAzure.Storage.Table.TableBatchOperation]$batchOperation = New-Object -TypeName Microsoft.WindowsAzure.Storage.Table.TableBatchOperation
foreach($line in $csv)
{
$number++
$entity = New-Object -TypeName Microsoft.WindowsAzure.Storage.Table.DynamicTableEntity -ArgumentList $line.partitionkey, $line.rowKey
$colArray = $cols.split(",")
Write-Host "$($line.partitionkey), $($line.rowKey)" #output partitionkey and rowKey value
foreach($colName in $colArray)
{
Write-Host "$colName,$($line.$colName)" #output column name and value
$entity.Properties.Add($colName,$line.$colName)
}
if($number -le 100)
{
$batchOperation.InsertOrReplace($entity) # Changed code
}
else
{ $number =0
$result = $table.CloudTable.ExecuteBatch($batchOperation)
[Microsoft.WindowsAzure.Storage.Table.TableBatchOperation]$batchOperation = New-Object -TypeName Microsoft.WindowsAzure.Storage.Table.TableBatchOperation
}
}
if($batchOperation.Count -ne 0)
{
$result = $table.CloudTable.ExecuteBatch($batchOperation)
}
Note: For batch operation requires records in the CSV file with the same partition key value.
csv file example format
Test Result:
is it possible create new table storage on azure with only use connection string by PowerShell?
Param (
[string]$StorageAccountName,
[string]$StorageAccountKey,
[string]$name
)
Import-Module Azure
$tableName = $name
$accountCredentials = New-Object "Microsoft.WindowsAzure.Storage.Auth.StorageCredentials" $StorageAccountName, $StorageAccountKey
$storageAccount = New-Object "Microsoft.WindowsAzure.Storage.CloudStorageAccount" $accountCredentials, $true
$tableClient = $storageAccount.CreateCloudTableClient()
$table = $tableClient.GetTableReference($tableName)
$table.CreateIfNotExists()
not like this way..
If you are using Azure PowerShell Cmdlets, there is a New-AzureStorageTable that you can use to create a new table.
Sample Code:
$storageContext = New-AzureStorageContext -StorageAccountName "accountname" -StorageAccountKey "accountkey"
New-AzureStorageTable -Name "TableName" -Context $storageContext