im struggling with an REST Method to DELETE an Entry via PowerShell from an Azure Storage Account. Im authenticating with an SharedAccessSignature (SAS) (has rights for read, write and delete) to create entries but i dont get this to work to also DELETE entries. Has anyone created an PowerShell script to delete form Azure Storage Account Tables from PowerShell yet and could send me an code snippet on how to do this?
Im not using the PowerShell Module im using the "Invoke-WebRequest" CMDlet. Im new to REST APIs so maybe i just dont have the right idea? For the entry creation im using the URI in the Invoke-WebRequest call to give the SAS Token as authentication but changing the "-Method POST" to "-Method DELETE" does not worked.
Thanks for your help
To delete the table by using REST method, make use of below sample query if helpful:
Instead of using "Invoke-WebRequest", make use of "Invoke-RestMethod" like below
function DeleteTableEntity($TableName,$PartitionKey,$RowKey) {
$resource = "$tableName(PartitionKey='$PartitionKey',RowKey='$Rowkey')"
$table_url = "https://$storageAccount.table.core.windows.net/$resource"
$GMTTime = (Get-Date).ToUniversalTime().toString('R')
$stringToSign = "$GMTTime`n/$storageAccount/$resource"
$hmacsha = New-Object System.Security.Cryptography.HMACSHA256
$hmacsha.key = [Convert]::FromBase64String($accesskey)
$signature = $hmacsha.ComputeHash([Text.Encoding]::UTF8.GetBytes($stringToSign))
$signature = [Convert]::ToBase64String($signature)
$headers = #{
'x-ms-date' = $GMTTime
Authorization = "SharedKeyLite " + $storageAccount + ":" + $signature
Accept = "application/json;odata=minimalmetadata"
'If-Match' = "*"
}
$item = Invoke-RestMethod -Method DELETE -Uri $table_url -Headers $headers -ContentType application/http
}
For more in detail, please refer below link:
Use Azure Table Storage via PowerShell and the Rest API - GCITS
Otherwise, you can install PowerShell module, and make use of below script like below:
$resourceGroup = 'ResourceGroupName'
$storageAccountName = 'StorageAccountName'
$storageAccount = Get-AzStorageAccount -ResourceGroupName $resourceGroup -Name $storageAccountName
$ctx = $storageAccount.Context
$tables = (Get-AzStorageTable -Context $Ctx).name
ForEach ($table in $tables) {
Remove-AzStorageTable –Name $table –Context $ctx -Force
}
For more in detail, please refer below link:
Delete all tables in Azure storage using powershell | Mike Says Meh.
Related
I have an Azure Devops Process which uses a Service Principle to deploy a Power BI report from a .pbix file and updates the data source to point at the production Azure SQL server.
The outcome is that the report and dataset are deployed and the connection is updated, however the CREDENTIALS are blank (Username and Password), so in order to make it usable, someone has to log on to Power BI service, open the Dataset and update the credentials, which means there is a manual step involved in our CI/CD process.
I need help with updating the source credentials for the new data source via code so that this manual process is not needed
Any suggestion will be of great help. Thank You.
Here is a sample PowerShell script, using Microsoft Power BI Cmdlets, which will patch the credentials of the dataset and commented at the end are few rows, which will refresh the dataset after that (uncomment them if needed). Just replace the x-es at the top with the actual values (workspace and dataset name, application ID, etc.).
<#
Patch the credentials of a published report/dataset, so it can be refreshed.
#>
# Fill these ###################################################
$tenantId = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" # Get from Azure AD -> Properties (https://portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/Properties)
$applictionId = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" # Get Application (client) ID from Azure AD -> App registrations (https://portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/RegisteredApps)
$applicationSecret = "xxxxxxxxxxxxxxxx" # Create it from application's "Certificates & secrets" section
$workspaceName = "xxxxxxxx"
$reportName = "xxxxxxxx" # Actually it is dataset name
$sqlUserName = "xxxxxxxx"
$sqlUserPassword = "xxxxxxxxxx"
################################################################
Import-Module MicrosoftPowerBIMgmt
$SecuredApplicationSecret = ConvertTo-SecureString -String $applicationSecret -AsPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential($applictionId, $SecuredApplicationSecret)
$sp = Connect-PowerBIServiceAccount -ServicePrincipal -Tenant $tenantId -Credential $credential
$workspace = Get-PowerBIWorkspace -Name $workspaceName
$dataset = Get-PowerBIDataset -WorkspaceId $workspace.Id -Name $reportName
$workspaceId = $workspace.Id
$datasetId = $dataset.Id
$datasources = Get-PowerBIDatasource -WorkspaceId $workspaceId -DatasetId $datasetId
foreach($datasource in $datasources) {
$gatewayId = $datasource.gatewayId
$datasourceId = $datasource.datasourceId
$datasourePatchUrl = "gateways/$gatewayId/datasources/$datasourceId"
Write-Host "Patching credentials for $datasourceId"
# HTTP request body to patch datasource credentials
$userNameJson = "{""name"":""username"",""value"":""$sqlUserName""}"
$passwordJson = "{""name"":""password"",""value"":""$sqlUserPassword""}"
$patchBody = #{
"credentialDetails" = #{
"credentials" = "{""credentialData"":[ $userNameJson, $passwordJson ]}"
"credentialType" = "Basic"
"encryptedConnection" = "NotEncrypted"
"encryptionAlgorithm" = "None"
"privacyLevel" = "Organizational"
}
}
# Convert body contents to JSON
$patchBodyJson = ConvertTo-Json -InputObject $patchBody -Depth 6 -Compress
# Execute PATCH operation to set datasource credentials
Invoke-PowerBIRestMethod -Method Patch -Url $datasourePatchUrl -Body $patchBodyJson
}
#$datasetRefreshUrl = "groups/$workspaceId/datasets/$datasetId/refreshes"
#Write-Host "Refreshing..."
#Invoke-PowerBIRestMethod -Method Post -Url $datasetRefreshUrl
Using below PowerShell code to read the endpoint and write the result to blob storage. However, if the endpoint has more than one documents. How can we loop through each document and write the result as separate file (file1, file2....) in blob storage. If separate file no possible I am looking to store all document in one JSON file (FILE.JSON).
For example:
endpoint-------------blob storage
doc1-----------------file1.json
doc2-----------------file2.json
and so on
OR above approach not possible then:
doc1, doc2...ALL DOCS========> goes to file.json
#$Credential = Get-Credential
$Params = #{
"URI" = 'https://e832702c-faad-46e5-a585-885b8613c2ce-bluemix.cloudant.com/testdb/_all_docs?include_docs=true'
#"Authentication" = 'Basic'
#"Credential" = $Credential
}
$Result = (Invoke-RestMethod #Params).rows.doc | ConvertTo-Json -Depth 100
Write-Host "the result is :" $Result
$context=New-AzStorageContext -StorageAccountName "genstorage999" -StorageAccountKey "<your key>"
$container=Get-AzStorageContainer -Name "blobcontainer" -Context $context
$content = [system.Text.Encoding]::UTF8.GetBytes($Result)
$container.CloudBlobContainer.GetBlockBlobReference("testapp2.json").UploadFromByteArray($content,0,$content.Length)
Code Reference: Filter JSON ducument using PowerShell
Based on your previous question and your description, there will be some rows in the result data and each row will have a doc. You want to write doc content to separate .json files in storage if there will be multiple rows in the request response. I assume your result data get from this link(a mock API just for demo): https://stantest1016.blob.core.windows.net/static/result.json, and its result contains 2 rows:
You can just try the code below to meet your requirement:
$storageAccountKey= '<your key>'
$context = New-AzStorageContext -StorageAccountName '<account name>' -StorageAccountKey $storageAccountKey
$container = Get-AzStorageContainer -Name 'testc' -Context $context
$result = Invoke-RestMethod -Uri 'https://stantest1016.blob.core.windows.net/static/result.json'
foreach($row in $result.rows){
$destBlobName ="doc_" + $row.doc._id + ".json"
$contentText = $row.doc | ConvertTo-Json -Compress
$container.CloudBlobContainer.GetBlockBlobReference($destBlobName).UploadTextAsync($contentText)
}
Result :
Let me know if you have any other questions.
I'm attempting to provision a one drive in a dotnet core app using powershell core. Running powershell I've been able to successfully provision a one drive from the powershell command line following the directions provided below:
https://learn.microsoft.com/en-us/onedrive/pre-provision-accounts
Running it programatically in .net core however it looks like it uses a separate powershell that's bundled into .net core 2.1
I believe the unsuccessful in app runs are due to the powershell bundled with core not being setup correctly, namely the first 3 steps in the link above:
1.Download the latest SharePoint Online Management Shell.
2.Download and install the SharePoint Online Client Components SDK.
3.Connect to SharePoint Online as a global admin or SharePoint admin in Office 365. To learn how, see Getting started with SharePoint Online Management Shell.
How do I set up the powershell that gets run by my application to mirror those steps above?
My code looks like this:
using System.IO;
using System.Management.Automation;
namespace PowerShellApp
{
class Program
{
public static int Main(string[] args)
{
using (PowerShell ps = PowerShell.Create(
{
ps.AddScript(File.ReadAllText(<scriptLocation>))
.Invoke();
}
}
return 0;
}
}
How do I achieve these steps when executing within a .net core application
The powershell script I"m running is below and also within the link above:
<#
.SYNOPSIS
This script adds an entry for each user specified in the input file
into the OneDrive provisioning queue
.DESCRIPTION
This script reads a text file with a line for each user.
Provide the User Principal Name of each user on a new line.
An entry will be made in the OneDrive provisioning queue for each
user up to 200 users.
.EXAMPLE
.\BulkEnqueueOneDriveSite.ps1 -SPOAdminUrl https://contoso- admin.sharepoint.com -InputfilePath C:\users.txt
.PARAMETER SPOAdminUrl
The URL for the SharePoint Admin center
https://contoso-admin.sharepoint.com
.PARAMETER InputFilePath
The path to the input file.
The file must contain 1 to 200 users
C:\users.txt
.NOTES
This script needs to be run by a global or SharePoint administrator in Office 365
This script will prompt for the username and password of the administrator
#>
param
(
#Must be SharePoint Administrator URL
[Parameter(Mandatory = $true)]
[ValidateNotNullOrEmpty()]
[string] $SPOAdminUrl,
[Parameter(Mandatory = $true)]
[ValidateNotNullOrEmpty()]
[string] $InputFilePath
)
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client.R untime") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client.U serProfiles") | Out-Null
$ctx = New-Object Microsoft.SharePoint.Client.ClientContext($SPOAdminUrl)
$Users = Get-Content -Path $InputFilePath
if ($Users.Count -eq 0 -or $Users.Count -gt 200)
{
Write-Host $("Unexpected user count: [{0}]" -f $Users.Count) - ForegroundColor Red
return
}
$web = $ctx.Web
Write-Host "Enter an admin username" -ForegroundColor Green
$username = Read-Host
Write-Host "Enter your password" -ForegroundColor Green
$password = Read-Host -AsSecureString
$ctx.Credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($username,$password )
$ctx.Load($web)
$ctx.ExecuteQuery()
$loader = [Microsoft.SharePoint.Client.UserProfiles.ProfileLoader]::GetProfileLoader($ctx)
$ctx.ExecuteQuery()
$loader.CreatePersonalSiteEnqueueBulk($Users)
$loader.Context.ExecuteQuery()
Write-Host "Script Completed"
I'm afraid SP Online management Shell has dependencies from .Net Framework and will not work with Core (check this).
From the other side that module seemed to be a wrapper on top of their REST API. So if you want to integrate it with Core app, you may try to replace it with HTTP requests. Check this documentation
Also, below is a base powershell script to work with those REST API endpoints. I tested this one on my site:
$baseUrl = "http://sharepoint.com/sites/yourSite/_api"
$cred = Get-Credential
# retreive digest
$r = Invoke-WebRequest -Uri "$baseUrl/contextinfo" -Method Post -Credential $cred -SessionVariable sp
$digest = ([xml]$r.content).GetContextWebInformation.FormDigestvalue
# calling endpoint
$endpoint = "sp.userprofiles.profileloader.getprofileloader/getuserprofile"
$head = #{
"Accept" = "application/json;odata=verbose"
"X-RequestDigest" = $digest
}
$re = Invoke-WebRequest -Uri "$baseUrl/$endpoint" -Headers $head -Method Post -WebSession $sp
Write-Host $re.Content
This is a snippet for createpersonalsiteenqueuebulk, but I can't test it since I'm not domain admin. Hope it will work for you
#--- sample 2 (didn't test it since I'm not domain admin). Might need separate session/digest
$endpoint2 = "https://<domain>-admin.sharepoint.com/_api/sp.userprofiles.profileloader.getprofileloader/createpersonalsiteenqueuebulk"
$head = #{
"Accept" = "application/json;odata=verbose"
"X-RequestDigest" = $digest
}
$body = "{ 'emailIDs': ['usera#domain.onmicrosoft.com', 'userb#domain.onmicrosoft.com'] }"
$re2 = Invoke-WebRequest -Uri "$endpoint2" -Headers $head -body $body -Method Post -WebSession $sp
Write-Host $re2.Content
I wanted to extract data from VSTS using "WIQL" and do some reporting with that data.
1) Could someone please let me know if its is possible to use "WIQL" in PowerShell? If so, please let me know on where i can find the some samples or demos for this?
2) Also, are there any other client tools which support "WIQL" to make custom querying to VSTS. If so, please let me know where i can find some demo or some samples with respect to this?
Yes, it is possible to use WIQL in PowerShell, the easy way is that you could call Work Item Query REST API by using PowerShell.
For example:
$vstsAccount = "XX"
$projectName = "XX"
$user = ""
$token = "personal access token"
Function QueryWorkItem{
# Base64-encodes the Personal Access Token (PAT) appropriately
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $user,$token)))
# Construct the REST URL to obtain Build ID
$uri = "https://$($vstsAccount).visualstudio.com/$($projectName)/_apis/wit/wiql?api-version=1.0"
$body = "{
'query':'Select [System.Id],[System.Title] From WorkItems Where [System.Id] = 123'
}"
# Invoke the REST call and capture the results (notice this uses the PATCH method)
$result = Invoke-RestMethod -Uri $uri -Method Post -ContentType "application/json" -Headers #{Authorization=("Basic {0}" -f $base64AuthInfo)} -Body $body
Write-Output ("work item title: '$($result.WorkItems[0].URL)'")
}
QueryWorkItem
More information, you can refer to this article.
For secondly issue, you can build an application by yourself, such as console application with C# language.
Install Microsoft.TeamFoundationServer.ExtendedClient package
Simple C# code
:
var u = new Uri("https://XXX.visualstudio.com");
VssCredentials c = new VssCredentials(new Microsoft.VisualStudio.Services.Common.WindowsCredential(new NetworkCredential("user name", "password")));
var connection = new VssConnection(u, c);
var workitemClient = connection.GetClient<WorkItemTrackingHttpClient>();
var result = workitemClient.QueryByWiqlAsync(new Microsoft.TeamFoundation.WorkItemTracking.WebApi.Models.Wiql() { Query= "Select [System.Id],[System.Title] From WorkItems Where [System.Id] = 123" },"Scrum2015").Result;
Console.WriteLine(result.WorkItems.First().Url);
Update 1:
The TFS/VSTS SDK or extend SDK can be used in PowerShell. For example:
if((Get-PSSnapIn -Name Microsoft.TeamFoundation.PowerShell -ErrorAction SilentlyContinue) -eq $null)
{
Add-PSSnapin Microsoft.TeamFoundation.PowerShell
}
$Tfs2015AssembliesPath="[vs Installation path]\Microsoft Visual Studio 14.0\Common7\IDE\CommonExtensions\Microsoft\TeamFoundation\Team Explorer"
Add-Type -Path "$Tfs2015AssembliesPath\Microsoft.TeamFoundation.Client.dll"
Add-Type -Path "$Tfs2015AssembliesPath\Microsoft.TeamFoundation.Common.dll"
Add-Type -Path "$Tfs2015AssembliesPath\Microsoft.TeamFoundation.Build.Client.dll"
Add-Type -Path "$Tfs2015AssembliesPath\Microsoft.TeamFoundation.Build.Common.dll"
Add-Type -Path "$Tfs2015AssembliesPath\Microsoft.TeamFoundation.Git.Client.dll"
Add-Type -Path "$Tfs2015AssembliesPath\Microsoft.TeamFoundation.SourceControl.WebApi.dll"
#Add-Type -Path "$Tfs2015AssembliesPath\Microsoft.TeamFoundation.TestManagement.Client.dll"
Add-Type -Path "$Tfs2015AssembliesPath\Microsoft.TeamFoundation.VersionControl.Client.dll"
Add-Type -Path "$Tfs2015AssembliesPath\Microsoft.TeamFoundation.WorkItemTracking.Client.dll"
Function GetWorkItems{
param([string]$teamProjectName,[string]$address)
$credentials = New-Object System.Net.NetworkCredential("[user name]", "[password]")
$tfsCollection = New-Object Microsoft.TeamFoundation.Client.TfsTeamProjectCollection((New-Object System.URI($address)))
$wis = $tfsCollection.GetService([Microsoft.TeamFoundation.WorkItemTracking.Client.WorkItemStore])
$wiqlQuery = "SELECT [System.ID], [System.WorkItemType], [System.Title], [System.AssignedTo], [System.State] FROM WorkItems WHERE [System.TeamProject] = 'Scrum2015' AND [State] = 'New' AND [System.WorkItemType] in ('Bug','User Story') ORDER BY [System.ID]";
$witCollection = $wis.Query($wiqlQuery);
# add logical to export to excel.
Foreach($witItem in $witCollection)
{
Write-Host $witItem.Title
}
}
GetWorkItems "[project name]" "[tfs/vsts address]"
I want to add a tag to a TFS project using the REST API in Powershell.
I am trying to make this request based on the documentation for
Visual Studio Integration
I am calling this:
[void][System.Reflection.Assembly]::LoadWithPartialName('Microsoft.TeamFoundation.Client')
[void][System.Reflection.Assembly]::LoadWithPartialName('Microsoft.TeamFoundation.WorkItemTracking.Client')
if ( (Get-PSSnapin -Name "Microsoft.TeamFoundation.Powershell" -ErrorAction SilentlyContinue) -eq $null )
{
Add-PSSnapin "Microsoft.TeamFoundation.Powershell"
}
$SrcCollectionUrl = 'http://tfs.myCompany.com:8080/tfs/MyCollection'
$SrcProjectName = 'myProject'
[psobject] $tfs = [Microsoft.TeamFoundation.Client.TeamFoundationServerFactory]::GetServer($SrcCollectionUrl)
[Microsoft.TeamFoundation.WorkItemTracking.Client.WorkItemStoreFlags]$WorkItemBypass = [Microsoft.TeamFoundation.WorkItemTracking.Client.WorkItemStoreFlags]::BypassRules
$tfstp = New-Object Microsoft.TeamFoundation.Client.TfsTeamProjectCollection($SrcCollectionUrl)
$WorkItemStore = New-Object -TypeName 'Microsoft.TeamFoundation.WorkItemTracking.Client.WorkItemStore' -ArgumentList $tfs.TfsTeamProjectCollection, $WorkItemBypass
$SrcProject = $WorkItemStore.Projects[$SrcProjectName]
$ProjectGuid = Split-Path $SrcProject.Uri -Leaf
$AddTagsUrl = '{0}/_apis/tagging/scopes/{1}/tags?api-version=1.0' -f $SrcCollectionUrl,$ProjectGuid
$newTagParams = #{name="PWCreateTag2"}
$outjson = $newTagParams | ConvertTo-Json
$nresp = Invoke-RestMethod -Method POST -Uri $AddTagsUrl -UseDefaultCredentials -Body $outjson -ContentType 'application/json'
Everything works. The first time. However the documentation states: "If a tag by that name already exists, no tag is created. Instead, the response body includes the existing tag with that name."
The 2nd time I call the line I get: "The remote server returned an error: (400) Bad Request."
Anyone have any Idea why this fails the 2nd time?
FYI: TFS Server is 2015, Powershell is V4
I created powershell module for this - tfs
To add tags:
'tag1','tag2' | % { Add-TFSBuildTag -Id 520 -Tag $_ }