Where to define the parameters in powershell script? - powershell

I have downloaded this powershell script which copies Power BI report contents into a new file, however I don't understand where I should input the parameter values for SourceReportId, SourceWorkspaceId, TargetReportId, TargetWorkspaceId, etc.
Script is available here also: https://github.com/JamesDBartlett3/PowerBits/blob/main/PowerShell/Copy-PowerBIReportContentToBlankPBIXFile.ps1
`<#
.SYNOPSIS
Function: Copy-PowerBIReportContentToBlankPBIXFile
Author: #JamesDBartlett3 (James D. Bartlett III)
.DESCRIPTION
- This script will copy the contents of a published Power BI
report into a new report published from a blank PBIX
- This solves the problem where a Power BI report originally
created in the web browser cannot be downloaded from the
Power BI service as a PBIX file.
.PARAMETER SourceReportId
The ID of the report to copy from
.PARAMETER SourceWorkspaceId
The ID of the workspace to copy from
.PARAMETER TargetReportId
The ID of the report to copy to
.PARAMETER TargetWorkspaceId
The ID of the workspace to copy to
.PARAMETER BlankPbix
Local path (or URL) to a blank PBIX file to upload and copy the source report's contents into
.PARAMETER OutFile
Local path to save the new PBIX file to
.EXAMPLE
Copy-PowerBIReportContentToBlankPBIXFile -SourceReportId "12345678-1234-1234-1234-123456789012" -SourceWorkspaceId "12345678-1234-1234-1234-123456789012" -TargetReportId "12345678-1234-1234-1234-123456789012" -TargetWorkspaceId "12345678-1234-1234-1234-123456789012"
.NOTES
This function does NOT require Azure AD app registration,
service principal creation, or any other special setup.
The only requirements are:
- The user must be able to run PowerShell (and install the
MicrosoftPowerBIMgmt module, if it's not already installed).
- The user must be allowed to download report PBIX files
(see: "Download reports" setting in the Power BI Admin Portal).
- The user must have "Contributor" or higher permissions
on the source and target workspace(s).
TODO
- Testing
- Add usage, help, and examples.
- Rename the function to something more accurate to its current capabilities.
ACKNOWLEDGEMENTS
This PS function was inspired by a blog article written by
one of the top minds in the Power BI space, Mathias Thierbach.
And if you're not already using his pbi-tools for Power BI
version control, you should check it out: https://pbi.tools
#>
Function Copy-PowerBIReportContentToBlankPBIXFile {
#Requires -PSEdition Core
#Requires -Modules MicrosoftPowerBIMgmt
[CmdletBinding()]
Param(
[parameter(Mandatory = $true)][string]$SourceReportId,
[parameter(Mandatory = $true)][string]$SourceWorkspaceId,
[parameter(Mandatory = $false)][string]$TargetReportId,
[parameter(Mandatory = $false)][string]$TargetWorkspaceId = $SourceWorkspaceId,
[Parameter(Mandatory = $false)][string]$BlankPbix,
[Parameter(Mandatory = $false)][string]$OutFile
)
$headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
[string]$blankPbixTempFile = Join-Path -LiteralPath $env:TEMP -ChildPath "blank.pbix"
[array]$validPbixContents = #("Layout", "Metadata")
[bool]$blankPbixIsUrl = $BlankPbix.StartsWith("http")
[bool]$localFileExists = Test-Path $BlankPbix
[bool]$remoteFileIsValid = $false
[bool]$localFileIsValid = $false
[bool]$defaultFileIsValid = $false
Function FileIsBlankPbix($file) {
$zip = [System.IO.Compression.ZipFile]::OpenRead($file)
$fileIsPbix = #($validPbixContents | Where-Object {$zip.Entries.Name -Contains $_}).Count -gt 0
$fileIsBlank = (Get-Item $file).length / 1KB -lt 20
$zip.Dispose()
if($fileIsPbix -and $fileIsBlank) {
Write-Debug "$file is a valid blank pbix file."
return $true
}
else {
Write-Error "$file is NOT a valid blank pbix file."
return $false
}
}
# If user did not specify a target report ID, use a blank PBIX file
if(!$TargetReportId) {
# If user specified a URL to a file, download and validate it as a blank PBIX file
if ($blankPbixIsUrl){
Write-Debug "Downloading file: $BlankPbix..."
Invoke-WebRequest -Uri $BlankPbix -OutFile $blankPbixTempFile
Write-Debug "Validating downloaded file..."
$remoteFileIsValid = FileIsBlankPbix($blankPbixTempFile)
}
# If user specified a local path to a file, validate it as a blank PBIX file
elseif ($localFileExists) {
Write-Debug "Validating user-supplied file: $BlankPbix..."
$localFileIsValid = FileIsBlankPbix($BlankPbix)
}
# If user didn't specify a blank PBIX file, check for a valid blank PBIX in the temp location
elseif (Test-Path $blankPbixTempFile) {
Write-Debug "Validating pbix file found in temp location: $blankPbixTempFile..."
$defaultFileIsValid = FileIsBlankPbix($blankPbixTempFile)
}
# If user did not specify a blank PBIX file, and a valid blank PBIX is not in the temp location,
# download one from GitHub and check if it's valid and blank
else {
Write-Debug "Downloading a blank pbix file from GitHub to $blankPbixTempFile..."
$BlankPbixUri = "https://github.com/JamesDBartlett3/PowerBits/raw/main/Misc/blank.pbix"
Invoke-WebRequest -Uri $BlankPbixUri -OutFile $blankPbixTempFile
$defaultFileIsValid = FileIsBlankPbix($blankPbixTempFile)
}
# If we downloaded a valid blank PBIX file, use it.
if ($remoteFileIsValid -or $defaultFileIsValid) {
$BlankPbix = $blankPbixTempFile
}
# If a valid blank PBIX file could not be obtained by any of the above methods, throw an error.
if (!$TargetReportId -and !$localFileIsValid -and !$remoteFileIsValid -and !$defaultFileIsValid) {
Write-Error "No targetReportId specified & no valid blank PBIX file found. Please specify one or the other."
return
}
[bool]$pbixIsValid = ($localFileIsValid -or $remoteFileIsValid -or $defaultFileIsValid)
}
try {
$headers = Get-PowerBIAccessToken
}
catch {
Write-Output "Power BI Access Token required. Launching authentication dialog..."
Start-Sleep -s 1
Connect-PowerBIServiceAccount -WarningAction SilentlyContinue | Out-Null
$headers = Get-PowerBIAccessToken
}
finally {
Write-Debug "Target Report ID is null: $(!$TargetReportId)"
$pbiApiBaseUri = "https://api.powerbi.com/v1.0/myorg"
# If a valid blank PBIX was found, publish it to the target workspace
if ($pbixIsValid) {
Write-Debug "Publishing $BlankPbix to target workspace..."
$publishResponse = New-PowerBIReport -Path $BlankPbix -WorkspaceId $TargetWorkspaceId -ConflictAction CreateOrOverwrite
Write-Debug "Response: $publishResponse"
$TargetReportId = $publishResponse.Id
}
# Assemble the UpdateReportContent API URI and request body
$updateReportContentEndpoint = "$pbiApiBaseUri/groups/$TargetWorkspaceId/reports/$TargetReportId/UpdateReportContent"
$body = #"
{
"sourceReport": {
"sourceReportId": "$SourceReportId",
"sourceWorkspaceId": "$SourceWorkspaceId"
},
"sourceType": "ExistingReport"
}
"#
# Update the target report with the source report's content
$headers.Add("Content-Type", "application/json")
$response = Invoke-RestMethod -Uri $updateReportContentEndpoint -Method POST -Headers $headers -Body $body
# If user did not specify an output file, use the source report's name
$sourceReportName = (Get-PowerBIReport -Id $SourceReportId -WorkspaceId $SourceWorkspaceId).Name
$OutFile = $OutFile ?? "$($sourceReportName)_Clone.pbix"
# Export the target report to a PBIX file
Export-PowerBIReport -WorkspaceId $TargetWorkspaceId -Id $response.id -OutFile $OutFile
# Assemble the Datasets API URI
$datasetsEndpoint = "$pbiApiBaseUri/groups/$TargetWorkspaceId/datasets"
}
}`
Can anyone help please?

Assuming you have saved the script in a file "Copy-PowerBIReportContentToBlankPBIXFile.ps1", dot-source the file to make its function available (if you would call the script without dot-sourcing it, the function would be scoped to the script and not visible to the outside):
# Assumes the script is located in the current directory
. .\Copy-PowerBIReportContentToBlankPBIXFile.ps1
Now you can call the function from the script like this to get a list of parameters:
Copy-PowerBIReportContentToBlankPBIXFile -?
Don't get confused, the function just coincidentally has the same name as the script, in principle it could be named different.
The help section in the comments gives a full example of how to call the function:
Copy-PowerBIReportContentToBlankPBIXFile -SourceReportId "12345678-1234-1234-1234-123456789012" -SourceWorkspaceId "12345678-1234-1234-1234-123456789012" -TargetReportId "12345678-1234-1234-1234-123456789012" -TargetWorkspaceId "12345678-1234-1234-1234-123456789012"
If you look at the parameter definitions, only -SourceReportId and -SourceWorkspaceId are mandatory (Mandatory = $true), so the other parameters could be omitted if this makes sense for your use case.
Copy-PowerBIReportContentToBlankPBIXFile -SourceReportId "12345678-1234-1234-1234-123456789012" -SourceWorkspaceId "12345678-1234-1234-1234-123456789012"

Related

Output Azure Function powershell value to azure storage account

I have a powershell script running in an azure function app that grabs a json list of IPs, parses it and just returns a list of IPs with 1 per line. I need to take this output and save it to a specific azure storage account. I'm attempting to use the static website in the storage account to create an HTTP endpoint for others to retrieve the list of IPs.
Here's the script
# Input bindings are passed in via param block.
param($Timer)
# Get the current universal time in the default string format.
$currentUTCtime = (Get-Date).ToUniversalTime()
# The 'IsPastDue' property is 'true' when the current function invocation is later than scheduled.
if ($Timer.IsPastDue) {
Write-Host "PowerShell timer is running late!"
}
# Write an information log with the current time.
Write-Host "PowerShell timer trigger function ran! TIME: $currentUTCtime"
#-------------------------------------------------------------------------------------------
$url = "https://ip-ranges.atlassian.com/"
Invoke-RestMethod $url
$iplist = Invoke-RestMethod $url
$iplist.items | select-object -ExpandProperty cidr
out-file $outputBlob
I've tested the function in azure and it runs there just fine. I can't seem to get the integration outputs section of the function app to work. The settings for the outputs is
Binding type - azure blob storage
blob paramter name - outputBlob
Path - test/list.txt
storage account connection - searched and selected the storage account
I am not finding much documentation on how to make my powershell script output to this storage account. The out-file clearly doesn't work.
----------- updated code ----------
Here is the code that now successfully saves the file to a container, but I still cant save to the $web container for the static website. The $ is not something I can use in the output binding
# Input bindings are passed in via param block.
param($Timer)
# Get the current universal time in the default string format.
$currentUTCtime = (Get-Date).ToUniversalTime()
# The 'IsPastDue' property is 'true' when the current function invocation is later than scheduled.
if ($Timer.IsPastDue) {
Write-Host "PowerShell timer is running late!"
}
# Write an information log with the current time.
Write-Host "PowerShell timer trigger function ran! TIME: $currentUTCtime"
#------------------------
$url = "https://ip-ranges.atlassian.com/"
Invoke-RestMethod $url
$call = Invoke-RestMethod $url
$iplist = $call.items | select-object -ExpandProperty cidr
$output = out-string -InputObject $iplist
Push-OutputBinding -Name outputBlob -Value $output
the outputBlob binding is configured under integration > outputs > and the Path which is in the format of container/file. I cannot specify $web/file.txt...but if I do web/file.txt it will create a web container and put the output as file.txt within it. I need to do this, but it must go in the $web container.
This is something that I've been meaning to try for a little while but not actually got around to. Decided to give it a shot today when I seen your question.
So it is possible to push content using the blob output binding but the functionality is limited.
run.ps1
using namespace System.Net
# Input bindings are passed in via param block.
param (
$Request,
$TriggerMetadata
)
# Call the atlassian API to get the address ranges
$atlassianUri = "https://ip-ranges.atlassian.com/"
$iplist = Invoke-RestMethod $atlassianUri -ErrorAction Stop
$cidrList = $iplist.items | select-object -ExpandProperty cidr
# Push the contents to blob storage using the outputBinding
Push-OutputBinding -Name myOutputBlob -Value $cidrList
# Return a simple response so I know it worked
Push-OutputBinding -Name Response -Value ([HttpResponseContext]#{
StatusCode = [HttpStatusCode]::OK
Body = 'Successfully Updated Blob Storage'
})
function.json
You would have to include an timer Input Binding in your function but I used HTTP so that I could trigger it on-demand to test that it would work.
I have provided a static path to the blob output binding. The Path property can not be dynamically assigned from within the function yet according to this open GitHub issue.
{
"bindings": [
{
"authLevel": "function",
"type": "httpTrigger",
"direction": "in",
"name": "Request",
"methods": [
"get",
"post"
]
},
{
"type": "http",
"direction": "out",
"name": "Response"
},
{
"name": "myOutputBlob",
"type": "blob",
"path": "functioncopy/ipRanges.txt",
"connection": "MyStorageConnectionAppSetting",
"direction": "out"
}
]
}
The above code works however when sending the data to the blob file in the storage account Push-OutputBinding serialises the content to a JSON array e.g.
This may or may not work for you in it's current guise but I don't think there is a way using the output binding to just have a raw list.
You could however use the Az.Storage module within your function, create the file within your function execution and upload it that way instead
run.ps1
# Variables required - Fill these out
$storageAccountName = '<Insert Storage Account Here'
$containerName = '<Insert StorageContainer Name Here>'
$resourceGroupName = '<Insert resourceGroup Name Here>'
$subscriptionId = '<Insert subscriptionId Here>'
# Call the atlassian API to get the address ranges
$atlassianUri = "https://ip-ranges.atlassian.com/"
$iplist = Invoke-RestMethod $atlassianUri -ErrorAction Stop
$cidrList = $iplist.items | select-object -ExpandProperty cidr
# New-TemporaryFile uses [System.IO.Path]::GetTempPath() location
$tempFile = New-TemporaryFile
# Set the context to the subscription you want to use
# If your functionApp has access to more than one subscription it will load the first subscription by default.
# Possibly a good habit to be explicit about context.
Set-AzContext -Subscription $subscriptionId
# Get the Storage Account Key to authenticate
$storAccKeys = Get-AzStorageAccountKey -ResourceGroupName $resourceGroupName -Name $storageAccountName
$primaryKey = $storAccKeys | Where-Object keyname -eq 'key1' | Select-Object -ExpandProperty value
# Write the CIDR list to the temp file created earlier
$cidrList | Out-File $tempFile
# Create a Storage Context which will be used in the subsequent commands
$storageContext = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $primaryKey
# Upload the temp file to blob storage
$setAzStorageBlobContentSplat = #{
Container = $containerName
File = $tempFile.FullName
Blob = 'ipranges.txt'
Context = $storageContext
Properties = #{
ContentType = 'text/plain'
}
}
Set-AzStorageBlobContent #setAzStorageBlobContentSplat
# Return a simple response so I know it worked
Push-OutputBinding -Name Response -Value (
[HttpResponseContext]#{
StatusCode = [HttpStatusCode]::OK
Body = 'Successfully Updated Blob Storage'
}
)
You can see the documentation on Set-AzStorageBlobContent for examples on that here:
https://learn.microsoft.com/en-us/powershell/module/az.storage/set-azstorageblobcontent?view=azps-6.2.1#examples

Configuring powershell core from .net core to provision a one drive

I'm attempting to provision a one drive in a dotnet core app using powershell core. Running powershell I've been able to successfully provision a one drive from the powershell command line following the directions provided below:
https://learn.microsoft.com/en-us/onedrive/pre-provision-accounts
Running it programatically in .net core however it looks like it uses a separate powershell that's bundled into .net core 2.1
I believe the unsuccessful in app runs are due to the powershell bundled with core not being setup correctly, namely the first 3 steps in the link above:
1.Download the latest SharePoint Online Management Shell.
2.Download and install the SharePoint Online Client Components SDK.
3.Connect to SharePoint Online as a global admin or SharePoint admin in Office 365. To learn how, see Getting started with SharePoint Online Management Shell.
How do I set up the powershell that gets run by my application to mirror those steps above?
My code looks like this:
using System.IO;
using System.Management.Automation;
namespace PowerShellApp
{
class Program
{
public static int Main(string[] args)
{
using (PowerShell ps = PowerShell.Create(
{
ps.AddScript(File.ReadAllText(<scriptLocation>))
.Invoke();
}
}
return 0;
}
}
How do I achieve these steps when executing within a .net core application
The powershell script I"m running is below and also within the link above:
<#
.SYNOPSIS
This script adds an entry for each user specified in the input file
into the OneDrive provisioning queue
.DESCRIPTION
This script reads a text file with a line for each user.
Provide the User Principal Name of each user on a new line.
An entry will be made in the OneDrive provisioning queue for each
user up to 200 users.
.EXAMPLE
.\BulkEnqueueOneDriveSite.ps1 -SPOAdminUrl https://contoso- admin.sharepoint.com -InputfilePath C:\users.txt
.PARAMETER SPOAdminUrl
The URL for the SharePoint Admin center
https://contoso-admin.sharepoint.com
.PARAMETER InputFilePath
The path to the input file.
The file must contain 1 to 200 users
C:\users.txt
.NOTES
This script needs to be run by a global or SharePoint administrator in Office 365
This script will prompt for the username and password of the administrator
#>
param
(
#Must be SharePoint Administrator URL
[Parameter(Mandatory = $true)]
[ValidateNotNullOrEmpty()]
[string] $SPOAdminUrl,
[Parameter(Mandatory = $true)]
[ValidateNotNullOrEmpty()]
[string] $InputFilePath
)
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client.R untime") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client.U serProfiles") | Out-Null
$ctx = New-Object Microsoft.SharePoint.Client.ClientContext($SPOAdminUrl)
$Users = Get-Content -Path $InputFilePath
if ($Users.Count -eq 0 -or $Users.Count -gt 200)
{
Write-Host $("Unexpected user count: [{0}]" -f $Users.Count) - ForegroundColor Red
return
}
$web = $ctx.Web
Write-Host "Enter an admin username" -ForegroundColor Green
$username = Read-Host
Write-Host "Enter your password" -ForegroundColor Green
$password = Read-Host -AsSecureString
$ctx.Credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($username,$password )
$ctx.Load($web)
$ctx.ExecuteQuery()
$loader = [Microsoft.SharePoint.Client.UserProfiles.ProfileLoader]::GetProfileLoader($ctx)
$ctx.ExecuteQuery()
$loader.CreatePersonalSiteEnqueueBulk($Users)
$loader.Context.ExecuteQuery()
Write-Host "Script Completed"
I'm afraid SP Online management Shell has dependencies from .Net Framework and will not work with Core (check this).
From the other side that module seemed to be a wrapper on top of their REST API. So if you want to integrate it with Core app, you may try to replace it with HTTP requests. Check this documentation
Also, below is a base powershell script to work with those REST API endpoints. I tested this one on my site:
$baseUrl = "http://sharepoint.com/sites/yourSite/_api"
$cred = Get-Credential
# retreive digest
$r = Invoke-WebRequest -Uri "$baseUrl/contextinfo" -Method Post -Credential $cred -SessionVariable sp
$digest = ([xml]$r.content).GetContextWebInformation.FormDigestvalue
# calling endpoint
$endpoint = "sp.userprofiles.profileloader.getprofileloader/getuserprofile"
$head = #{
"Accept" = "application/json;odata=verbose"
"X-RequestDigest" = $digest
}
$re = Invoke-WebRequest -Uri "$baseUrl/$endpoint" -Headers $head -Method Post -WebSession $sp
Write-Host $re.Content
This is a snippet for createpersonalsiteenqueuebulk, but I can't test it since I'm not domain admin. Hope it will work for you
#--- sample 2 (didn't test it since I'm not domain admin). Might need separate session/digest
$endpoint2 = "https://<domain>-admin.sharepoint.com/_api/sp.userprofiles.profileloader.getprofileloader/createpersonalsiteenqueuebulk"
$head = #{
"Accept" = "application/json;odata=verbose"
"X-RequestDigest" = $digest
}
$body = "{ 'emailIDs': ['usera#domain.onmicrosoft.com', 'userb#domain.onmicrosoft.com'] }"
$re2 = Invoke-WebRequest -Uri "$endpoint2" -Headers $head -body $body -Method Post -WebSession $sp
Write-Host $re2.Content

How to use PowerShell to download files from SharePoint?

I've used the following sites to help me get this far and to troubleshoot.
Download file from SharePoint
How to download newest file from SharePoint using PowerShell
Mike Smith's Tech Training Notes SharePoint, PowerShell and .Net!
Upload file to a SharePoint doc library via PowerShell
Download latest file from SharePoint Document Library
How to iterate each folders in each of the SharePoint websites using PowerShell
PowerShell's Get-ChildItem on SharePoint Library
I am trying to download random files from SharePoint folder, and I have it working for when I actually know the file name and extension.
Working code with name and extension:
$SharePoint = "https://Share.MyCompany.com/MyCustomer/WorkLoad.docx"
$Path = "$ScriptPath\$($CustomerName)_WorkLoad.docx"
#Get User Information
$user = Read-Host "Enter your username"
$username = "$user#MyCompany"
$password = Read-Host "Enter your password" -AsSecureString
#Download Files
$WebClient = New-Object System.Net.WebClient
$WebClient.Credentials = New-Object System.Net.Networkcredential($UserName, $Password)
$WebClient.DownloadFile($SharePoint, $Path)
However, I don't seem to be able to figure out how to do it with multiple files of unknown names or extensions.
I have tried mapping a drive, only to end up with "drive mapping failed" & "The network path was not found." errors:
$SharePoint = Read-Host 'Enter the full path to Delivery Site'
$LocalDrive = 'P:'
$Credentials = Get-Credential
if (!(Test-Path $LocalDrive -PathType Container)) {
$retrycount = 0; $completed = $false
while (-not $completed) {
Try {
if (!(Test-Path $LocalDrive -PathType Container)) {
(New-Object -ComObject WScript.Network).MapNetworkDrive($LocalDrive,$SharePoint,$false,$($Credentials.username),$($Credentials.GetNetworkCredential().password))
}
$Completed = $true
}
Catch {
if ($retrycount -ge '5') {
Write-Verbose "Mapping SharePoint drive failed the maximum number of times"
throw "SharePoint drive mapping failed for '$($SharePoint)': $($Global:Error[0].Exception.Message)"
} else {
Write-Verbose "Mapping SharePoint drive failed, retrying in 5 seconds."
Start-Sleep '5'
$retrycount++
}
}
}
}
I've also used the following code with similar results or no results at all.
#Get User Information
$user = Read-Host "Enter your username"
$username = "$user#MyCompany"
$password = Read-Host "Enter your password" -AsSecureString
#Gathering the location of the Card Formats and Destination folder
$Customer = "$SharePoint\MyCustomer"
$Products = "$Path\$($CustomerName)\Products\"
#Get Documents from SharePoint
$credential = New-Object System.Management.Automation.PSCredential($UserName, $Password)
New-PSDrive -Credential $credential -Name "A" -PSProvider "FileSystem" -Root "$SharePoint"
net use $spPath #$password /USER:$user#corporate
#Get PMDeliverables file objects recursively
Get-ChildItem -Path "$Customer" | Where-Object { $_.name -like 'MS*' } | Copy-Item -Destination $Products -Force -Verbose
Without defined "input parameters", it's not exactly clear the full solution you need so I'll provide a few snippets of PowerShell that should be of use based on what you've described.
I'll spare you the basics of the various OOTB functions (i.e. Get-SPWeb, etc) though can provide those details as well if needed. I've also been overly explicit in the scripting, though know some of these lines could be chained, piped, etc to be made shorter & more efficient.
This example will iterate over the contents of a SharePoint Library and download them to your local machine:
$Destination = "C:\YourDestinationFolder\ForFilesFromSP"
$Web = Get-SPWeb "https://YourServerRoot/sites/YourSiteCollection/YourSPWebURL"
$DocLib = $Web.Lists["Your Doc Library Name"]
$DocLibItems = $DocLib.Items
foreach ($DocLibItem in $DocLibItems) {
if($DocLibItem.Url -Like "*.docx") {
$File = $Web.GetFile($DocLibItem.Url)
$Binary = $File.OpenBinary()
$Stream = New-Object System.IO.FileStream($Destination + "\" + $File.Name), Create
$Writer = New-Object System.IO.BinaryWriter($Stream)
$Writer.write($Binary)
$Writer.Close()
}
}
This is pretty basic; the variables up top are where on your local machine you wish to store the download files ($Destination), the URL of your SharePoint Site/Web ($Web) and the name of the Document Library (Your Doc Library Name).
The script then iterates through the items in the Library (foreach ($DocLibItem in $DocLibItems) {}), optionally filters for say items with a .docx file extension and downloads each to your local machine.
You could customize this further by targeting a specific sub-folder within the Doc Library, filter by metadata or properties of the Docs or even iterate over multiple Sites, Webs and/or Libraries in one script, optionally filtering those based on similar properties.

Getting error output from a powershell 2.0 script running as a task

TL:DR actual question is at the bottom
I'm trying to troubleshoot a Powershell v1.0 script issue. The script basically downloads a file from an FTP site and puts it on a remote server via UNC and emails the success or failure of the task.
The script runs as a task with a generic ID that is a Domain Admin but is not used to log into systems so the server it runs off of does not contain a profile for it.
If I do a runas for that user and execute the script via command line it works flawlessly. However, if I try to run it as a task it runs then exits instantly. If I open a runas command prompt and run the scheduled task vi at he command line all I get back is:
SUCCESS: Attempted to run the scheduled task "Task Name".
I've tried writing variable values to a text file to see what is going on but it never writes even when I write them as the very first step of execution.
What I want to do is capture any script error messages you would normally see when trying to run the script and/or write the variable information to a text file.
Is there any way to do this? BTW I doing via calling powershell with the following arguments:
-file -ExecutionPolicy Bypass "d:\datscript\myscript.ps1"
-I've tried -command instead of -file.
-I've tried "d:\datscript\myscript.ps1 5>&1 test.txt"
-I've tried "d:\datscript\myscript.ps1 9>&1 test.txt"
-I've tried "d:\datscript\myscript.ps1 | out-file d:\datscript\test.txt"
Nothing worked. I'm sure I can fix whatever bug I have but I'm banging my head against the wall trying to get some kind of failure info.
--Update: Here is a copy of the script minus details--
#-------------------------------------------------------------------------------------------------------------------------------------------------------------
#
#Variable Declaration
#
#$path = Path on local server to downlaod DAT to
#$olddat = Old/last DAT downloaded
#$currentdat = Next DAT number
#$ftpsite = McAfee FTP site. Update if path changes
#$ftpuser = FTP user (anon login)
#$ftppass = FTP password (anon login)
#$tempstring = Manipulation variable
#$gotdat = Boolean if updated DAT exists
#$success = Status if a new DAT exists and has been downloaded (used for email notification).
#$thetime = Variable use dto hold time of day manipulation.
$path = "\\myservername\ftproot\pub\mcafee\datfiles\"
$olddat = ""
$currentdat =""
$ftpsite = "ftp://ftp.nai.com/virusdefs/4.x/"
$ftpuser = "something"
$ftppass = "anything"
$tempstring =""
$gotdat = "False"
$success = ""
$thetime = ""
#
#Normalized functions handles UNC paths
#
function Get-NormalizedFileSystemPath
{
<#
.Synopsis
Normalizes file system paths.
.DESCRIPTION
Normalizes file system paths. This is similar to what the Resolve-Path cmdlet does, except Get-NormalizedFileSystemPath also properly handles UNC paths and converts 8.3 short names to long paths.
.PARAMETER Path
The path or paths to be normalized.
.PARAMETER IncludeProviderPrefix
If this switch is passed, normalized paths will be prefixed with 'FileSystem::'. This allows them to be reliably passed to cmdlets such as Get-Content, Get-Item, etc, regardless of Powershell's current location.
.EXAMPLE
Get-NormalizedFileSystemPath -Path '\\server\share\.\SomeFolder\..\SomeOtherFolder\File.txt'
Returns '\\server\share\SomeOtherFolder\File.txt'
.EXAMPLE
'\\server\c$\.\SomeFolder\..\PROGRA~1' | Get-NormalizedFileSystemPath -IncludeProviderPrefix
Assuming you can access the c$ share on \\server, and PROGRA~1 is the short name for "Program Files" (which is common), returns:
'FileSystem::\\server\c$\Program Files'
.INPUTS
String
.OUTPUTS
String
.NOTES
Paths passed to this command cannot contain wildcards; these will be treated as invalid characters by the .NET Framework classes which do the work of validating and normalizing the path.
.LINK
Resolve-Path
#>
[CmdletBinding()]
param (
[Parameter(Mandatory = $true, ValueFromPipeline = $true, ValueFromPipelineByPropertyName = $true)]
[Alias('PSPath', 'FullName')]
[string[]]
$Path,
[switch]
$IncludeProviderPrefix
)
process
{
foreach ($_path in $Path)
{
$_resolved = $_path
if ($_resolved -match '^([^:]+)::')
{
$providerName = $matches[1]
if ($providerName -ne 'FileSystem')
{
Write-Error "Only FileSystem paths may be passed to Get-NormalizedFileSystemPath. Value '$_path' is for provider '$providerName'."
continue
}
$_resolved = $_resolved.Substring($matches[0].Length)
}
if (-not [System.IO.Path]::IsPathRooted($_resolved))
{
$_resolved = Join-Path -Path $PSCmdlet.SessionState.Path.CurrentFileSystemLocation -ChildPath $_resolved
}
try
{
$dirInfo = New-Object System.IO.DirectoryInfo($_resolved)
}
catch
{
$exception = $_.Exception
while ($null -ne $exception.InnerException)
{
$exception = $exception.InnerException
}
Write-Error "Value '$_path' could not be parsed as a FileSystem path: $($exception.Message)"
continue
}
$_resolved = $dirInfo.FullName
if ($IncludeProviderPrefix)
{
$_resolved = "FileSystem::$_resolved"
}
Write-Output $_resolved
}
} # process
} # function Get-NormalizedFileSystemPath
#
#Get the number of the exisiting DAT file and increment for next DAT if the DAT's age is older than today.
# Otherwise, exit the program if DATs age is today.
#
$tempstring = "xdat.exe"
$env:Path = $env:Path + ";d:\datscript"
$path2 ="d:\datscript\debug.txt"
add-content $path2 $path
add-content $path2 $olddat
add-content $path2 $currentdat
add-content $path2 $success
add-content $path2 " "
$path = Get-NormalizedFileSystemPath -Path $path
Set-Location -Path $path
$olddat = dir $path | %{$_.Name.substring(0, 4) }
$olddatfull = "$olddat" + "$tempstring"
if ( ((get-date) - (ls $olddatfull).LastWriteTime).day -lt 1)
{
#***** Commented out for testing!
# exit
}
$currentdat = [INT] $olddat
$currentdat++
$currentdat = "$currentdat" + "$tempstring"
add-content $path2 $olddat
add-content $path2 $currentdat
add-content $path2 $success
add-content $path2 " "
#
#Connect to FTP site and get a current directory listing.
#
[System.Net.FtpWebRequest]$ftp = [System.Net.WebRequest]::Create($ftpsite)
$ftp.Method = [System.Net.WebRequestMethods+FTP]::ListDirectoryDetails
$response = $ftp.getresponse()
$stream = $response.getresponsestream()
$buffer = new-object System.Byte[] 1024
$encoding = new-object System.Text.AsciiEncoding
$outputBuffer = ""
$foundMore = $false
#
# Read all the data available from the ftp directory stream, writing it to the
# output buffer when done. After that the buffer is searched to see if it cotains the expected
# lastest DAT.
#
do
{
## Allow data to buffer for a bit
start-sleep -m 1000
## Read what data is available
$foundmore = $false
$stream.ReadTimeout = 1000
do
{
try
{
$read = $stream.Read($buffer, 0, 1024)
if($read -gt 0)
{
$foundmore = $true
$outputBuffer += ($encoding.GetString($buffer, 0, $read))
}
} catch { $foundMore = $false; $read = 0 }
} while($read -gt 0)
} while($foundmore)
$gotdat = $outputbuffer.Contains($currentdat)
$target = $path + $currentdat
#
# Downloads DATs and cleans up old DAT file. Returns status of the operation.
# Return 1 = success
# Return 2 = Latest DAT not found and 4pm or later
# Return 3 = DAT available but did not download or is 0 bytes
# Return 4 = LatesT DAT not found and before 4pm
#
$success = 0
if ($gotdat -eq "True")
{
$ftpfile = $ftpsite + $ftppath + $currentdat
write-host $ftpfile
write-host $target
$ftpclient = New-Object system.Net.WebClient
$uri = New-Object System.Uri($ftpfile)
$ftpclient.DownloadFile($uri, $target)
Start-Sleep -s 30
if ( ((get-date) - (ls $target).LastWriteTime).days -ge 1)
{
$success = 3
}
else
{
$testlength = (get-item $target).length
if( (get-item $target).length -gt 0)
{
Remove-Item "$olddatfull"
$success = 1
}
else
{
$success = 3
}
}
}
else
{
$thetime = Get-Date
$thetime = $thetime.Hour
if ($thetime -ge 16)
{
$success = 2
}
else
{
$success = 4
exit
}
}
#
# If successful download (success = 1) run push bat
#
if ($success -eq 1)
{
Start-Process "cmd.exe" "/c c:\scripts\mcafeepush.bat"
}
#Email structure
#
#Sends result email based on previous determination
#
#SMTP server name
$smtpServer = "emailserver.domain.com"
#Creating a Mail object
$msg = new-object Net.Mail.MailMessage
#Creating SMTP server object
$smtp = new-object Net.Mail.SmtpClient($smtpServer)
$msg.From = "email1#domain.com"
$msg.ReplyTo = "email2#domain.com"
$msg.To.Add("email2#domain.com")
switch ($success)
{
1 {
$msg.subject = "McAfee Dats $currentdat successful"
$msg.body = ("DAT download completed successfully. Automaton v1.0")
}
2 {
$msg.subject = "McAfee DATs Error"
$msg.body = ("Looking for DAT $currentdat on the FTP site but I coud not find it. Human intervention may be required. Automaton v1.0")
}
3 {
$msg.subject = "McAfee DATs Error"
$msg.body = ("$currentdat is available for download but download has failed. Human intervention will be required. Automaton v1.0")
}
default {
$msg.subject = "DAT Automaton Error"
$msg.body = ("Something broke with the McAfee automation script. Human intervention will be required. Automaton v1.0")
}
}
#Sending email
$smtp.Send($msg)
#Needed to keep the program from exiting too fast.
Start-Sleep -s 30
#debugging stuff
add-content $path2 $olddat
add-content $path2 $currentdat
add-content $path2 $success
add-content $path2 " "
Apparently you have an error in starting Powershell, either because execution policy is different on the Powershell version you start, or on the account, or there is an access error on the scheduled task. To gather actual error, you can launch a task like so:
cmd /c "powershell.exe -file d:\datscript\myscript.ps1 test.txt 2>&1" >c:\windows\temp\test.log 2&>1
This way if there would be an error on starting Powershell, it will be logged in the c:\windows\temp\test.log file. If the issue is in execution policy, you can create and run (once) a task with the following:
powershell -command "Get-ExecutionPolicy -List | out-file c:/windows/temp/policy.txt; Set-ExecutionPolicy RemoteSigned -Scope LocalMachine -Force"
Running a task under the account you plan to run your main task will first get the policies in effect (so that if setting machine-level policy won't help, you'll know what scope to alter) and set machine-level policy to "RemoteSigned", the least restrictive level beyond allowing every script (highly not recommended, there are encoder scripts written on Powershell that can ruin your data).
Hope this helps.
UPDATE: If that's not policy, there might be some errors in properly writing the parameters for the task. You can do this: Create a .bat file with the string that launches your script and redirects output to say test1.txt, then change the scheduled task to cmd.exe -c launcher.bat >test2.txt, properly specifying the home folder. Run the task and review both files, at least one of them should contain an error that prevents your script from launching.

EWS Powershell Exchange 2013 FindFolders returns 0 results

I am currently trying to make this script run on exchange 2013 to convert folder types from IPF.IMAP to IPF.NOTE as the folders are not showing on mobile devices after being imported from Imap. This script returns 0 results after running and multiple Doesnt Exist. If I output the folder names they are coming through, so i am not sure why the FindFolders is not returning any results.
I tried turning on impersonation (commented out here) but get an error saying I do not have permissions to impersonate even though I am logged in as administrator and running on powershell as admin. I am not sure if this is even necessary as the script works fine and returns the folder names for both $mbxfolder.Name and $SfSearchFilter, but only until it hits the FindFolders line, then the TotalCount is always 0.
Import-Module -Name "C:\Program Files\Microsoft\Exchange\Web Services\1.2\Microsoft.Exchange.WebServices.dll"
$exchService = New-Object Microsoft.Exchange.WebServices.Data.ExchangeService
$exchService.UseDefaultCredentials = $true
$exchService.AutodiscoverUrl('email#domain.com', {$true})
$MBXID = "email#domain.com" #Define mailboxID
foreach ($MailboxIdentity in $MBXID) {
Write-Host "Searching for $MailboxIdentity"
$mailbox = (Get-Mailbox -Identity $MailboxIdentity)
$MailboxName = (Get-Mailbox -Identity $MailboxIdentity).PrimarySmtpAddress.ToString()
$MailboxRootid = new-object Microsoft.Exchange.WebServices.Data.FolderId([Microsoft.Exchange.WebServices.Data.WellKnownFolderName]::Root,$MailboxName) #MsgFolderRoot selection and creation of new root
$MailboxRoot = [Microsoft.Exchange.WebServices.Data.Folder]::Bind($exchService,$MailboxRootid)
#$exchService.ImpersonatedUserId = New-Object Microsoft.Exchange.WebServices.Data.ImpersonatedUserId -ArgumentList ([Microsoft.Exchange.WebServices.Data.ConnectingIdType]::SmtpAddress),$MailboxName #Define impersonation
$folderid = $MailboxRootid
$f1 = $MailboxRoot
$fold = get-mailboxfolderstatistics $MailboxIdentity #Getting complete list of selected mailbox
foreach ($mbxfolder in $fold){
#Define Folder View Really only want to return one object
$fvFolderView = new-object Microsoft.Exchange.WebServices.Data.FolderView(100) #page size for displayed folders
$fvFolderView.Traversal = [Microsoft.Exchange.WebServices.Data.FolderTraversal]::Deep; #Search traversal selection Deep = recursively
#Define a Search folder that is going to do a search based on the DisplayName of the folder
$SfSearchFilter = new-object Microsoft.Exchange.WebServices.Data.SearchFilter+IsEqualTo([Microsoft.Exchange.WebServices.Data.FolderSchema]::Displayname,$MBXFolder.name) #for each folder in mailbox define search
$findFolderResults = $MailboxRoot.FindFolders($SfSearchFilter,$fvFolderView) #for each folder in mailbox define folder view (this is online task for store.exe) and perform search
if ($findFolderResults.TotalCount -eq 0){ "Folder Doesn't Exist" } #Info if folder still exist
else {"Folder Exist"
ForEach ($Folder in $findFolderResults.Folders) { #for each folder in folder results perform check of folder class
$folder.folderclass #Info about folder class
if ($Folder.folderclass -eq "IPF.Imap"){ #If folder class is target type, do change and update
$Folder.folderclass = "IPF.Note" #Folder class change in variable
Write-Host "Updating folder $folder.name to correct type IPF.Note. Folder will start to be visible in OWA"
$Folder.update() #Folder class update in mailbox via EWS
}
}
}
}
}
It doesn't really make much sense to enumerate the the folders using Get-MailboxFolderStatistics and then search for each folder in EWS. This is going to really slow and unnecessary (you have the folderId anyway from Get-MailboxFolderStatistics so you can just convert that and bind to it). However I would
Get rid of Get-MailboxFolderStatistics altogether and just use straight EWS to enumerate the Folders in the Mailbox and do your fixes as this will be much quicker eg
## Get the Mailbox to Access from the 1st commandline argument
$MailboxName = $args[0]
## Load Managed API dll
###CHECK FOR EWS MANAGED API, IF PRESENT IMPORT THE HIGHEST VERSION EWS DLL, ELSE EXIT
$EWSDLL = (($(Get-ItemProperty -ErrorAction SilentlyContinue -Path Registry::$(Get-ChildItem -ErrorAction SilentlyContinue -Path 'Registry::HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Exchange\Web Services'|Sort-Object Name -Descending| Select-Object -First 1 -ExpandProperty Name)).'Install Directory') + "Microsoft.Exchange.WebServices.dll")
if (Test-Path $EWSDLL)
{
Import-Module $EWSDLL
}
else
{
"$(get-date -format yyyyMMddHHmmss):"
"This script requires the EWS Managed API 1.2 or later."
"Please download and install the current version of the EWS Managed API from"
"http://go.microsoft.com/fwlink/?LinkId=255472"
""
"Exiting Script."
exit
}
## Set Exchange Version
$ExchangeVersion = [Microsoft.Exchange.WebServices.Data.ExchangeVersion]::Exchange2007_SP1
## Create Exchange Service Object
$service = New-Object Microsoft.Exchange.WebServices.Data.ExchangeService($ExchangeVersion)
## Set Credentials to use two options are availible Option1 to use explict credentials or Option 2 use the Default (logged On) credentials
#Credentials Option 1 using UPN for the windows Account
$psCred = Get-Credential
$creds = New-Object System.Net.NetworkCredential($psCred.UserName.ToString(),$psCred.GetNetworkCredential().password.ToString())
$service.Credentials = $creds
#$service.TraceEnabled = $true
#Credentials Option 2
#service.UseDefaultCredentials = $true
## Choose to ignore any SSL Warning issues caused by Self Signed Certificates
## Code From http://poshcode.org/624
## Create a compilation environment
$Provider=New-Object Microsoft.CSharp.CSharpCodeProvider
$Compiler=$Provider.CreateCompiler()
$Params=New-Object System.CodeDom.Compiler.CompilerParameters
$Params.GenerateExecutable=$False
$Params.GenerateInMemory=$True
$Params.IncludeDebugInformation=$False
$Params.ReferencedAssemblies.Add("System.DLL") | Out-Null
$TASource=#'
namespace Local.ToolkitExtensions.Net.CertificatePolicy{
public class TrustAll : System.Net.ICertificatePolicy {
public TrustAll() {
}
public bool CheckValidationResult(System.Net.ServicePoint sp,
System.Security.Cryptography.X509Certificates.X509Certificate cert,
System.Net.WebRequest req, int problem) {
return true;
}
}
}
'#
$TAResults=$Provider.CompileAssemblyFromSource($Params,$TASource)
$TAAssembly=$TAResults.CompiledAssembly
## We now create an instance of the TrustAll and attach it to the ServicePointManager
$TrustAll=$TAAssembly.CreateInstance("Local.ToolkitExtensions.Net.CertificatePolicy.TrustAll")
[System.Net.ServicePointManager]::CertificatePolicy=$TrustAll
## end code from http://poshcode.org/624
## Set the URL of the CAS (Client Access Server) to use two options are availbe to use Autodiscover to find the CAS URL or Hardcode the CAS to use
#CAS URL Option 1 Autodiscover
$service.AutodiscoverUrl($MailboxName,{$true})
"Using CAS Server : " + $Service.url
#CAS URL Option 2 Hardcoded
#$uri=[system.URI] "https://casservername/ews/exchange.asmx"
#$service.Url = $uri
## Optional section for Exchange Impersonation
#$service.ImpersonatedUserId = new-object Microsoft.Exchange.WebServices.Data.ImpersonatedUserId([Microsoft.Exchange.WebServices.Data.ConnectingIdType]::SmtpAddress, $MailboxName)
#Define Function to convert String to FolderPath
function ConvertToString($ipInputString){
$Val1Text = ""
for ($clInt=0;$clInt -lt $ipInputString.length;$clInt++){
$Val1Text = $Val1Text + [Convert]::ToString([Convert]::ToChar([Convert]::ToInt32($ipInputString.Substring($clInt,2),16)))
$clInt++
}
return $Val1Text
}
#Define Extended properties
$PR_FOLDER_TYPE = new-object Microsoft.Exchange.WebServices.Data.ExtendedPropertyDefinition(13825,[Microsoft.Exchange.WebServices.Data.MapiPropertyType]::Integer);
$folderidcnt = new-object Microsoft.Exchange.WebServices.Data.FolderId([Microsoft.Exchange.WebServices.Data.WellKnownFolderName]::MsgFolderRoot,$MailboxName)
#Define the FolderView used for Export should not be any larger then 1000 folders due to throttling
$fvFolderView = New-Object Microsoft.Exchange.WebServices.Data.FolderView(1000)
#Deep Transval will ensure all folders in the search path are returned
$fvFolderView.Traversal = [Microsoft.Exchange.WebServices.Data.FolderTraversal]::Deep;
$psPropertySet = new-object Microsoft.Exchange.WebServices.Data.PropertySet([Microsoft.Exchange.WebServices.Data.BasePropertySet]::FirstClassProperties)
$PR_Folder_Path = new-object Microsoft.Exchange.WebServices.Data.ExtendedPropertyDefinition(26293, [Microsoft.Exchange.WebServices.Data.MapiPropertyType]::String);
#Add Properties to the Property Set
$psPropertySet.Add($PR_Folder_Path);
$fvFolderView.PropertySet = $psPropertySet;
#The Search filter will exclude any Search Folders
$sfSearchFilter = new-object Microsoft.Exchange.WebServices.Data.SearchFilter+IsEqualTo($PR_FOLDER_TYPE,"1")
$fiResult = $null
#The Do loop will handle any paging that is required if there are more the 1000 folders in a mailbox
do {
$fiResult = $Service.FindFolders($folderidcnt,$sfSearchFilter,$fvFolderView)
foreach($ffFolder in $fiResult.Folders){
$foldpathval = $null
#Try to get the FolderPath Value and then covert it to a usable String
if ($ffFolder.TryGetProperty($PR_Folder_Path,[ref] $foldpathval))
{
$binarry = [Text.Encoding]::UTF8.GetBytes($foldpathval)
$hexArr = $binarry | ForEach-Object { $_.ToString("X2") }
$hexString = $hexArr -join ''
$hexString = $hexString.Replace("FEFF", "5C00")
$fpath = ConvertToString($hexString)
}
"FolderPath : " + $fpath
"Folder Class : " + $ffFolder.FolderClass
}
$fvFolderView.Offset += $fiResult.Folders.Count
}while($fiResult.MoreAvailable -eq $true)
Cheers
Glen