Intermittent error messages when running an Azure Functions app - powershell

I have a Powershell program which runs on a schedule in an Azure Functions app. It connects to Office 365 to download audit logs, make some changes and then export a CSV to an Azure Data Lake Storage account. To avoid hard-coded credentials, an Azure Key Vault stores the secrets. I created a managed identity in the Azure Function along with the required application settings and URL to point to Azure Key Vault. The code references the application secrets (APPSETTING) and all seemed to be running well, until I noticed today that since yesterday afternoon the exported CSV files were empty.
So I opened up the Function app, hit Run manually and I could see a CSV file exported with data. When I took a look at the execution log however, I spotted these error messages which despite not affecting the execution this time, makes me wonder whether this is what caused the problem with the empty CSV files. The program is now running on a schedule as normal and the error messages appear to be intermittent.
Not sure why it's complaining about the username and password, when it is clearly able to access the data source (Office audit logs), export the CSV and transfer it to the file destination (Azure Data Lake Storage) successfully.
Any idea what is going on? Any tips or suggestions welcome! Code provided below. Many thanks!
# Input bindings are passed in via param block.
param($Timer)
# Get the current universal time in the default string format.
$currentUTCtime = (Get-Date).ToUniversalTime()
# The 'IsPastDue' property is 'true' when the current function invocation is later than scheduled.
if ($Timer.IsPastDue) {
Write-Host "PowerShell timer is running late!"
}
# Write an information log with the current time.
Write-Host "PowerShell timer trigger function ran! TIME: $currentUTCtime"
<#
Title: Power BI Audit Logging
Client:
Description: Connects to Azure audit logs using admin credentials (secrets via Azure Key Vault). Opens a session to iterate through the Audit Log ($currentrResults) and aggregate
the logs into a single object ($aggregateResults). A for-each loop then iterates through the $aggregateResults and assigns each data piece (datum)
to a PowerShell object to which properties are added to hold the audit data. A CSV file is created and exported, and then transferred to a Data Lake storage account (using SAS secret via Azure Key Vault).
Last Revision: 06/09/2020 #>
Set-ExecutionPolicy RemoteSigned
Set-Item ENV:\SuppressAzurePowerShellBreakingChangeWarnings "true"
# Better for scheduled jobs
$uSecret = $ENV:APPSETTING_SecretUsername
$pSecret = $ENV:APPSETTING_SecretPassword
$sasSecret = $ENV:APPSETTING_SecretSAS
$securePassword = ConvertTo-SecureString -String $pSecret -AsPlainText -Force
$UserCredential = New-Object -TypeName "System.Management.Automation.PSCredential" -ArgumentList $uSecret, $securePassword
# This will prompt the user for credential (optional)
# $UserCredential = Get-Credential
$session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/ -Credential $UserCredential -Authentication Basic -AllowRedirection
Import-PSSession $session
$startDate=(get-date).AddDays(-10)
$endDate=(get-date)
$scriptStart=(get-date)
$sessionName = (get-date -Format 'u')+'pbiauditlog'
# Reset user audit accumulator
$aggregateResults = #()
$i = 0 # Loop counter
Do {
$currentResults = Search-UnifiedAuditLog -StartDate $startDate -EndDate $enddate -SessionId $sessionName -SessionCommand ReturnLargeSet -ResultSize 1000 -RecordType PowerBIAudit
if ($currentResults.Count -gt 0) {
Write-Host ("Finished {3} search #{1}, {2} records: {0} min" -f [math]::Round((New-TimeSpan -Start $scriptStart).TotalMinutes,4), $i, $currentResults.Count, $user.UserPrincipalName )
# Accumulate the data.
$aggregateResults += $currentResults
# No need to do another query if the # records returned <1000 - should save around 5-10 seconds per user.
if ($currentResults.Count -lt 1000) {
$currentResults = #()
} else {
$i++
}
}
} Until ($currentResults.Count -eq 0) # End of Session Search Loop.
$data=#()
foreach ($auditlogitem in $aggregateResults) {
$datum = New-Object -TypeName PSObject
$d = ConvertFrom-json $auditlogitem.AuditData
$datum | Add-Member -MemberType NoteProperty -Name Id -Value $d.Id
$datum | Add-Member -MemberType NoteProperty -Name CreationTDateTime -Value $d.CreationDate
$datum | Add-Member -MemberType NoteProperty -Name CreationTime -Value $d.CreationTime
$datum | Add-Member -MemberType NoteProperty -Name RecordType -Value $d.RecordType
$datum | Add-Member -MemberType NoteProperty -Name Operation -Value $d.Operation
$datum | Add-Member -MemberType NoteProperty -Name OrganizationId -Value $d.OrganizationId
$datum | Add-Member -MemberType NoteProperty -Name UserType -Value $d.UserType
$datum | Add-Member -MemberType NoteProperty -Name UserKey -Value $d.UserKey
$datum | Add-Member -MemberType NoteProperty -Name Workload -Value $d.Workload
$datum | Add-Member -MemberType NoteProperty -Name UserId -Value $d.UserId
$datum | Add-Member -MemberType NoteProperty -Name ClientIPAddress -Value $d.ClientIPAddress
$datum | Add-Member -MemberType NoteProperty -Name UserAgent -Value $d.UserAgent
$datum | Add-Member -MemberType NoteProperty -Name Activity -Value $d.Activity
$datum | Add-Member -MemberType NoteProperty -Name ItemName -Value $d.ItemName
$datum | Add-Member -MemberType NoteProperty -Name WorkSpaceName -Value $d.WorkSpaceName
$datum | Add-Member -MemberType NoteProperty -Name DashboardName -Value $d.DashboardName
$datum | Add-Member -MemberType NoteProperty -Name DatasetName -Value $d.DatasetName
$datum | Add-Member -MemberType NoteProperty -Name ReportName -Value $d.ReportName
$datum | Add-Member -MemberType NoteProperty -Name WorkspaceId -Value $d.WorkspaceId
$datum | Add-Member -MemberType NoteProperty -Name ObjectId -Value $d.ObjectId
$datum | Add-Member -MemberType NoteProperty -Name DashboardId -Value $d.DashboardId
$datum | Add-Member -MemberType NoteProperty -Name DatasetId -Value $d.DatasetId
$datum | Add-Member -MemberType NoteProperty -Name ReportId -Value $d.ReportId
$datum | Add-Member -MemberType NoteProperty -Name OrgAppPermission -Value $d.OrgAppPermission
# Option to include the below JSON column however for large amounts of data it may be difficult for PBI to parse
$datum | Add-Member -MemberType NoteProperty -Name Datasets -Value (ConvertTo-Json $d.Datasets)
# Below is a simple PowerShell statement to grab one of the entries and place in the DatasetName if any exist
foreach ($dataset in $d.datasets) {
$datum.DatasetName = $dataset.DatasetName
$datum.DatasetId = $dataset.DatasetId
}
$data+=$datum
}
$dateTimestring = $startDate.ToString("yyyyMMdd") + "_" + (Get-Date -Format "yyyyMMdd") + "_" + (Get-Date -Format "HHmm")
$fileName = ($dateTimestring + ".csv")
Write-Host ("Writing to file {0}" -f $fileName)
$filePath = "$Env:temp/" + $fileName
$data | Export-csv -Path $filePath
# File transfer to Azure storage account
Get-AzContext #Connect-AzAccount -Credential $UserCredential
Get-AzVM -ResourceGroupName "Audit" -status
$Context = New-AzStorageContext -StorageAccountName "auditingstorage" -StorageAccountKey $sasSecret
Set-AzStorageBlobContent -Force -Context $Context -Container "auditlogs" -File $filePath -Blob $filename
# Close PowerShell session
Remove-PSSession -Id $Session.Id

Your error state
ERROR: Connect-AzAccount : Username + Password authentication is not
supported in PowerShell Core. Please use device code authentication
for interactive log in, or Service Principal authentication for script
log in.
The problem come from using the credential authentication scheme in Powershell Core
Connect-AzAccount -Credential $UserCredential
Instead, in your app, enable the System Managed Identity and grant it the permissions to access what you need.
You can do that by going into the Identity pane and turning the status to On in the System assigned tab.
From there, add the required access through the Azure role assignments button.
Once this is done, you don't need to use Connect-AzAccount, your app is connected automatically to the managed identity at runtime. You can use Object ID from the Identity pane to find it afterward in Azure Active Directory / App Registration and assign it additional API access if needed.
Additional note
You could always continue to use Connect-AzAccount with a service principal account but unless you have requirements for that, I'd go the Managed Identity route.
References
How to use managed Identities for App Service and Azure Functions
Create an Azure service principal with Azure Powershell

Related

Get All Web Applications from SharePoint On-Prem using PowerShell CSOM

How to get all Web Applications with Content DB names from SharePoint 2013/2016 Farm using PowerShell CSOM?
simple one:
#Get all web applications sharepoint using powershell
$WebAppColl = Get-SPWebApplication
#Iterate through each web application
Foreach ($WebApp in $WebAppColl)
{
$Webapp.Url
}
#Or a one liner to loop through web applications
#Get-SPWebApplication | Select URL
#Read more: https://www.sharepointdiary.com/2016/01/get-all-web-applications-in-sharepoint-using-powershell.html#ixzz7JoRyZrx8
with more data:
Add-PSSnapin Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue
#Configuration Parameters
$ReportOutput= "C:\WebApplications-Report.csv"
$DataCollection = #()
#Get all web applications sharepoint using powershell
$WebAppColl = Get-SPWebApplication
Foreach ($WebApp in $WebAppColl)
{
#Determine the Authentication Type of the web application
if ($WebApp.UseClaimsAuthentication) { $AuthticationTYpe = "Claims"} else {$AuthticationTYpe = "Classic" }
#Get All Managed Paths of the web application
$ManagedPaths =(Get-SPManagedPath -WebApplication $WebApp | Select -ExpandProperty Name) -join ","
$WebAppData = new-object PSObject
$WebAppData | add-member -membertype NoteProperty -name "Web Application Name" -Value $WebApp.Name
$WebAppData | add-member -membertype NoteProperty -name "URL" -Value $WebApp.URL
$WebAppData | add-member -membertype NoteProperty -name "No.of Content Databases" -Value $WebApp.ContentDatabases.Count
$WebAppData | add-member -membertype NoteProperty -name "Authentication Type" -Value $AuthticationTYpe
$WebAppData | add-member -membertype NoteProperty -name "Application Pool" -Value $WebApp.ApplicationPool.DisplayName
$WebAppData | add-member -membertype NoteProperty -name "Outgoing E-mail" -Value $WebApp.OutboundMailServiceInstance[0].Server.Address
$WebAppData | add-member -membertype NoteProperty -name "Managed Paths" -Value $ManagedPaths
$DataCollection += $WebAppData
}
#Export Results to a CSV File
$DataCollection | Export-csv $ReportOutput -notypeinformation
Write-Host "Web Application Audit Report has been Generated!" -f Green
#Read more: https://www.sharepointdiary.com/2016/01/get-all-web-applications-in-sharepoint-using-powershell.html#ixzz7JoSCM4YG
more info - https://www.sharepointdiary.com/2016/01/get-all-web-applications-in-sharepoint-using-powershell.html

How to reference application settings (key vault references) in Azure Function using Powershell

I am writing a small program in Powershell which connects to Office 365 to download audit logs, make some changes and then export a CSV to an Azure Data Lake Storage account. To run this process on a schedule, I have created an Azure Function app (timer template) to run the program. To avoid hard-coded credentials, I created an Azure Key Vault to store the credential secrets. I created a managed identity in the Azure Function, created the secrets in Azure Key Vault with the credentials and then created three application settings in Azure Function under "Configuration" with the URL to point at the secrets stored in Azure Key Vault.
The three application settings are called "SecretUsername", "SecretPassword" (to point to the Office 365) and "SecretSAS" (to store the CSV in ADLS).
How do I refer to these variables in my Powershell script? I have tried different variations in my code, but none appear to work. Examples:
$uSecret = $SecretUsername
$uSecret = $ENV:SecretUsername
$uSecret = ENV:SecretUsername
$uSecret = (Get-ChildItem ENV:SecretUsername).SecretValueText
# Input bindings are passed in via param block.
param($Timer)
# Get the current universal time in the default string format.
$currentUTCtime = (Get-Date).ToUniversalTime()
# The 'IsPastDue' property is 'true' when the current function invocation is later than scheduled.
if ($Timer.IsPastDue) {
Write-Host "PowerShell timer is running late!"
}
# Write an information log with the current time.
Write-Host "PowerShell timer trigger function ran! TIME: $currentUTCtime"
Set-ExecutionPolicy AllSigned
Set-Item ENV:\SuppressAzurePowerShellBreakingChangeWarnings "true"
$uSecret = (Get-ChildItem ENV:SecretUsername).SecretValueText
$pSecret = (Get-ChildItem ENV:SecretPassword).SecretValueText
$sasSecret = (Get-ChildItem ENV:SecretSAS).SecretValueText
$securePassword = ConvertTo-SecureString -String $pSecret -AsPlainText -Force
$UserCredential = New-Object -TypeName "System.Management.Automation.PSCredential" -ArgumentList $uSecret, $securePassword
$session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/ -Credential $UserCredential -Authentication Basic -AllowRedirection
Import-PSSession $session
$startDate=(get-date).AddDays(-10)
$endDate=(get-date)
$scriptStart=(get-date)
$sessionName = (get-date -Format 'u')+'pbiauditlog'
$aggregateResults = #()
$i = 0 # Loop counter
Do {
$currentResults = Search-UnifiedAuditLog -StartDate $startDate -EndDate $enddate -SessionId $sessionName -SessionCommand ReturnLargeSet -ResultSize 1000 -RecordType PowerBIAudit
if ($currentResults.Count -gt 0) {
Write-Host ("Finished {3} search #{1}, {2} records: {0} min" -f [math]::Round((New-TimeSpan -Start $scriptStart).TotalMinutes,4), $i, $currentResults.Count, $user.UserPrincipalName )
# Accumulate the data.
$aggregateResults += $currentResults
# No need to do another query if the # records returned <1000 - should save around 5-10 seconds per user.
if ($currentResults.Count -lt 1000) {
$currentResults = #()
} else {
$i++
}
}
} Until ($currentResults.Count -eq 0) # End of Session Search Loop.
$data=#()
foreach ($auditlogitem in $aggregateResults) {
$datum = New-Object -TypeName PSObject
$d = ConvertFrom-json $auditlogitem.AuditData
$datum | Add-Member -MemberType NoteProperty -Name Id -Value $d.Id
$datum | Add-Member -MemberType NoteProperty -Name CreationTDateTime -Value $d.CreationDate
$datum | Add-Member -MemberType NoteProperty -Name CreationTime -Value $d.CreationTime
$datum | Add-Member -MemberType NoteProperty -Name RecordType -Value $d.RecordType
$datum | Add-Member -MemberType NoteProperty -Name Operation -Value $d.Operation
$datum | Add-Member -MemberType NoteProperty -Name OrganizationId -Value $d.OrganizationId
$datum | Add-Member -MemberType NoteProperty -Name UserType -Value $d.UserType
$datum | Add-Member -MemberType NoteProperty -Name UserKey -Value $d.UserKey
$datum | Add-Member -MemberType NoteProperty -Name Workload -Value $d.Workload
$datum | Add-Member -MemberType NoteProperty -Name UserId -Value $d.UserId
$datum | Add-Member -MemberType NoteProperty -Name ClientIPAddress -Value $d.ClientIPAddress
$datum | Add-Member -MemberType NoteProperty -Name UserAgent -Value $d.UserAgent
$datum | Add-Member -MemberType NoteProperty -Name Activity -Value $d.Activity
$datum | Add-Member -MemberType NoteProperty -Name ItemName -Value $d.ItemName
$datum | Add-Member -MemberType NoteProperty -Name WorkSpaceName -Value $d.WorkSpaceName
$datum | Add-Member -MemberType NoteProperty -Name DashboardName -Value $d.DashboardName
$datum | Add-Member -MemberType NoteProperty -Name DatasetName -Value $d.DatasetName
$datum | Add-Member -MemberType NoteProperty -Name ReportName -Value $d.ReportName
$datum | Add-Member -MemberType NoteProperty -Name WorkspaceId -Value $d.WorkspaceId
$datum | Add-Member -MemberType NoteProperty -Name ObjectId -Value $d.ObjectId
$datum | Add-Member -MemberType NoteProperty -Name DashboardId -Value $d.DashboardId
$datum | Add-Member -MemberType NoteProperty -Name DatasetId -Value $d.DatasetId
$datum | Add-Member -MemberType NoteProperty -Name ReportId -Value $d.ReportId
$datum | Add-Member -MemberType NoteProperty -Name OrgAppPermission -Value $d.OrgAppPermission
# Option to include the below JSON column however for large amounts of data it may be difficult for PBI to parse
$datum | Add-Member -MemberType NoteProperty -Name Datasets -Value (ConvertTo-Json $d.Datasets)
# Below is a simple PowerShell statement to grab one of the entries and place in the DatasetName if any exist
foreach ($dataset in $d.datasets) {
$datum.DatasetName = $dataset.DatasetName
$datum.DatasetId = $dataset.DatasetId
}
$data+=$datum
}
$dateTimestring = $startDate.ToString("yyyyMMdd") + "_" + (Get-Date -Format "yyyyMMdd") + "_" + (Get-Date -Format "HHmm")
$fileName = ($dateTimestring + ".csv")
Write-Host ("Writing to file {0}" -f $fileName)
$filePath = "$Env:temp/" + $fileName
$data | Export-csv -Path $filePath
Connect-AzAccount -Credential $UserCredential
Get-AzVM -ResourceGroupName "Audit" -status
$Context = New-AzStorageContext -StorageAccountName "auditingstorage" -StorageAccountKey $sasSecret
Set-AzStorageBlobContent -Force -Context $Context -Container "auditlogs" -File $filePath -Blob $filename
Remove-PSSession -Id $Session.Id
How do I reference the application settings in Azure Function so that I can use the stored secrets in my program?
Please assist! Many thanks!
To access the app settings, keyvault or not, you must retrieve it trhough : $env:APPSETTING_YourSettingName
Thus, for your keyvault referenced secret, you would access it through the following variables.
$env:APPSETTING_SecretUserName
$env:APPSETTING_SecretPassword
$env:APPSETTING_SecretSAS
And if ever you need to produce a list of them.
Get-ChildItem env:APPSETTING_*
Note, the values returned will plain text unencrypted string.
Therefore, in your code, this:
$uSecret = (Get-ChildItem ENV:SecretUsername).SecretValueText
becomes that:
$uSecret = $env:APPSETTING_SecretUserName
Additional note
Since it was pointed out in the comments, I'll mention it.
I am not advocating the use of clear text secret in app settings at all.
App settings should be a keyvault referene for any sensitive data.
I am simply stating that it can be retrieved within the function at runtime as clear-text through the $env:APPSETTING_YourSettingName variable.
Example:
AppSetting name : MySecretUser
AppSetting value: #Microsoft.KeyVault(SecretUri=https://myvault.vault.azure.net/secrets/mysecret/ec96f02080254f109c51a1f14cdb1931)
Actual secret value (In the keyvault) : I_AM_Secret
At runtime, getting the value of $env:APPSETTING_MySecretUser will return a String Object with the value I_AM_Secret

Publish from PowerShell directly to Data Lake Storage or output as a sink for ADF

I have a PowerShell script which downloads audit logs from Azure. The Export-CSV function outputs the file to my local computer. My plan however is to run this script every night using Azure Data Factory and then output the log file directly to Data Lake Storage, not locally.
ADF > PowerShell Script > Data Lake Storage
I need to amend this script so that it either outputs the CSV file directly to Data Lake Storage OR it outputs it so that ADF can channel it to a sink (Data Lake Storage).
Set-ExecutionPolicy RemoteSigned
#This is better for scheduled jobs
$User = "admin#M365XXXXXX.onmicrosoft.com"
$PWord = ConvertTo-SecureString -String "XXXXXXXX" -AsPlainText -Force
$UserCredential = New-Object -TypeName "System.Management.Automation.PSCredential" -ArgumentList $User, $PWord
#This will prompt the user for credential
#$UserCredential = Get-Credential
$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/ -Credential $UserCredential -Authentication Basic -AllowRedirection
Import-PSSession $Session
$startDate=(get-date).AddDays(-5)
$endDate=(get-date)
$scriptStart=(get-date)
$sessionName = (get-date -Format 'u')+'pbiauditlog'
# Reset user audit accumulator
$aggregateResults = #()
$i = 0 # Loop counter
Do {
$currentResults = Search-UnifiedAuditLog -StartDate $startDate -EndDate $enddate -SessionId $sessionName -SessionCommand ReturnLargeSet -ResultSize 1000 -RecordType PowerBIAudit
if ($currentResults.Count -gt 0) {
Write-Host (" Finished {3} search #{1}, {2} records: {0} min" -f [math]::Round((New-TimeSpan -Start $scriptStart).TotalMinutes,4), $i, $currentResults.Count, $user.UserPrincipalName )
# Accumulate the data
$aggregateResults += $currentResults
# No need to do another query if the # recs returned <1k - should save around 5-10 sec per user
if ($currentResults.Count -lt 1000) {
$currentResults = #()
} else {
$i++
}
}
} Until ($currentResults.Count -eq 0) # --- End of Session Search Loop --- #
$data=#()
foreach ($auditlogitem in $aggregateResults) {
$d=convertfrom-json $auditlogitem.AuditData
$datum = New-Object –TypeName PSObject
$d=convertfrom-json $auditlogitem.AuditData
$datum | Add-Member –MemberType NoteProperty –Name Id –Value $d.Id
$datum | Add-Member –MemberType NoteProperty –Name CreationDateTime –Value $auditlogitem.CreationDate
$datum | Add-Member –MemberType NoteProperty –Name CreationTimeUTC –Value $d.CreationTime
$datum | Add-Member –MemberType NoteProperty –Name RecordType –Value $d.RecordType
$datum | Add-Member –MemberType NoteProperty –Name Operation –Value $d.Operation
$datum | Add-Member –MemberType NoteProperty –Name OrganizationId –Value $d.OrganizationId
$datum | Add-Member –MemberType NoteProperty –Name UserType –Value $d.UserType
$datum | Add-Member –MemberType NoteProperty –Name UserKey –Value $d.UserKey
$datum | Add-Member –MemberType NoteProperty –Name Workload –Value $d.Workload
$datum | Add-Member –MemberType NoteProperty –Name UserId –Value $d.UserId
$datum | Add-Member –MemberType NoteProperty –Name ClientIP –Value $d.ClientIP
$datum | Add-Member –MemberType NoteProperty –Name UserAgent –Value $d.UserAgent
$datum | Add-Member –MemberType NoteProperty –Name Activity –Value $d.Activity
$datum | Add-Member –MemberType NoteProperty –Name ItemName –Value $d.ItemName
$datum | Add-Member –MemberType NoteProperty –Name WorkSpaceName –Value $d.WorkSpaceName
$datum | Add-Member –MemberType NoteProperty –Name DashboardName –Value $d.DashboardName
$datum | Add-Member –MemberType NoteProperty –Name DatasetName –Value $d.DatasetName
$datum | Add-Member –MemberType NoteProperty –Name ReportName –Value $d.ReportName
$datum | Add-Member –MemberType NoteProperty –Name WorkspaceId –Value $d.WorkspaceId
$datum | Add-Member –MemberType NoteProperty –Name ObjectId –Value $d.ObjectId
$datum | Add-Member –MemberType NoteProperty –Name DashboardId –Value $d.DashboardId
$datum | Add-Member –MemberType NoteProperty –Name DatasetId –Value $d.DatasetId
$datum | Add-Member –MemberType NoteProperty –Name ReportId –Value $d.ReportId
$datum | Add-Member –MemberType NoteProperty –Name OrgAppPermission –Value $d.OrgAppPermission
#option to include the below JSON column however for large amounts of data it may be difficult for PBI to parse
#$datum | Add-Member –MemberType NoteProperty –Name Datasets –Value (ConvertTo-Json $d.Datasets)
#below is a PowerShell statement to grab one of the entries and place in the DatasetName if any exist
foreach ($dataset in $d.datasets) {
$datum.DatasetName = $dataset.DatasetName
$datum.DatasetId = $dataset.DatasetId
}
$data+=$datum
}
$datestring = $startDate.ToString("yyyyMMdd")
$fileName = ("C:\Users\Client\Audit Logging\Logs\" + $datestring + ".csv")
Write-Host ("Writing to file {0}" -f $fileName)
$data | Export-csv -Path $fileName
Remove-PSSession -Id $Session.Id
I did start writing some code to connect to Data Lake Storage as follows, but not sure how to integrate this with the above Export-CSV function. How do I get the CSV file to be published to Data Lake Storage (as it won't be stored locally) or output so that ADF can direct it to a sink store?
# Variable Declaration
$rgName = "Audit"
$subscriptionID = "dabdhnca9-0742-48b2-98d5-af476d62c6bd"
$dataLakeStoreName = "pbiauditingstorage12"
$myDataRootFolder = "/auditlogs"
#$sourceFilesPath = "C:\Users\Downloads\datasets\"
# Log in to your Azure account
Login-AzureRmAccount
# List all the subscriptions associated to your account
Get-AzureRmSubscription
# Select a subscription
Set-AzureRmContext -SubscriptionId $subscriptionID
# See if folder exists.
# If a folder or item does not exiss, then you will see
# Get-AzureRmDataLakeStoreChildItem : Operation returned an invalid status code 'NotFound'
Get-AzureRmDataLakeStoreChildItem -AccountName $dataLakeStoreName -Path $myDataRootFolder
# Create new folder
New-AzureRmDataLakeStoreItem -Folder -AccountName $dataLakeStoreName -Path $myDataRootFolder/population
# Upload folder and its contents recursively and force ovewrite existing
Import-AzureRmDataLakeStoreItem -AccountName $dataLakeStoreName `
-Path $sourceFilesPath\ `
-Destination $myDataRootFolder `
-Recurse `
-Force
Please advise, many thanks!
Managed to make it work after passing the exported file's path ($filepath) as the "-File" source parameter using Set-AzStorageBlobContent function:
$User = "sdcadmin#M36dcdcdc.onmicrosoft.com"
$PWord = ConvertTo-SecureString -String "eVadcdcdcR" -AsPlainText -Force
$UserCredential = New-Object -TypeName "System.Management.Automation.PSCredential" -ArgumentList $User, $PWord
$dateTimestring = $startDate.ToString("yyyyMMdd") + "_" + (Get-Date -Format "yyyyMMdd") + "_" + (Get-Date -Format "HHmm")
$fileName = ($dateTimestring + ".csv")
Write-Host ("Writing to file {0}" -f $fileName)
$filePath = "$Env:temp/" + $fileName
$data | Export-csv -Path $filePath
# File transfer to Azure storage account
Connect-AzAccount -Credential $UserCredential
Get-AzVM -ResourceGroupName "Audit" -status
$Context = New-AzStorageContext -StorageAccountName "storageaccountname" -StorageAccountKey "sdfvsdvdsvsfvIdb6JgnnazfLIPDU8kOozDDn15262591efq5sdfvsdfv3M5ew=="
Set-AzStorageBlobContent -Force -Context $Context -Container "auditlogs" -File $filename -Blob $filename

SharePoint Online Subsites listing 0 in WebsCount parameter

I am using the following to connect to my SPO tenant and pull a report, however I am finding that the WebsCount parameter is not populating any subsites, has anyone encountered similar behavior and how to fix this bulk query?
Import-Module Microsoft.Online.SharePoint.Powershell -DisableNameChecking
#Config Parameters
$AdminSiteURL="https://SPTenant-admin.sharepoint.com"
$ReportOutput="C:\Temp\SPOStorage.csv"
#Get Credentials to connect to SharePoint Admin Center
$Cred = Get-Credential
#Connect to SharePoint Online Admin Center
Connect-SPOService -Url $AdminSiteURL –Credential $Cred
#Get all Site collections details and Export to CSV
Get-SPOSite -Limit ALL -Detailed | Export-Csv -Path $ReportOutput -NoTypeInformation
The following PowerShell for your reference.
Import-Module Microsoft.Online.SharePoint.Powershell -DisableNameChecking
#Config Parameters
$AdminSiteURL="https://tenant-admin.sharepoint.com"
$ReportOutput="C:\Temp\SPOStorage.csv"
#Get Credentials to connect to SharePoint Admin Center
$Cred = Get-Credential
#Connect to SharePoint Online Admin Center
Connect-SPOService -Url $AdminSiteURL –Credential $Cred
#Get All site collections
$SiteCollections = Get-SPOSite -Limit All
Write-Host "Total Number of Site collections Found:"$SiteCollections.count -f Yellow
 
#Array to store Result
$ResultSet = #()
 
#Loop through each site collection and retrieve details
Foreach ($Site in $SiteCollections)
{
    Write-Host "Processing Site Collection :"$Site.URL -f Yellow
  $Site=Get-SPOSite -identity $Site.Url -Limit ALL -Detailed
    #Get site collection details   
    $Result = new-object PSObject
    $Result | add-member -membertype NoteProperty -name "Title" -Value $Site.Title
    $Result | add-member -membertype NoteProperty -name "Url" -Value $Site.Url
$Result | add-member -membertype NoteProperty -name "WebsCount" -Value $Site.WebsCount
    $Result | add-member -membertype NoteProperty -name "LastContentModifiedDate" -Value $Site.LastContentModifiedDate
    $Result | add-member -membertype NoteProperty -name "Status" -Value $Site.Status
    $Result | add-member -membertype NoteProperty -name "LocaleId" -Value $Site.LocaleId
    $Result | add-member -membertype NoteProperty -name "LockState" -Value $Site.LockState
    $Result | add-member -membertype NoteProperty -name "StorageQuota" -Value $Site.StorageQuota
    $Result | add-member -membertype NoteProperty -name "StorageQuotaWarningLevel" -Value $Site.StorageQuotaWarningLevel
    $Result | add-member -membertype NoteProperty -name "Used" -Value $Site.StorageUsageCurrent
    $Result | add-member -membertype NoteProperty -name "CompatibilityLevel" -Value $Site.CompatibilityLevel
    $Result | add-member -membertype NoteProperty -name "Template" -Value $Site.Template
    $Result | add-member -membertype NoteProperty -name "SharingCapability" -Value $Site.SharingCapability     
    $ResultSet += $Result
} 
  
#Export Result to csv file
$ResultSet |  Export-Csv $ReportOutput -notypeinformation
  
Write-Host "Site Quota Report Generated Successfully!" -f Green
Check the similar thread here: SiteProperties.WebsCount property is returning zero (CSOM)?

Trying to Use Powershell Credential Object to Run As Admin?

let me start off by saying that I have scoured through the online resources and cannot figure out how to properly pass admin credentials (user AND password) through my powershell script. I have a script, lets call it script.ps1, and the script looks like this (I have powershell v1.0):
$UserName = "domain\user"
$Password = ConvertTo-SecureString -String "PASSWORD" -AsPlainText -Force
$Credentials = New-Object -TypeName System.Management.Automation.PSCredential ` -ArgumentList $UserName, $Password
$events = Get-WinEvent -Credential $Credentials -ComputerName thecomputername -LogName Microsoft-Windows-PrintService/Operational
$resultArry = #()
foreach($ev in $events)
{
Try
{
$evXML = [xml]$ev.ToXml()
if( $ev.Id -eq 307 )
{
$obj = New-Object PSObject
Add-Member -InputObject $obj -MemberType NoteProperty -Name "User" -Value $evXML.Event.UserData.DocumentPrinted.Param3
Add-Member -InputObject $obj -MemberType NoteProperty -Name "Document" -Value $evXML.Event.UserData.DocumentPrinted.Param2
Add-Member -InputObject $obj -MemberType NoteProperty -Name "Computer" -Value $evXML.Event.System.Computer
Add-Member -InputObject $obj -MemberType NoteProperty -Name "Event Timestamp" -Value $evXML.Event.System.TimeCreated.GetAttribute("SystemTime")
Add-Member -InputObject $obj -MemberType NoteProperty -Name "Printer" -Value $evXML.Event.UserData.DocumentPrinted.Param5
Add-Member -InputObject $obj -MemberType NoteProperty -Name "Port" -Value $evXML.Event.UserData.DocumentPrinted.Param6
Add-Member -InputObject $obj -MemberType NoteProperty -Name "Pages" -Value $evXML.Event.UserData.DocumentPrinted.Param8
Add-Member -InputObject $obj -MemberType NoteProperty -Name "Size in Bytes" -Value $evXML.Event.UserData.DocumentPrinted.Param7
$script:resultArry += $obj
}
}
Catch
{
}
}
$resultArry | Export-CSV C:\mycsv.csv -NoTypeInformation
pause
When I double click on this script to run it, I am still getting a permissions denied. I want the admin account to automatically be entered so the script will run as that. If I get rid of the code at the top, and right click on my file and run it as an admin and manually enter my credentials, it will work successfully. I am obviously not using/passing the credentials correctly, so I was hoping someone could see what I am doing wrong?
I don't want to set the icon shortcut to automatically run as an admin when clicked on, because I am having an automated process run this script, and that workaround will not work in this case. I need the actual code to pass the credentials
Thanks for any input, it is greatly appreciated