So im trying to make a backup script that will download a csv from my mssql, then zip the file, then upload the backup to amazon S3.
The issue im having is the table is 20Million lines on average when i run the script daily. and it looks like it just lags forever untill it completes like 20 minutes later. I was wondering if there is a way to show a progress bar for the invoke-sqlcmd specificly. ive done some research and all the examples i could find is to make a progress bar on a for loop only, not for a single commands progress.
Here is my code:
ECHO "Starting Download"
Import-Module sqlps
#$SQLquery="SELECT * FROM dbo.$PREFIX$i"
$SQLquery="SELECT * FROM dbo.events"
ECHO "Executing query = $SQLquery"
$hostname = "."
$pass = "test"
$usern = "test"
$database = "theDB"
$result=invoke-sqlcmd -ServerInstance $hostname -query $SQLquery -HostName $hostname -Password $pass -Username $usern -Database $database -verbose
#echo $result
pause
$result |export-csv -path $CSVPATH -notypeinformation
pause
ECHO "Starting Zip:"
Compress-Archive -LiteralPath $CSVPATH -CompressionLevel Optimal -DestinationPath $ZIPPATH
ECHO "Starting Delete: $CSVPATH "
del "$CSVPATH"
echo "Removed $CSVNAME"
aws s3 cp $ZIPPATH s3://test_$ZIPNAME
pause
this script works but as i said i would like to add a progress bar to the invoke-sqlcmd so that it doesnt look like its frozen while it downloads the huge file.
this is what i could find so far but this only works for a loops progression
$VerbosePreference = "Continue"
Write-Verbose "Test Message"
for ($a=1; $a -lt 100; $a++) {
Write-Progress -Activity "Working..." -PercentComplete $a -CurrentOperation "$a% complete" -Status "Please wait."
Start-Sleep -Milliseconds 100
}
Considering your huge ~20 million record data set, it's probably a good idea to use some of the .NET classes in the System.Data.Common namespace. And I'm not sure about how Export-Csv is implemented, but System.IO.StreamWriter is very efficient for writing large files.
A simple tested/working example with inline comments:
# replace $tableName with yours
$sqlCount = "SELECT COUNT(*) FROM dbo.$($tableName)";
$sqlSelect = "SELECT * FROM dbo.$($tableName)";
$provider = [System.Data.Common.DbProviderFactories]::GetFactory('System.Data.SqlClient');
$connection = $provider.CreateConnection();
# replace $connectionString with yours, e.g.:
# "Data Source=$($INSTANCE-NAME);Initial Catalog=$($DATABASE-NAME);Integrated Security=True;";
$connection.ConnectionString = $connectionString;
$command = $connection.CreateCommand();
# get total record count for Write-Progress
$command.CommandText = $sqlCount;
$connection.Open();
$reader = $command.ExecuteReader();
$totalRecords = 0;
while ($reader.Read()) {
$totalRecords = $reader[0];
}
$reader.Dispose();
# select CSV data
$command.CommandText = $sqlSelect;
$reader = $command.ExecuteReader();
# get CSV field names
$columnNames = #();
for ($i = 0; $i -lt $reader.FieldCount; $i++) {
$columnNames += $reader.GetName($i);
}
# read and populate data one row at a time
$values = New-Object object[] $columnNames.Length;
$currentCount = 0;
# replace $CSVPATH with yours
$writer = New-Object System.IO.StreamWriter($CSVPATH);
$writer.WriteLine(($columnNames -join ','));
while ($reader.Read()) {
$null = $reader.GetValues($values);
$writer.WriteLine(($values -join ','));
if (++$currentCount % 1000 -eq 0) {
Write-Progress -Activity 'Reading data' `
-Status "Finished reading $currentCount out of $totalRecords records." `
-PercentComplete ($currentCount / $totalRecords * 100);
}
}
$command.Dispose();
$reader.Dispose();
$connection.Dispose();
$writer.Dispose();
Related
I have got a Ps script which loops a list of servers, however the error handling is not behaving the way I would expect it to. The invoke-sqlcmd has been set to continue on error, this is deliberate because when looping through a list of servers, I dont want it to stop every time it encounters an error. On the other hand, I would like to know if there has been an error.
What I opted to do was to set the erroractionpreference to continue within the script, but the invoke-sqlcmd command set to stop. This works well outside of powershell workflows, but within a PS workflow its having undesired effects i.e If I run the powershell script from a console, I cannot see any errors if the error action is set to continue, If its set to stop then I can see the errors.
In the example below, I took out the try catch as it was masking errors.
Workflow RunDeployment
{
$file_list = #("deployment_1.sql","deployment_2.sql","deployment_3.sql","deployment_4.sql","deployment_5.sql")
$x = (1,"localhost\DEV2",3,1),(4,"localhost\DEV2",6,2),(3,"localhost\DEV2",4,3)
$k = 'serverid','servername','locationid','appid'
$h = #{}
For($i=0;$i -lt $x[0].length; $i++){
$x |
ForEach-Object{
[array]$h.($k[$i]) += [string]$_[$i]
}
}
$folder = "C:\Temp\"
$database_name = "Test"
$all_server_ids = $h['serverid']
foreach -parallel ($server_id in $all_server_ids)
{
$ErrorActionPreference = 'Continue'
$severid = $h["serverid"][$all_server_ids.indexof($server_id)]
$servername = $h["servername"][$all_server_ids.indexof($server_id)]
$locationid = $h["locationid"][$all_server_ids.indexof($server_id)]
foreach ($file in $file_list)
{
$release_file = "$folder$file"
write-output "The file is $release_file "
# try {
invoke-sqlcmd -ServerInstance "$servername" -inputfile $folder$file -Database "$database_name" -Querytimeout 60 -OutputSqlErrors $true -ConnectionTimeout 10 -ErrorAction Stop -Errorvariable errorvalue
# write-output "-ServerInstance $servername -inputfile $folder$file -Database $database_name -Querytimeout 60 -OutputSqlErrors $true -ConnectionTimeout 10 -ErrorAction Stop -Errorvariable errorvalue"
# if ($errorvalue){
# write-output "Error encountered see $errorvalue" }
# }
# Catch{
# $error_message = $_.Exception.Message
# write-output $error_message
# write-output $error
# }
}
}
}
RunDeployment
I am in the process of re-writing the script below to be able to run in parallel, as can be seen in the code, an array of servers is passed to the script, and then it loads it onto a hash table, loops through each server at a time to do the deployment, for each server there are files to execute in a particular order (see array of files). Looking at the structure, I feel workspace is the way to go here but I could be wrong.
Where the performance gains can be seen in my opinion or having the code such that multiple servers can be executed at thesame time rather than waiting for each server to complete and move onto the next one. foreach parallel
I ran a test to call a function declared outside a workspace, it worked.Is this good practice to call a function declared outside a workspace ? I ask this because I would like to reuse some functions outside the workspace, or is it generally better to put all the code in the workspace even ones that are not intended for parallel workloads i.e one off calls to the code. ?
The below is the code I am testing with.
Function Check-Instance-Connection{
param
(
[Parameter(Mandatory=$true,
ValueFromPipelineByPropertyName=$true,
Position=0)]
$sql_server,
[Parameter(Mandatory=$true,
ValueFromPipelineByPropertyName=$true,
Position=1)]
$db_name
)
try
{
#Return extra useful info by using custom objects
$check_outcome = "" | Select-Object -Property log_date, stage, status, error_message
$check_outcome.log_date = (Get-Date)
$check_outcome.stage = 'Ping SQL instance for $sql_server'
#test connection for a sql instance
$connectionstring = "Data Source=$sql_server;Integrated Security =true;Initial Catalog=$db_name;Connect Timeout=5;"
$sqllconnection = New-Object System.Data.SqlClient.SqlConnection $connectionstring
$sqllconnection.Open();
$check_outcome.status = $true
$check_outcome.error_message = ''
return $check_outcome
}
Catch
{
$check_outcome.status = $false
$check_outcome.error_message = $_.Exception.Message
return $check_outcome
}
finally{
$sqllconnection.Close();
}
}
$file_list = #("deployment_1.sql","deployment_2.sql","deployment_3.sql","deployment_4.sql","deployment_5.sql")
$x = (1,"Server1",3,1),(4,"Server2",6,2),(3,"Server3",4,3)
$k = 'serverid','servername','locationid','appid' # key names correspond to data positions in each array in $x
$h = #{}
For($i=0;$i -lt $x[0].length; $i++){
$x |
ForEach-Object{
[array]$h.($k[$i]) += [string]$_[$i]
}
}
$folder = "F:\Files\"
$database_name = "Test"
foreach ($server_id in $all_server_ids)
{
$severid = $h["serverid"][$all_server_ids.indexof($server_id)]
$servername = $h["servername"][$all_server_ids.indexof($server_id)]
$locationid = $h["locationid"][$all_server_ids.indexof($server_id)]
$message = 'ServerID {0} has a servername of {1} and a location id of {2}' -f $server_id, $h["servername"][$all_server_ids.indexof($server_id)],$h["locationid"][$all_server_ids.indexof($server_id)]
Write-Output $message
Write-Output "This $severid and this $servername and this $locationid"
foreach ($file in $file_list)
{
$is_instance_ok = Check-Instance-Connection $servername $database_name
if ($is_instance_ok.check_outcome -eq $true){
invoke-sqlcmd -ServerInstance "$servername" -inputfile $folder$file -Database "$database_name" -Querytimeout 60 -OutputSqlErrors $true -ConnectionTimeout 10 -ErrorAction Continue -Errorvariable generated_error | Out-Null
}
}
}
Thanks, I did a lot more research and looked at a lot of examples on how workflows work. This is what I have come up with.
Workflow RunExecution
{
Function Check-Instance-Connection{
param
(
[Parameter(Mandatory=$true,
ValueFromPipelineByPropertyName=$true,
Position=0)]
$sql_server,
[Parameter(Mandatory=$true,
ValueFromPipelineByPropertyName=$true,
Position=1)]
$db_name
)
try
{
#Return extra useful info by using custom objects
$check_outcome = "" | Select-Object -Property log_date, stage, status, error_message
$check_outcome.log_date = (Get-Date)
$check_outcome.stage = 'Ping SQL instance for $sql_server'
#test connection for a sql instance
$connectionstring = "Data Source=$sql_server;Integrated Security =true;Initial Catalog=$db_name;Connect Timeout=5;"
$sqllconnection = New-Object System.Data.SqlClient.SqlConnection $connectionstring
$sqllconnection.Open();
$check_outcome.status = $true
$check_outcome.error_message = ''
return $check_outcome
}
Catch
{
$check_outcome.status = $false
$check_outcome.error_message = $_.Exception.Message
return $check_outcome
}
finally{
$sqllconnection.Close();
}
}
$file_list = #("deployment_1.sql","deployment_2.sql","deployment_3.sql","deployment_4.sql","deployment_5.sql")
$x = (1,"server1\DEV3",3,1),(4,"serer1\DEV2",6,2),(3,"serer2\DEV1",4,3)
$k = 'serverid','servername','locationid','appid'
$h = #{}
For($i=0;$i -lt $x[0].length; $i++){
$x |
ForEach-Object{
[array]$h.($k[$i]) += [string]$_[$i]
}
}
$folder = "C:\Temp\"
$database_name = "Test"
$all_server_ids = $h['serverid']
foreach -parallel ($server_id in $all_server_ids)
{
$severid = $h["serverid"][$all_server_ids.indexof($server_id)]
$servername = $h["servername"][$all_server_ids.indexof($server_id)]
$locationid = $h["locationid"][$all_server_ids.indexof($server_id)]
foreach ($file in $file_list)
{
# $check_fine = $is_instance_ok.check_outcome
# if ($check_fine = $true){
invoke-sqlcmd -ServerInstance "$servername" -inputfile $folder$file -Database "$database_name" -Querytimeout 60 -OutputSqlErrors $true -ConnectionTimeout 10 -ErrorAction Continue
write-output "invoke-sqlcmd -ServerInstance $servername -inputfile $folder$file -Database $database_name -Querytimeout 60 -OutputSqlErrors $true -ConnectionTimeout 10 -ErrorAction Continue "
# }
}
}
}
RunExecution
I have a PowerShell script on Windows 2012 R2 that is used to export data from a database into a CSV file. I have a check in there to escape double quotes and text qualify necessary fields. I am looking to increase the performance of the script because it runs a little slower than I would like (exporting 20GB/20 million rows) and it only utilizes about 10% of the CPU. Does anyone have any suggestions for improvement?
$ConnectionString = "Data Source=server1; Database=Development; Trusted_Connection=True;";
$streamWriter = New-Object System.IO.StreamWriter ".\output.csv"
$sqlConn = New-Object System.Data.SqlClient.SqlConnection $ConnectionString
$sqlCmd = New-Object System.Data.SqlClient.SqlCommand
$sqlCmd.Connection = $sqlConn
$sqlCmd.CommandText = "SELECT * FROM Development.dbo.All_Opportunities WITH(NOLOCK)"
$sqlConn.Open();
$reader = $sqlCmd.ExecuteReader();
# Initialze the array the hold the values
$array = #()
for ( $i = 0 ; $i -lt $reader.FieldCount; $i++ )
{ $array += #($i) }
# Write Header
$streamWriter.Write($reader.GetName(0))
for ( $i = 1; $i -lt $reader.FieldCount; $i ++)
{ $streamWriter.Write($("," + $reader.GetName($i))) }
$streamWriter.WriteLine("") # Close the header line
while ($reader.Read())
{
# get the values;
$fieldCount = $reader.GetValues($array);
# add quotes if the values have a comma or double quote
for ($i = 0; $i -lt $array.Length; $i++)
{
if ($array[$i] -match "`"|,")
{
$array[$i] = '"' + $array[$i].Replace("`"", "`"`"").ToString() + '"';
}
}
$newRow = [string]::Join(",", $array);
$streamWriter.WriteLine($newRow)
}
$reader.Close();
$sqlConn.Close();
$streamWriter.Close();
You can split it to jobs and start them in the background
Try :
https://learn.microsoft.com/en-us/powershell/module/Microsoft.PowerShell.Core/Start-Job?view=powershell-5.1
Hope it help
So, I had a similar issue about a year ago, albeit with a slightly smaller table (~1 GB). Initially I just used:
Import-Module -Name SqlServer -Cmdlet Read-SqlTableData;
Read-SqlTableData -ServerInstance $SqlServer -DatabaseName $Database -SchemaName $Schema -TableName $Table |
Export-Csv -Path $OutputFilePath -NoTypeInformation
It worked, but it used a ton of memory (5+ GB out of 16 GB) and took about 7-9 minutes to run. All of these tests were with a spinning metal disk in a laptop, so bear that in mind with what follows as well.
I wondered if I could get it to go faster. I initially wrote it like this, which took about half the time, and about 100 MB of RAM:
$SqlServer = '...';
$SqlDatabase = '...';
$OutputFilePath = '...';
$SqlQuery = '...';
$SqlConnectionString = 'Data Source={0};Initial Catalog={1};Integrated Security=SSPI' -f $SqlServer, $SqlDatabase;
$Utf8NoBOM = New-Object -TypeName System.Text.UTF8Encoding -ArgumentList $false;
$StreamWriter = New-Object -TypeName System.IO.StreamWriter -ArgumentList $OutputFilePath, $Utf8NoBOM;
$CsvDelimiter = '"';
$CsvDelimiterEscape = '""';
$CsvSeparator = ',';
$SQLConnection = New-Object -TypeName System.Data.SqlClient.SqlConnection -ArgumentList $SqlConnectionString;
$SqlCommand = $SQLConnection.CreateCommand();
$SqlCommand.CommandText = $SqlQuery;
$SQLConnection.Open();
$SqlDataReader = $SqlCommand.ExecuteReader();
for ($Field = 0; $Field -lt $SqlDataReader.FieldCount; $Field++) {
if ($Field -gt 0) { $StreamWriter.Write($CsvSeparator); }
$StreamWriter.Write($CsvDelimiter);
$StreamWriter.Write($SqlDataReader.GetName($Field).Replace($CsvDelimiter, $CsvDelimiterEscape));
$StreamWriter.Write($CsvDelimiter);
}
$StreamWriter.WriteLine();
while ($SqlDataReader.Read()) {
for ($Field = 0; $Field -lt $SqlDataReader.FieldCount; $Field++) {
if ($Field -gt 0) { $StreamWriter.Write($CsvSeparator); }
$StreamWriter.Write($CsvDelimiter);
$StreamWriter.Write($SqlDataReader.GetValue($Field).ToString().Replace($CsvDelimiter, $CsvDelimiterEscape));
$StreamWriter.Write($CsvDelimiter);
}
$StreamWriter.WriteLine();
}
$SqlDataReader.Close();
$SqlDataReader.Dispose();
$SQLConnection.Close();
$SQLConnection.Dispose();
$StreamWriter.Close();
$StreamWriter.Dispose();
As you can see, it's basically the same pattern as yours.
I wondered if I could improve it even more, so I tried adding a StringBuilder since I'd had success doing that with other projects. I still have the code, but I found that it didn't work any faster, and took about 200 MB of RAM:
$SqlServer = '...'
$SqlDatabase = '...'
$OutputFilePath = '...'
$SqlQuery = '...';
$SqlConnectionString = 'Data Source={0};Initial Catalog={1};Integrated Security=SSPI' -f $SqlServer, $SqlDatabase;
$StringBuilderBufferSize = 50MB;
$StringBuilder = New-Object -TypeName System.Text.StringBuilder -ArgumentList ($StringBuilderBufferSize + 1MB);
$Utf8NoBOM = New-Object -TypeName System.Text.UTF8Encoding -ArgumentList $false;
$StreamWriter = New-Object -TypeName System.IO.StreamWriter -ArgumentList $OutputFilePath, $Utf8NoBOM;
$CsvDelimiter = '"';
$CsvDelimiterEscape = '""';
$CsvSeparator = ',';
$SQLConnection = New-Object -TypeName System.Data.SqlClient.SqlConnection -ArgumentList $SqlConnectionString;
$SqlCommand = $SQLConnection.CreateCommand();
$SqlCommand.CommandText = $SqlQuery;
$SQLConnection.Open();
$SqlDataReader = $SqlCommand.ExecuteReader();
for ($Field = 0; $Field -lt $SqlDataReader.FieldCount; $Field++) {
if ($Field -gt 0) { [void]$StringBuilder.Append($CsvSeparator); }
[void]$StringBuilder.Append($CsvDelimiter);
[void]$StringBuilder.Append($SqlDataReader.GetName($Field).Replace($CsvDelimiter, $CsvDelimiterEscape));
[void]$StringBuilder.Append($CsvDelimiter);
}
[void]$StringBuilder.AppendLine();
while ($SqlDataReader.Read()) {
for ($Field = 0; $Field -lt $SqlDataReader.FieldCount; $Field++) {
if ($Field -gt 0) { [void]$StringBuilder.Append($CsvSeparator); }
[void]$StringBuilder.Append($CsvDelimiter);
[void]$StringBuilder.Append($SqlDataReader.GetValue($Field).ToString().Replace($CsvDelimiter, $CsvDelimiterEscape));
[void]$StringBuilder.Append($CsvDelimiter);
}
[void]$StringBuilder.AppendLine();
if ($StringBuilder.Length -ge $StringBuilderBufferSize) {
$StreamWriter.Write($StringBuilder.ToString());
[void]$StringBuilder.Clear();
}
}
$SqlDataReader.Close();
$SqlDataReader.Dispose();
$SQLConnection.Close();
$SQLConnection.Dispose();
$StreamWriter.Write($StringBuilder.ToString());
$StreamWriter.Close();
$StreamWriter.Dispose();
No matter what I tried, I couldn't seem to get it under ~4:30 for about 1 GB of data.
I never considered parallelism because you'd have to break your query up into 4 equal pieces such that you could be certain that you'd get the complete data set, or otherwise do some pretty difficult process management with Runspace Pools. Even then you'd have to write to four different files, and eventually combine the files back together. Maybe it would work out, but it wasn't an interesting problem to me anymore at that point.
Eventually I just created a package with the Import Export Wizard, saved it as a package, and ran it with DTExec.exe. This takes about 45-60 seconds or so for 1 GB of data. The only drawbacks are that you need to specify the table when you build the package, it doesn't dynamically determine the columns, and it's an unreasonable pain to get the output file to be UTF8.
I did find that bcp.exe and sqlcmd.exe were faster. BCP was extremely fast, and took 20-30 seconds. However, the output formats are extremely limited, and BCP in particular is needlessly difficult to use.
My client had an issue, i.e., they accidentally copied 13 million Objects (files) to S3 bucket with wrong permissions. They have asked my team to fix it. We have to update each 13 million files in the S3 bucket with correct ACLs. We are using below powershell script to fix it. However, when the script runs on a folder with more than 20-30k objects, it fails to set the ACLs. [It iterates thru the loop, but it wont set the permission post 20-30k objects, no exception either]
I am suspecting that the requests might be getting throttled. Have any one of you came across such issue. Please help me on how to proceed.
I am looking for answers for the below questions:
1. If the API calls are getting throttled # 20-30k objects, how can I modify my script to overcome it.
2. What is the best practice in terms of scripting to "modify" AWS resources (like set ACL permission to S3 objects) for millions of objects
(I am not looking for the "BucketPolicy" approach, as we have to do it with a script and apply the ACLs to every S3 object)
Param (
[Parameter(Position=0,Mandatory=$true)]
[string]$profile,
[Parameter(Position=1,Mandatory=$true)]
[string]$switchToAccount,
[Parameter(Position=2,Mandatory=$true)]
[string]$roleName,
[Parameter(Position=3,Mandatory=$true)]
[string]$keyPrefix
)
#Set base AWS credentials
Set-AWSCredentials -ProfileName $profile
Set-DefaultAWSRegion -Region $region
#Get and set MFA device ARN
$userName = (Get-IAMUser).UserName
$mfaArn = "arn:aws:iam::xxxxxxxxx:mfa/" + "$userName"
#Configure CAA roles
$roleArn = "arn:aws:iam::" + "$switchToAccount" + ":role/" + "$roleName"
$roleSessionName = "xxxxxxxxxxxx"
#Prompt for MFA token and perform CAA request
$tokenCode = Read-Host -Prompt "Enter MFA token for $accountNumber"
$switchRole = Use-STSRole -RoleSessionName $roleSessionName -RoleArn $roleArn -TokenCode $tokenCode -SerialNumber $mfaArn
#Set new role for CAA
Set-AWSCredentials -Credential $switchRole.Credentials
#Declare access level for S3 Object ACL grantees
$FULL_CONTROL = [Amazon.S3.S3Permission]::FULL_CONTROL
$grants = #();
#Grant FULL_CONTROL access to xxxxxxxxxxxxxxxxxxxxx
$grantee1 = New-Object -TypeName Amazon.S3.Model.S3Grantee
$grantee1.EmailAddress = "xxxxxxxxxxxxxxxxxxx"
#Grant FULL_CONTROL access to xxxxxxxxxxxxxxxxx
$grantee2 = New-Object -TypeName Amazon.S3.Model.S3Grantee
$grantee2.EmailAddress = "xxxxxxxxxxxxxxxxxxx"
#Grant FULL_CONTROL access to xxxxxxxxxxxxxxxxxxxx
$grantee3 = New-Object -TypeName Amazon.S3.Model.S3Grantee
$grantee3.EmailAddress = "xxxxxxxxxxxxxxxxxxxxx"
#Create grant and add to grant list
$grant1 = New-Object -TypeName Amazon.S3.Model.S3Grant
$grant1.Grantee = $grantee1
$grant1.Permission = $FULL_CONTROL
$grants += $grant1
#Create grant and add to grant list
$grant2 = New-Object -TypeName Amazon.S3.Model.S3Grant
$grant2.Grantee = $grantee2
$grant2.Permission = $FULL_CONTROL
$grants += $grant2
#Create grant and add to grant list
$grant3 = New-Object -TypeName Amazon.S3.Model.S3Grant
$grant3.Grantee = $grantee3
$grant3.Permission = $FULL_CONTROL
$grants += $grant3
#Set bucket name for S3 objects
$bucketName = "xxxxxxxxxxxxxxxxxxxxxxxxx"
#Get all S3 Objects in specified bucket
$s3Objects = Get-S3Object -BucketName $bucketName -KeyPrefix $keyPrefix
#Count for progress bar
$totalObjects = $s3Objects.length
$i = 1
$fail_count = 0
$current_count = 0
$file_path = "C:\Users\Administrator\Desktop\Failed_Objects_new\" + $keyPrefix.Replace("/","_") + ".txt"
$file_path_retry = "C:\Users\Administrator\Desktop\Failed_Objects_new_retry\" + $keyPrefix.Replace("/","_") + ".txt"
new-item $file_path -ItemType file
new-item $file_path_retry -ItemType file
"Total Object Count:" + $totalObjects + "`n" | Out-File $file_path -Append
foreach($s3Object in $s3Objects){
$owner = $s3Object.owner.id
$s3Object.name | Write-Output
$current_count++
#Extracts Key for each S3 object in bucket
$key = $s3Object.Key
#Logging
Write-Host "Setting $bucketName | $key | $grants"
# Pick objects that were modified on or before July 15th
try {
if (($s3Object.LastModified.month -lt 7)) {
Set-S3ACL -BucketName $bucketName -Key $key -Grant $grants -OwnerId $owner
$owner | Write-Host
}
elseif(($s3Object.LastModified.month -eq 7) -and ($s3Object.LastModified.day -le 15)) {
Set-S3ACL -BucketName $bucketName -Key $key -Grant $grants -OwnerId $owner
$owner | Write-Host
}
}catch{
"Failed $bucketName | $key | $grants" | out-file $file_path -Append
$key | Out-File $file_path_retry -Append
$fail_count++
}
Write-Host "progress: " $current_count "/" $totalObjects
#Update progress bar
$percentComplete = $i/$totalObjects
Write-Progress -Activity "Setting S3 Object ACL's" -Status "$i% complete" -PercentComplete $percentComplete
$i++
}
"`n`n Total Fail Count:" + $fail_count | Out-File $file_path -Append
Steps to debug the problem:
Make sure if it is throttling issue. In for loop; break after 10k objects and see if everything works fine.
Also, put print statements inside try block both if and else.. to make sure if its reaching there or not; and when is it failing.
I'm new to powershell. I read some lines on www.powershell.com. Now I need your help to solve a problem. I want to read the UUID from clients in the Network. Therefore I created a document "pcs.txt" where all PCs are stored.
$pc = Get-Content pcs.txt #Read content of file
$cred = Get-Credential “domain\user”
for ($i=0; $i -lt $pc.length; $i++) {
$Result=test-connection -ComputerName $pc[$i] -Count 1 -Quiet
If ($Result -eq 'True')
{
$uuid = (Get-WmiObject Win32_ComputerSystemProduct -ComputerName $pc[$i] -Credential $cred).UUID
$Ausgabe=$pc[$i] + ';'+$uuid
$Ausgabe
}
else
{
$Ausgabe=$pc[$i] + '; UUID nicht erhalten'
$Ausgabe
}
}
First I test if the ping works. When the ping works I try to get the uuid.
Sometimes I don't get the uuid even if the ping worked. So I would like to code a timeout, which say -> go to next pc when you don't have the uuid after 2 seconds.
Can you help me please?
Alas, there is no timeout parameter for Get-WmiObject commandlet. There is a feature request in MS Connect, but it is from 2011 and still open.
A workaround, which I haven't tested is available by using System.Management. I'll copy-and-paste it here in case the link goes dead. (And I hate SO answers that only contain links to resouces that may or may not exist...)
Function Get-WmiCustom([string]$computername,[string]$namespace,[string]$class,[int]$timeout=15){
$ConnectionOptions = new-object System.Management.ConnectionOptions
$EnumerationOptions = new-object System.Management.EnumerationOptions
$timeoutseconds = new-timespan -seconds $timeout
$EnumerationOptions.set_timeout($timeoutseconds)
$assembledpath = "\\" + $computername + "\" + $namespace
#write-host $assembledpath -foregroundcolor yellow
$Scope = new-object System.Management.ManagementScope $assembledpath, $ConnectionOptions
$Scope.Connect()
$querystring = "SELECT * FROM " + $class
#write-host $querystring
$query = new-object System.Management.ObjectQuery $querystring
$searcher = new-object System.Management.ManagementObjectSearcher
$searcher.set_options($EnumerationOptions)
$searcher.Query = $querystring
$searcher.Scope = $Scope
trap { $_ } $result = $searcher.get()
return $result
}
I found a good workaround!
http://theolddogscriptingblog.wordpress.com/2012/05/11/wmi-hangs-and-how-to-avoid-them/
Here my working code:
$pc = Get-Content pcs.txt #FILE FROM THE HARDDISK
$cred = Get-Credential “DOMAIN\USER” #
for ($i=0; $i -lt $pc.length; $i++)
{
$Result=test-connection -ComputerName $pc[$i] -Count 1 -Quiet
If ($Result -eq 'True')
{
$WMIJob = Get-WmiObject Win32_ComputerSystemProduct -ComputerName $pc[$i] -Credential $cred -AsJob
$Timeout=Wait-Job -ID $WMIJob.ID -Timeout 1 # the Job times out after 1 seconds.
$uuid = Receive-Job $WMIJob.ID
if ($uuid -ne $null)
{
$Wert =$uuid.UUID
$Ausgabe=$pc[$i] + ';'+$Wert
$Ausgabe
}
else
{
<#$b = $error | select Exception
$E = $b -split (:)
$x = $E[1]
$Error.Clear() #>
$Ausgabe=$pc[$i] + '; got no uuid'
$Ausgabe
}
}
else
{
$Ausgabe='PC not reached through ping.'
$Ausgabe
}
}
I hope I can help somebody with that