Run a Hash Checking code as a job powershell - powershell

I looked into some of the posts here but didn't found an answer to my situation, I have the following script in which I parse through multiple servers checking the hash code of certain files, I use foreach to do this trick and wished to pass each server as a job that runs the foreach loop, otherwise it takes an awfully long time to do.
Here's the code:
$Modulos = Get-Content -Path .\mod.txt |sort-object -unique
$Orig = "H:\Redsys_Client"
$Machines = Get-Content -Path .\servers.txt |sort-object -unique
function CheckFileHash ($file, $file2 ,$cLogFile) {
$hashSrc = Get-FileHash $file -Algorithm "SHA256"
$hashDest = Get-FileHash $file2 -Algorithm "SHA256"
if ((test-path -Path $cLogFile ) -eq $False){
Add-Content -Path $cLogFile -Value "Algor; Hash_Orig; Orig; Algor; Hash_Dest; Dest; Result"
}
If ($hashSrc.Hash -ne $hashDest.Hash)
{
Add-Content -Path $cLogFile -Value "$hashSrc; $hashDest; the files are NOT EQUAL."
}
elseif ($hashSrc.Hash -eq $hashDest.Hash){
Add-Content -Path $cLogFile -Value "$hashSrc; $hashDest; the files are EQUAL."
}
}
Foreach ($Server in $Machines){
Foreach ($name in $Modulos){
CheckFileHash $orig\$name\$name.exe "\\$Server\Redsys\$name\$name.exe" ".\test.csv"
}
}
Does anyone have any idea on how to do this kind of job ? I'm afraid that it'd just mess a lot with the csv and wished to try to store them on a variable, dunno

Related

How to parse through folders and files using PowerShell?

I am trying to construct a script that moves through specific folders and the log files in it, and filters the error codes. After that it passes them into a new file.
I'm not really sure how to do that with for loops so I'll leave my code bellow.
If someone could tell me what I'm doing wrong, that would be greatly appreciated.
$file_name = Read-Host -Prompt 'Name of the new file: '
$path = 'C:\Users\user\Power\log_script\logs'
Add-Type -AssemblyName System.IO.Compression.FileSystem
function Unzip
{
param([string]$zipfile, [string]$outpath)
[System.IO.Compression.ZipFile]::ExtractToDirectory($zipfile, $outpath)
}
if ([System.IO.File]::Exists($path)) {
Remove-Item $path
Unzip 'C:\Users\user\Power\log_script\logs.zip' 'C:\Users\user\Power\log_script'
} else {
Unzip 'C:\Users\user\Power\log_script\logs.zip' 'C:\Users\user\Power\log_script'
}
$folder = Get-ChildItem -Path 'C:\Users\user\Power\log_script\logs\LogFiles'
$files = foreach($logfolder in $folder) {
$content = foreach($line in $files) {
if ($line -match '([ ][4-5][0-5][0-9][ ])') {
echo $line
}
}
}
$content | Out-File $file_name -Force -Encoding ascii
Inside the LogFiles folder are three more folders each containing log files.
Thanks
Expanding on a comment above about recursing the folder structure, and then actually retrieving the content of the files, you could try something line this:
$allFiles = Get-ChildItem -Path 'C:\Users\user\Power\log_script\logs\LogFiles' -Recurse
# iterate the files
$allFiles | ForEach-Object {
# iterate the content of each file, line by line
Get-Content $_ | ForEach-Object {
if ($_ -match '([ ][4-5][0-5][0-9][ ])') {
echo $_
}
}
}
It looks like your inner loop is of a collection ($files) that doesn't yet exist. You assign $files to the output of a ForEach(...) loop then try to nest another loop of $files inside it. Of course at this point $files isn't available to be looped.
Regardless, the issue is you are never reading the content of your log files. Even if you managed to loop through the output of Get-ChildItem, you need to look at each line to perform the match.
Obviously I cannot completely test this, but I see a few issues and have rewritten as below:
$file_name = Read-Host -Prompt 'Name of the new file'
$path = 'C:\Users\user\Power\log_script\logs'
$Pattern = '([ ][4-5][0-5][0-9][ ])'
if ( [System.IO.File]::Exists( $path ) ) { Remove-Item $path }
Expand-Archive 'C:\Users\user\Power\log_script\logs.zip' 'C:\Users\user\Power\log_script'
Select-String -Path 'C:\Users\user\Power\log_script\logs\LogFiles\*' -Pattern $Pattern |
Select-Object -ExpandProperty line |
Out-File $file_name -Force -Encoding ascii
Note: Select-String cannot recurse on its own.
I'm not sure you need to write your own UnZip function. PowerShell has the Expand-Archive cmdlet which can at least match the functionality thus far:
Expand-Archive -Path <SourceZipPath> -DestinationPath <DestinationFolder>
Note: The -Force parameter allows it to over write the destination files if they are already present. which may be a substitute for testing if the file exists and deleting if it does.
If you are going to test for the file that section of code can be simplified as:
if ( [System.IO.File]::Exists( $path ) ) { Remove-Item $path }
Unzip 'C:\Users\user\Power\log_script\logs.zip' 'C:\Users\user\Power\log_script'
This is because you were going to run the UnZip command regardless...
Note: You could also use Test-Path for this.
Also there are enumerable ways to get the matching lines, here are a couple of extra samples:
Get-ChildItem -Path 'C:\Users\user\Power\log_script\logs\LogFiles' |
ForEach-Object{
( Get-Content $_.FullName ) -match $Pattern
# Using match in this way will echo the lines that matched from each run of
# Get-Content. If nothing matched nothing will output on that iteration.
} |
Out-File $file_name -Force -Encoding ascii
This approach will read the entire file into an array before running the match on it. For large files it may pose a memory issue, however it enabled the clever use of -match.
OR:
Get-ChildItem -Path 'C:\Users\user\Power\log_script\logs\LogFiles' |
Get-Content |
ForEach-Object{ If( $_ -match $Pattern ) { $_ } } |
Out-File $file_name -Force -Encoding ascii
Note: You don't need the alias echo or its real cmdlet Write-Output
UPDATE: After fuzzing around a bit and trying different things I finally got it to work.
I'll include the code below just for demonstration purposes.
Thanks everyone
$start = Get-Date
"`n$start`n"
$file_name = Read-Host -Prompt 'Name of the new file: '
Out-File $file_name -Force -Encoding ascii
Expand-Archive -Path 'C:\Users\User\Power\log_script\logs.zip' -Force
$i = 1
$folders = Get-ChildItem -Path 'C:\Users\User\Power\log_script\logs\logs\LogFiles' -Name -Recurse -Include *.log
foreach($item in $folders) {
$files = 'C:\Users\User\Power\log_script\logs\logs\LogFiles\' + $item
foreach($file in $files){
$content = Get-Content $file
Write-Progress -Activity "Filtering..." -Status "File $i of $($folders.Count)" -PercentComplete (($i / $folders.Count) * 100)
$i++
$output = foreach($line in $content) {
if ($line -match '([ ][4-5][0-5][0-9][ ])') {
Add-Content -Path $file_name -Value $line
}
}
}
}
$end = Get-Date
$time = [int]($end - $start).TotalSeconds
Write-Output ("Runtime: " + $time + " Seconds" -join ' ')

How can I separate my content on powershell?

This is my code: It runs correctly but it prints my content together. I was told to use the Write-ouput command and write an empty string but its not coming out right. Does anyone have any suggestions
$file1='/Users/raelynsade/Documents/cpt180stuff/pets/dogs/dognames.txt'
$file2='/Users/raelynsade/Documents/cpt180stuff/pets/cats/catnames.txt'
$fileExist = (Test-Path -Path $file1) -AND (Test-Path -Path $file2)
if ($fileExist -eq $True) {
$file_content = Get-Content -Path $file1
Write-output -InputObject $file_content
$file_content = Get-Content -Path $file2
Write-output -InputObject $file_content
Add-Content -Path "/Users/raelynsade/Documents/cpt180stuff/pets/cats/catnames.txt"
-Value "Sammy"
Add-Content -Path "/Users/raelynsade/Documents/cpt180stuff/pets/cats/catnames.txt"
-Value "Luna"
Get-Content -Path "/Users/raelynsade/Documents/cpt180stuff/pets/cats/catnames.txt" } else {
Write-output -Inputobject "Unable to access one or more files" }
If you put powershell's implicit output to work for you and enclose it all in a subexpression $(...) then you can combine all the lines as you desire and output once. Add the -Passthru parameter of Set-Content and you don't need to read the file again after writing.
$file1='/Users/raelynsade/Documents/cpt180stuff/pets/dogs/dognames.txt'
$file2='/Users/raelynsade/Documents/cpt180stuff/pets/cats/catnames.txt'
$fileExist = (Test-Path -Path $file1) -AND (Test-Path -Path $file2)
if ($fileExist -eq $True){
$(Get-Content -Path $file1
""
Get-Content -Path $file2
"Sammy"
"Luna") | Set-Content -Path "/Users/raelynsade/Documents/cpt180stuff/pets/cats/catnames.txt" -PassThru
} else {
"Unable to access one or more files"
}
something like this?
$file1='/Users/raelynsade/Documents/cpt180stuff/pets/dogs/dognames.txt'
$file2='/Users/raelynsade/Documents/cpt180stuff/pets/cats/catnames.txt'
$fileExist = (Test-Path -Path $file1) -AND (Test-Path -Path $file2)
if ($fileExist)
{
#content to dog file
Get-Content -Path $file1
" "
#content to cat file with new values added
Add-Content -Path $file2 -Value "Sammy" ,"Luna"
Get-Content -Path $file2
}
else
{
"Unable to access one or more files"
}

Preview a hash before write a file to disk

I wrote code to hash some files, separated by extension within a folder. At the end, a hash file is generated. Is there a way to preview a hash "#MP3_DIR_$hash.txt" before the file is written to disk? I raise the question, as a way to save reading and writing to disk and speed up the script.
$directory = (Get-ChildItem -Recurse -Directory).FullName
Foreach ($path in $directory) {
If (!(Test-Path -Path "$path\#MP3_DIR_*.txt")) {
$array = #()
Get-ChildItem -Path "$path\*.mp3" | Foreach {((Get-FileHash "$_" -Algorithm MD5).Hash)} | ForEach-Object { $array += "$_" }
}
If (!($array -eq $null)) {
$array = $array | Where-Object {$_}
$array = $array | Sort
$hashfile = (Get-FileHash -Algorithm MD5 -InputStream ([System.IO.MemoryStream]::New([System.Text.Encoding]::ASCII.GetBytes($array)))).Hash
$array | Set-Content -LiteralPath "$path\#MP3_DIR_$hashfile.txt"
}
}
The variable "$hashfile" not output the same hash of the file writed as "$path#MP3_DIR_$hashfile.txt". How to preview the hash of this file before write the file to disc?

Memory exception while filtering large CSV files

getting memory exception while running this code. Is there a way to filter one file at a time and write output and append after processing each file. Seems the below code loads everything to memory.
$inputFolder = "C:\Change\2019\October"
$outputFile = "C:\Change\2019\output.csv"
Get-ChildItem $inputFolder -File -Filter '*.csv' |
ForEach-Object { Import-Csv $_.FullName } |
Where-Object { $_.machine_type -eq 'workstations' } |
Export-Csv $outputFile -NoType
May be can you export and filter your files one by one and append result into your output file like this :
$inputFolder = "C:\Change\2019\October"
$outputFile = "C:\Change\2019\output.csv"
Remove-Item $outputFile -Force -ErrorAction SilentlyContinue
Get-ChildItem $inputFolder -Filter "*.csv" -file | %{import-csv $_.FullName | where machine_type -eq 'workstations' | export-csv $outputFile -Append -notype }
Note: The reason for not using Get-ChildItem ... | Import-Csv ... - i.e., for not directly piping Get-ChildItem to Import-Csv and instead having to call Import-Csv from the script block ({ ... } of an auxiliary ForEach-Object call, is a bug in Windows PowerShell that has since been fixed in PowerShell Core - see the bottom section for a more concise workaround.
However, even output from ForEach-Object script blocks should stream to the remaining pipeline commands, so you shouldn't run out of memory - after all, a salient feature of the PowerShell pipeline is object-by-object processing, which keeps memory use constant, irrespective of the size of the (streaming) input collection.
You've since confirmed that avoiding the aux. ForEach-Object call does not solve the problem, so we still don't know what causes your out-of-memory exception.
Update:
This GitHub issue contains clues as to the reason for excessive memory use, especially with many properties that contain small amounts of data.
This GitHub feature request proposes using strongly typed output objects to help the issue.
The following workaround, which uses the switch statement to process the files as text files, may help:
$header = ''
Get-ChildItem $inputFolder -Filter *.csv | ForEach-Object {
$i = 0
switch -Wildcard -File $_.FullName {
'*workstations*' {
# NOTE: If no other columns contain the word `workstations`, you can
# simplify and speed up the command by omitting the `ConvertFrom-Csv` call
# (you can make the wildcard matching more robust with something
# like '*,workstations,*')
if ((ConvertFrom-Csv "$header`n$_").machine_type -ne 'workstations') { continue }
$_ # row whose 'machine_type' column value equals 'workstations'
}
default {
if ($i++ -eq 0) {
if ($header) { continue } # header already written
else { $header = $_; $_ } # header row of 1st file
}
}
}
} | Set-Content $outputFile
Here's a workaround for the bug of not being able to pipe Get-ChildItem output directly to Import-Csv, by passing it as an argument instead:
Import-Csv -LiteralPath (Get-ChildItem $inputFolder -File -Filter *.csv) |
Where-Object { $_.machine_type -eq 'workstations' } |
Export-Csv $outputFile -NoType
Note that in PowerShell Core you could more naturally write:
Get-ChildItem $inputFolder -File -Filter *.csv | Import-Csv |
Where-Object { $_.machine_type -eq 'workstations' } |
Export-Csv $outputFile -NoType
Solution 2 :
$inputFolder = "C:\Change\2019\October"
$outputFile = "C:\Change\2019\output.csv"
$encoding = [System.Text.Encoding]::UTF8 # modify encoding if necessary
$Delimiter=','
#find header for your files => i take first row of first file with data
$Header = Get-ChildItem -Path $inputFolder -Filter *.csv | Where length -gt 0 | select -First 1 | Get-Content -TotalCount 1
#if not header founded then not file with sise >0 => we quit
if(! $Header) {return}
#create array for header
$HeaderArray=$Header -split $Delimiter -replace '"', ''
#open output file
$w = New-Object System.IO.StreamWriter($outputfile, $true, $encoding)
#write header founded
$w.WriteLine($Header)
#loop on file csv
Get-ChildItem $inputFolder -File -Filter "*.csv" | %{
#open file for read
$r = New-Object System.IO.StreamReader($_.fullname, $encoding)
$skiprow = $true
while ($line = $r.ReadLine())
{
#exclude header
if ($skiprow)
{
$skiprow = $false
continue
}
#Get objet for current row with header founded
$Object=$line | ConvertFrom-Csv -Header $HeaderArray -Delimiter $Delimiter
#write in output file for your condition asked
if ($Object.machine_type -eq 'workstations') { $w.WriteLine($line) }
}
$r.Close()
$r.Dispose()
}
$w.close()
$w.Dispose()
You have to read and write to the .csv files one row at a time, using StreamReader and StreamWriter:
$filepath = "C:\Change\2019\October"
$outputfile = "C:\Change\2019\output.csv"
$encoding = [System.Text.Encoding]::UTF8
$files = Get-ChildItem -Path $filePath -Filter *.csv |
Where-Object { $_.machine_type -eq 'workstations' }
$w = New-Object System.IO.StreamWriter($outputfile, $true, $encoding)
$skiprow = $false
foreach ($file in $files)
{
$r = New-Object System.IO.StreamReader($file.fullname, $encoding)
while (($line = $r.ReadLine()) -ne $null)
{
if (!$skiprow)
{
$w.WriteLine($line)
}
$skiprow = $false
}
$r.Close()
$r.Dispose()
$skiprow = $true
}
$w.close()
$w.Dispose()
get-content *.csv | add-content combined.csv
Make sure combined.csv doesn't exist when you run this, or it's going to go full Ouroboros.

powershell backup script with error logging per file

Really need help creating a script that backs up, and shoots out the error along the file that did not copy
Here is what I tried:
Creating lists of filepaths to pass on to copy-item, in hopes to later catch errors per file, and later log them:
by using $list2X I would be able to cycle through each file, but copy-item loses the Directory structure and shoots it all out to a single folder.
So for now I am using $list2 and later I do copy-item -recurse to copy the folders:
#create list to copy
$list = Get-ChildItem -path $source | Select-Object Fullname
$list2 = $list -replace ("}"),("")
$list2 = $list2 -replace ("#{Fullname=") , ("")
out-file -FilePath g:\backuplog\DirList.txt -InputObject $list2
#create list crosscheck later
$listX = Get-ChildItem -path $source -recurse | Select-Object Fullname
$list2X = $listX -replace ("}"),("")
$list2X = $list2X -replace ("#{Fullname=") , ("")
out-file -FilePath g:\backuplog\FileDirList.txt -InputObject $list2X
And here I would pass the list:
$error.clear()
Foreach($item in $list2){
Copy-Item -Path $item -Destination $destination -recurse -force -erroraction Continue
}
out-file -FilePath g:\backuplog\errorsBackup.txt -InputObject $error
Any help with this is greatly appreciated!!!
The answer to complex file-copying or backup scripts is almost always: "Use robocopy."
Bill
"Want to copy all the items in C:\Scripts (including subfolders) to C:\Test? Then simply use a wildcard character..."
Next make it easier on yourself and do something like this:
$files = (Get-ChildItem $path).FullName #Requires PS 3.0
#or
$files = Get-ChildItem $path | % {$_.Fullname}
$files | Out-File $outpath
well it took me a long time, considering my response time. here is my copy function, which logs most errors(network drops, failed copies , etc) the copy function , and targetobject.
Function backUP{ Param ([string]$destination1 ,$list1)
$destination2 = $destination1
#extract new made string for backuplog
$index = $destination2.LastIndexOf("\")
$count = $destination2.length - $index
$source1 = $destination2.Substring($index, $count)
$finalstr2 = $logdrive + $source1
Foreach($item in $list1){
Copy-Item -Container: $true -Recurse -Force -Path $item -Destination $destination1 -erroraction Continue
if(-not $?)
{
write-output "ERROR de copiado : " $error| format-list | out-file -Append "$finalstr2\GCI-ERRORS-backup.txt"
Foreach($erritem in $error){
write-output "Error Data:" $erritem.TargetObject | out-file -Append "$finalstr2\GCI- ERRORS-backup.txt"
}
$error.Clear()
}
}
}