Saving SHA1 algorithm outputs to a text file - powershell

Morning,
I've got this powershell script as followed:
$zips = Get-ChildItem 'C:\Users\X\Desktop\Powershell\Zip\' - Filter *.zip
$sha1 = New-Object System.Security.Cryptography.SHA1CryptoServiceProvider
foreach ($file in $zips) {
$return = "" | Select Name, Hash
$return.name = $file.Name
$return.hash = [System.BitConverter]::ToString($sha1.ComputeHash([System.IO.File]::ReadAllBytes($file.FullName)))
Write-Output $return
[System.IO.File}::WriteAllText("C:\Users\X\Desktop\Info\"+ $_.Name + ".txt", $_.FullName)
}
Read-Host -Prompt "Press Enter to Exit"
So it calculates the SHA1 hash for all the zip files in the folder 'zip', I am trying to get it to save the hashes to a text file in the location \desktop\info\, with an individual text file for each zip file.
The hash is calculated with no problems, and it prints it within the powershell window, but all it does at the moment is create a singular text file with no data in it, and with no name.
What am I missing? I've tried using information from another script that does something very similar, but I cant seem to produce the same result.
Any help is appreciated.

Try this:
[System.IO.File]::WriteAllText("C:\Users\X\Desktop\Info\"+$return.Name+".txt",$return.Hash)

Instead of
[System.IO.File}::WriteAllText("C:\Users\X\Desktop\Info\"+ $_.Name + ".txt", $_.FullName)
try this:
$return.hash | Out-File "C:\Users\X\Desktop\Info\$($file.BaseName).txt"

Related

Auto rename files with sequential numbers and basename in powershell

I have some video files that I need to rename.
the name is something like this: [video name] [number].[file-extension]
I have recently switched my media play software that requires a special naming order.
The Order is as follows: [video name] e(increment start at 01)].[file-extension]
additionally, the media player requires the folder structure like this: C:\Media\[series]\[season(increment start at 01)]
I can do the folder structure manually, and renaming the files manually would be a possibility too, but I'd like to automate the process to save some time.
The best way to create the filename would be like this: check path to file like this:
$path = Get-Location
get-childitem "$path" -recurse | where {$_.extension -eq ".mkv"}
and detect the path before \season[number]. Ideally, the script would then remane the file like this: [video name = path(before season)] and then add e(increment start at 01)] based on a script like this:
$i = 1 Get-ChildItem *.mkv| %{Rename-Item $_ -NewName ('$_.Fullname{0:D4}.mkv' -f $i++)}
as seen here:
Bulk renaming of files with powershell
however the media player will get confused if the series has 12 episodes and the filename is like this: s01e001
If it is not possible to do the part of getting the name based on the path, I'd like to have a script that renames the file to [series name] e[increment start from 01].mkv
Are there any ways to create a script to rename the files?
Here Is a script that I use for that:
Write-Output "Press Ctrl+c to Abort"
Write-Output ""
Write-Output "This Script will rename all files of a specified file type with a zero-padded sequential "
Write-Output "number like 01.jpg. An optional basename ending before file type can be added like 01-Jake.jpg"
Write-Output "Enter an integer not including the zero-padded numbers to start the sequential numbering."
[int]$startInt = Read-Host "Example 0 or 1 or 10... etc."
$file_type = Read-Host 'Enter the file type that you would like to target. Example: .jpg .png .m4v .mp4 .pdf'
# $fileNameEnding = Read-Host 'Add a file basename ending before its file type extension.'
$fileNameEnding = Read-Host 'Add an optional basename ending before file type extension. Enter it now or press enter to skip'
Get-ChildItem *$file_type | ForEach-Object{Rename-Item $_ -NewName ("{0:D2}$fileNameEnding$file_type" -f $startInt++)}
Where {0:D2} is used, that is the number of zeros padded. So {0:D4} would padd to 4 zeros.
You can run this script to see how the different parts of the file name can be accessed.
$filePath = Read-Host 'Enter the file name and path to see its parts'
Write-Output ""
Write-Output "Extension: $((Get-Item $filePath ).Extension)"
Write-Output "Basename: $((Get-Item $filePath ).Basename)"
Write-Output "Name: $((Get-Item $filePath ).Name)"
Write-Output "DirectoryName: $((Get-Item $filePath ).DirectoryName)"
Write-Output "FullName: $((Get-Item $filePath ).FullName)"
Write-Output ""
Write-Output "Mode: $((Get-Item $filePath ).Mode)"

Powershell not sending the right path for a file as argument

I'm trying to apply a hash function to all the files inside a folder as some kind of version control. The idea is to make a testfile that lists the name of the file and the generated checksum. Digging online I found some code that should do the trick (in theory):
$list = Get-ChildItem 'C:\users\public\documents\folder' -Filter *.cab
$sha1 = New-Object System.Security.Cryptography.SHA1CryptoServiceProvider
foreach ($file in $list) {
$return = "" | Select Name, Hash
$returnname = $file.Name
$returnhash = [System.BitConverter]::ToString($sha1.ComputeHash([System.IO.File]::ReadAllBytes($file.Name)))
$return = "$returnname,$returnhash"
Out-File -FilePath .\mylist.txt -Encoding Default -InputObject ($return) -Append
}
When I run it however, I get an error because it tries to read the files from c:\users\me\, the folder where I'm running the script. And the file c:\users\me\aa.cab does not exist and hence can't be reached.
I've tried everything that I could think of, but no luck. I'm using Windows 7 with Powershell 2.0, if that helps in any way.
Try with .FullName instead of just .Name.
$returnhash = [System.BitConverter]::ToString($sha1.ComputeHash([System.IO.File]::ReadAllBytes($file.FullName)))

Powershell script to extract Cab file contents and move files to new locations

I've been given 14K CAB files each containing 200 files which need to be unzipped into their original locations.
Unfortunately it's not as easy as all of them being extracted to the same location :-(
I've decided to use PowerShell and have generated a list of individual file locations for each file using SQL and can extract the CABs, unfortunately they all extract to the current location.
I am trying to move them to their respective locations, but am struggling.
Here's the code, I've got so far
$shell_app=new-object -com shell.application
$CABfilename= Import-CSV "CABFileList.csv" -Header CABfilename | Foreach-object {
$zip_file = $shell_app.namespace((Get-Location).Path + "\$CABfilename")
$destination = $shell_app.namespace((Get-Location).Path)
$destination.Copyhere($zip_file.items())
$dvs = Import-csv "CABFileList.csv" -Header Path, DVSFilename |
Foreach-object{
Move-item $_.DVSFilename* $_.Path
}
This is an old question, but someone might find the answer useful anyway. I have adapted one I made today to download all WSPs from a farm and extract their contents.
$CABfilename = Import-CSV "CABFileList.csv" -Header CABfilename | Foreach-object {
# Grab the Solution
$Path = $SaveLocation + $CABfilename
# Check the path is ok
$Path
# Make a copy with extension '.cab' for extraction
$DotCab = $CABfilename + ".cab"
$SolutionDir = $Dotcab -replace '.wsp.cab'
mkdir $SolutionDir
copy-item $CABfilename $DotCab
# Now extract it, assuming you have expand.exe in the filsystem (should be in everything post Server 2008 / Vista)
if(C:\Windows\System32\expand.exe) {
try { cmd.exe /c "C:\Windows\System32\expand.exe -F:* $Dotcab $SolutionDir"}
catch { Write-host "Nope, don't have that, soz."}
}}

Powershell: Search data in *.txt files to export into *.csv

First of all, this is my first question here. I often come here to browse existing topics, but now I'm hung on my own problem. And I didn't found a helpful resource right now. My biggest concern would be, that it won't work in Powershell... At the moment I try to get a small Powershell tool to save me a lot of time. For those who don't know cw-sysinfo, it is a tool that collects information of any host system (e.g. Hardware-ID, Product Key and stuff like that) and generates *.txt files.
My point is, if you have 20, 30 or 80 server in a project, it is a huge amount of time to browse all files and just look for those lines you need and put them together in a *.csv file.
What I have working is more like the basic of the tool, it browses all *.txt in a specific path and checks for my keywords. And here is the problem that I just can use the words prior to those I really need, seen as follow:
Operating System: Windows XP
Product Type: Professional
Service Pack: Service Pack 3
...
I don't know how I can tell Powershell to search for "Product Type:"-line and pick the following "Professional" instead. Later on with keys or serial numbers it will be the same problem, that is why I just can't browse for "Standard" or "Professional".
I placed my keywords($controls) in an extra file that I can attach the project folders and don't need to edit in Powershell each time. Code looks like this:
Function getStringMatch
{
# Loop through the project directory
Foreach ($file In $files)
{
# Check all keywords
ForEach ($control In $controls)
{
$result = Get-Content $file.FullName | Select-String $control -quiet -casesensitive
If ($result -eq $True)
{
$match = $file.FullName
# Write the filename according to the entry
"Found : $control in: $match" | Out-File $output -Append
}
}
}
}
getStringMatch
I think this is the kind of thing you need, I've changed Select-String to not use the -quiet option, this will return a matches object, one of the properties of this is the line I then split the line on the ':' and trim any spaces. These results are then placed into a new PSObject which in turn is added to an array. The array is then put back on the pipeline at the end.
I also moved the call to get-content to avoid reading each file more than once.
# Create an array for results
$results = #()
# Loop through the project directory
Foreach ($file In $files)
{
# load the content once
$content = Get-Content $file.FullName
# Check all keywords
ForEach ($control In $controls)
{
# find the line containing the control string
$result = $content | Select-String $control -casesensitive
If ($result)
{
# tidy up the results and add to the array
$line = $result.Line -split ":"
$results += New-Object PSObject -Property #{
FileName = $file.FullName
Control = $line[0].Trim()
Value = $line[1].Trim()
}
}
}
}
# return the results
$results
Adding the results to a csv is just a case of piping the results to Export-Csv
$results | Export-Csv -Path "results.csv" -NoTypeInformation
If I understand your question correctly, you want some way to parse each line from your report files and extract values for some "keys". Here are a few lines to give you an idea of how you could proceede. The example is for one file, but can be generalized very easily.
$config = Get-Content ".\config.txt"
# The stuff you are searching for
$keys = #(
"Operating System",
"Product Type",
"Service Pack"
)
foreach ($line in $config)
{
$keys | %{
$regex = "\s*?$($_)\:\s*(?<value>.*?)\s*$"
if ($line -match $regex)
{
$value = $matches.value
Write-Host "Key: $_`t`tValue: $value"
}
}
}

PowerShell Wildcard Not Returning All Files

I'm new to PowerShell and been trying to get this script to work.
If ((Get-Date -UFormat %a) -eq "Mon") {$intSubtract = -3}
Else {$intSubtract = -1}
$datDate = (Get-Date).AddDays($intSubtract)
Write-Output "Find expected file --------------"
$strDate = ($datDate).ToString('yyyyMMdd')
Write-Host "strDate: $strDate"
$arrGetFile = Get-ChildItem -Path "\\Computer\Data\States\NorthDakota\Cities\*_Bismark_$strDate*.txt"
$strLocalFileName = $arrGetFile
If ($arrGetFile.count -ne 2)
{
Throw "No file or more than two files with today's date exists!"
}
Else {$strLocalFileName = $arrGetFile[0].Name}
Write-Output "Found file $strLocalFileName --------------"
#Encrypt each file
foreach ($arrGetFile in $strPath)
{
Write-Output "Start Encrypt --------------"
$strPath = "\\Computer\Data\States\NorthDakota\Cities\"
$FileAndPath = Join-Path $strPath $strLocalFileName
$Recipient = "0xA49B4B5D"
Import-Module \\JAMS\C$\PSM_PGP.psm1
Get-Module Encrypt
Encrypt $FileAndPath $Recipient
$strLocalFileNamePGP = $strLocalFileName + ".pgp"
Write-Output "End Encrypt --------------"
}
#Archive files
Write-Output "Archiving --------------"
move-item -path \\Computer\Data\States\NorthDakota\Cities\*_Bismark_$strDate*.txt -destination \\Computer\Data\States\NorthDakota\Cities\Archive
The Cities folder will contain two files. Example 2015_Bismark_20150626_183121.txt and 2015_Bismark_20150626_183121_Control.txt
I am trying to get both files encrypted however it is only finding and encrypting the file without _Control. It is archiving both files correctly.
Not sure what I am missing to also find the control file.
Your for loop is incorrect. You have foreach ($arrGetFile in $strPath), but $strPath doesn't seem to contain anything at that point.
Your for loop should be:
foreach ($LocalFile in $arrGetFile)
And you need to remove the following line:
$strLocalFileName = $arrGetFile
This is making $strLocalFileName an array of file objects, but later in the script you are treating it like a string. You may have more logical errors--you need to walk through the script very carefully and identify each variable and make sure it contains what you expect it to contain.
In general you seem to be treating arrays of non-string objects as if they are strings. Note that I changed your $strLocalFileName variable to $LocalFile. This is because it is a file object, not a string object.
Following is a sample that just shows that the for loop iterates through the both files.
If ((Get-Date -UFormat %a) -eq "Mon") {$intSubtract = -3}
Else {$intSubtract = -1}
$datDate = (Get-Date).AddDays($intSubtract)
Write-Output "Find expected file --------------"
$strDate = ($datDate).ToString('yyyyMMdd')
Write-Host "strDate: $strDate"
$arrGetFile = Get-ChildItem -Path "\\Computer\Data\States\NorthDakota\Cities\*_Bismark_$strDate*.txt"
If ($arrGetFile.count -ne 2)
{
Throw "No file or more than two files with today's date exists!"
}
Write-Output "Found files " ($arrGetFile | select Name | fl) "--------------"
#Process each file
foreach ($LocalFile in $arrGetFile)
{
$FileAndPath = Join-Path $LocalFile.DirectoryName $LocalFile
$FileAndPath
}
Start with this and then carefully add your encryption processing back into the loop.
Also, The line that assigns $FileAndPath could be removed. You can just use $LocalFile.FullName wherever you need the full path and filename.