I cannot use the built in cmdlet Get-FileHash to generate checksum value as the version of Powershell is lower than 4.
Is there an alternative way of getting or validating the integrity of the file?
OK lets assume you have a file item (from Get-ChildItem for example)
$stream = new-object system.IO.FileStream($item.fullname, "Open", "Read", "ReadWrite")
You open the file with FileStream to get a stream object.
Then you can use one of the Crypto classes to compute its hash:
if ($stream)
{
$sha = new-object -type System.Security.Cryptography.SHA256Managed
$bytes = $sha.ComputeHash($stream)
$stream.Dispose()
$stream.Close()
$sha.Dispose()
$checksum = [System.BitConverter]::ToString($bytes).Replace("-", [String]::Empty).ToLower();
}
Finally the checksum is in $checksum and it's a nice string you can use for your compare:
5989b3cdcff6a594b2b2aef7f6288f7727019c037515c2b10627721e707cf613
You have all sort of classes to compute hashes under System.Security.Cryptography, you can see what is available here : https://msdn.microsoft.com/en-us/library/system.security.cryptography(v=vs.110).aspx
Related
I'm trying to convert a few thousand home videos to a smaller format. However, encoding the video changed the created and modified timestamp to today's date. I wrote a powershell script that successfully (somehow) worked by writing the original file's modified timestamp to the new file.
However, I couldn't find a way in powershell to modify the "Media created" timestamp in the file's details properties. Is there a way to add a routine that would either copy all of the metadata from the original file, or at least set the "media created" field to the modified date?
When I searched for file attributes, it looks like the only options are archive, hidden, etc. Attached is the powershell script that I made (please don't laugh too hard, haha). Thank you
$filepath1 = 'E:\ConvertedMedia\Ingest\' # directory with incorrect modified & create date
$filepath2 = "F:\Backup Photos 2020 and DATA\Data\Photos\Photos 2021\2021 Part1\Panasonic 3-2-21\A016\PRIVATE\PANA_GRP\001RAQAM\" # directory with correct date and same file name (except extension)
$destinationCodec = "*.mp4" # Keep * in front of extension
$sourceCodec = ".mov"
Get-ChildItem $filepath1 -File $destinationCodec | Foreach-Object { # change *.mp4 to the extension of the newly encoded files with the wrong date
$fileName = $_.Name # sets fileName variable (with extension)
$fileName # Optional used during testing- sends the file name to the console
$fileNameB = $_.BaseName # sets fileNameB variable to the filename without extension
$filename2 = "$filepath2" + "$fileNameB" + "$sourceCodec" # assembles filepath for source
$correctTime = (Get-Item $filename2).lastwritetime # used for testing - just shows the correct time in the output, can comment out
$correctTime # prints the correct time
$_.lastwritetime = (Get-Item $filename2).lastwritetime # modifies lastwritetime of filepath1 to match filepath2
$_.creationTime = (Get-Item $filename2).lastwritetime # modifies creation times to match lastwritetime (comment out if you need creation time to be the same)
}
Update:
I think I need to use Shell.Application, but I'm getting an error message "duplicate keys ' ' are not allowed in hash literals" and am not sure how to incorporate it into the original script.
I only need the "date modified" attribute to be the same as "lastwritetime." The other fields were added just for testing. I appreciate your help!
$tags = "people; snow; weather"
$cameraModel = "AG-CX10"
$cameraMaker = "Panasonic"
$mediaCreated = "2/16/1999 5:01 PM"
$com = (New-Object -ComObject Shell.Application).NameSpace('C:\Users\philip\Videos') #Not sure how to specify file type
$com.Items() | ForEach-Object {
New-Object -TypeName PSCustomObject -Property #{
Name = $com.GetDetailsOf($_,0) # lists current extended properties
Tags = $com.GetDetailsOf($_,18)
CameraModel = $com.GetDetailsOf($_,30)
CameraMaker = $com.GetDetailsOf($_,32)
MediaCreated = $com.GetDetailsOf($_,208)
$com.GetDetailsOf($_,18) = $tags # sets extended properties
$com.GetDetailsOf($_,30) = $cameraModel
$com.GetDetailsOf($_,32) = $cameraMaker
$com.GetDetailsOf($_,32) = $mediaCreated
}
}
Script Example
File Properties Window
I think your best option is to drive an external tool/library from Powershell rather than using the shell (not sure you can actually set values this way tbh).
Its definitely possible to use FFMpeg to set the Media Created metadata of a file like this:
ffmpeg -i input.MOV -metadata creation_time=2000-01-01T00:00:00.0000000+00:00 -codec copy output.MOV
This would copy input.MOV file to new file output.MOV and set the Media Created metadata on the new output.MOV. This is very inefficient - but it does work.
You can script ffmpeg something like the below. The script will currently output the FFMpeg commands to the screen, the commented out Start-Process line can be used to execute ffmpeg.
gci | where Extension -eq ".mov" | foreach {
$InputFilename = $_.FullName;
$OutputFilename = "$($InputFilename)-fixed.mov";
Write-Host "Reading $($_.Name). Created: $($_.CreationTime). Modifed: $($_.LastWriteTime)";
$timestamp = Get-Date -Date $_.CreationTime -Format O
Write-Host "ffmpeg -i $InputFilename -metadata creation_time=$timestamp -codec copy $OutputFilename"
# Start-Process -Wait -FilePath C:\ffmpeg\bin\ffmpeg.exe -ArgumentList #("-i $InputFilename -metadata creation_time=$timestamp -codec copy $($OutputFilename)")
}
How do I compare the output of Get-FileHash directly with the output of Properties.ContentMD5?
I'm putting together a PowerShell script that takes some local files from my system and copies them to an Azure Blob Storage Container.
The files change daily so I have added in a check to see if the file already exists in the container before uploading it.
I use Get-FileHash to read the local file:
$LocalFileHash = (Get-FileHash "D:\file.zip" -Algorithm MD5).Hash
Which results in $LocalFileHash holding this: 67BF2B6A3E6657054B4B86E137A12382
I use this code to get the checksum of the blob file already transferred to the container:
$BlobFile = "Path\To\file.zip"
$AZContext = New-AZStorageContext -StorageAccountName $StorageAccountName -SASToken "<token here>"
$RemoteBlobFile = Get-AzStorageBlob -Container $ContainerName -Context $AZContext -Blob $BlobFile -ErrorAction Ignore
if ($ExistingBlobFile) {
$cloudblob = [Microsoft.Azure.Storage.Blob.CloudBlockBlob]$RemoteBlobFile.ICloudBlob
$RemoteBlobHash = $cloudblob.Properties.ContentMD5
}
This value of $RemoteBlobHash is set to Z78raj5mVwVLS4bhN6Ejgg==
No problem, I thought, I'll just decrypt the Base64 string and compare:
$output = [System.Text.Encoding]::UTF8.GetString([System.Convert]::FromBase64String($RemoteBlobHash))
Which gives me g�+j>fWKK��7�#� so not directly comparable ☹
This question shows someone in a similar pickle but I don't think they were using Get-FileHash given the format of their local MD5 result.
Other things I've tried:
changing the System.Text.Encoding line above UTF8 to UTF16 & ASCII which changes the output but not to anything recognisable.
dabbling with GetBytes to see if that helped:
$output = [System.Text.Encoding]::UTF8.GetBytes([System.Text.Encoding]::UTF16.GetString([System.Convert]::FromBase64String($RemoteBlobHash)))
Note: Using md5sum to compare the local file and a downloaded copy of file.zip results in the same MD5 string as Get-FileHash: 67BF2B6A3E6657054B4B86E137A12382
Thank you in advance!
ContentMD5 is a base64 representation of the binary hash value, not the resulting hex string :)
$md5sum = [convert]::FromBase64String('Z78raj5mVwVLS4bhN6Ejgg==')
$hdhash = [BitConverter]::ToString($md5sum).Replace('-','')
Here we convert base64 -> binary -> hexadecimal
If you need to do it the other way around (ie. for obtaining a local file hash, then using that to search for blobs in Azure), you'll first need to split the hexadecimal string into byte-size chunks, then convert the resulting byte array to base64:
$hdhash = '67BF2B6A3E6657054B4B86E137A12382'
$bytes = [byte[]]::new($hdhash.Length / 2)
for($i = 0; $i -lt $bytes.Length; $i++){
$offset = $i * 2
$bytes[$i] = [convert]::ToByte($hdhash.Substring($offset,2), 16)
}
$md5sum = [convert]::ToBase64String($bytes)
How can data like strings byte arrays io streams be hash using common hashing algorithms like MD4 MD5 SHA1 etc...
I am writing a script that makes backup of drives and to prevent unnecessary copies and detecting if files become corrupted it need to hash files quickly with some hashing algorithm like MD4.
If anyone have idea how to hash files, io streams, byte arrays, strings... using any hashing algorithm please let me know. Also Get-FileHash cmdlet doesn't exist on all Windows installation I encountered.
Create an instance of [System.Security.Cryptography.MD5], then pass a file stream to its ComputeHash() method:
function Get-MD5Sum
{
param(
[Parameter(Mandatory, ValueFromPipelineByPropertyName)]
[Alias('PSPath')]
[string[]]$Path
)
begin {
$md5 = [System.Security.Cryptography.MD5]::Create()
}
process {
foreach($filePath in $Path){
# Resolve filesystem item
$file = Get-Item -LiteralPath $Path
# Skip if not a file
if($file -isnot [System.IO.FileInfo]){
continue
}
# Open a stream to read the file
$filestream = $file.OpenRead()
try {
# Calculate + format hash, then output
Write-Output $([pscustomobject]#{
File = $file.FullName
Hash = [BitConverter]::ToString($MD5.ComputeHash($filestream)) -replace '-'
})
}
finally {
# close file stream handle
$filestream.Dispose()
}
}
}
end {
# Dispose of the hash provider
$MD5.Dispose()
}
}
Now you can calculate MD5 file hashes without Get-FileHash:
PS C:\> $fileHashes = Get-ChildItem . |Get-MD5Sum
I am trying to read values from a text file and keep them as variables to use in my script.
This config file contains strings, ints, booleans and an array that can contain strings, ints and booleans.
When I declare the variables outright, I have no problems. My script functions as expected. However when I am reading in the config file and trying to create variables based on that, I only get the variables declared as strings.
This creates my config file in the format I would like.
Function Create-Config() {
If (!(Test-Path config.txt)) {
$currentlocation=Get-Location
$parentfolder=(get-item $currentlocation).parent.FullName
New-Item config.txt -ItemType "file"
Add-Content config.txt "SERVER_NAME=MyServer"
Add-Content config.txt "SERVER_LOCATION=$currentlocation"
Add-Content config.txt "BACKUP_LOCATION=$parentfolder\backup"
Add-Content config.txt "CRAFTBUKKIT=craftbukkit.jar"
Add-Content config.txt "JAVA_FLAGS=-Xmx1G"
Add-Content config.txt "CRAFTBUKKIT_OPTIONS=-o True -p 1337"
Add-Content config.txt "TEST_DEPENDENCIES=True"
Add-Content config.txt "DELETE_LOG=True"
Add-Content config.txt "TAKE_BACKUP=True"
Add-Content config.txt "RESTART_PAUSE=5"
}
}
However, either I need to change how I create my config file, or change how I import those variables. I want the config file to be as simple as possible. I am using this code to import the values:
Function Load-Variables() {
Get-Content config.txt | Foreach-Object {
$var = $_.Split('=')
New-Variable -Name $var[0] -Scope Script -Value $var[1]
}
}
As you can see, I don't explicitly set the variable, since the variables from the config are different types (booleans, int, array, strings). However, PowerShell imports these all as strings. I can import all variables individually (which I may have to do) but I'm still feeling like I will be stuck on the array.
If I declare the array using this command:
New-Variable -Name CRAFTBUKKIT_OPTIONS -Option Constant -Value ([array]#('-o',$true,'-p',25565))
I get exactly what I want, but I need to import it from the config file instead of declaring the variable in my script. The java program is a bit finicky, so I cannot just import that value as a string, or it will not get passed properly and those options get ignored. I've found the only way it works is to have it as an array (as defined above). I also want to note that there could be many more config file options presented than in my example.
I am not sure what is the better approach - importing the variables to be declared correctly (what I would like to do), or assuming they cannot be imported as anything other than a string and then parsing that string into the proper variable types after.
I have tried declaring the variables before hand and using the Set-Variable command to set the values, but that doesn't work. It very much seems like my variables are being imported with Get-Content as strings from the start instead of the correct types.
Full script is here:
https://gist.github.com/TnTBass/4692f2a00fade7887ce4
Any help?
$types = #{
SERVER_NAME = {$args[0]}
SERVER_LOCATION = {$args[0]}
BACKUP_LOCATION = {$args[0]}
CRAFTBUKKIT = {$args[0]}
JAVA_FLAGS = {$args[0]}
CRAFTBUKKIT_OPTIONS = { ($args[0].split(' ')[0] -as [string]),
([bool]::Parse($args[0].split(' ')[1])),
($args[0].split(' ')[2] -as [string]),
($args[0].split(' ')[3] -as [int]) }
TEST_DEPENDENCIES = {[bool]::Parse($args[0])}
DELETE_LOG = {[bool]::Parse($args[0])}
TAKE_BACKUP = {[bool]::Parse($args[0])}
RESTART_PAUSE = {$args[0] -as [int]}
}
$ht = [ordered]#{}
gc config.txt |
foreach {
$parts = $_.split('=').trim()
$ht[$parts[0]] = &$types[$parts[0]] $parts[1]
}
New-object PSObject -Property $ht
SERVER_NAME : MyServer
SERVER_LOCATION : C:\testfiles
BACKUP_LOCATION : C:\\backup
CRAFTBUKKIT : craftbukkit.jar
JAVA_FLAGS : -Xmx1G
CRAFTBUKKIT_OPTIONS : {-o, True, -p, 1337}
TEST_DEPENDENCIES : True
DELETE_LOG : True
TAKE_BACKUP : True
RESTART_PAUSE : 5
The $types hash table uses parameter names from your configuration file for the keys, and script blocks that define the typing and data transformation that needs to be done on the string value for that parameter you're reading from the file. As each line is read in from the file, this part of the script:
$parts = $_.split('=').trim()
$ht[$parts[0]] = &$types[$parts[0]] $parts[1]
Splits it at the '=', then looks up the script block for that parameter and invokes it using the value as it's argument. The results are stored in a hash table ($ht), and then that's used to create an object. You can omit the object creation and just use the hash table to pass your config values if that's more appropriate for your application.
You might need to add some error trapping to test the input data and/or resulting values for production work. but I think the hash table of script blocks is a pretty clean way doing to present the typing and transformation, and should be fairly intuitive to read and easy to maintain in the script if you need to make changes. The first 5 parameters are string parameters, and are just returned as-is, but you can explicit cast them as [string] in the script block just for clarity.
Of course Powershell handles the variable values as strings. That's because it cannot tell string "1337" apart from integer 1337 without some extra help. In order to specify the data type, you need some metadata. There is an format just for that - XML. Now, you don't need to create an XML file by yourself. There are cmdlets Import-CliXML and Export-CliXML that manage Powershell object serialization.
One could for example save the configuration settings in a hash table and serialize it like so,
$cfgSettings = #{
"currentlocation" = "my current location";
"parentfolder" = "my backup location";
"SERVER_NAME" = "MyServer";
"SERVER_LOCATION" = $currentlocation;
"BACKUP_LOCATION" = "$parentfolder\backup";
"CRAFTBUKKIT" = "craftbukkit.jar";
"JAVA_FLAGS" = "-Xmx1G";
"CRAFTBUKKIT_OPTIONS" = "-o True -p 1337";
"TEST_DEPENDENCIES" = $true;
"DELETE_LOG" = $true;
"TAKE_BACKUP" = $true;
"RESTART_PAUSE" = 5
}
Export-Clixml -Path myConf.xml -InputObject $cfgSettings
The file contains serialized hashtable with data types. For example, DELETE_LOG is a boolean, RESTART_PAUSE an int and so on:
<En>
<S N="Key">DELETE_LOG</S>
<B N="Value">true</B>
</En>
<En>
<S N="Key">RESTART_PAUSE</S>
<I32 N="Value">5</I32>
</En>
<En>
<S N="Key">JAVA_FLAGS</S>
<S N="Value">-Xmx1G</S>
</En>
Repopulating and accessing the settings hashtable is not hard either:
$config = Import-CliXML myConf.xml
$config["DELETE_LOG"] # NB! Case sensitive, "delete_log" is different a key!
True
Edit
As per how to create the array, here is a sample that uses deserialized data.
Split the options and serialize the values:
$config = #{
"CRAFTBUKKIT_OPTION1" = "-o" ;
"CRAFTBUKKIT_OPTION2" = $true ;
"CRAFTBUKKIT_OPTION3" = "-p" ;
"CRAFTBUKKIT_OPTION4" = 1337 }
Export-Clixml -InputObject $config -Path .\temp\conf.xml
Deserialize the values and create an array out of them:
$config2 = Import-Clixml C:\temp\conf.xml
$array = #(
$config2["CRAFTBUKKIT_OPTION1"],
$config2["CRAFTBUKKIT_OPTION2"],
$config2["CRAFTBUKKIT_OPTION3"],
$config2["CRAFTBUKKIT_OPTION4"])
Print the array contents with type info:
$array | % { $("{0} {1}" -f $_, ($_.gettype().name)) }
# Output
-o String
True Boolean
-p String
1337 Int32
I'm new at writing in powershell but this is what I'm trying to accomplish.
I want to compare the dates of the two excel files to determine if one is newer than the other.
I want to convert a file from csv to xls on a computer that doesn't have excel. Only if the statement above is true, the initial xls file was copied already.
I want to copy the newly converted xls file to another location
If the file is already open it will fail to copy so I want to send out an email alert on success or failure of this operation.
Here is the script that I'm having issues with. The error is "Expressions are only allowed as the first element of a pipeline." I know it's to do with the email operation but I'm at a loss as to how to write this out manually with all those variables included. There are probably more errors but I'm not seeing them now. Thanks for any help, I appreciate it!
$CSV = "C:filename.csv"
$LocalXLS = "C:\filename.xls"
$RemoteXLS = "D:\filename.xls"
$LocalDate = (Get-Item $LocalXLS).LASTWRITETIME
$RemoteDate = (Get-Item $RemoteXLS).LASTWRITETIME
$convert = "D:\CSV Converter\csvcnv.exe"
if ($LocalDate -eq $RemoteDate) {break}
else {
& $convert $CSV $LocalXLS
$FromAddress = "email#address.com"
$ToAddress = "email#address.com"
$MessageSubject = "vague subject"
$SendingServer = "mail.mail.com"
$SMTPMessage = New-Object System.Net.Mail.MailMessage $FromAddress, $ToAddress, $MessageSubject, $MessageBody
$SMTPClient = New-Object System.Net.Mail.SMTPClient $SendingServer
$SendEmailSuccess = $MessageBody = "The copy completed successfully!" | New-Object System.Net.Mail.SMTPClient mail.mail.com $SMTPMessage
$RenamedXLS = {$_.BaseName+(Get-Date -f yyyy-MM-dd)+$_.Extension}
Rename-Item -path $RemoteXLS -newname $RenamedXLS -force -erroraction silentlycontinue
If (!$error)
{ $SendEmailSuccess | copy-item $LocalXLS -destination $RemoteXLS -force }
Else
{$MessageBody = "The copy failed, please make sure the file is closed." | $SMTPClient.Send($SMTPMessage)}
}
You get this error when you are trying to execute an independent block of code from within a pipeline chain.
Just as a different example, imagine this code using jQuery:
$("div").not(".main").console.log(this)
Each dot (.) will chain the array into the next function. In the above function this breaks with console because it's not meant to have any values piped in. If we want to break from our chaining to execute some code (perhaps on objects in the chain - we can do so with each like this:
$("div").not(".main").each(function() {console.log(this)})
The solution is powershell is identical. If you want to run a script against each item in your chain individually, you can use ForEach-Object or it's alias (%).
Imagine you have the following function in Powershell:
$settings | ?{$_.Key -eq 'Environment' } | $_.Value = "Prod"
The last line cannot be executed because it is a script, but we can fix that with ForEach like this:
$settings | ?{$_.Key -eq 'Environment' } | %{ $_.Value = "Prod" }
This error basically happens when you use an expression on the receiving side of the pipeline when it cannot receive the objects from the pipeline.
You would get the error if you do something like this:
$a="test" | $a
or even this:
"test" | $a
I don't know why are trying to pipe everywhere. I would recommend you to learn basics about Powershell pipelining. You are approaching it wrong. Also, I think you can refer to the link below to see how to send mail, should be straight forward without the complications that you have added with the pipes : http://www.searchmarked.com/windows/how-to-send-an-email-using-a-windows-powershell-script.php