Unzipping an archive with 7zip using a predefined lis of passwords - powershell

I am trying since hourse to unpack an passwort-protected rar-archive. The passwort is unknown and should be determined based on the contents of a json file.
$content = Get-Content "C:\JDownloader
v2.0\cfg\org.jdownloader.extensions.extraction.ExtractionExtension.passwordlist.json"
$passwords = ConvertFrom-Json
$content $7ZipPath = '"C:\Program
Files\7-Zip\7z.exe"'
$zipFile = Get-Clipboard $output = Split-Path
$zipFile
Write-Host(Get-Clipboard);
foreach ($password in $passwords) {
7zip x -o$output -p$password $zipFile
}
}
If i instead use instead of the variable $password the plain-text-password, everyting is working as expected.

Related

How can I update a changing value in Powershell in a web.config file?

I am trying to update text inside a web.config file with powershell, but I am having a very, very difficult time.
The problem is this line
<add key="Models.Catalog.Repository" value=Server23 />
You see where the Server23 is? this value sometimes can be empty, which will be represented as "" or sometimes it can have a different name like "\Server25\path"
How do I tell PowerShell to replace/insert data if the field is empty/has some text in it?
here is the full code
$GP_AP= "Server1"
$NewMD = "\\JohnWick"
Invoke-Command -ComputerName $GP_AP -ScriptBlock {
$date= Get-Date -Format "dd_MM_yyyy_HHMM_ss"
$webfile = "C:\Program Files\TEST\IIS\Web.config"
(Copy-Item -Path $webfile -Destination $webfile_Test_$Date)
#[xml]$webconfigfile = Get-Content "C:\Program Files\TEST\IIS\Web.config"
#[xml]$2 = Get-Content "\\Server1\C$\Program Files\TEST\IIS\Web.config"
$olddata = "*"
#(Get-Content $webfile -raw).Replace("<add key=`"Models.Catalog.Repository`" value=$olddata />","<add key=`"Models.Catalog.Repository`" value=$Using:NewMD />") `
#| Set-Content "C:\Program Files\TEST\IIS\Web.config" -Force -Encoding UTF8 -ErrorAction Stop
(Get-Content $webfile -raw).Replace('<add key="Models.Catalog.Repository" value=Server23 />',"<add key=`"Models.Catalog.Repository`" value=$Using:NewMD />") `
| Set-Content "C:\Program Files\TEST\IIS\Web.config" -Force -Encoding UTF8 -ErrorAction Stop
} # end of script block
The script below will allow you to set values in a web config's <appSettings> based on key names. All the nodes must exist. This script will not create them. You'll obviously need to edit the script's default params.
param(
[string] $pathToWebConfig = "$PSScriptRoot\web.config"
[System.Collections.Hashtable] $appSettings = #{
someKey = 'somevalue';
someOtherKey = 'someOtherValue'
}
)
function UpdateXmlConfigAppSettings(
[Parameter(mandatory=$true)]
[string] $configFilePath,
[Parameter(mandatory=$true)]
[System.Collections.Hashtable] $appSettingsDict
) {
[xml] $webConfig = Get-Content $configFilePath
foreach($key in $appSettingsDict.Keys) {
$node = $webConfig.SelectSingleNode("//appSettings/add[#key = '$key']")
if ($null -ne $node) {
$node.SetAttribute('value', $appSettingsDict[$key])
}
else {
throw "$appSettingsDict[$key] is not a valid key in AppSettings"
}
}
$webConfig.Save($configFilePath)
}
UpdateXmlConfigAppSettings -configFilePath $pathToWebConfig -appSettingsDict $appSettings
$path = '\\server\c$\Program Files\TEST\IIS\web.config'
[xml]$config = Get-Content -Path $path -Raw
$stuffIwant = "\\Blah-244"
$config.SelectNodes("//add[#key='Models.Catalog.Repository']") | `
ForEach-Object { $_.SetAttribute("value", $stuffIwant) }
$config.Save($path)

Expand-Archive without Importing and Exporting files

How do I stop exporting then importing files in and out of Powershell when working with .zip files (Expand-Archive)?
I am currently using a temporary folder to extract the .zip file.
Is there a variable or something I missed that would work better than the solution below?
$filename = 'foobar'
$Zip_in_Bytes | Set-Content -Encoding Byte -Path "C:\temp\filename.zip"
Expand-Archive -Path "C:\temp\filename.zip" -DestinationPath "C:\temp\" -Force
[xml]$xml = Get-Content -Path "C:\temp\filename.xml"
Remove-Item "C:\temp\filename.zip"
Remove-Item "C:\temp\filename.xml"
Expand-Archive only support paths parameters, not objects
Is there a better way to handle .zip files?
Using System.IO.Compression you can work with byte arrays and streams rather than temporary files, but it's a bit more work than Expand-Archive.
EDIT: Added Get-ZipEntryContent and Add-ZipEntry sample calls, and tweaked parameters making $ZipFilePath optional.
#( 'System.IO.Compression','System.IO.Compression.FileSystem') | % { [void][System.Reflection.Assembly]::LoadWithPartialName($_) }
function Get-ZipEntryContent(#returns the bytes of the first matching entry
[string] $ZipFilePath, #optional - specify a ZipStream or path
[IO.Stream] $ZipStream = (New-Object IO.FileStream($ZipFilePath, [IO.FileMode]::Open)),
[string] $EntryPath){
$ZipArchive = New-Object IO.Compression.ZipArchive($ZipStream, [IO.Compression.ZipArchiveMode]::Read)
$buf = New-Object byte[] (0) #return an empty byte array if not found
$ZipArchive.GetEntry($EntryPath) | ?{$_} | %{ #GetEntry returns first matching entry or null if there is no match
$buf = New-Object byte[] ($_.Length)
Write-Verbose " reading: $($_.Name)"
$_.Open().Read($buf,0,$buf.Length)
}
$ZipArchive.Dispose()
$ZipStream.Close()
$ZipStream.Dispose()
return $buf
}
function Add-ZipEntry(#Adds an entry to the $ZipStream. Sample call: Add-ZipEntry -ZipFilePath "$PSScriptRoot\temp.zip" -EntryPath Test.xml -Content ([text.encoding]::UTF8.GetBytes("Testing"))
[string] $ZipFilePath, #optional - specify a ZipStream or path
[IO.Stream] $ZipStream = (New-Object IO.FileStream($ZipFilePath, [IO.FileMode]::OpenOrCreate)),
[string] $EntryPath,
[byte[]] $Content,
[switch] $OverWrite, #if specified, will not create a second copy of an existing entry
[switch] $PassThru ){#return a copy of $ZipStream
$ZipArchive = New-Object IO.Compression.ZipArchive($ZipStream, [IO.Compression.ZipArchiveMode]::Update, $true)
$ExistingEntry = $ZipArchive.GetEntry($EntryPath) | ?{$_}
If($OverWrite -and $ExistingEntry){
Write-Verbose " deleting existing $($ExistingEntry.FullName)"
$ExistingEntry.Delete()
}
$Entry = $ZipArchive.CreateEntry($EntryPath)
$WriteStream = New-Object System.IO.StreamWriter($Entry.Open())
$WriteStream.Write($Content,0,$Content.Length)
$WriteStream.Flush()
$WriteStream.Dispose()
$ZipArchive.Dispose()
If($PassThru){
$OutStream = New-Object System.IO.MemoryStream
$ZipStream.Seek(0, 'Begin') | Out-Null
$ZipStream.CopyTo($OutStream)
}
$ZipStream.Close()
$ZipStream.Dispose()
If($PassThru){$OutStream}
}
Here's an example of how you would call Add-ZipEntry and Get-ZipEntryContent functions completely in memory:
$NewZipStream = Add-ZipEntry -ZipStream (New-Object IO.MemoryStream) -EntryPath Test.xml -Content ([text.encoding]::UTF8.GetBytes("<xml><test>1</test>")) -PassThru
$bytes = Get-ZipEntryContent -ZipStream $NewZipStream -EntryPath 'Test.xml'
[text.encoding]::UTF8.GetString($bytes)
Although Expand-Archive doesn't accept objects, you can provide it the string property of the objects. E.G. $File.FullName will be a String.
Get-ChildItem C:\temp\ -Filter "*.zip" |
ForEach-Object {
Expand-Archive -Path $_.FullName -DestinationPath "C:\Temp\Extracted\$($_.BaseName)\"
}

Word bypass password protected files

PURPOSE
The script should iterate through each file in a folder, convert to .txt and upload text to Azure database
PROBLEM
Everything works fine up until it hits a password protected file, I just want to skip these files. I am running this on hundreds of thousands of documents and the script will pause if it hits a password protected file until you either enter the password or click Cancel.
SCRIPT
Write-Output "Processing: $($file)"
Try {
$doc = $word.Documents.OpenNoRepairDialog($file)
}
Catch {
}
if ($doc) {
$fileName = [io.path]::GetFileNameWithoutExtension($file)
$fileName = $filename + ".txt"
$doc.SaveAs("$env:TEMP\$fileName", [ref]$saveFormat)
$doc.Close()
$4ID = $fileName.split('-')[-1].replace(' ', '').replace(".txt", "")
$text = Get-Content -raw "$env:TEMP\$fileName"
$text = $text.replace("'", "")
$query += "
('$text', $4ID),"
Remove-Item -Force "$env:TEMP\$fileName"
}
SOLUTION
For anyone having the same issue, the solution was to pass a non empty string to the open call like so:
$wd.Documents.Open($file, $false, $falsel, $false, "ttt")
rather than
$wd.Documents.Open($file, $false, $falsel, $false, "")
Here is a demo script of indicating if a Word document is password protected in the current directory. If the file opening doesn't get triggered by the catch block, continue your logic in the try block.
$wd = New-Object -ComObject Word.Application
$scriptpath = $MyInvocation.MyCommand.Path
$dir = Split-Path $scriptpath
$files = Get-ChildItem $dir -Include *.doc, *.docx -Recurse
foreach ($file in $files) {
try {
$doc = $wd.Documents.Open($file, $null, $null, $null, "")
} catch {
Write-Host "$file is password-protected!"
}
}
You will need to integrate the rest of your logic if you choose this approach, but it shows the general idea of checking password protected files.

file not found in current directory

I have the following script:
function Export-sql-toExcel {
[CmdletBinding()]
Param (
[string]$scriptFile,
[string]$excelFile,
[string]$serverInstance,
[string]$database,
[string]$userName,
[string]$password
)
$tokens = ( [system.io.file]::ReadAllText( $scriptFile ) -split '(?:\bGO\b)' )
foreach ($token in $tokens) {
$token = $token.Trim()
if ($token -ne "") {
$lines = $token -split '\n'
$title = $lines[0]
if ($title.StartsWith("--")) {
$title = $title.Substring(2)
$title
}
Invoke-Sqlcmd -ServerInstance $serverInstance -Database $database -Username $userName -Password $password -Query $token |
Export-Excel -Path $excelFile -WorkSheetname $title -FreezeTopRow -ExcludeProperty RowError,RowState,Table,ItemArray,HasErrors
}
}
}
I have installed this function as a module. When I invoke the command from, for example, desktop folder, like this:
PS D:\Usuarios\mnieto\Desktop> Export-sql-toExcel -scriptFile .\EXPORT.txt -excelFile Excel.xlsx
I get the following error (the export.txt file is in the desktop folder):
Exception calling "ReadAllText" with "1" argument(s): "Can't find the file 'D:\Usuarios\<MyUserName>\EXPORT.txt'."
EDITED
if I debug and try [system.environment]::CurrentDirectory, it returns
'D:\Usuarios\<MyUserName>
That is because my script fails. NET functions and powershell don't share the 'current directory'
Any other way to get the content and parse the $scriptFile file?
I got the solution changing the NET call by a powershell command at this line
$content = Get-Content $scriptFile -Raw
$tokens = ( $content -split '(?:\bGO\b)' )
the trick was in the -Raw parameter, so the file is read as a single string
To my experience .dot NET functions don't like relative path's.
I'd use
$scriptfile = (Get-Item $Scriptfile).FullName
to resolve to a full path in the function just ahead :
$tokens = ( [system.io.file]::ReadAllText( $scriptFile ) -split '(?:\bGO\b)' )
I had the same situation, it was resolved when i added a dot "." to call the csv
$path2 = $path + ".\file.csv"
Or check for spaces at the $excelFile variable.

FTPS Upload in Powershell

I'm in the process of learning Powershell, and am working on a little script that will upload a group of files to an FTPS server nightly. The files are located on a network share in a sub-directory containing the date in the name. The files themselves will all begin with the same string, let's say "JONES_". I have this script working for FTP, but I don't quite get what I need to do to get it to work for FTPS:
# Set yesterday's date (since uploads will happen at 2am)
$YDate = (Get-Date).AddDays(-1).ToString('MM-dd-yyyy')
#Create Log File
$Logfile = "C:\powershell\$YDate.log"
Function LogWrite
{
Param ([string]$logstring)
Add-Content $Logfile -value $logstring
}
# Find Directory w/ Yesterday's Date in name
$YesterdayFolder = Get-ChildItem -Path "\\network\storage\location" | Where-Object {$_.FullName.contains($YDate)}
If ($YesterdayFolder) {
#we specify the directory where all files that we want to upload are contained
$Dir= $YesterdayFolder
#ftp server
$ftp = "ftp://ftps.site.com"
$user = "USERNAME"
$pass = "PASSWORD"
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
$FilesToUpload = Get-ChildItem -Path (Join-Path $YesterdayFolder.FullName "Report") | Where-Object {$_.Name.StartsWith("JONES","CurrentCultureIgnoreCase")}
foreach($item in ($FilesToUpload))
{
LogWrite "Uploading file: $YesterdayFolder\Report\$item"
$uri = New-Object System.Uri($ftp+$item.Name)
$webclient.UploadFile($uri, $item.FullName)
}
} Else {
LogWrite "No files to upload"
}
I'd rather not have to deal with a 3rd party software solution, if at all possible.
Using psftp didn't work for me. I couldn't get it to connect to the FTP over SSL. I ended up (reluctantly?) using WinSCP with this code:
$PutCommand = '& "C:\Program Files (x86)\WinSCP\winscp.com" /command "open ftp://USER:PASS#ftps.hostname.com:21/directory/ -explicitssl" "put """"' + $Item.FullName + '""""" "exit"'
Invoke-Expression $PutCommand
In the foreach loop.
I'm not sure if you would consider this as "3rd party software" or not, but you can run PSFTP from within Powershell. Here is an example of how you could do that (source):
$outfile=$YesterdayFolder"\Report\"$item.Name
"rm $outfile`nput $outfile`nbye" | out-file batch.psftp -force -Encoding ASCII
$user = "USERNAME"
$pass = "PASSWORD"
&.\psftp.exe -l $user -pw $pass $ftp -b batch.psftp -be