PowerShell - Download files with WebDAV - powershell

I want to use PowerShell with WebDAV (https) to download multiple files from a folder. The name from the download files is unknown. So my plan is to download all files from this folder and create a cleaning job at the server.
At the moment I´m searching for a good PowerShell with WebDAV example. Does anybody know a good example?

I don't try it but do you just test WebClient class?
PS > $source = "http://www.unsite.fr/untruc.zip"
PS > $destination = "c:\temp\untruc.zip"
PS >
PS >$wc = New-Object System.Net.WebClient
PS >$wc.DownloadFile($source, $destination)
----Edited------
Do you try to dig into ADODB.stream com object as you cansee in : "PowerShell : Upload file to WebDav Server"

Take a look at: http://amarchuk.blogspot.com/2011/10/heres-c-webdav-client-that-works-with.html
You can store it as cs file, then call add-type to add it to powershell and from there on client.GetListItems to get names and then download all files recursively.

Related

Version of jar file by reading its manifest using powershell

I am running a powershell script to idenfity the versions. For DLL's and EXE's using the following function to get the version . I have a few other files with the extension .Jar. Is there a way I can use powershell to open the jar and get the version of it from their manifest.
Please let me know.
#{n='Version';e={$_.versioninfo.Fileversion}}
Looks like you have to extract from the jar file first. I downloaded java and tested myself using a jar file I also downloaded
& "C:\Program Files\Java\jdk1.8.0_191\bin\jar.exe" xvf junit-4.10.jar META-INF/MANIFEST.MF
get-content .\META-INF\MANIFEST.MF
RESULTS
Manifest-Version: 1.0
Ant-Version: Apache Ant 1.8.2
Created-By: 1.6.0_26-b03-384-10M3425 (Apple Inc.)
That being said, please read here, Do we want single, complete answers? where Implementation-Version is mentioned, so make sure you know where the version is to be and if you can depend on this.
This works as a PowerShell script on Server 2016/PS 5.1 and Win 10/PS 7.1 without installing anything extra. It reads the MANIFEST.MF file in the zip archive and writes to standard output. This is option 4 found here: https://stackoverflow.com/a/37561878/101151
Be sure to pass the absolute path to [io.compression.zipfile]::OpenRead. It seems to bind to the first directory it was run in and re-use that for relative paths.
# Read the MANIFEST.MF from a Java .JAR (really a .zip) file and output to standard output
param(
[Parameter(Mandatory=$true)][string]$jarname
)
# The following code is based on an answer at https://stackoverflow.com/a/37561878/101151
Add-Type -assembly "system.io.compression.filesystem"
$zip = [io.compression.zipfile]::OpenRead((Get-ChildItem $jarname).FullName)
$file = $zip.Entries | where-object { $_.Name -eq "MANIFEST.MF"}
$stream = $file.Open()
$reader = New-Object IO.StreamReader($stream)
$text = $reader.ReadToEnd()
$text
$reader.Close()
$stream.Close()
$zip.Dispose()

IE save as pop up automation using powershell

Automating a process where i download a report from a link and save it in a particular folder. I can use sendkeys to automate this but under headless condition it fails. Is there any way in powershell to automate this.
So you need to download a file via Powershell. You'll need to know a direct link to it, then you can automate it via this script:
$url = "http://your-site/report.xlsx"
$output = "$PSScriptRoot\report.xlsx"
$wc = New-Object System.Net.WebClient
$wc.DownloadFile($url, $output)
There are other options how to download the file via Powershell, but this is the best tradeoff for you.

(PowerShell) How to give a ZIP permissions to allow user to edit directly after downloading from Azure Blob

Good morning friends,
I've been writing a script in PowerShell to replace our current manual process to deploy our application to Azure Blob Storage in a ZIP folder during the Build Process in VS. I'm about done, but I've run into this issue:
When the ZIP that I upload to Azure is downloaded by anyone, the ZIP cannot be manipulated without having to extract the files first. This is something the current process is able to accomplish and I don't know how (The current process was written in C# and is done through a GUI). It needs to be editable via the ZIP because the current Updater is set to manipulate the ZIP without the extraction first.
So the initial question is: How do I set permissions on a ZIP archive that will follow it to Azure Blob Storage and then when it's downloaded on a client's machine that allow it's contents to be manipulated (The error itself at this time is that it cannot delete a file in a child folder) without extraction?
Currently, to ZIP my folder up, I use this process:
$src = "$TEMPFOL\$testBuildDrop"
$dst = "$TEMPFOL\LobbyGuard.zip"
[Reflection.Assembly]::LoadWithPartialName( "System.IO.Compression.FileSystem" )
[System.IO.Compression.ZipFile]::CreateFromDirectory($src, $dst)
and then push it to blob with:
set-azurestorageblobcontent -Container test -blob "LobbyGuard.zip" -file "$TEMPFOL\LobbyGuard.zip" -context $storageCreds -force
I've tried to set permissions on the folder prior to upload using
$getTEMPFOLACL = Get-ACL $TEMPFOL
$accessRule = New-Object System.Security.AccessControl.FileSystemAccessRule("Everyone", "FullControl", "Allow")
$getTEMPFOLACL.SetAccessRule($accessRule)
Which works on the current local file, but once downloaded, the permissions on the file are set as
Owner: BUILTIN\Adminstrators Access: NT AUTHORITY\SYSTEM Allow FullControl
Which is exactly the same permissions as the file that's downloaded from the current process. I'm not understanding what I'm missing here to make this work.
If necessary I can provide the DL link to our blob to show the current manual processes folder that can be manipulated IN the ZIP vs. My Scripts ZIP that cannot.
Try unblocking the file after downloading, ie
Unblock-File C:\path\yourDownloaded.zip

powershell ftp session download logs

I know there is a lot of stuff on ftp with powershell but i am strugerling to find the correct information. please can you guys help..
$today = (get-date).Date
$dateStr = '{0:yyyyMMdd}' -f $today
$source = "ftp://username:password:servername"
$target = "\\Path\filename.zip"
$WebClient = New-Object System.Net.WebClient
$WebClient.DownloadFile($source, $target)
"Downloading Log $File..."
$webclient.DownloadFile($source, $target)
I am trying to download the log files from a MSA P2000 controller, the commands would follow on a normal cmd session...
ftp "controllerName" username, password.. connection established. get logs filename.zip
how do i run the command get logs within the script to automate the process?
I gave up trying to do this via powershell and run it as a normal Batch file, dont know why i didnt do this in the first place!
#ftp -i -s:"%~f0"&GOTO:EOF
open serverName.domain.com
username
password
cd \\path for file
bin
hash
get logs test.zip
disconnect
bye
Seems like you've found a solution, but if you're still itching for an answer... your $source variable may be wrong. You need to use the entire path, including the directory and file name relevant to the account's ftp site.
$source = "ftp://username:password:servername/path/to/file.log"
You can check out MSDN for a code example since this is part of .NET.

Powershell running under a service hangs on *.zip CopyHere

I'm running a Windows Service (Hudson) which in turn spawns a PowerShell process to run my custom PowerShell commands. Part of my script is to unzip a file using CopyHere. When I run this script locally, I see a progress dialog pop up as the files are extracted and copied. However, when this runs under the service, it hangs at the point where a dialog would otherwise appear.
Here's the unzip portion of my script.
# Extract the contents of a zip file to a folder
function Extract-Zip {
param([string]$zipFilePath, [string]$destination)
if(test-path($zipFilePath)) {
$shellApplication = new-object -com shell.application
$zipFile = get-item $zipFilePath
$zipFolder = $shellApplication.NameSpace($zipFile.fullname)
$destinationFile = get-item $destination
$destinationFolder = $shellApplication.NameSpace($destinationFile.fullname)
$destinationFolder.CopyHere($zipFolder.Items())
}
}
I suspect that because its running under a service process which is headless (no interaction with the desktop), its somehow stuck trying to display a dialog.
Is there a way around this?
If it's still actual, I managed to fix this with having CopyHere params equal 1564.
So in my case extract zip function looks like:
function Expand-ZIPFile{
param(
$file, $destination
)
$shell = new-object -com shell.application
$zip = $shell.NameSpace($file)
foreach($item in $zip.items())
{
$shell.Namespace($destination).copyhere($item,1564)
"$($item.path) extracted"
}
1564 description can be found here - http://msdn.microsoft.com/en-us/library/windows/desktop/bb787866(v=vs.85).aspx:
(4) Do not display a progress dialog box.
(8) Give the file being operated on a new name in a move, copy, or rename operation if a file with the target name already exists.
(16) Respond with "Yes to All" for any dialog box that is displayed.
(512) Do not confirm the creation of a new directory if the operation requires one to be created.
(1024) Do not display a user interface if an error occurs.
If this is running on Vista or Windows 7, popping up UI from a service isn't going to be seen by the end user as you suspected. See this paper on Session 0 Isolation. However, does the progress dialog require user input? If not, I wouldn't think that would cause the service to hang. I would look for an option to disable the progress display. If you can't find that, then try switching to another ZIP extractor. PSCX 1.2 comes with an Expand-Archive cmdlet. I'm sure there are also others available.
Looking at the documentation for PowerShell, it looks like the -NonInteractive option may help here