using a delimiter to split files names in powershell - powershell

I use a simple function to download files and return the path to me when updating computers for simplicity.
I was stuck on why it was not working then realized that the proxy is appending a random number to the filename so instead of it being 12345.zip it is actually 8493830_12345.zip.
I have tried to find the file using the "_" as a split but while there are no errors, the file is not being returned and I have checked it is there manually.
function FileCheck {
$fileName.Split("_")[1]
$fileName = "{0}.zip" -f 12345
Download -ZipFileName $($fileName) -OutputDirectory $env:temp
$SleepTime = 300
$sleepElapsed = 0
$sleepInterval = 20
Start-Sleep $sleepInterval
$file = Get-ChildItem -Path $env:temp -Filter "$fileName*"
if ($file -ne $null) {
return $file[0].FullName
}
Start-Sleep($sleepInterval)
$sleepElapsed += $sleepInterval
if (($SleepTime) -le $sleepElapsed){
# Check for file with given prefix
$file = Get-ChildItem -Path $env:temp -Filter "$fileName*"
if ($file -eq $null) {
Write-Error 'file not found'
return $null
}
return $file[0].FullName
}
}
I am guessing the split is not working but googling and moving the filename.split has not worked for me. Any help is appreciated

Well, your split is doing nothing at all. You haven't defined $filename, but if you had, and it had an underscore, then $filename.split('_') would return two or more strings, depending on how many underscores were in the original string, but you never capture the result. I think the real problem here is the filter you are applying to Get-ChildItem later in your function.
$file = Get-ChildItem -Path $env:temp -Filter "$fileName*"
That will look for files beginning with $fileName, which you define on line 4 to be "12345.zip". That is exactly the opposite of what you want to be looking for. You need to move the asterisk to before $fileName, so it looks like this:
$file = Get-ChildItem -Path $env:temp -Filter "*$fileName"
That will return all files that end with "12345.zip", which would include things like:
myfuzzyhippo12345.zip
learn-to-count-12345.zip
8493830_12345.zip
Basically anything that ends in 12345.zip. Also, it appears that you are under the impression that executing a return $file[0].fullname or return $null will stop the function. That's a mistake. A function runs to completion unless exited early by something like a break command. Also, everything not explicitly captured or redirected will be passed back from the function, so reading through your function people are likely to get the output of your $filename.split('_') line, then possibly $null or $filename[0].fullname.
Lastly, it appears that you're trying to look for the file, if you don't find it to wait a bit, and try again, until $sleepElapsed is greater than $sleepTime. What you want here is a While or a Do/While loop. Here's what I'd do...
function FileCheck {
Param(
$fileName = '12345.zip',
$SleepTime = 300,
$sleepElapsed = 0,
$sleepInterval = 20
)
Download -ZipFileName $($fileName) -OutputDirectory $env:temp
Do{
Start-Sleep $sleepInterval
$sleepElapsed = $sleepElapsed + $sleepInterval
$file = Get-ChildItem -Path $env:temp -Filter "*$fileName"|Select -First 1
}While(!$file -and $sleepElapsed -le $sleepTime)
$file.FullName
}
That lets you define things like sleep settings at runtime if you want, or just let it default to what you were using, same with the file name. Then it downloads the file, and looks for it, pausing between attempts, until either it finds the file, or it runs out of time. Then it returns $file.FullName which is either the path to the file if it found one, or nothing if it didn't find a file.
Personally I'd have it return the file object, and just utilize the .FullName property if that's all I wanted later. Usually (not always, but usually) more info returned from a function is better than less info. Like what if the download fails and it's a zero byte file? Just returning only the path doesn't tell you that.

Related

Compress File per file, same name

I hope you are all safe in this time of COVID-19.
I'm trying to generate a script that goes to the directory and compresses each file to .zip with the same name as the file, for example:
sample.txt -> sample.zip
sample2.txt -> sample2.zip
but I'm having difficulties, I'm not that used to powershell, I'm learning and improving this script. In the end it will be a script that deletes files older than X days, compresses files and makes them upload in ftp .. the part of excluding with more than X I've already managed it for days, now I grabbed a little bit on this one.
Last try at moment.
param
(
#Future accept input
[string] $InputFolder,
[string] $OutputFolder
)
#test folder
$InputFolder= "C:\Temp\teste"
$OutputFolder="C:\Temp\teste"
$Name2 = Get-ChildItem $InputFolder -Filter '*.csv'| select Name
Set-Variable SET_SIZE -option Constant -value 1
$i = 0
$zipSet = 0
Get-ChildItem $InputFolder | ForEach-Object {
$zipSetName = ($Name2[1]) + ".zip "
Compress-Archive -Path $_.FullName -DestinationPath "$OutputFolder\$zipSetName"
$i++;
$Name2++
if ($i -eq $SET_SIZE) {
$i = 0;
$zipSet++;
}
}
You can simplify things a bit, and it looks like most of the issues are because in your script example $Name2 will contain a different set of items than the Get-ChildItem $InputFolder will return in the loop (i.e. may have other objects other than .csv files).
The best way to deal with things is to use variables with the full file object (i.e. you don't need to use |select name). So I get all the CSV file objects right away and store in the variable $CsvFiles.
We can additionally use the special variable $_ inside the ForEach-Object which represents the current object. We also can use $_.BaseName to give us the name without the extension (assuming that's what you want, otherwise use $_Name to get a zip with the name like xyz.csv).
So a simplified version of the code can be:
$InputFolder= "C:\Temp\teste"
$OutputFolder="C:\Temp\teste"
#Get files to process
$CsvFiles = Get-ChildItem $InputFolder -Filter '*.csv'
#loop through all files to zip
$CsvFiles | ForEach-Object {
$zipSetName = $_.BaseName + ".zip"
Compress-Archive -Path $_.FullName -DestinationPath "$OutputFolder\$zipSetName"
}

Parse directory listing and pass to another script?

I am trying to write a PowerShell script that will loop through a directory in C:\ drive and parse the filenames with the file extension to another script to use.
Basically, the output of the directory listing should be accessible to be parsed to another script one by one. The script is a compiling script which expects an argument (parameter) to be parsed to it in order to compile the specific module (filename).
Code:
Clear-Host $Path = "C:\SandBox\"
Get-ChildItem $Path -recurse -force | ForEach { If ($_.extension -eq ".cob")
{
Write-Host $_.fullname
}
}
If ($_.extension -eq ".pco")
{
Write-Host $_.fullname }
}
You don't need to parse the output as text, that's deprecated.
Here's something that might work for you:
# getmyfiles.ps1
Param( [string])$Path = Get-Location )
dir $Path -Recurse -Force | where {
$_.Extension -in #('.cob', '.pco')
}
# this is another script that calls the above
. getmyfile.ps1 -Path c:\sandbox | foreach-object {
# $_ is a file object. I'm just printing its full path but u can do other stuff eith it
Write-host $_.Fullname
}
Clear-Host
$Path = "C:\Sandbox\"
$Items = Get-ChildItem $Path -recurse -Include "*.cob", "*.pco"
From your garbled code am guessing you want to return a list of files that have .cob and .pco file extensions. You could use the above code to gather those.
$File = $Items.name
$FullName = $items.fullname
Write-Host $Items.name
$File
$FullName
Adding the above lines will allow you to display them in various ways. You can pick the one that suites your needs then loop through them on a for-each.
As a rule its not a place for code to be writen for you, but you have tried to add some to the questions so I've taken a look. Sometimes you just want a nudge in the right direction.

How to use Powershell to list duplicate files in a folder structure that exist in one of the folders

I have a source tree, say c:\s, with many sub-folders. One of the sub-folders is called "c:\s\Includes" which can contain one or more .cs files recursively.
I want to make sure that none of the .cs files in the c:\s\Includes... path exist in any other folder under c:\s, recursively.
I wrote the following PowerShell script which works, but I'm not sure if there's an easier way to do it. I've had less than 24 hours experience with PowerShell so I have a feeling there's a better way.
I can assume at least PowerShell 3 being used.
I will accept any answer that improves my script, but I'll wait a few days before accepting the answer. When I say "improve", I mean it makes it shorter, more elegant or with better performance.
Any help from anyone would be greatly appreciated.
The current code:
$excludeFolder = "Includes"
$h = #{}
foreach ($i in ls $pwd.path *.cs -r -file | ? DirectoryName -notlike ("*\" + $excludeFolder + "\*")) { $h[$i.Name]=$i.DirectoryName }
ls ($pwd.path + "\" + $excludeFolder) *.cs -r -file | ? { $h.Contains($_.Name) } | Select #{Name="Duplicate";Expression={$h[$_.Name] + " has file with same name as " + $_.Fullname}}
1
I stared at this for a while, determined to write it without studying the existing answers, but I'd already glanced at the first sentence of Matt's answer mentioning Group-Object. After some different approaches, I get basically the same answer, except his is long-form and robust with regex character escaping and setup variables, mine is terse because you asked for shorter answers and because that's more fun.
$inc = '^c:\\s\\includes'
$cs = (gci -R 'c:\s' -File -I *.cs) | group name
$nopes = $cs |?{($_.Group.FullName -notmatch $inc)-and($_.Group.FullName -match $inc)}
$nopes | % {$_.Name; $_.Group.FullName}
Example output:
someFile.cs
c:\s\includes\wherever\someFile.cs
c:\s\lib\factories\alt\someFile.cs
c:\s\contrib\users\aa\testing\someFile.cs
The concept is:
Get all the .cs files in the whole source tree
Split them into groups of {filename: {files which share this filename}}
For each group, keep only those where the set of files contains any file with a path that matches the include folder and contains any file with a path that does not match the includes folder. This step covers
duplicates (if a file only exists once it cannot pass both tests)
duplicates across the {includes/not-includes} divide, instead of being duplicated within one branch
handles triplicates, n-tuplicates, as well.
Edit: I added the ^ to $inc to say it has to match at the start of the string, so the regex engine can fail faster for paths that don't match. Maybe this counts as premature optimization.
2
After that pretty dense attempt, the shape of a cleaner answer is much much easier:
Get all the files, split them into include, not-include arrays.
Nested for-loop testing every file against every other file.
Longer, but enormously quicker to write (it runs slower, though) and I imagine easier to read for someone who doesn't know what it does.
$sourceTree = 'c:\\s'
$allFiles = Get-ChildItem $sourceTree -Include '*.cs' -File -Recurse
$includeFiles = $allFiles | where FullName -imatch "$($sourceTree)\\includes"
$otherFiles = $allFiles | where FullName -inotmatch "$($sourceTree)\\includes"
foreach ($incFile in $includeFiles) {
foreach ($oFile in $otherFiles) {
if ($incFile.Name -ieq $oFile.Name) {
write "$($incFile.Name) clash"
write "* $($incFile.FullName)"
write "* $($oFile.FullName)"
write "`n"
}
}
}
3
Because code-golf is fun. If the hashtables are faster, what about this even less tested one-liner...
$h=#{};gci c:\s -R -file -Filt *.cs|%{$h[$_.Name]+=#($_.FullName)};$h.Values|?{$_.Count-gt1-and$_-like'c:\s\includes*'}
Edit: explanation of this version: It's doing much the same solution approach as version 1, but the grouping operation happens explicitly in the hashtable. The shape of the hashtable becomes:
$h = {
'fileA.cs': #('c:\cs\wherever\fileA.cs', 'c:\cs\includes\fileA.cs'),
'file2.cs': #('c:\cs\somewhere\file2.cs'),
'file3.cs': #('c:\cs\includes\file3.cs', 'c:\cs\x\file3.cs', 'c:\cs\z\file3.cs')
}
It hits the disk once for all the .cs files, iterates the whole list to build the hashtable. I don't think it can do less work than this for that bit.
It uses +=, so it can add files to the existing array for that filename, otherwise it would overwrite each of the hashtable lists and they would be one item long for only the most recently seen file.
It uses #() - because when it hits a filename for the first time, $h[$_.Name] won't return anything, and the script needs put an array into the hashtable at first, not a string. If it was +=$_.FullName then the first file would go into the hashtable as a string and the += next time would do string concatenation and that's no use to me. This forces the first file in the hashtable to start an array by forcing every file to be a one item array. The least-code way to get this result is with +=#(..) but that churn of creating throwaway arrays for every single file is needless work. Maybe changing it to longer code which does less array creation would help?
Changing the section
%{$h[$_.Name]+=#($_.FullName)}
to something like
%{if (!$h.ContainsKey($_.Name)){$h[$_.Name]=#()};$h[$_.Name]+=$_.FullName}
(I'm guessing, I don't have much intuition for what's most likely to be slow PowerShell code, and haven't tested).
After that, using h.Values isn't going over every file for a second time, it's going over every array in the hashtable - one per unique filename. That's got to happen to check the array size and prune the not-duplicates, but the -and operation short circuits - when the Count -gt 1 fails, the so the bit on the right checking the path name doesn't run.
If the array has two or more files in it, the -and $_ -like ... executes and pattern matches to see if at least one of the duplicates is in the includes path. (Bug: if all the duplicates are in c:\cs\includes and none anywhere else, it will still show them).
--
4
This is edited version 3 with the hashtable initialization tweak, and now it keeps track of seen files in $s, and then only considers those it's seen more than once.
$h=#{};$s=#{};gci 'c:\s' -R -file -Filt *.cs|%{if($h.ContainsKey($_.Name)){$s[$_.Name]=1}else{$h[$_.Name]=#()}$h[$_.Name]+=$_.FullName};$s.Keys|%{if ($h[$_]-like 'c:\s\includes*'){$h[$_]}}
Assuming it works, that's what it does, anyway.
--
Edit branch of topic; I keep thinking there ought to be a way to do this with the things in the System.Data namespace. Anyone know if you can connect System.Data.DataTable().ReadXML() to gci | ConvertTo-Xml without reams of boilerplate?
I'd do more or less the same, except I'd build the hashtable from the contents of the includes folder and then run over everything else to check for duplicates:
$root = 'C:\s'
$includes = "$root\includes"
$includeList = #{}
Get-ChildItem -Path $includes -Filter '*.cs' -Recurse -File |
% { $includeList[$_.Name] = $_.DirectoryName }
Get-ChildItem -Path $root -Filter '*.cs' -Recurse -File |
? { $_.FullName -notlike "$includes\*" -and $includeList.Contains($_.Name) } |
% { "Duplicate of '{0}': {1}" -f $includeList[$_.Name], $_.FullName }
I'm not as impressed with this as I would like but I thought that Group-Object might have a place in this question so I present the following:
$base = 'C:\s'
$unique = "$base\includes"
$extension = "*.cs"
Get-ChildItem -Path $base -Filter $extension -Recurse |
Group-Object $_.Name |
Where-Object{($_.Count -gt 1) -and (($_.Group).FullName -match [regex]::Escape($unique))} |
ForEach-Object {
$filename = $_.Name
($_.Group).FullName -notmatch [regex]::Escape($unique) | ForEach-Object{
"'{0}' has file with same name as '{1}'" -f (Split-Path $_),$filename
}
}
Collect all the files with the extension filter $extension. Group the files based on their names. Then of those groups find every group where there are more than one of that particular file and one of the group members is at least in the directory $unique. Take those groups and print out all the files that are not from the unique directory.
From Comment
For what its worth this is what I used for testing to create a bunch of files. (I know the folder 9 is empty)
$base = "E:\Temp\dev\cs"
Remove-Item "$base\*" -Recurse -Force
0..9 | %{[void](New-Item -ItemType directory "$base\$_")}
1..1000 | %{
$number = Get-Random -Minimum 1 -Maximum 100
$folder = Get-Random -Minimum 0 -Maximum 9
[void](New-Item -Path $base\$folder -ItemType File -Name "$number.txt" -Force)
}
After looking at all the others, I thought I would try a different approach.
$includes = "C:\s\includes"
$root = "C:\s"
# First script
Measure-Command {
[string[]]$filter = ls $includes -Filter *.cs -Recurse | % name
ls $root -include $filter -Recurse -Filter *.cs |
Where-object{$_.FullName -notlike "$includes*"}
}
# Second Script
Measure-Command {
$filter2 = ls $includes -Filter *.cs -Recurse
ls $root -Recurse -Filter *.cs |
Where-object{$filter2.name -eq $_.name -and $_.FullName -notlike "$includes*"}
}
In my first script, I get all the include files into a string array. Then i use that string array as a include param on the get-childitem. In the end, I filter out the include folder from the results.
In my second script, I enumerate everything and then filter after the pipe.
Remove the measure-command to see the results. I was using that to check the speed. With my dataset, the first one was 40% faster.
$FilesToFind = Get-ChildItem -Recurse 'c:\s\includes' -File -Include *.cs | Select Name
Get-ChildItem -Recurse C:\S -File -Include *.cs | ? { $_.Name -in $FilesToFind -and $_.Directory -notmatch '^c:\s\includes' } | Select Name, Directory
Create a list of file names to look for.
Find all files that are in the list but not part of the directory the list was generated from
Print their name and directory

(Powershell) Loop to delete files from an FTP Location

Good morning!
I have made it to the last (and rather pivotal) stage in my script, which is looping to delete files from a directory. I'm not going to pretend I'm knowledgeable at Powershell (far from it), so I'm sort-of chopping up blocks of code I find on the net, improvising and hoping it works.
I'm hoping someone can decipher what I'm trying to do here and see what I'm doing wrong!
# Clear FTP Directory
$DelLoop=1
$server = "www.newsbase.com"
$dir = "/usr/local/tomcat/webapps/newsbasearchive/monitors/asiaelec/"
"open $server
user Canttell Youthis
binary
cd $dir
" +(
For ($DelLoop=1; $DelLoop -le 5; 5)
{
$FileList[$DelLoop] | %{ "delete ""$_""`n" }
$DelLoop++
})| ftp -i -in
I know that the 'Open Connection' portion works, it's just the loop. It just keeps complaining about misplaced operators, and when I fix those, it doesn't throw up any errors - but it doesn't do anything either.
I spent the best part of 4 hours researching this yesterday, and I'm hoping one of you guys can help me.
Thanks in advance!
ADDENDUM:
Here is more of the code, as requested:
# Clear existing .htm file to avoid duplication
Get-ChildItem -path ".\" -recurse -include index.jsp | ForEach-Object {
Clear-Content "index.jsp"
}
# Set first part of .JSP Body
$HTMLPart1="</br><tr><td colspan=9 align=center><p style=""font-family:Arial"">Here are the links to the last 3 AsiaElec PDFs:</br><ul>"
# Recurse through directory, looking for 3 most recent .PDF files 3 times
$Directory="C:\PDFs"
$HTMLLinePrefix="<li><a style=""font-family:Arial""href="""
$HTMLLineSuffix="</a></li>"
$HTMLLine=#(1,2,3,4)
$Loop=1
$PDF=#(1,2,3,4)
Get-ChildItem -path $Directory -recurse -include *.pdf | sort-object -Property LastWriteTime -Descending | select-object -First 3 | ForEach-Object {
$PDF[$Loop]=$_.name
$HTMLLine[$Loop]=$HTMLLinePrefix + $_.name + """>" + $_.name + $HTMLLineSuffix
$Loop++
}
# Final .JSP File Assembly
Get-Content "header.html" >> "index.jsp"
$HTMLPart1 >> "index.jsp"
$LineParse=""
$Loop2=1
For ($Loop2=1; $Loop2 -le 3; 3)
{
$HTMLLine[$Loop2] >> "index.jsp"
$Loop2++
}
Get-Content "tail.html" >> "index.jsp"
# Prepare File List
$FileList=#(1,2,3,4,5)
$FileList[2]=$PDF[2]
$FileList[3]=$PDF[3]
$FileList[4]="index.jsp"
# Clear FTP Directory
$DelLoop=1
$server = "www.newsbase.com"
$dir = "/usr/local/tomcat/webapps/newsbasearchive/monitors/asiaelec/"
"open $server
user derek bland1ne
binary
cd $dir
" +(
For ($DelLoop=1; $DelLoop -le 5; 5)
{
$FileList[$DelLoop] | %{ "delete ""$_""`n" }
$DelLoop++
})| ftp -i -in
This isn't all of it, but I believe it contains all the relevant info.
Your $dir path looks like you're on a unix system so this may be a little different, but all you need to do is change your final loop a little bit:
For ($DelLoop=1; $DelLoop -le 5; $DelLoop++)
{
$FileList[$DelLoop] | % { rm $FileList[$DelLoop] }
}
This is assuming that $FileList contains the files you want to delete and not only (what I'm guessing are dummy) numbers. I also suggest that you download the Module that #Graimer mentions and then put it in WindowsPowerShell > Modules > %ModuleFolder% > %Module.psm1% and import it from your profile.
You can then just use PS> Remove-FTPItem -Path "/myFolder" -Recurse to remove your FTP stuff. Making your life easier.
Tweaking the solution to this post may also help Upload files with FTP using PowerShell
e.g:
Using $ftp.Method = [System.Net.WebRequestMethods+Ftp]::DeleteFile to delete the file,
and $response = $ftp.GetResponse() to find out if things went smoothly.
EDIT
Wrote this function after doing a little bit of research from here http://social.msdn.microsoft.com/forums/en-US/netfxnetcom/thread/17a3abbc-6144-433b-aadd-1f776c042bd5 and adapting the code from the Accepted Answer in the above link as well as the module #Graimer talked about.
function deleteFTPSide
{
Param(
[String] $ftpUserName = "muUserName",
[String] $ftpDomain = "ftp.place.com", # Normal domains begin with "ftp" here
[String] $ftpPassword = "myPassword",
[String] $ftpPort = 21, # Leave as the default FTP port
[String] $fileToDelete = "folder.domain.com/subfolder/file.txt"
)
# Create the direct path to the file you want to delete
[String] $ftpPath = "ftp://"+"$ftpUserName"+":"+"$ftpPassword#$ftpDomain"+":"+"$ftpPort/$fileToDelete"
# create the FtpWebRequest and configure it
$ftp = [System.Net.FtpWebRequest]::Create($ftpPath)
$ftp.Method = [System.Net.WebRequestMethods+Ftp]::DeleteFile
$ftp.Credentials = new-object System.Net.NetworkCredential($ftpUserName,$ftpPassword)
$ftp.UseBinary = $true
$ftp.UsePassive = $true
$response = [System.Net.FtpWebResponse]$ftp.GetResponse()
$response.Close()
}
While, admittedly, not one of the most elegant solutions written, I've tested it and it works at deleting a specified file off an FTP server.

Get-ChildItem refresh/update

Pretty new to Powershell and hoping someone can point me in the right direction. Im trying to figure out if there is a cleaner way to accomplish what I have below? Is there a way to refresh to contents of Get-ChildItem once I have made some changes to the files which are returned during the first Get-ChildItem call (stored in $items variable)?
During the first foreach statement I am creating a log signature for all the files that are returned. Once that is done, what I need to do is; get a listing once again (because the item in the path have changed), the second Get-ChildItem will include both the files that were found during the first Get-ChildItem call and also all the logFiles that were generated when the first foreach statement called the generate-LogFile function. So my question, is there a way to update the listing without having to call get-chilItem twice, as well as use two foreach statements?
Thanks for all the help!
--------------This is what I changed the code based on recommendation--------------
$dataStorePath = "C:\Process"
function print-All($file)
{
Write-Host "PrintALL filename:" $file.FullName #Only prints when print-All($item) is called
}
function generate-LogFile($file)
{
$logName = $file.FullName + ".log"
$logFilehandle = New-Object System.IO.StreamWriter $logName
$logFilehandle.Writeline($logName)
$logFilehandle.Close()
return $logName
}
$items = Get-ChildItem -Path $dataStorePath
foreach ($item in $items)
{
$log = generate-LogFile($item) #Contains full path C:\Process\$fileName.log
print-All($item)
print-All($log) #When this goes to the function, nothing prints when using $file.FullName in the print-All function
}
---------Output--------------
For testing I have two files in C:\Process:
fileA.txt & fileB.txt
I will create two additional files
fileA.txt.log & fileB.txt.log
Eventually I need to do something with all four files. I created a print-All Function where I would process all four files. Below is the current ouput. As can be seen, I only get output for the two original files found, not the two additional created (get blank lines when calling the print-All($log)). I need to able to use fullpath property provided by Get-ChildItem, thus using FullName
PrintALL filename: fileA.txt
PrintALL filename:
PrintALL filename: fileB.txt
PrintALL filename:
I'm not entirely clear on what you are asking, by can have generate-LogFile return the created log file, then just call generateRequestData on both your file and the log file? Something like this:
$items = Get-ChildItem -Path $dataStorePath
foreach ($file in $items)
{
$logFile = generate-LogFile $file
generateRequestData $file
generateRequestData $logFile
}
Edit:
In your added sample, you are returning a string from generate-LogFile. .NET strings don't have a FullName property, so nothing gets printed in print-All. To get the FileInfo object that you want, use the get-item commandlet:
return Get-Item $logName
Also, in this example, you don't need to use a StreamWriter to write to the file, you could use the native powershell Out-File commandlet:
function generate-LogFile($file)
{
$logName = $file.FullName + ".log"
$logName | Out-File $logName
return Get-Item $logName
}