VisualStudioCode debugger says to supply values and gives prompt - powershell

I'm trying to use code to get files in a dir and choose the one(s) that are edited in the last 4 hours. For some reason, when I debug this in VisualStudioCode, the debugger says
Supply values for the following parameters:
Process[0]:
$out_path = "C:\Data\Monitor\PropertiesReport\"
#find latest scan file (within 4 hours from now)
$output_sprdsheet_blob_path = Join-Path -Path $out_path -ChildPath "\OutputSprdsht\" #location of scan output file...looks good for path
Get-ChildItem $output_sprdsheet_blob_path -Filter *.xlsx | Foreach-Object
{
$lastupdatetime=$_.LastWriteTime
$nowtime = get-date
if (($nowtime - $lastupdatetime).totalhours -le 4)
{
Write-Host $_.Name
$excel_File_from = $_.Name
#Select-String -Path $_.Name -Pattern "'Execute Time of Send Thread = 60.'"
}
}
#use file found above next
I'm not sure why powershell gives a prompt to supply values foreach-object, when the path is valid for Get-ChildItem. I've used similar code before, and it worked, but I was using PowershellISE, and the code started with the following instead of the Get-ChildItem.
powershell "Set-Location -Path $log_path ; Get-Item *.* | Foreach {...}
I was having the same issue with the above code, where the visual studio code debugger gave the Process[0] prompt and wanted me to supply values at the foreach. This had been tested and used before as well.
I am trying the Get-ChildItem because of the example below doing this, and it looks like it should work. Any idea why the visual studio code debugger gives the prompt and how to fix it?
I have used write-host to print the dir being used, and I pasted the path printed into windows file explorer and there was a file there, and the path was valid.
My powershell version is 5.1.
example get-childitem
Update:
This prints the filename. I'm not sure why it doesn't give the prompt.
$out_pth = "C:\Data\Monitor\PropertiesReport\"
Set-Location -Path $out_pth
Get-Item *.* | foreach-object {write-host $_.name}
Update2:
This prints the filename too:
Get-ChildItem $out_pth | Foreach-Object {write-host $_.name}

It looks like that newline made the difference. This is working:
Get-ChildItem $out_pth | Foreach-Object {$lastupdatetime=$_.LastWriteTime;$nowtime = get-date; if (($nowtime - $lastupdatetime).totalhours -le 40) {$excel_File_from = $_.Name;write-host $_.name}}
write-host "here"
write-host $excel_File_from
prints:
filename.xlsx
here
filename.xlsx
I changed the time from 4 to 40 hours above, because I realized the file was last edited yesterday. But it found the file as well, without the time check on the file properties.

Related

Powershell Get-ChildItem returns nothing for folder

A friend of mine asked me to write a short script for him. The script should check a specific folder, find all files and subfolders older than X days and remove them. Simple so far, I wrote the script, successfully tested it on my own system and sent it to him. Here's the thing - it doesn't work on his system. To be more specific, the Get-ChildItem cmdlet does not return anything for the provided path, but it gets weirder even, more on that later. I'm using the following code to first find the files and folders (and log them before deleting them later on):
$Folder = "D:\Data\Drive_B$\General\ExchangeFolder"
$CurrentDate = Get-Date
$TimeSpan = "-1"
$DatetoDelete = $CurrentDate.AddDays($TimeSpan)
$FilesInFolder = (Get-ChildItem -Path $Folder -Recurse -ErrorAction SilentlyContinue | Where-Object {$_.LastWriteTime -lt $DatetoDelete})
All variables are filled and we both know that the folder is filled to the brim with files and subfolders older than one day, which was our timespan we chose for the test. Now, the interesting part is that not only does Get-ChildItem not return anything - going to the folder itself and typing in "dir" does not return anything either. Never seen behaviour like this. I've checked everything I could think of - is it DFS?, typos, folder permissions, share permissions, hidden files, ExecutionPolicy. Everything is as it should be to allow this script to work properly as it did on my own system when initially testing it. The script does not return any errors whatsoever.
So for some reason, the content of the folder cannot be found by powershell. Does anyone know of a reason why this could be happening? I'm at a loss here :-/
Thanks for your time & help,
Fred
.AddDays() takes an double I would use that.
Filter then action
This code will work for you.
$folder = Read-Host -Prompt 'File path'
$datetodel = (Get-Date).AddDays(-1)
$results = try{ gci -path $folder -Recurse | select FullName, LastWriteTime | ?{ $_.LastWriteTime -lt $datetodel}}catch{ $Error[-1] }
$info = "{0} files older than: {1} deleting ...." -f $results.count, $datetodel
if($results | ogv -PassThru){
[System.Windows.Forms.MessageBox]::Show($info)
#Put you code here for the removal of the files
# $results | % del FullName -force
}else{
[System.Windows.Forms.MessageBox]::Show("Exiting")
}

Powershell error copy-item cannot bind argument to parameter 'path' because it is null

I am rather new to Powershell and have a question regarding the error I'm receiving. After browsing through stack overflow I have found that users have made errors in spelling and the like and so I haven't found a suitable answer to my problem.
I have one script that runs a backup of some data and compresses it and stores it as:
yyyyMMddsometext.7z
I have another script to get the latest backup (if it was created) and copy it to another location
I am receiving an error
copy-item cannot bind argument to parameter 'path' because it is null
Does this mean that the file is non-existent or is it an error in any of the below?
$c = $textBox.Text
$a = (Get-Date).AddDays(-1).ToString($c)
$b = Get-ChildItem "C:\BackupsRP1" -Filter *.7z | Where-Object BaseName -like "*$a*"
Copy-Item $b -Destination "C:\Users\user\Desktop"
Above the code is a simple GUI for the user to input the date in the format yyyyMMdd and it will locate the file one less day than the user inputs and copy it to the location.
Thank you,
J
$b might contain multiple files or even non at all, depending what your filter finds.
The correct why to do it:
# This will copy each of the files that
Get-ChildItem "C:\BackupsRP1" -Filter *.7z | where BaseName -like "*$a*" | Copy-Item -Destination "C:\Users\user\Desktop" -PassThru
This will copy all items that match the filter and output the copied files to the console afterwards.
Also, make sure that $a really contains what you want. (I cannot know since I don't know what is in your textbox.)
You have to make sure the values in the variables are as expected, You can add logging for debugging this.
$c = $textBox.Text
$c > c:\temp\Debug.log
$a = (Get-Date).AddDays(-1).ToString($c)
$a >> c:\temp\Debug.log
$b = Get-ChildItem "C:\BackupsRP1" -Filter *.7z | Where-Object BaseName -like "*$a*"
$b >> c:\temp\Debug.log
Copy-Item $b.FullName -Destination "C:\Users\user\Desktop"
$b will contain FileInfo object, you have to select the fullname property(the full path of the file) from the object.

Powershell: Recursively search a drive or directory for a file type in a specific time frame of creation

I am trying to incorporate Powershell into my everyday workflow so I can move up from a Desktop Support guy to a Systems Admin. One question that I encountered when helping a coworker was how to search for a lost or forgotten file saved in an unknown directory. The pipeline I came up with was:
dir C:\ -Recurse -Filter *.pdf -ErrorAction SilentlyContinue -Force | Out-File pdfs.txt
This code performed exactly how I wanted but now I want to extend this command and make it more efficient. Especially since my company has clients with very messy file management.
What I want to do with this pipeline:
Recursively search for a specific file-type that was created in a specified time-frame. Lets say the oldest file allowed in this search is a file from two days ago.
Save the file to a text file with the columns containing the Filename, FullName(Path), and sorted by the created time in descending order.
What I have so far:
dir C:\ -Recurse -Filter *.pdf -ErrorAction SilentlyContinue -Force | Select-Object Name, FullName | Out-File *pdfs.txt
I really need help on how to create a filter for the time that the file was created. I think I need to use the Where-Object cmdlet right after the dir pipe and before the Select Object pipe but I don't know how to set that up. This is what I wrote: Where-Object {$_.CreationTime <
You're on the right track, to get the files from a specific file creation date range, you can pipe the dir command results to:
Where-Object {$_.CreationTime -ge "06/20/2017" -and $_.CreationTime -le "06/22/2017"}
If you want something more repeatable where you don't have to hard-code the dates everytime and just want to search for files from up to 2 days ago you can set variables:
$today = (Get-Date)
$daysago = (Get-Date).AddDays(-2)
then plugin the variables:
Where-Object {$_.CreationTime -ge $daysago -and $_.CreationTime -le $today}
I'm not near my Windows PC to test this but I think it should work!
See if this helps
dir c:\ -Recurse -Filter *.ps1 -ErrorAction SilentlyContinue -Force | select LastWriteTime,Name | Where-Object {$_.LastWriteTime -ge [DateTime]::Now.AddDays(-2) } | Out-File Temp.txt

Powershell Workflow Chugging at Memory and Crashing

I'm dabbling with workflows in powershell and I'm noticing some odd behavior. The below script will work when the directory doesn't contain a lot of files. After some point it will hold on line 6 (when run in the ise you'll see the workflow status bar), munch up memory, then eventually crash (after at least half an hour). This crash happens when the directory of files is at least 1.25GB, but not when the $Path has only 50mb of files. Here's an easy test:
Workflow Test-Me {
Param
(
$Path = "c:\temp",
$Days = 0
)
$Files = InlineScript{
Get-ChildItem -Path $using:Path -File -Recurse -Force | Where-Object {$_.LastWriteTime -lt ((get-date).AddDays(-$using:Days))}
}
$Files
}
Now the odd thing is that when Get-ChildItem -Path $using:Path -File -Recurse -Force | Where-Object {$_.LastWriteTime -lt ((get-date).AddDays(-$using:Days))} is run from outside of the workflow (in a regular function or just on a line of the shell) it completes in less than a minute, even with 1.25GB of files.
What is the workflow doing that causes it to eat memory, take a long time, and crash? It's obviously doing something unexpected. Again, it works if there's only a few files in the directory.
Also, a solution/workaround would be great.
Research:
Activity to invoke the Microsoft.PowerShell.Management\Get-ChildItem command in a workflow
Running Windows PowerShell Commands in a Workflow
The problem here appears to be with the retention of object data. Adding a select reduces the size of the returned object data so much so that searching 100GB+ did not cause it to crash. Solution is as followed:
Workflow Test-Me {
Param
(
$Path = "c:\temp",
$Days = 0
)
$Files = InlineScript{
Get-ChildItem -Path $using:Path -File -Recurse -Force | Where-Object {$_.LastWriteTime -lt ((get-date).AddDays(-$using:Days))} | select filename
}
$Files
}

Why is my PowerShell script writing blank lines to console?

I have a bit of an odd problem. Or maybe not so odd. I had to implement a "custom clean" for a PowerShell script developed for building some unique configurations for my current project (the whys are not particularly important). Basically it copies a bunch of files from the release directories into some temporary directories with this code:
$Paths = Get-ChildItem $ProjectPath -recurse |
Where-Object { ($_.PSIsContainer -eq $true) -and
(Test-Path($_.Fullname + 'bin\release')) } |
Select-Object Fullname
ForEach ($Path in $Paths)
{
$CopyPath = $Path.Fullname + '\bin\Temp'
$DeletePath = $Path.Fullname + '\bin\Release'
New-Item -ItemType directory -path $CopyPath
Copy-Item $DeletePath $CopyPath -recurse
Remove-Item $DeletePath Recurse
}
And after the build copies it back with:
ForEach ($Path in $Paths)
{
$CopiedPath = $Path.Fullname + '\bin\Temp\'
$DeletedPath = $Path.Fullname + '\bin\Release\'
$Files = Get-ChildItem $CopiedPath -recurse |
where-object {-not $_PSIsContainer}
ForEach ($File in $Files)
{
if(-not (Test-Path ($DeletedPath+$File.Name)))
{
Copy-Item $File.Fullname ($DeletedPath+$File.Name)
}
}
Remove-Item $CopyPath -recurse -force
}
This is pretty clunky and noobish (Sorry, I'm a PowerShell noob drinking from a fire hose), but it works for the purpose and I will clean it up later. However, when it executes the initial copy to the temp directories, it writes a lot of blank lines to the screen, which isn't ideal as I have a message I display while this process is executing to assure our CM doesn't freak out and think it broke, but this message is blown away by the blank lines. Do you know what might be causing this and how I might solve this? I'm using PowerShell 2.0 out of the box and due to the nature of this project I can't upgrade or get any outside libraries. Thanks guys.
If the only thing you're looking to do is clean up the console output, then all you need to do is use the pipeline. You can start the command with [void], which will exclude all information from the pipeline. You can also pipe the whole thing into the Out-Null cmdlet, which will trap all output, except for the lines that don't have output.
The New-Item cmdlet by default returns output to the console on my version of Windows PowerShell (4.0). This may not be true on previous versions, but I think it is... Remove-Item also doesn't return any output, usually. If I were to take a stab, I'd kill output on those lines that use the "Item" noun using one of the methods mentioned above.