Function not processing every step - powershell

The below snippet will jump to the correct function "ORD_LOG_PROCESS", it will CD to the path, but it will not store the variables after that. $ordfiles and every variable after that do not store. There is one file in the $ordlogpath directory, and if I do (gci $ordlogpath |% {$_.name}) at the shell it works, but via the script for some reason it will not store.
$ordlogpath = "C:\test_environment\ORD_REPO\ORD_LOGS\"
$ordlogexist = gci "C:\test_environment\ORD_REPO\ORD_LOGS\*.log"
FUNCTION ORD_LOG_PROCESS
{
cd $ordlogpath
$ordfiles = (gci $ordlogpath |% {$_.name})
FOREACH ($ordfile in $ordfiles)
{
$ordlogimport = Import-Csv $ordfile
$ordloggrep = $ordfile
exit
}
}
FUNCTION NO_FILES
{
write-host "NO FILES TO PROCESS"
EXIT
}
IF (!$ordlogexist)
{
NO_FILES
}
else
{
ORD_LOG_PROCESS
}

If you declare variables inside a function, they will be local to that function. That means that the variables do not exist outside the function.
But then.. why use functions like this at all?
Can't you simply do something like below?
$ordlogpath = "C:\test_environment\ORD_REPO\ORD_LOGS\*.log"
if (!(Test-Path -Path $ordlogpath)) {
Write-Host "NO FILES TO PROCESS"
}
else {
Get-ChildItem -Path $ordlogpath -File | ForEach-Object {
$ordlogimport = Import-Csv $_.FullName
# now do something with the $ordlogimport object,
# otherwise you are simply overwriting it with the next file that gets imported..
# perhaps store it in an array?
# it is totally unclear to me what you intend to do with this variable..
$ordloggrep = $_.Name
}
Write-Host "The log name is: $ordloggrep"
Write-Host
Write-Host 'The imported variable ordlogimport contains:'
Write-Host
$ordlogimport | Format-Table -AutoSize
}

Related

Trouble linking commands together

So i'm pretty new to powershell and I'm trying to list all contents of a directory(on my vm) while stating if each is a reg file or directory along with it's path/size.
the code I have is:
#!/bin/bash
cd c:\
foreach ($item in get-childitem -Path c:\) {
Write-Host $item
}
########
if(Test-Path $item){
Write-Host "Regular File" $item
}
else {
Write-Host "Directory" $item
}
I can get all of the contents to print, but when I try to state whether file/directory, only one .txt file says "Regular File" next to it. I've been at it for hours on end and get figure it out. Also, my output doesn't state "directory" next to directories...
Here is an example of how you can enumerate the files and folders on your C Drive one level deep with their current size (if it's a folder, look for the files inside and get a sum of it's Length). Regarding trying to "state whether file / directory", you don't need to apply any logic to it, FileInfo and DirectoryInfo have an Attributes property which gives you this information already.
Get-ChildItem -Path C:\ | & {
process {
$object = [ordered]#{
Attributes = $_.Attributes
Path = $_.Name # change to $_.FullName for the Path
Length = $_.Length / 1mb
}
if($_ -is [IO.DirectoryInfo]) {
foreach($file in $_.EnumerateFiles()) {
$object['Length'] += $file.Length / 1mb
}
}
$object['Length'] = [math]::Round($object['Length'], 2).ToString() + ' Mb'
[pscustomobject] $object
}
}
If you want something more complex, i.e. seeing the hierarchy of a directory, like tree does, with the corresponding sizes you can check out this module.

Powershell Test-Path in loop, variable path

I started learning Powershell yesterday.
I want to write a script that run batches in multiple folders, however only if subfolder contain work_folder.
I have the following folder structure
*folder1
--work_folder
--script.bat
*folder2
--script.bat
*folder3
--script.bat
*test.ps1
powershell
cd C:\a\test\
Get-ChildItem -Path $PWD\*\script.bat | ForEach-Object {
if ( -not ( Test-Path $PWD\*\work_folder -PathType Container ) )
{
Write-host "Not exist"
return
} else {
Write-host "Exist"
}
& $_
}
If I use
powershell
$myPath = "$PWD\*\"
it will not work as I want.
Please give me a hint or example.
There's a couple problems with your code. Because you use the return keyword if the work_folder directory doesn't exist, your script ends entirely on the first folder it spots like this. I think you meant to use the continue keyword, which would skip to the next iteration of the loop.
Also, you were iterating on every instance of script.bat, when you should have been iterating on every directory containing script.bat.
You can write your code a lot cleaner if you do it this way:
(Get-ChildItem -Path $PWD -Directory).FullName | ForEach-Object {
if (Test-Path "$_/work_folder" -PathType Container)
{
Write-host "Exist"
& "$_/script.bat"
} else {
Write-host "Not exist"
}
}

How to decorate an existing function or passing function objects in Powershell?

Lets say I want to save the output of a Powershell command to file. I would do this like ls | out-file "path.txt". I make this call a few times per day and am worried that the function call (ls in this case) produce bad data ruining my file. I feel like I need a backup!
Next step for me would be decorating the out-file call so that it automatically backs up the data in a separate file. One backup per day would be sufficient. This could be achieved by a custom out-bak function as per below. Suddenly I get automated backups with ls | Out-Bak "path.txt".
function Out-Bak {
[cmdletbinding()]
Param (
[parameter(ValueFromPipeline)]
[string]$inputObject,
[parameter(Mandatory=$false)]
[string]$myPath
)
Begin {
$backupPath = [System.IO.Path]::ChangeExtension($myPath,".bak_$([DateTime]::Now.ToShortDateString())")
Remove-Item $myPath
Remove-Item $backupPath
}
Process {
Out-File -InputObject $input -FilePath $myPath -Append
Out-File -InputObject $input -FilePath $backupPath -Append
}
}
This solves my problem fine, but I would like to be able to use exactly the same pattern for Out-csv and similar filewriting fucntion. Is there a way to pass the Out-File command as a parameter to the Out-Bak so that I can use the function as a somewhat generic decorator for output commands?
Let the backup function do only what its name suggests: backup the file.
ls | Out-File $path | Backup
........ | Out-File foo.txt | Backup
........ | Out-File -FilePath "$path\$name" | Backup
........ | Export-Csv -NoTypeInformation bar.csv | Backup
The backup cmdlet will simply copy the file once the pipeline finishes.
To find the file path from a previous pipeline command we'll have to use arcane stuff like AST parser:
function Backup {
end {
$bakCmdText = (Get-PSCallStack)[1].Position.text
$bakCmd = [ScriptBlock]::Create($bakCmdText).
Ast.EndBlock.Statements[0].PipelineElements[-2].CommandElements
$bakParamInfo = if (!$bakCmd) { #{} }
else { #{} + (Get-Command ($bakCmd[0].value)).Parameters }
$bakSource = ''; $bakLiteral = $false; $bakPos = 0
while (!$bakSource -and ++$bakPos -lt $bakCmd.count) {
$bakToken = $bakCmd[$bakPos]
if ($bakToken.ParameterName) {
if ($bakToken.ParameterName -match '^(File|Literal)?Path$') {
$bakLiteral = $bakToken.ParameterName -eq 'LiteralPath'
} elseif (!$bakParamInfo[$bakToken.ParameterName].SwitchParameter) {
$bakPos++
}
continue
}
$bakSource = if ($bakToken.StringConstantType -in 'SingleQuoted', 'BareWord') {
$bakToken.value
} else {
[ScriptBlock]::Create($bakToken.extent.text).
InvokeWithContext(#{}, (Get-Variable bak* -scope 1))
}
}
if (!$bakSource) {
Write-Warning "Could not find file path in pipeline emitter: $bakCmdText"
return
}
$backupTarget = "$bakSource" + '.' + [DateTime]::Now.ToShortDateString() + '.bak'
$bakParams = #{ $(if ($bakLiteral) {'LiteralPath'} else {'Path'}) = "$bakSource" }
copy #bakParams -destination $backupTarget -Force
}
}
Warning: it fails with $() like ... | out-file "$($path)" | backup because Get-PSCallStack for some reason returns the expression contents as the callee, and right now I don't know other methods of getting the parent invocation context.

Parse directory listing and pass to another script?

I am trying to write a PowerShell script that will loop through a directory in C:\ drive and parse the filenames with the file extension to another script to use.
Basically, the output of the directory listing should be accessible to be parsed to another script one by one. The script is a compiling script which expects an argument (parameter) to be parsed to it in order to compile the specific module (filename).
Code:
Clear-Host $Path = "C:\SandBox\"
Get-ChildItem $Path -recurse -force | ForEach { If ($_.extension -eq ".cob")
{
Write-Host $_.fullname
}
}
If ($_.extension -eq ".pco")
{
Write-Host $_.fullname }
}
You don't need to parse the output as text, that's deprecated.
Here's something that might work for you:
# getmyfiles.ps1
Param( [string])$Path = Get-Location )
dir $Path -Recurse -Force | where {
$_.Extension -in #('.cob', '.pco')
}
# this is another script that calls the above
. getmyfile.ps1 -Path c:\sandbox | foreach-object {
# $_ is a file object. I'm just printing its full path but u can do other stuff eith it
Write-host $_.Fullname
}
Clear-Host
$Path = "C:\Sandbox\"
$Items = Get-ChildItem $Path -recurse -Include "*.cob", "*.pco"
From your garbled code am guessing you want to return a list of files that have .cob and .pco file extensions. You could use the above code to gather those.
$File = $Items.name
$FullName = $items.fullname
Write-Host $Items.name
$File
$FullName
Adding the above lines will allow you to display them in various ways. You can pick the one that suites your needs then loop through them on a for-each.
As a rule its not a place for code to be writen for you, but you have tried to add some to the questions so I've taken a look. Sometimes you just want a nudge in the right direction.

Get-Item fails with closed pipeline error

If I have an example function ...
function foo()
{
# get a list of files matched pattern and timestamp
$fs = Get-Item -Path "C:\Temp\*.txt"
| Where-Object {$_.lastwritetime -gt "11/01/2009"}
if ( $fs -ne $null ) # $fs may be empty, check it first
{
foreach ($o in $fs)
{
# new bak file
$fBack = "C:\Temp\test\" + $o.Name + ".bak"
# Exception here Get-Item! See following msg
# Exception thrown only Get-Item cannot find any files this time.
# If there is any matched file there, it is OK
$fs1 = Get-Item -Path $fBack
....
}
}
}
The exception message is ... The WriteObject and WriteError methods cannot be called after the pipeline has been closed. Please contact Microsoft Support Services.
Basically, I cannot use Get-Item again within the function or loop to get a list of files in a different folder.
Any explanation and what is the correct way to fix it?
By the way I am using PS 1.0.
This is just a minor variation of what has already been suggested, but it uses some techniques that make the code a bit simpler ...
function foo()
{
# Get a list of files matched pattern and timestamp
$fs = #(Get-Item C:\Temp\*.txt | Where {$_.lastwritetime -gt "11/01/2009"})
foreach ($o in $fs) {
# new bak file
$fBack = "C:\Temp\test\$($o.Name).bak"
if (!(Test-Path $fBack))
{
Copy-Item $fs.Fullname $fBack
}
$fs1 = Get-Item -Path $fBack
....
}
}
For more info on the issue with foreach and scalar null values check out this blog post.
I modified the above code slightly to create the backup file, but I am able to use the Get-Item within the loop successfully, with no exceptions being thrown. My code is:
function foo()
{
# get a list of files matched pattern and timestamp
$files = Get-Item -Path "C:\Temp\*.*" | Where-Object {$_.lastwritetime -gt "11/01/2009"}
foreach ($file in $files)
{
$fileBackup = [string]::Format("{0}{1}{2}", "C:\Temp\Test\", $file.Name , ".bak")
Copy-Item $file.FullName -destination $fileBackup
# Test that backup file exists
if (!(Test-Path $fileBackup))
{
Write-Host "$fileBackup does not exist!"
}
else
{
$fs1 = Get-Item -Path $fileBackup
...
}
}
}
I am also using PowerShell 1.0.