I have a text file list of 56 folders that includes the full paths to a set of folders. I need to rename the folder at the end of the path.
Example:
Original: \this folder\needs to\move
New: \this folder\needs to\move.moved
I am totally new at powershell and trying to learn. I thought this might be a good way to start. Any help would be greatly appreciated.
# Get the content of the list
# in this case, a text file with no heading, and one path per line
$listContent = Get-Content $ENV:USERPROFILE\Desktop\list.txt
# Loop over each child folder and rename it
foreach($line in $listContent)
{
# check if the current path is valid
$pathTest = Test-Path -Path $line
if($pathTest -eq $True)
{
Write-Output "`nOld path: $($line)"
$newName = $line + ".moved"
Write-Output "New name: $newName"
try
{
# on success, write out message
Rename-Item -Path $line -NewName $newName -Force
# split the string from the file and get the data after the last \ for readability
Write-Output "`nSuccessfully changed directory name $($line.split('\')[-1]) to $newName"
}
catch
{
# on error, throw first error in the Error array
throw $Error[0]
}
}
else {
Write-Output "$($line) is not a valid path"
}
}
Write-Output "`nEnd of script!"
Related
I want to read all the .log files for todays file timestamp (i.e today's date yyyymmdd) in a folder which contains a string "scp error". And if condition satisfies move the name of .log file without extension to a new .txt file "redrop_files.txt"
this is what I have tried so far
Get-Item File*.log | ForEach-Object {
$fil = $_.Name;
foreach ($line in Get-Content $fil) {
if ($line -eq "scp error") {
$stat = "FAILED"
}
}
if ($stat -eq "FAILED") {
$errorfile = Get-ChildItem *.log | Rename-Item -NewName { $fil -replace '.log','' }
Add-Content -Path .\redrop_files.txt "`n" $errorfile
}
}
You have one main problem: you don't store the results of grep anywhere so variable f is undefined.
It is not clear why you use -n with grep if you only want the filename; -l seems to make more sense.
So:
grep --include=\*.log -rlw '/path/' -e "scp error" |\
while read -r f; do
echo "${f%%.*}"
done > redrop_files.txt
Try the following,
path=.
while read file; do
echo ${file%%:*} >> redrop_files.txt
done < <(grep --include=\*.log -rnw $path -e "scp error")
Your code has some minor mistakes, and a few improvements you could use, for example using a switch to read the files. The biggest problem is that you're actually renaming the files, Rename-Item actually renames files. What you want to do instead is append the .BaseName property (the file name without its extension) to your redrop_files.txt file.
Get-Item File*.log | ForEach-Object {
# if this file is not from Today
if($_.CreationTime -lt [datetime]::Today) {
# skip it
return
}
$append = switch -Wildcard -File $_.FullName {
# if the line contains "scp error"
'*scp error*' {
# output true
$true
# and break this loop, no need to keep checking
break
}
}
# if the result of the switch loop was true
if ($append) {
# append to redrop_files.txt the .BaseName of this file
Add-Content -Value $_.BaseName -Path .\redrop_files.txt
}
}
So i'm pretty new to powershell and I'm trying to list all contents of a directory(on my vm) while stating if each is a reg file or directory along with it's path/size.
the code I have is:
#!/bin/bash
cd c:\
foreach ($item in get-childitem -Path c:\) {
Write-Host $item
}
########
if(Test-Path $item){
Write-Host "Regular File" $item
}
else {
Write-Host "Directory" $item
}
I can get all of the contents to print, but when I try to state whether file/directory, only one .txt file says "Regular File" next to it. I've been at it for hours on end and get figure it out. Also, my output doesn't state "directory" next to directories...
Here is an example of how you can enumerate the files and folders on your C Drive one level deep with their current size (if it's a folder, look for the files inside and get a sum of it's Length). Regarding trying to "state whether file / directory", you don't need to apply any logic to it, FileInfo and DirectoryInfo have an Attributes property which gives you this information already.
Get-ChildItem -Path C:\ | & {
process {
$object = [ordered]#{
Attributes = $_.Attributes
Path = $_.Name # change to $_.FullName for the Path
Length = $_.Length / 1mb
}
if($_ -is [IO.DirectoryInfo]) {
foreach($file in $_.EnumerateFiles()) {
$object['Length'] += $file.Length / 1mb
}
}
$object['Length'] = [math]::Round($object['Length'], 2).ToString() + ' Mb'
[pscustomobject] $object
}
}
If you want something more complex, i.e. seeing the hierarchy of a directory, like tree does, with the corresponding sizes you can check out this module.
Could you help me with a powershell script?
I want to check if multiple files exist, if they exist then delete the files.
Than provide information if the file has been deleted or information if the file does not exist.
I have found the script below, it only works with 1 file, and it doesn't give an message if the file doesn't exist.
Can you help me adjust this? I would like to delete file c:\temp\1.txt, c:\temp\2.txt, c:\temp\3.txt if they exist.
If these do not exist, a message that they do not exist. Powershell should not throw an error or stop if a file doesn't exist.
$FileName = "C:\Test\1.txt"
if (Test-Path $FileName) {
Remove-Item $FileName -verbose
}
Thanks for the help!
Tom
You can create a list of path which you want to delete and loop through that list like this
$paths = "c:\temp\1.txt", "c:\temp\2.txt", "c:\temp\3.txt"
foreach($filePath in $paths)
{
if (Test-Path $filePath) {
Remove-Item $filePath -verbose
} else {
Write-Host "Path doesn't exits"
}
}
Step 1: You want multiple files. You can do that two ways:
$files = "C:\file1.txt","C:\file2.txt","C:\file3.txt"
That would work, is cumbersome. Easier? Have all files in one .csv list, import that. Remember, the first row is not read, since its consider the header:
$files = Import-Csv "C:\yourcsv.csv"
Alright Step 2: now you got your files, now we want to cycle them:
Foreach ($file in $files) {
If (Test-Path $file) {
Remove-Item $file -verbose | Add-Content C:\mylog.txt }
else { Write-Host "$file not found" }}
Foreach loops take each individual "entry" in one variable and do whatever you want with them.
That should do what you want.
I have a script that I use to search for hundreds of files across multiple drives from a list. It works fine as it catches all the matches. The only issue is I need to see the file it matched along with the extension.
A bit of the back story...
We have programs that share the same name as a Copybook. Not too uncommon in the mainframe world. When searching for a file, I have to Wildcard the search in order to catch all the same name files (Minus the extension). I then have to manually search for the hits to determine if they are copybooks or programs.
When I try to add any logic to the script below, it displays the entire array of file names and not just the actual match.
Would anyone be able to assist in capturing and displaying just the matched file along with it's extension? Maybe it's location also?
Regards,
-Ron
#List containing file names must be wilcarded FILE.*
#Parent folder (Where to begin search)
$folder = 'C:\Workspace\src'
#Missing Artifacts Folder (Where Text file resides)
$Dir2 = 'C:\Workspace\Temp'
#Text File Name
$files=Get-Content $Dir2\FilesToSearchFor.txt
cd \
cd $folder
Write-Host "Folder: $folder"
# Get only files and only their names
$folderFiles = (Get-ChildItem -Recurse $folder -File).Name
foreach ($f in $files) {
#if ($folderFiles -contains $f) {
if ($folderFiles -like $f) {
Write-Host "File $f was found." -foregroundcolor green
} else {
Write-Host "File $f was not found!" -foregroundcolor red
}
}
Instead of testing whether the entire list of file names contains the target file name ($folderFiles -like $f), load all files into a hashtable and then test if the target file name exists as a key with ContainsKey():
$fileTable = #{}
Get-ChildItem -Recurse $folder -File |ForEach-Object {
# Create a new key-value entry for the given file name (minus the extension) if it doesn't already exist
if(-not $fileTable.ContainsKey($_.BaseName)){
$fileTable[$_.BaseName] = #()
}
# Add file info object to the hashtable entry
$fileTable[$_.BaseName] += $_
}
foreach($f in $files){
if($fileTable.ContainsKey($f)){
Write-Host "$($fileTable[$f].Count) file(s) matching '$f' were found." -ForegroundColor Green
foreach($file in $fileTable[$f]){
Write-Host "File with extension '$($file.Extension)' found at: '$($file.FullName)'"
}
}
else {
Write-Host "No files found matching '$f'"
}
}
Since the $fileTable contains not just the name, but also a reference to the original file info objects with that names as returned by Get-ChildItem, you can easily access relevant metadata (like the Extension property) now
I'm new to PowerShell and been trying to get this script to work.
If ((Get-Date -UFormat %a) -eq "Mon") {$intSubtract = -3}
Else {$intSubtract = -1}
$datDate = (Get-Date).AddDays($intSubtract)
Write-Output "Find expected file --------------"
$strDate = ($datDate).ToString('yyyyMMdd')
Write-Host "strDate: $strDate"
$arrGetFile = Get-ChildItem -Path "\\Computer\Data\States\NorthDakota\Cities\*_Bismark_$strDate*.txt"
$strLocalFileName = $arrGetFile
If ($arrGetFile.count -ne 2)
{
Throw "No file or more than two files with today's date exists!"
}
Else {$strLocalFileName = $arrGetFile[0].Name}
Write-Output "Found file $strLocalFileName --------------"
#Encrypt each file
foreach ($arrGetFile in $strPath)
{
Write-Output "Start Encrypt --------------"
$strPath = "\\Computer\Data\States\NorthDakota\Cities\"
$FileAndPath = Join-Path $strPath $strLocalFileName
$Recipient = "0xA49B4B5D"
Import-Module \\JAMS\C$\PSM_PGP.psm1
Get-Module Encrypt
Encrypt $FileAndPath $Recipient
$strLocalFileNamePGP = $strLocalFileName + ".pgp"
Write-Output "End Encrypt --------------"
}
#Archive files
Write-Output "Archiving --------------"
move-item -path \\Computer\Data\States\NorthDakota\Cities\*_Bismark_$strDate*.txt -destination \\Computer\Data\States\NorthDakota\Cities\Archive
The Cities folder will contain two files. Example 2015_Bismark_20150626_183121.txt and 2015_Bismark_20150626_183121_Control.txt
I am trying to get both files encrypted however it is only finding and encrypting the file without _Control. It is archiving both files correctly.
Not sure what I am missing to also find the control file.
Your for loop is incorrect. You have foreach ($arrGetFile in $strPath), but $strPath doesn't seem to contain anything at that point.
Your for loop should be:
foreach ($LocalFile in $arrGetFile)
And you need to remove the following line:
$strLocalFileName = $arrGetFile
This is making $strLocalFileName an array of file objects, but later in the script you are treating it like a string. You may have more logical errors--you need to walk through the script very carefully and identify each variable and make sure it contains what you expect it to contain.
In general you seem to be treating arrays of non-string objects as if they are strings. Note that I changed your $strLocalFileName variable to $LocalFile. This is because it is a file object, not a string object.
Following is a sample that just shows that the for loop iterates through the both files.
If ((Get-Date -UFormat %a) -eq "Mon") {$intSubtract = -3}
Else {$intSubtract = -1}
$datDate = (Get-Date).AddDays($intSubtract)
Write-Output "Find expected file --------------"
$strDate = ($datDate).ToString('yyyyMMdd')
Write-Host "strDate: $strDate"
$arrGetFile = Get-ChildItem -Path "\\Computer\Data\States\NorthDakota\Cities\*_Bismark_$strDate*.txt"
If ($arrGetFile.count -ne 2)
{
Throw "No file or more than two files with today's date exists!"
}
Write-Output "Found files " ($arrGetFile | select Name | fl) "--------------"
#Process each file
foreach ($LocalFile in $arrGetFile)
{
$FileAndPath = Join-Path $LocalFile.DirectoryName $LocalFile
$FileAndPath
}
Start with this and then carefully add your encryption processing back into the loop.
Also, The line that assigns $FileAndPath could be removed. You can just use $LocalFile.FullName wherever you need the full path and filename.