I am using this command to get all of the child folders and child-child folders and so on.
Get-ChildItem -dir S:\WGroups -Recurse.
But what I would like to do is print it either into a csv or a txt file and was wonder what the best way to do that?
Get-ChildItem -dir S:\WGroups -Recurse | export-csv -Path $env:userprofile\documents\S_wgroups.csv
There are a number of fields you may not want, so filtering it to just what you need may be important if you're attempting to process in an automated fashion.
Related
I have tried to do my research, but I can't fathom this one out.
I can combine multiple .txt files in a folder. no problem:
dir C:\Users\XXXX\AC1\22JUN *.txt | get-content | out-file C:\Users\XXXX\22JUN\AC1_22JUN.txt
however, I have 14 Directories each with subdirectories. (months of the year), and this will only ever grow. How can I write it so that it will go into each directory AC1 - AC14 and then look into each folder JAN-DEC and in each subdirectory create a combined file for AC1_22JUN, AC2_22JUN AC1_22JUL, AC2_22JUL and so on and so on?
is there also a way to rename the output file with data, such as the number of .txt files that have been combined. i.e. AC1_22JUN_314.txt
many thanks in advance
What you need to do is iterate over all your directories and their subdirectories and run a particular command in each of them. This is easy enough to achieve using the cmdlet Get-ChildItem and a pair of nested foreach loops.
In addition, you need to count the number of text files you've processed so that you can name your aggregate file appropriately. To do this you can break your pipeline using the temporary variable $files. You can later begin a new pipeline with this variable and use its count property to name the aggregate file.
The code for this is as follows:
$dirs = Get-ChildItem -Directory
foreach ($dir in $dirs)
{
$subdirs = Get-ChildItem $dir -Directory
foreach ($subdir in $subdirs)
{
$files = Get-ChildItem *.txt -Path $subdir.fullname
$name = "$($dir.name)_$($subdir.name)_$($files.count).txt"
$files | Get-Content | Out-File "$($subdir.fullname)/$name"
}
}
A few things to note:
The script needs to be run from the containing folder - in your case the parent folder for AC1-AC14. To run it from elsewhere you will have to change the first statement into something like $dirs = Get-ChildItem C:\path\to\container -Directory
Get-ChildItem is the same as the command dir. dir is an alias for Get-ChildItem. This is NOT the same as the variable $dir which I've used in the script.
Running the script multiple times will include any and all of your old aggregate files! This is because the output file is also a .txt file which is caught in your wildcard search. Consider refining the search criteria for Get-ChildItem, or save the output elsewhere.
this is my first post so sorry about any mistakes.
I'm currently trying to use Powershell to combine folders of csv files based on date. I'm trying to go a week back, compile them, and export to another folder. I've only been using Powershell a few days and have an ok knowledge on coding in general.
I'm trying to use this fuction:
(Get-ChildItem C:\Folder | Group-Object -AsHashTable {$_.CreationTime -ge (Get-Date).AddDays(-2)})
That outputs a name(true or false) and a value(folder title). What I want to do is use another function to then export those CSV files in several folders all to one folder.
Is this possible? Am I going in the right direction? I have very little experience with this.
Thanks.
Luke
You're looking to filter by instead of group by, hence you would be using Where-Object instead of Group-Object. To copy files you can use the built-in cmdlet Copy-Item.
Do note, the path\to\destinationfolder in the example below must be an existing folder, you should create it before running the code.
# NOTE: If you want to filter only for files with .CSV Extension,
# `-Filter *.csv` should be included
Get-ChildItem C:\Folder -Recurse |
Where-Object { $_.CreationTime -ge (Get-Date).AddDays(-2) } |
Copy-Item -Destination path\to\destinationfolder
in a particular folder I have files created with random name for example:
file1.xml
file2.xml
when these files are succesfully created, a .ack file is created.
So I will have
file1.xml
file1.ack
file2.xml
file2.ack
What I have to do:
Move a .xml file only if the corresponding .ack is created.
The difficult part: file names are random and I have no control over them.
Is there a way to create a .bat or a powershell to check and move with these requirements run at scheduled times?
Many thanks for your help
the ideal would be a powershell task, indeed. Now, you will want to leverage window's Scheduled Tasks in order to run it at a scheduled time.
In a nutshell, what you'll have to do with powershell is to
List the xml files in your folder with Get-ChildItem -filter "*.xml"
Pipe it to a Where-Object statement to make sure the .xml has a .ack counterpart, leveraging Test-Path
For each produced item, move the .xml file.
Move-Item -Path $source -Destination $target
Optionally, you could also clean the .ack files with Remove-Item.
Find every filename that appears with two extensions, and move them both.
gci | group basename |? Count -eq 2 | select -expand group | move -Dest c:\temp
Because it's fun not to use any loops. If you want loops, maybe: to move the XML and delete the .ack.
gci *.ack |% { move ($_.BaseName +'.xml') "c:\temp" ; rm $_ }
I am new to the powershell scripting and hence need one help in scripting, the script which I am looking for should search for the folders as per the entries in .csv file, please note that it should search for the folders in the drive without knowing the path and move it to the destination.
I did some research and created below script which is taking data from .txt file and moving the data to the destination however it does not work if I just write C:\ at the place of source folder.
Request you to please help me :)
Get-Content C:\abc.txt |
Foreach-Object {
move-item -path "C:\0123\$_" -destination "C:\To Archive\$_"
}
With what you've given, I'd do something like the following:
$File = Import-Csv C:\share\test\files.txt
Foreach($fileName in $File.FileName)
{
Move-Item -path "C:\share\test\OldLocation\$fileName" -Destination "C:\share\test\NewLocation\$fileName"
}
I did this with a .csv file that had one column whose title was FileName. Notable differences from your code include using the Import-Csv cmdlet and specifying the .csv header title in the foreach loop.
If you wanted to do this with a single command:
Import-Csv C:\share\test\files.txt | ForEach-Object {
Move-Item -path "C:\share\test\OldLocation\$($_.[csvHeader])" -Destination "C:\share\test\NewLocation\$($_.[csvHeader])"
}
Where csvHeader is the title of the column in your .csv file.
My .csv file looked like:
FileName
file1.txt
file2.txt
file3.txt
I have a PowerShell script on a production server to show me the list of database backup files that are older than 30 days.
I need to see only the files that have the extension of ".bak". However, in my script there is no matching syntax or regular expression, so I am also getting to see a list of files which have an extension like "filename.foo.bak". These files may be text files or other configuration files on the server for which backups have been taken automatically by the program that uses these files.
How do I enable a match filter so that I see only "*.bak" and not other files as mentioned above?
As mentioned by mjolinor, I have used this script to do the exclusion.
gci $paths -recurse -filter *.bak -exclude *.*.bak | ?{!$_.psiscontainer}
However, I have learnt that I need to exclude some system folders such as C:\Windows.
How can this be accomplished as well?
Try this:
gci *.bak -exclude *.*.bak
I tried to reply to your comment, but the code doesn't show up right. -exclude takes a string[] argument, so:
gci -recurse -filter *.bak -exclude *.*.bak,windows |? {!$_.psiscontainer}