Add to Array of wildcards in Powershell - powershell

I am new to powershell and looking to test something for a POC.
There are multiple files stored on the cloud inbox (File_1.txt, File_2.txt and so on).
I want to add these files using a wildcard to an array and then run some commands (specific to the cloud) for each file.
I cannot specify the -Path in the code as the files are located on cloud and I do not have the path.
The below code works:
$files = #("File_1.txt","File_2.txt","File_3.txt")
foreach ($file in $files) {
Run commands on cloud like...delete inbox/$file
}
However I cannot hard code the file names. I am looking to add file names using wildcard.
$files=#("File*.txt")
foreach ($file in $files) {
Run commands on cloud like...inbox/$file
}
But this does not work as in the log it is taking File*.txt as the name of the file.
Thanks in advance

You should use dir equivalent of PowerShell, Get-ChildItem (In fact dir and ls are aliases for the Cmdlet) to search files using wildcards:
$files = Get-ChildItem file*.txt
foreach($file in $files) {
DoSomeStuffWithFile $file
}
Get-ChildItem returns FileInfo objects. If you want to use the filename String in the loop, you should refer to the Name property of the object:
DoSomeStuffWithFileName $file.Name

You can easily update the file number using string interpolation. Here is a basic example of how this might work:
1 .. 10 |
ForEach-Object {
"hello" | Out-File -Path "File_$_.txt"
}
Here we generate an array of the numbers from 1 to 10 (using the range operator ..), then this is piped to our command (and hence placed in the $_ variable). As each number comes through, the path is constructed with the current value of $_, so you get File_1.txt, File_2.txt, etc.
You can use other looping techniques, but the key is that you can build the filename/path on-the-fly by having PowerShell replace the variables for you.

Related

Powershell. Combine text files in folders across multiple directories

I have tried to do my research, but I can't fathom this one out.
I can combine multiple .txt files in a folder. no problem:
dir C:\Users\XXXX\AC1\22JUN *.txt | get-content | out-file C:\Users\XXXX\22JUN\AC1_22JUN.txt
however, I have 14 Directories each with subdirectories. (months of the year), and this will only ever grow. How can I write it so that it will go into each directory AC1 - AC14 and then look into each folder JAN-DEC and in each subdirectory create a combined file for AC1_22JUN, AC2_22JUN AC1_22JUL, AC2_22JUL and so on and so on?
is there also a way to rename the output file with data, such as the number of .txt files that have been combined. i.e. AC1_22JUN_314.txt
many thanks in advance
What you need to do is iterate over all your directories and their subdirectories and run a particular command in each of them. This is easy enough to achieve using the cmdlet Get-ChildItem and a pair of nested foreach loops.
In addition, you need to count the number of text files you've processed so that you can name your aggregate file appropriately. To do this you can break your pipeline using the temporary variable $files. You can later begin a new pipeline with this variable and use its count property to name the aggregate file.
The code for this is as follows:
$dirs = Get-ChildItem -Directory
foreach ($dir in $dirs)
{
$subdirs = Get-ChildItem $dir -Directory
foreach ($subdir in $subdirs)
{
$files = Get-ChildItem *.txt -Path $subdir.fullname
$name = "$($dir.name)_$($subdir.name)_$($files.count).txt"
$files | Get-Content | Out-File "$($subdir.fullname)/$name"
}
}
A few things to note:
The script needs to be run from the containing folder - in your case the parent folder for AC1-AC14. To run it from elsewhere you will have to change the first statement into something like $dirs = Get-ChildItem C:\path\to\container -Directory
Get-ChildItem is the same as the command dir. dir is an alias for Get-ChildItem. This is NOT the same as the variable $dir which I've used in the script.
Running the script multiple times will include any and all of your old aggregate files! This is because the output file is also a .txt file which is caught in your wildcard search. Consider refining the search criteria for Get-ChildItem, or save the output elsewhere.

Nested pipeline variable in powershell

I have the following directory tree composed of a root directory containing 10 subdirectories, and 1 file in each subdirectory.
root/
dir1/
file
dir2/
file
...
dir10/
file
I would like to edit the content of the files recursively, replacing a string "str1" by "str2". I issued the following command in powershell:
Get-ChildItem -Directory | foreach {(get-content $_/file) -replace "str1", "str2" | set-content $_/file}
And it worked like a charm, but I still do not understand how. I use a pipeline in the foreach loop, but the following call to $_ still refers to the pipeline outside the foreach loop. Why is it so?
I don't think your command did work, is the problem.
The -Directory switch of Get-ChildItem makes it only return directories, not files. If you want to return files, use the -File switch.
Next up, if you have a list of items from Get-ChildItem, those give you a System.IO.FileSystemInfo object. We can provide those directly to the Get-Content command to read the file into a string.
From a string, you can call any of the general operators PowerShell offers, including the string replace operator, -Replace. The output of this can be piped over to Set-Content and used to update or append content to an existing file.
Get-ChildItem | foreach {
(get-content $_) -replace "str1", "str2" | Set-Content $_.FullName
}
Note the only real change here is that I removed -Directory from Get-ChildItem, and then fixed the syntax on the $PSItem (the official name for the present variable in a forEach loop, often written as $_).
The reason you can use the syntax I showed is that forEach-object gives you that special $_ or $PSitem variable to use to reference $this in a collection.
The special variable $_ is not unique for the foreach loop.
It is a placeholder for the object being passed through the pipe line.
This article: POWERSHELL POWER LESSON: THIS IS NO ORDINARY VARIABLE goes into more detail about the $_ variable.
Both $_ and the | are inside the foreach loop in the curly braces { } or script block.

Using a filename for a variable and move similar files with Powershell

I am trying to write a powershell script that will look at a directory(c:\Temp) and look for files with exensions of .fin.
If it finds a file with the fin extension I need it to move all files that have the same first seven characters of that fin file to another directory(c:\Temp\fin).
There could be multiple .fin files at a time in that directory with files that need to be moved.
I have tried a few different things but I am new to anything like this. I have only used very basic powershell scripts or commands. While this might be basic(not sure) I am lost as to where to start.
You'll need to call Get-ChildItem twice. The first time to get the pattern you want to match the file names to and the second to get the files. You can then pipe the results of the second command to Move-Item. Here's an example of how to do that:
[CmdletBinding()]
param(
$DirectoryToScan,
$ExtensionToMatch = ".fin",
$TargetDirectory
)
$Files = Get-ChildItem -Path $DirectoryToScan -File
foreach ($File in $Files) {
if($File.Extension -eq $ExtensionToMatch) {
$PatternToMatch = "$($File.BaseName.Substring(0, 7))*$ExtensionToMatch"
Write-Verbose "PatternToMatch: $PatternToMatch"
$MatchedFiles = Get-ChildItem -Path $DirectoryToScan -Filter $PatternToMatch
}
$MatchedFiles | Move-Item -Destination $TargetDirectory
}
If you save the above into a file called Move-MatchingFiles.ps1 you can pass in your parameters Moving-MatchingFiles.ps1 -DirectoryToScan C:\Temp -TargetDirectory C:\Temp\fin. The ExtensionToMatch parameter is optional and only needed if you wanted to move a different type of file.

Using powershell to move/make folders/subfolders based on filename

I don't really have much experience in powershell but I have files that I need to organize. The files are all pdf and will have a format similar to "Dept123_Name_year.pdf".
I want to move the documents into a folder based on "Dept123" and a sub folder "Name". If the folder is yet to exist I would like it to create the folder/subfolder.
To make it easier I was thinking of using creating an "Organize" folder on the desktop and running the program on that. If you think it'd be easier some other way, let me know.
Thanks in advance.
You can use a regular expression to match the different components of the filenames, then generate a directory structure based on that. The -Force parameter of mkdir lets you ignore whether or not the directory already exists:
$list = ls
for ($i=0; $i -le $list.Length; $i++) {
if ($list[$i].Name -match '([A-Za-z0-9]+)_([A-Za-z]+)_.*\.pdf') {
$path = Join-Path $matches[1] $matches[2]
mkdir -Force -Path $path
cp $list[$i] "$path\."
}
}
The regular expression part is in the quotes; you might need to modify it to suit your specific needs. Note that the sections in round brackets correspond to the different parts of the name being extracted; these parts are loaded, in order, into the $matches variable, which is generated automatically. E.g. '([A-Za-z0-9]+)\.txt' will match any text file with only letters or numbers in the name, and will stick the actual name - minus the extension - into $matches[1].
Using regex and full-form Powershell:
# using ?<name> within a () block in regex causes powershell to 'name' the property
# with the given name within the automatic variable, $matches, object.
$Pattern = "(?<Dept>.*)_(?<Name>.*)_(?<Year>.*)\.pdf"
# Get a list of all items in the current location. The location should be set using
# set-location, or specified to the command by adding -Path $location
$list = Get-ChildItem
# Foreach-Object loop based on the list of files
foreach ($file in $list) {
# send $true/$false results from -matches operation to $null
$File.Name -matches $Pattern 2> $Null
# build destination path from the results of the regex match operation above
$Destination = Join-Path $matches.Dept $matches.Name
# Create the destination if it does not exist
if (!(Test-Path $Destination) ) {
New-Item -ItemType Directory -Path $destination
}
# Copy the file, keeping only the year part of the name
Copy-Item $file "$destination\$($matches.year)"+".pdf"
}

Copying files while preserving directory structure

One of our sites was recently hit by a CryptoLocker infection. Fortunately we caught it early and only have about 8k files encrypted out of the 200k or so on the file-system. Our backups were good too, so I can restore the files
I have a list of all ~8k of the encrypted files in roughly the following format, one file per new line:
\\nas001\DATA\IQC\file.xls
\\nas001\DATA\IQC\file2.xls
\\nas001\DATA\IQC\folder1\file1.xls
\\nas001\DATA\IQC\folder3\file1.xls
I did an ndmp copy of a snapshot of good data, so the backup I am restoring form is another volume on the nas with the same folder structure after a certain point:
\\nas001\IQC_restore\file.xls
\\nas001\IQC_restore\file2.xls
\\nas001\IQC_restore\folder1\file1.xls
\\nas001\IQC_restore\folder3\file1.xls
Is there an easy way with powershell (or really, with batch scripting or robocopy) parse the files with the list of encrypted files and copy only those files from our backup to the original location, overwriting the encrypted files? On another solution I found the following script:
$a = Get-Content "C:\script\hname.txt"
foreach ($i in $a)
{$files= get-content "C:\script\src.txt"
foreach ($file in $files)
{Copy-Item $file -Destination \\$i\C$\temp -force}
}
Which is almost what I need - the foreach $i in $a statement is redundant because I only have one $i, and it's copying all the files listed in the to a single folder rather than copying them in a way to preserve folder structures.
What's the best way to do this? Can I pass it two separate files and tell it to link the two line for line, so for each line in file a it copies the file to the path in file b? Is it easier to pass it one set of files (file a), perform a string replacement, and copy to that location?
Thank you!
I am perhaps making too broad an assumption here - but really the paths between encrypted and restored files are identical as long as you make the appropriate substitution between \\nas001\DATA\IQC and \\nas001\IQC_restore correct? If so....
You can simply take each file from the "known bad files" file (hname.txt?) and substitute the correct path for the destination using the String.Replace method:
$files = Get-Content C:\script\hname.txt
foreach ($file in $files) {
$src = $file.Replace("\\nas001\DATA\IQC","\\nas001\IQC_restore")
Copy-Item $src -Destination $file -Force
}
Or, to be even more brief:
$enc = '\\nas001\DATA\IQC' # encrypted files share/folder
$good = '\\nas001\IQC_restore' # clean share/folder
gc c:\scripts\hname.txt | % { copy $_.replace($enc,$good) -dest $_ -force }
The secret decoder ring:
gc is an alias for the Get-Content cmdlet
% is an alias for the ForEach-Object cmdlet
$_ is a special variable inside a foreach scriptblock that represents the "current item" that is being passed in from the pipeline - in this case each line from the input file hname.txt