I'm trying to find and log any local PST files on multiple machines, accross all local drives on each machine. I have the code below so far, but I can't get it to run in the context of the root of the current drive in the foreach loop, it just runs in the context of where the script was run from.
If (-not (Test-Path -Path "\\BETH-FS01\F$\PSTBackups")){
Exit 1
} # check if PC is connected to domain, some laptops aren't always on the VPN
Else{
#defines the path where to store the log CSV
$LogPath = "\\BETH-FS01\F$\PSTBackups"
$Log = #() #object array to store "PST objects"
# defining string variable to combine with another string variable later on
$Ext = ".pst"
# creates array of local drives on PC
$Drives = Get-PSDrive -PSProvider 'FileSystem'
foreach ($Drive in $Drives) {
# searches drive for PST files, creates an array of those files,
# then passes each through to create PST objects
$Log = ForEach ($PST in ($PSTS = Get-ChildItem -LiteralPath $Drive.Name -Include *.pst -Recurse -Force -erroraction silentlycontinue)){
New-Object PSObject -Property #{
ComputerName = $env:COMPUTERNAME
Path = $PST.DirectoryName
FileName = $PST.BaseName+$Ext
Size = "{0:N2} MB" -f ($PST.Length / 1mb)
Date = $PST.LastWriteTime.ToString("yyyy-MM-dd HH:mm")
} #creates PST object
}
}
}
$Name = $env:COMPUTERNAME #define string to use in log path below
$Log | Export-Csv $LogPath\$Name.csv -NoTypeInformation #exports the Log object array to a CSV
To clarify, I'm trying to find out how to reference the fact that if the foreach loop is currently doing the C: drive, it would use the "C:" path as the -path for Get-ChildItem, i.e.:
$PSTS = Get-ChildItem -path "*somehow reference C: drive path*" -Include *.pst -Recurse -Force -erroraction silentlycontinue
Sorry if the code is sloppy, I'm not the best at keeping clean code...
In this line...
$Log = ForEach ($PST in ($PSTS = Get-ChildItem -LiteralPath $Drive.Name -Include *.pst -Recurse -Force -erroraction silentlycontinue)){
... you are just passing the drive letter for -LiteralPath, which is not a path.
You need to pass the root path, e. g. "C:\", not just "C" or "C:". The latter only means the current directory on drive C.
This should do the trick:
$Log = ForEach ($PST in ($PSTS = Get-ChildItem -LiteralPath $Drive.Root -Include *.pst -Recurse -Force -erroraction silentlycontinue)){
Related
I would like to unzip some files each into their own folder with the same name as the zip file. I have been doing clunky things like this, but as this is PowerShell, there is usually a much smarter way to achieve things.
Are there some kind of one-or-two-liner ways that I can operate on each zip file in a folder and extract it into a subfolder of the same name as the zip (but without the extension)?
foreach ($i in $zipfiles) {
$src = $i.FullName
$name = $i.Name
$ext = $i.Extension
$name_noext = ($name -split $ext)[0]
$out = Split-Path $src
$dst = Join-Path $out $name_noext
$info += "`n`n$name`n==========`n"
if (!(Test-Path $dst)) {
New-Item -Type Directory $dst -EA Silent | Out-Null
Expand-Archive -LiteralPath $src -DestinationPath $dst -EA Silent | Out-Null
}
}
You could do with a few less variables. When the $zipfiles collection contains FileInfo objects as it appears, most variables can be replaced by using the properties the objects already have.
Also, try to avoid concatenating to a variable with += because that is both time and memory consuming.
Just capture the result of whatever you output in the loop in a variable.
Something like this:
# capture the stuff you want here as array
$info = foreach ($zip in $zipfiles) {
# output whatever you need to be collected in $info
$zip.Name
# construct the folderpath for the unzipped files
$dst = Join-Path -Path $zip.DirectoryName -ChildPath $zip.BaseName
if (!(Test-Path $dst -PathType Container)) {
$null = New-Item -ItemType Directory $dst -ErrorAction SilentlyContinue
$null = Expand-Archive -LiteralPath $zip.FullName -DestinationPath $dst -ErrorAction SilentlyContinue
}
}
# now you can create a multiline string from the $info array
$result = $info -join "`r`n==========`r`n"
I am currently working on finding a way to get a list with path of all the .xlsx and .xls file that are password protected in a network drive that contains tons and tons of folders and sub folders. I put together this script below that works fine, but it only returns .xlsx files, none of the .xls files with password protected were returned. I am wondering if anyone knows how to get the .xls file with password or any other script that would get this job done? Appreciate all your help!
Script
$path = "C:\Users\DC\Desktop"
$dest = "C:\Users\DC\Desktop\ExcelWithPassword.txt"
$Full = Get-ChildItem $path -Include *.xlsx*, *.xls* -Recurse -ErrorAction SilentlyContinue
$List = select-string -pattern "<encryption" $Full
foreach ($file in $List) {
$file.path | Out-File $dest -Append -Force
}
The output is basically a list of paths where those password protected files are located.
unless you have other files in the target directory tree, with an '.xl extension. Why are you doing this ...
Get-ChildItem $path -Include *.xlsx*, *.xls* -Recurse -ErrorAction SilentlyContinue
... you only need this...
Get-ChildItem $path -Include *.xl* -Recurse -ErrorAction SilentlyContinue
If you are after just the full path, ask for it, using this ...
Get-ChildItem $path -Include *.xl* -Recurse -ErrorAction SilentlyContinue |
Select-Object -Property Fullname
# Results
<#
FullName
--------
D:\Temp\NewFolder\Test.xlsx
D:\Temp\Test.xls
D:\Temp\Test.xlsx
#>
... or this.
(Get-ChildItem $path -Include *.xl* -Recurse -ErrorAction SilentlyContinue).FullName
# Results
<#
D:\Temp\NewFolder\Test.xlsx
D:\Temp\Test.xls
D:\Temp\Test.xlsx
#>
As far as the loop, you can also shorten your code to something similar.
(Get-ChildItem $path -Include *.xl* -Recurse -ErrorAction SilentlyContinue |
Select-Object -Property Fullname) -match '<encryption' |
Out-File $dest -Append -Force
Or
(Get-ChildItem $path -Include *.xl* -Recurse -ErrorAction SilentlyContinue).FullName -match '<encryption' |
Out-File $dest -Append -Force
You are not saying how the files were encrypted. Excel allows for protecting the sheet, the workbook, etc. You can't check a password-protected file by searching for a string without opening the file. To open the file you must use the application interface to open the file. For Excel it's:
### Automate Excel
$excel = New-Object -ComObject Excel.Application
$excel.Visible = $true
$workbook1 = $excel.Workbooks.Add()
$Sheet = $Excel.WorkSheets.Item(1)
$Sheet.Cells.Item(1,1) = "Hello from Powershell "
$Sheet.Cells.Item(1,2) = "Using VBA from Excel Button object"
Based on what you are after, there are a few other considerations you must have. Scanning and doing this across the whole network and thousands of files requires planning, and parallel processing.
I have a root folder (a mapped network drive) to Z, in this folder I have a folder named Archive and I would like to move some folders in Z to archive folder.
The titles of folders to move I have in a csv file.
I've created a PowerShell script, but somehow it does not really work, it does move one folder, but then nothing happens even in the PowerShell command, just empty and after a while nothing happens and I have to close the PowerShell window.
So if I have ten folders to copy only the first is moved and that is it.
Here is the code:
$currentPath = Split-Path -Parent $PSCommandPath;
$areaCsvPath = $currentPath + "\CSVFile.csv";
write-host $areaCsvPath;
$csv = Import-Csv $areaCsvPath;
$count =0;
$Creds = Get-Credential
foreach ($row in $csv)
{
Get-ChildItem -Path "Z:\" -Recurse |
Where-Object {$_.name -eq $row.Title} |
Move-Item -destination "Z:\_Archive" -Credential $Creds
$count++;
write-host $count;
}
CSV is as follows
Title
12345
22223
75687
...
I don't see why only one folder gets moved but you could try the following script which should be much faster because the Get-ChildItem cmdlet is only called once:
$currentPath = Split-Path -Parent $PSCommandPath;
$areaCsvPath = $currentPath + "\CSVFile.csv";
write-host $areaCsvPath;
$csv = Import-Csv $areaCsvPath;
$Creds = Get-Credential
Get-ChildItem -Path "Z:\" -Recurse |
Where-Object Name -in ($csv | select -expand Title) |
Move-Item -destination "Z:\_Archive" -Credential $Creds
If the folders are at the top level on Z: you should omit the -Recurse parameter. Also if you only want to move folders, you could add the -Directory switch to the Get-ChildItem invoke to further improve the performance.
I need to copy the Documents, Favorites, and Desktop folders for each user that has logged in to the PC in the last 30 days.
I'm just starting to learn powershell and I've got a decent start I think, but I'm wasting too much time. This is a project for work, and I've found myself just digging for solutions to X problem only to run into another at the next turn. Spent about a month trying to get this sorted out thus far, and thrown away a lot of codes.
What I have as of right now is this:
Get-ChildItem -path c:\users |Where-Object { $_.lastwritetime -gt (get-date).AddDays(-30)}
I know that this line will return the user folders that I need. At this point, I need code that will go in to each childitem from above and pull out the Documents, Favorites, and Desktop folder.
Now the tricky part. I need the code to create a folder on c: with the username it is pulling those folders from.
So the solution should:
for each user logged in in last 30 days;
copy Documents, Favorites, Desktop folder from their user drive
create a folder on c:\ for that user name
paste Documents, Favorites, Desktop to that folder
To better cover the scope:
I have to reimage PCs a lot in my department. The process of "inventorying" a PC is copying those folders and replacing them on the new PC I image for the user. That way their desktop etc looks the same and functions the same when they get their new PC. This code will be part of a larger code that ultimately "inventories" the entire PC for me... Ultimately, I want to be able to run my script for 2 seconds and then pull X folders and X documents off the c: drive on that PC as opposed to click, click, click, click a hundred times for 9 users that have used the PC in the last 30 days.
Any ideas?
2dubs
$usersFoldr = Get-ChildItem -path c:\users | Where-Object { $_.lastwritetime -gt (get-date).AddDays(-30)}
foreach ($f in $usersFoldr){
$toFld = "c:usrTest\" + $f.Name +"\Desktop\"
New-Item $toFld -type directory -force
Get-ChildItem ($f.FullName + "\Desktop") | Copy-Item -destination $toFld -Recurse -Force
}
Thanks to #bagger for his contribution. He was close.
After some experimentation, I found that this is the actual solution:
$usersFoldr = Get-ChildItem -path c:\users | Where-Object {
$_.lastwritetime -gt (get-date).AddDays(-30)}
foreach ($f in $usersFoldr)
{
$doc = "c:\users\$f\documents"
$toFldDoc = "c:\$f\documents"
New-Item $doc -type directory -force
Copy-Item $doc $toFldDoc -recurse -Force
}
foreach ($f in $usersFoldr){
$desk = "c:\users\$f\desktop"
$toFldDesk = "c:\$f\desktop"
New-Item $desk -type directory -force
Copy-Item $desk $toFldDesk -recurse -Force
}
foreach ($f in $usersFoldr){
$fav = "c:\users\$f\favorites"
$toFldFav = "c:\$f\favorites"
New-Item $fav -type directory -force
Copy-Item $fav $toFldFav -recurse -Force
}
Then save this file, send a shortcut of it to the desktop, then change the target of the shortcut to this:
C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -f "C:\YOURDIRECTORY\YOURSCRIPTNAME.ps1"
Then run that shortcut as an administrator. Works like gold.
Thanks for your help, guys! :)
For anyone interested in the whole script:
Inventory script to copy pertinent files for all users in last 30 days, gather printer hostname/driver/IP, gather serialnumber, gather make/model.
$usersFoldr = Get-ChildItem -path c:\users | Where-Object {
$_.lastwritetime -gt (get-date).AddDays(-30)}
foreach ($f in $usersFoldr){
$doc = "c:\users\$f\documents"
$toFldDoc = "c:\inventory\$f\documents"
New-Item $doc -type directory -force
Copy-Item $doc $toFldDoc -recurse -Force
}
foreach ($f in $usersFoldr){
$desk = "c:\users\$f\desktop"
$toFldDesk = "c:\inventory\$f\desktop"
New-Item $desk -type directory -force
Copy-Item $desk $toFldDesk -recurse -Force
}
foreach ($f in $usersFoldr){
$fav = "c:\users\$f\favorites"
$toFldFav = "c:\inventory\$f\favorites"
New-Item $fav -type directory -force
Copy-Item $fav $toFldFav -recurse -Force
}
Get-WMIObject -class Win32_Printer | Select Name,DriverName,PortName
|Export-CSV -path 'C:\Inventory\printers.csv'
Get-WmiObject win32_bios |foreach-object {$_.serialnumber} |out-file
'c:\Inventory\SerialNumber.txt'
Get-WmiObject Win32_ComputerSystem | Select Model,Manufacturer |out-file
'c:\Inventory\MakeModel.txt'
Again, save this file, send a shortcut of it to the desktop, then change the target of the shortcut to this:
C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -f "C:\YOURDIRECTORY\YOURSCRIPTNAME.ps1"
You can also retrieve a list of installed software by adding this line to the script:
get-wmiobject win32_product | select Name |export-csv -path 'c:\inventory
\software.csv'
My Copy-Item doesn't work when included in a foreach loop.
Much like Powershell: Copy-Item not working when in ForEach loop
Only my destination folder is not set the same as the originating folder which seemed to be the problem there.
This is the very basic function. My objective is to grab the latest log files from a directory containting log files for lots of stuff. I'm only interested in a few defined in $servers. A line in Servers.txt looks like this: \\clientname\d$\logdirectory\processlog\
When I Set-Location to a path in servers.txt and run Get-ChildItem it works as expected.
I also need to generate a new folder for each object in \Logs\ but one thing at a time.
$servers = #()
$servers = Get-Content c:\Test\Servers.txt
$destServer = #()
$destServer = ( "clientname")
$destinationFolder = "\\" + $destServer + "\d$\Logs\"
foreach ($serverpath in $servers) {
Write-Host " Copying from $serverpath "
Set-Location -literalpath $serverpath |
Get-ChildItem |
Sort-Object -Descending LastWriteTime |
Select -First 2 |
Copy-Item -Destination $destinationFolder -Recurse -Force
}