Exporting Object and strings to CSV using Powershell - powershell

The purpose of this code is to transfer files from one location to another and to log whether the transfer was a success or a failure.
Everything works except I am having issues with the log. I want the log to be in CSV format and there to be 3 columns: success/failure, from location, and to location. This is outputting the results all into rows with one column.
I've tried the Export-Csv option but that looks for objects/properties so only displays the length(I have strings too). Add-content works but there is only one column. Any suggestions?
#LOCATION OF CSV
$csv = Import-Csv C:\test2.csv
#SPECIFY DATE (EXAMPLE-DELETE FILES > 7 YEARS. 7 YEARS=2555 DAYS SO YOU WOULD ENTER "-2555" BELOW)
$Daysback = "-1"
#FILE DESTINATION
$storagedestination = "C:\Users\mark\Documents\Test2"
#LOG LOCATION
$loglocation = "C:\Users\mark\Documents\filetransferlog.csv"
$s = "SUCCESS"
$f = "FAIL"
$CurrentDate = Get-Date
foreach ($line in $csv) {
$Path = $line | Select-Object -ExpandProperty FullName
$DatetoDelete = $CurrentDate.AddDays($DaysBack)
$objects = Get-ChildItem $Path -Recurse | Select-Object FullName, CreationTime, LastWriteTime, LastAccessTime | Where-Object { $_.LastWriteTime -lt $DatetoDelete }
foreach ($object in $objects) {
try
{
$sourceRoot = $object | Select-Object -ExpandProperty FullName
Copy-Item -Path $sourceRoot -Recurse -Destination $storagedestination
Remove-Item -Path $sourceRoot -Force -Recurse
$temp = $s, $sourceRoot, $storagedestination
$temp | add-content $loglocation
}
catch
{
$temp2 = $f, $sourceRoot, $storagedestination
$temp2 | add-content $loglocation
}
}
}

All your | Select-Object -ExpandProperty are superfluous, simply attach the property name to the variable name => $Path = $line.FullName
Why calculate $DatetoDelete inside the foreach every time?
Output the success/fail to a [PSCustomObject] and gather them in a variable assigned directly to the foreach.
Untested:
$csv = Import-Csv C:\test2.csv
$Daysback = "-1"
$destination = "C:\Users\mark\Documents\Test2"
$loglocation = "C:\Users\mark\Documents\filetransferlog.csv"
$s = "SUCCESS"
$f = "FAIL"
$CurrentDate = Get-Date
$DatetoDelete = $CurrentDate.Date.AddDays($DaysBack)
$Log = foreach ($line in $csv) {
$objects = Get-ChildItem $line.FullName -Rec |
Where-Object LastWriteTime -lt $DatetoDelete
foreach ($object in $objects) {
$Result = $s
$sourceRoot = $object.FullName
try {
Copy-Item -Path $sourceRoot -Recurse -Destination $destination
Remove-Item -Path $sourceRoot -Recurse -Force
} catch {
$Result = $f
}
[PSCustomObject]#{
'Success/Fail' = $Result
Source = $sourceRoot
Destination = $destination
}
}
}
$Log | Export-Csv $loglocation -NoTypeInformation

Related

How can I get the time and date my PowerShell script deletes a file

I am using the following script to read a list of file names which are then deleted. Is there a way can get an output of the date and time each file is deleted?
$targetFolder = "D:\" $fileList = "C:\DeleteList.txt" Get-ChildItem
-Path "$targetFolder\*" -Recurse -Include #(Get-Content $fileList) | Remove-Item -Verbose
Thanks for any help.
You could keep track of the files that are deleted and the time of deletion by outputting an object with the file's fullname and current date.
This output can then be saved as structured CSV file
$targetFolder = "D:\"
$fileList = Get-Content -Path "C:\DeleteList.txt"
$deleted = Get-ChildItem -Path $targetFolder -Recurse -Include $fileList | ForEach-Object {
# output an object with the current date and the file FullName
$_ | Select-Object #{Name = 'DeletedOn'; Expression = {(Get-Date)}}, FullName
$_ | Remove-Item -WhatIf
}
# output on screen
$deleted | Format-Table -AutoSize
# output to csv file
$deleted | Export-Csv -Path 'C:\RemovedFiles.csv' -NoTypeInformation
Remove the -WhatIf safety-switch if you are satisfied with the results shown on screen.
Would this work?
$targetFolder = "D:"
$fileList = "C:\DeleteList.txt"
$Files = Get-ChildItem -Path "$targetFolder" -Recurse -Include #(Get-Content $fileList)
# Once you have the desires files stored in the $Files variable, then run a Foreach loop.
$Obj = #() # create an array called $Obj
Foreach ($File in $Files)
{
# store info in hash table
$hash = #{
DateTime = (get-date)
fileName = $File.name
fullpath = $File.fullname
}
Write-Host "deleting file $($file.name)" -for cyan
Remove-Item $File.fullname # *** BE VERY CAREFUL!!!***
# record information in an array called $Obj
$Obj += New-Object psobject -Property $hash
}
$Obj | select fileName, DateTime | Export-csv C:\...

Folders with more than 40.000 files

I have this script I received to check folders and subfolders on a network drive. I wonder how it could be modified into checking only folders and subfolder and write in the CSV if there is any folder with more then 40.000 files in it and the number of files. The image show a sample output from the script as it is now and I do not need it to show any files as it currently do.
$dir = "D:\test"
$results = #()
gci $dir -Recurse -Depth 1 | % {
$temp = [ordered]#{
NAME = $_
SIZE = "{0:N2} MB" -f ((gci $_.Fullname -Recurse | measure -Property Length -Sum -ErrorAction SilentlyContinue).Sum / 1MB)
FILE_COUNT = (gci -File $_.FullName -Recurse | measure | select -ExpandProperty Count)
FOLDER_COUNT = (gci -Directory $_.FullName -Recurse | measure | select -ExpandProperty Count)
DIRECTORY_PATH = $_.Fullname
}
$results += New-Object PSObject -Property $temp
}
$results | export-csv -Path "C:\temp\output.csv" -NoTypeInformation
Instead of executing so many Get-ChildItem cmdlets, here's an approach that uses robocopy to do the heavy lifting of counting the number of files, folders and total sizes:
# set the rootfolder to search
$dir = 'D:\test'
# switches for robocopy
$roboSwitches = '/L','/E','/NJH','/BYTES','/NC','/NP','/NFL','/XJ','/R:0','/W:0','/MT:16'
# regex to parse the output from robocopy
$regEx = '\s*Total\s*Copied\s*Skipped\s*Mismatch\s*FAILED\s*Extras' +
'\s*Dirs\s*:\s*(?<DirCount>\d+)(?:\s+\d+){3}\s+(?<DirFailed>\d+)\s+\d+' +
'\s*Files\s*:\s*(?<FileCount>\d+)(?:\s+\d+){3}\s+(?<FileFailed>\d+)\s+\d+' +
'\s*Bytes\s*:\s*(?<ByteCount>\d+)(?:\s+\d+){3}\s+(?<BytesFailed>\d+)\s+\d+'
# loop through the directories directly under $dir
$result = Get-ChildItem -Path $dir -Directory | ForEach-Object {
$path = $_.FullName # or if you like $_.Name
$summary = (robocopy.exe $_.FullName NULL $roboSwitches | Select-Object -Last 8) -join [Environment]::NewLine
if ($summary -match $regEx) {
$numFiles = [int64] $Matches['FileCount']
if ($numFiles -gt 40000) {
[PsCustomObject]#{
PATH = $path
SIZE = [int64] $Matches['ByteCount']
FILE_COUNT = [int64] $Matches['FileCount']
FOLDER_COUNT = [int64] $Matches['DirCount']
}
}
}
else {
Write-Warning -Message "Path '$path' output from robocopy was in an unexpected format."
}
}
# output on screen
$result | Format-Table -AutoSize
# output to CSV file
$result | Export-Csv -Path "C:\temp\output.csv" -NoTypeInformation

Keyword Search Across All Servers - Powershell

Afternoon All,
I need to run a search across all of our servers.
I have the list of servers in a text document and a list of keywords in another
$Servers = get-content -path 'C:\support\Server Search\Server Test.txt'
$Keywords = get-content -path "C:\Support\Server Search\Keyword Test.txt"
Foreach ($Server in $Servers){
Foreach ($Keyword in $Keywords){
Get-ChildItem "$Server" -Recurse | Where-Object {$_.Name -like "$Keyword"}
$i++
Write-Host "$found: $i - Current $ $_"
New-Object -TypeName PSCustomObject -Property #{
Directory = $_.Directory
Name = $_.Name
Length = $_.Length /1024
CreationTime = $_.CreationTime
LastWriteTime = $_.LastWriteTime
LastAccessTime = $_.LastAccessTime}|
select Directory,Name,Length,CreationTime,LastWriteTime,LastAccessTime |
Export-Csv "C:\support\server search\$Server.csv" -Append -NoTypeInformation
}}
$i = 0
Is there a way to indicate when a Keyword has been located and total keywords found? I feel like I need to change this Line but I cannot fathom what I would actually put, I've tried $Keywords but that just changes keyword everytime the directory changes
$i++
Write-Host "$found: $i - Current $ $_"
I'm assuming your $server is set up something like "\\servername\c$\"
when a Keyword has been located and total keywords found:
$Servers = get-content -path 'C:\support\Server Search\Server Test.txt'
$Keywords = get-content -path "C:\Support\Server Search\Keyword Test.txt"
$num = 0 #Total Keyword files Found
Foreach ($Server in $Servers){
Foreach ($Keyword in $Keywords){
#Keyword found Check
$Found = Get-ChildItem -Path "$Server" -Recurse -Include "$Keyword"
if($Found){
Foreach($File in $Found){
$num++ #increment num of keyword files found by 1
Write-Host "found: $num - $File"
New-Object -TypeName PSCustomObject -Property #{
Directory = $File.Directory
Name = $File.Name
Length = $File.Length /1024
CreationTime = $File.CreationTime
LastWriteTime = $File.LastWriteTime
LastAccessTime = $File.LastAccessTime}|
select Directory,Name,Length,CreationTime,LastWriteTime,LastAccessTime |
Export-Csv "C:\support\server search\$Server.csv" -Append -NoTypeInformation
}
}
}
}
Please let me know if this helps you progress. I can assist further if requested.

Check files on remote computers for time stamp older than X hours and export results to CSV

We are trying to run a script against a pile of remote computers to check the date stamps of files in a fixed folder that are older than say 12 hours and return the results to a CSV. The date range needs to be flexible as its a set time of 6pm yesterday which will move as the time moves on.
$computers = Get-Content -Path computers.txt
$filePath = "c:\temp\profile"
$numdays = 0
$numhours = 12
$nummins = 5
function ShowOldFiles($filepath, $days, $hours, $mins)
{
$files = $computers #(get-childitem $filepath -include *.* -recurse | where {($_.LastWriteTime -lt (Get-Date).AddDays(-$days).AddHours(-$hours).AddMinutes(-$mins)) -and ($_.psIsContainer -eq $false)})
if ($files -ne $NULL)
{
for ($idx = 0; $idx -lt $files.Length; $idx++)
{
$file = $files[$idx]
write-host ("Old: " + $file.Name) -Fore Red
}
}
}
Write-output $computers, $numdays, $numhours, $nummins >> computerlist.txt
You could run the follow script on all of your remote machines:
$computers = Get-Content -Path computers.txt
$logFile = "\\ServerName\C$\Logfile.txt"
$date = "12/03/2002 12:00"
$limit = Get-Date $date
$computers | %{
$filePath = "\\$_\C$\temp\profile"
$files = $null
$files = Get-ChildItem -Path $filePath -Recurse -Force | `
Where-Object {$_.CreationTime -lt $limit }
If($files -ne $null){
"-------------------------[$($_)]------------------------">> $logFile
$files | Foreach {$_.FullName >> $logFile}
}
}
This will check the folder given ($filePath) for files that are older than the limit given. Files older than the limit will have there full file path logged in the given network location $logFile.
with a small alteration to #chard earlier code I managed to get a workable solution.
The output log file only returns the files that are older than the date in the code.
this can be manipulated in Excel with other outputs for our needs.
I will try the updated code above in a bit.
$computers = Get-Content -Path "C:\temp\computers.txt"
$logFile = "\\SERVER\logs\output.txt"
$numdays = 3
$numhours = 10
$nummins = 5
$limit = (Get-Date).AddDays(-$numdays).AddHours(-$numhours).AddMinutes(-$nummins)
$computers | %{
$filePath = "\\$_\C$\temp\profile\runtime.log"
Get-ChildItem -Path $filePath -Recurse -Force | `
Where-Object {$_.LastWriteTime -lt $limit } |
foreach {"$($_)">> $logFile}
}

powershell - adding folder path with another path

$path = "c:\folder a"
$destFolder = "C:\"
$subFolder = "\folder c\folder d\"
$file = "file.txt"
$dir = Get-ChildItem $path | select -first 10 | Sort-Object -Property CreationTime
[array]::Reverse($dir)
$dir | format-table FullName
$fullPath = #()
ForEach ($i in $dir) {
$fullPath += $i + $subFolder + $file
}
$i = 0
while ($i -lt $fullPath.Count) {
$exists = Test-Path $fullPath[$i]
if ($exists){
Copy-Item -Path $fullPath[$i] -Destination $destFolder
break
}
$i++
}
having trouble getting &fullpath to work
edit:
&fullpath displays all the folder in the directory then adds the subfolder+file at the end.
I want it to take the 1 file path at a time from &dir and add the subfolder+file
sorry if I haven't explained it very well.
Im a total beginner at this kind of stuff
I think what you need is this:
$path = "c:\temp"
$destFolder = "C:\"
$subFolder = "\folder c\folder d\"
$file = "file.txt"
$childItems = Get-ChildItem $path | select -first 10 | Sort-Object -Property CreationTime -Descending
forEach ($item in $childItems)
{
$fullPath = Join-Path -Path $item.FullName -ChildPath "$subFolder$file"
if (Test-Path -Path $fullPath)
{
Copy-Item -Path $fullPath -Destination $destFolder
}
}