I've deployed powershell script that was writing information of user certificates on shared folder. I've deleted deployment but script is still running on computers and writing in that file. Do you know, how can I diagnose why scprit is still runing, or how can I end it? Attached script below.
Daniel
$report = #()
$certs = Get-ChildItem Cert:\CurrentUser\my -Recurse |
select #{n="Name"; e={$_.GetName()}}
$report += $env:computername+”;”+$env:username+”;”+$certs[0]+”;”+$certs[1]+”;”+$certs[2]
$report > \\server\certs.csv
Related
Good morning,
Hopefully this will be a quick and easy one to answer.
I am trying to run a PS script and have it export to csv based on a list of IP addresses from a text file. At the moment, it will run but only produce one csv.
Code Revision 1
$computers = get-content "pathway.txt"
$source = "\\$computer\c$"
foreach ($computer in $computers) {
Get-ChildItem -Path "\\$Source\c$" -Recurse -Force -ErrorAction SilentlyContinue |
Select-Object Name,Extension,FullName,CreationTime,LastAccessTime,LastWriteTime,Length |
Export-CSV -Path "C:\path\$computer.csv" -NoTypeInformation
}
Edit
The script is now creating the individual server files as needed and I did change the source .txt file to list the servers by HostName rather than IP. The issue now is that no data is populating in the .csv files. It will create them but nothing populates. I have tried different source file paths to see if maybe its due to folder permissions or just empty but nothing seems to populate in the files.
The $computer file lists a number of server IP addresses so the script should run against each IP and then write out to a separate csv file with the results, naming the csv file the individual IP address accordingly.
Does anyone see any errors in the script that I provided, that would prevent it from writing out to a separate csv with each run? I feel like it has something to do with the foreach loop but I cannot seem to isolate where I am going wrong.
Also, I cannot use any third-party software as this is a closed network with very strict FW rules so I am left with powershell (which is okay). And yes this will be a very long run for each of the servers but I am okay with that.
Edit
I did forget to mention that when I run the script, I get an error indicating that the export-csv path is too long which doesn't make any sense unless it is trying to write all of the IP addresses to a single name.
"Export-CSV : The specified path, file name, or both are too long. The fully qualified file name must be less than 260 characters, and the directory name must be less than 248 characters.
At line:14 char:1
TIA
Running the script against C: Drive of each computer is strongly not advisable that too with Recurse option. But for your understanding, this is how you should pass the values to the variables. I haven't tested this code.
$computer = get-content "pathway.txt"
foreach ($Source in $computer) {
Get-ChildItem -Path "\\$Source\c$" -Recurse -Force -ErrorAction SilentlyContinue |
Select-Object Name,Extension,FullName,CreationTime,LastAccessTime,LastWriteTime,Length | Export-Csv -Path "C:\Path\$source.csv" -NoTypeInformation
}
$computer will hold the whole content and foreach will loop the content and $source will get one IP at a time. I also suggest instead of IP's you can have hostname so that your output file have servername.csv for each server.
In hopes that this helps someone else. I have finally got the script to run and create the individual .csv files for each server hostname.
$servers = Get-Content "path"
Foreach ($server in $servers)
{
Get-ChildItem -Path "\\$server\c$" -Recurse -Force -ErrorAction SilentlyContinue |
Select-Object Name,Extension,FullName,CreationTime,LastAccessTime,LastWriteTime,Length |
Export-CSV -Path "path\$server.csv" -NoTypeInformation
}
Very new to powershell and AD, so apologies if this post has an obvious answer. I have done some research and I am still not finding the answers I am looking for. My script is below for reference.
I have created a simple powershell script that will run on an admin vm i have setup on my domain. I have a separate SQL vm running a backup script that consume a lot of storage over time. I am trying to run this very simple script. My question is, do I need to modify this script in order to store it on my admin vm but have it run on my sql vm? Or can i leave the path as is and just set up in AD task scheduler. I have tried targeting the FQDN and the IP, but it doesn't seem to be working either way.
$backups_file = 'E:\blahBlahBla\SQL\Backups' or
$backups_file = '<IP_ADDRESS>\E:\blahBlahBla\SQL\Backups' or
$backups_file = '<FQDN>E:\blahBlahBla\SQL\Backups'
$backup_file_exist = (Test-Path -Path $backups_file)
if ($backup_file_exist){
# Verifies the folder exists
Write-Output -InputObject "This folder exists"
# returns all the files in the folder.
Get-ChildItem -Path $backups_file
# Deletes all files in the folder that are older that 7 days.
Get-ChildItem -Path $backups_file -Recurse | Where-Object {($_.LastWriteTime -lt (Get-
Date).AddDays(-7))} | Remove-Item
}
else
{
Write-Output -InputObject "Unable to access this directory."
}
Thanks.
well all your $backups_file solutions seems wrong to me.
If you want excess a directory on a Remote system, it has to be at least a fileshare or a administrative share like \\computer\e$\folder\folder\
But why using file shares or something like that when you just simple can connect to a Powershell Session on the Remote Host? here is a example.:
$mySQLServer = "Server1.domain.name", "server2.domain.name"
$backupFolder = "E:\blahBlahBla\SQL\Backups"
foreach ($server in $mySQLServer)
{
$session = New-PSSession -ComputerName $server #maybe -cred if needed
Invoke-Command -Session $session -ArgumentList $backupFolder -ScriptBlock {
param(
$directoy
)
if ($backup_file_exist)
{
# Verifies the folder exists
Write-Output -InputObject "This folder exists"
# returns all the files in the folder.
Get-ChildItem -Path $directoy
# Deletes all files in the folder that are older that 7 days.
Get-ChildItem -Path $directoy -Recurse | Where-Object { ($_.LastWriteTime -lt (Get-Date).AddDays(-7))
} | Remove-Item
}
}
Remove-PSSession
}
Good Luck!
I'm wanting to improve on my script to be able to accomplish the following:
Scan servers based on get-adcomputer on specific OUs.
Scan each server based on whatever drive letter it has.
Scan each server for log4j.
Export all results to a CSV that identifies the folder path, name of file, and the server that the file was found on.
I have been using the following code to start with:
$Servers = Get-ADComputer -Filter * -SearchBase "OU=..." | Select -ExpandProperty Name
foreach ($server in $Servers){
Invoke-Command -ComputerName $Server -ScriptBlock {
$Drives = (Get-PSDrive -PSProvider FileSystem).Root
foreach ($drive in $Drives){
Get-ChildItem -Path $drive -Force -Filter *log4j* -ErrorAction SilentlyContinue | '
foreach{
$Item = $_
$Type = $_.Extension
$Path = $_.FullName
$Folder = $_.PSIsContainer
$Age = $_.CreationTime
$Path | Select-Object `
#{n="Name";e={$Item}}, `
#{n="Created";e={$Age}},`
#{n="FilePath";e={$Path}},`
#{n="Extension";e={if($Folder){"Folder"}else{$Type}}}`
} | Export-Csv C:\Results.csv -NoType
}
}
I am having the following issues and would like to address them to learn.
How would I be able to get the CSV to appear the way I want, but have it collect the information and store it on my machine instead of having it on each local server?
I have noticed extreme performance issues on the remote hosts when running this. WinRM takes 100% of the processor while it is running. I have tried -Include first, then -Filter, but to no avail. How can this be improved so that at worst, it's solely my workstation that's eating the performance hit?
What exactly do the ` marks do?
I agree with #SantiagoSquarzon - that's going to be a performance hit.
Consider using writing a function to run Get-ChildItem recursively with the -MaxDepth parameter, including a Start-Sleep command to pause occasionally. Also, you may want to note this link
You'd also want to Export-CSV to a shared network drive to collect all the machines' results.
The backticks indicate a continuation of the line, like \ in bash.
Finally, consider using a Scheduled Task or start a powershell sub-process with a lowered process priority, maybe that will help?
I am trying to run this script through GPO deployed scheduled task:
$registryPath = 'HKLM:\Software\CC\PST_Discovery_Script\Already_Run'
if (!(Test-Path -Path $registryPath)) {
$dir = "c:\Users"
$ext = "pst"
Get-ChildItem "$dir" *$ext -r | Select-Object FullName,LastAccessTime,LastWriteTime,CreationTime,#{N='Owner';E={$_.GetAccessControl().Owner}}, #{Name="MegaBytes"; Expression={"{0:F2}" -f ($_.Length / 1MB)}}, #{N='Hostname';E={$env:computername}} | export-csv "c:\PST_Discovery.csv" -Append -NoTypeInformation
New-Item -Path HKLM:\Software\CC\PST_Discovery_Script -Name Already_Run –Force
}
It works fine if I run the script manually through the Powershell console/ISE, but not through a scheduled task.
If I run it through a scheduled task, I know the script is running because it reads the registry key, and if it doesn't exist it writes a registry key, but it does't actually run the get-childitem line or export a CSV.
The scheduled task shows up on the client, and it's running using a Domain Admin credentials (me)
EDIT: Sorry, my formatting for the code went all wrong, i think it should be fixed up now
kaspermoerch: Yes, it's a domain admin, and thus has full permissions over the file system
boxdog: I actually had it writing to a UNC share, but changed it to local computer because it wasn't working. I'll try some other combinations of output location and user.
TheIncorrigible: Originally it was system, but it wasn't working so I edited the pushed out scheduled task and am using my domain admin account.
Adam:
- Yes, scheduled task is created
- Yes, task runs script using following code: Powershell.exe -ExecutionPolicy Bypass \server1\share1\PST_Discovery_Script.ps1
- Yes, it runs using my DA creds
- Yes, the file isn't created, though it still writes the registry value
I've checked scheduled task, see my screenshot. I'm elevating the task scheduler and manually running the task.
Just to make sure I understand correctly.
The scheduled task is created.
The scheduled task runs a script containing the following code.
The scheduled task executes the script using your credentials.
Your account is in the Domain Admin group.
By "not working", the PST_Discovery.csv file isn't created.
Correct me if I misunderstood anything.
First: I'd verify the scheduled task is running in an elevated context (runas administrator). Even if the account executing the task is an administrator, the job needs to run in an elevated context.
Second: I'm curious how this works for you but not me. I've never seen a call to Select-Object quite like you've got. If I try to pipe an gci | Select-Object -Property #(...what you have...) | Export-Csv... I get an exception complaining about the FullName.
$registryPath = 'HKLM:\Software\CC\PST_Discovery_Script\Already_Run'
if (!(Test-Path -Path $registryPath)) {
$dir = 'c:\Users'
$ext = 'pst'
Get-ChildItem -Path $dir -Filter *$ext -Recurse |
Select-Object -Property #(
FullName
LastAccessTime
LastWriteTime
CreationTime
#{N = 'Owner'; E = {$_.GetAccessControl().Owner}}
#{Name = "MegaBytes"; Expression = {"{0:F2}" -f ($_.Length / 1MB)}}
#{N = 'Hostname'; E = {$env:computername}}
) |
Export-Csv -Path c:\PST_Discovery.csv -Append -NoTypeInformation
New-Item -Path HKLM:\Software\CC\PST_Discovery_Script -Name Already_Run –Force
}
You're 100% sure that works? I had to change that snippet to the following:
Select-Object -Property #(
, 'FullName'
, 'LastAccessTime'
, 'LastWriteTime'
, 'CreationTime'
#{N = 'Owner'; E = {$_.GetAccessControl().Owner}}
#{Name = "MegaBytes"; Expression = {"{0:F2}" -f ($_.Length / 1MB)}}
#{N = 'Hostname'; E = {$env:computername}}
I'm running Powershell 5 on Windows 10. I'm executing your example from the CLI.
I copied the code format you laid out, and it worked fine for me.
A quick and easy method to capture anything going wrong would be to change the error action preference to stop at the start of your script so that the process exits at the first error and any errors will flow back to the scheduled task's Last Run Result. From there you should be able to see the exception.
$ErrorActionPreference = 'Stop'
That should at least provide a starting point if you haven't already solved it. If that returns anything other than 0, then you know to build in some error handling with try/catch and perhaps an output log, to help you get to the bottom of it.
I've searched numerous MSDN/Technet and StackOverflow articles regarding this but I can't find a solution to my problem.
SO references below.
I am trying to run a script on my server that simply counts the files in a folder on a network location.
I can get it working if it's a local folder, and I can get it working when I map the network drive. However I can't use a network drive because I'll be running this script from a web interface that doesn't have a user account (local drives work fine).
My script is:
$Files = Get-ChildItem \\storage\folder -File
$Files.count
I get the error:
Get-ChildItem : Cannot find path '\\storage\folder' because it does not exist.
[0]open folder from Network with Powershell
[1]File counting with Powershell commands
[2]Count items in a folder with PowerShell
[3]Powershell - remote folder availability while counting files
Two things that I can think of,
One would be to add -path to your get-childitem call. I tested this on my Powershell and it works fine.
$files = get-childitem -path C:\temp
$files.count
This returns the number of files in that path.
However I am testing this on a local file. If you are sure it is the remote access part giving you trouble I would suggest trying to set credentials. Besides the get-credentials option, you could also try setting them yourself.
$Credentials = New-Object System.Management.Automation.PsCredential("Username", "password")
Then perhaps you can set the drive and still be able to access your files. Hope that helps.
Try this:
set-location \\\storage\folder\
dir -recurse | where-object{ $_.PSIsContainer } | ForEach{ Write-Host $_.FullName (dir $_.FullName | Measure-Object).Count }
This will count the number of files in each sub-folder (recurse) and display the full path and count in the output.