Powershell to find machine that created a file - powershell

I have a script that monitors the filesystem using FileWatcher.IO in Powershell.
Currently it finds the user that made the file with:
$owner = (Get-Acl $path).Owner
And it finds the computer that the file was made on with:
$Computer = get-content env:computername
But I'd also like to obtain what machine the file was created from. For instance, if a user is logged into a terminal server, I can see the file is made on the terminal server. But I want to know the host name of the local machine that made the file on the terminal server.
Is this possible? I've been searching the msdn PSEventArgs Class page without much success.

That information is not going to be stored in the file or its metadata, so no there's no straightforward way to get at it.
By the way, you can just use $env:computername directly as a variable; there's no need to use Get-Content.

Related

Can you use a powershell script to create a powershell script?

So this may be an odd request and maybe I'm going about this all wrong but I also have a unique situation. I have servers that are sometimes cloned and I need to run a script that I created on the clones servers. Due to the nature of the clones they cannot be connected to a network.
Currently I am manually putting the generic script on each server before cloning and then running the script on the clone server.
What I would like to do is have a script that runs and gathers all the information, say installed programs as an example, and generate a custom version of my current script on the servers before they are cloned.
I have both the powershell script that gets the server information and the generic one that makes the changes to the clone but I have not found a way to merge the two or any documentation so I don't know if i am hitting a limitation with this one.
Edit for more explanation and examples. I'm doing this from my phone atm so I dont have an example I can post.
Current I have a script that has a set number of applications to uninstall, registry keys to remove, services to stop ect. In another application I have a list of all the software that we have for each server and I can pull that data for each server. What I need to do is pull the data for each server, and have a script placed on each server that will uninstall just the programs for that server.
Currently the script has to run through every potential software and try to uninstall it and then check the other application to see if there are any additional programs that need to be uninstalled.
Hope this extra info helps.
Thanks.
Stop thinking of it as code.
Use script 1 to export blocks of text into a new file. for example, you might have a configuration that says all Dell servers must have this line of code run:
Set-DELL -attribute1 unmanaged
where on HP, the script would have been
Set-HP -attribute1 unmanaged
on web servers, you want:
set-web -active yes
where if not a web server, you want nothing.. so, your parent script code would look like:
$Dell = "Set-DELL -attribute1 unmanaged"
$HP = "Set-HP -attribute1 unmanaged"
$web = "set-web -active yes"
if (Get-servermake -eq "Dell")
{
$dell | out-file Child.ps1 -append
}
if (Get-servermake -eq "HP")
{
$HP | out-file Child.ps1 -append
}
if (Get-webserver -eq $true)
{
$web | out-file Child.ps1 -append
}
The result is a customized script for the specific server, child.ps1.
Now, you can take this and run with it. You could say add functionality to the child script like "Is it an AD controller", etc.
However, you might be better off having all of this in a single script, and just block off sections that don't apply in an if statement for example.
I'm still not totally sure I understand what your asking. If I've missed the mark, tell me how, and I'll tell you how to tweak this better. (And hopefully obvious is that the Get-whatever is sample code. I don't expect that to be what your using to determine a computer make/model/etc)

How to get an environment variable in a Powershell script when it is deployed by SCCM?

I've made a script to automatically change and/or create the default Outlook signature of all the employees in my company.
Technically, it gets the environment variable username where the script is deployed, access to the staff database to get some information regarding this user, then create the 3 different files for the signature by replacing values inside linked docx templates. Quite easy and logical.
After different tests, it is working correctly when you launch the script directly on a computer, either by using Powershell ISE, directly by the CMD or in Visual Studio. But when we tried to deploy it, like it will be, by using SCCM, it can't get any environment variable.
Do any of you have an idea about how to get environment variables in a script when it is deployed by SCCM ?
Here is what I've already tried :
$Name = [Environment]::UserName
$EnvVarUserName = Get-Item Env:\USERNAME
Even stuff like this :
$proc = gwmi win32_process -Filter "Name = 'explorer.exe'"
$report = #()
ForEach ($p in $proc)
{
$temp = "" | Select User
$temp.user = ($p.GetOwner()).User
$report += $temp
}
Thanks in advance and have a nice day y'all !
[EDIT]:
I've found a way of doing this, not the best one, but it works. I get the name of the machine, check the DB where when a laptop is connected to our network it stores the user id and the machine, then get the info in the staff DB.
I will still check for Matt's idea which is pretty interesting and, in a way, more accurate.
Thank you all !
How are you calling the environmental variable? $Env:computernamehas worked for me in scripts pushed out via SCCM before.
Why don't you enumerate the "%SystemDrive%\Users" folder, exclude certain built-in accounts, and handle them all in one batch?
To use the UserName environment variable the script would have to run as the logged-in user, which also implies that all of your users have at least read access to your staff database, which, at least in our environment, would be a big no-no.

How to list folder permissions located on a different server

I'm fairly new to PowerShell and am running into a problem.
I want to do the following:
Get list of permissions/users on a single folder on a different server than where I am running my PowerShell window from.
Current command failing:
Get-acl -path "\\servername\folder"
Error Message:
Get-acl : Cannot find path '\\servername\folder' because it does not exist
Does this command only work on the local machine?
It turns out with the way permissions/authentications are setup in my environment prevented my code from working.
Here are the steps I took to verify if I could connect to the server:
Test-Path \\server\folder
This returned "False", which is why my code was breaking.
The work around I used was this:
#Step 1: remotely connect to server
Enter-PSSession -ComputerName servernamegoeshere
#Step 2: get list of permissions on folder and save to csv
get-acl E:\foldernamehere |
select -expand access |
export-csv C:\Users\usernamegoeshere\Documents\listofperms.csv |
#Step 3: close remote connection
Exit-PSSession
I still had to remote into the server and copy the csv to the location I wanted because again, any copy command to another server/share in PowerShell would not work due to permission/authentication issues.
This article explains authentication/permissions a bit better than I can:
http://blogs.technet.com/b/heyscriptingguy/archive/2012/11/14/enable-powershell-quot-second-hop-quot-functionality-with-credssp.aspx
Second way to do this with less code and not having to create a remote session thanks to user Ansgar Wiechers:
Invoke-Command -Computer server -ScriptBlock {get-acl E:\folder |
select -expand access } |
export-csv \\server\folder\accesslist.csv
With PowerShell, there are many ways to do one thing...I think this way is best/most simple! Thanks!
The command works on UNC paths as well, but UNC paths are slightly different from local paths. You need an access point to enter the file system of a remote host. For SMB/CIFS access (via UNC paths) that access point is a shared folder, so you need a path \\server\share or \\server\share\path\to\subfolder.
With an admin account you could use the administrative shares (e.g. \\server\C$\Users\Administrator), otherwise you need to create a share first.

Starting an exe file with parameters on a remote PC

We have a program running on about 400 PCs (All W7). This program is called Wisa.
We receive regular updates for this program, named something like wisa_update1.0.exe, wisa_update1.1.exe, wisa_update2.0.exe, etc. The users can not do the update themself due to account restrictions.
We manage to do the update once and distribute it with a copy-item to all PCs. Then with Enter-PSSession I can go to each PC and update the program with the following command:
wisa_update3.0 /verysilent
(with the argument /verysilent no questions are asked)
This is already a major gain in time, but I want to do the update more automatically.
I have a file "pc.txt" with all 400 PCs in it. I use this file already for the Copy-Item via Get-Content. Now I want to use this file to do the updates with the above command, but I can't find a good way to use a remote executable with a parameter in PowerShell.
What you want to do is load get-content -Path $PClist and then run your script actions in a foreach. You'll want to adapt this example to your own script:
$PClist = 'c:\pc.txt'
$aComputers = Get-Content -Path $PClist
foreach ($Computer in $aComputers)
{
code actions to perform
}
Also you can use multithreading and get it over with fraction of time (provided you have a good machine). The below mentioned link explains how to do it well.
http://www.get-blog.com/?p=22

Determine where script is being executed from

I have a script that will send items to the recycle bin (if selected) or delete items permanently. If the script is run locally, the recycle piece works properly.
However, if it's run from a different computer - in this case, my local machine runs the script against a shared folder on a server - the delete is permanent, and doesn't get sent to the recycle bin. The script (in a prior run) makes a decision about WHAT to delete by first setting the Archive bit to TRUE and then (after seeing how many backups it is to retain) un-setting the Archive bit for items to be deleted on the next execution of that same script.
My thought was to alter the main script to mark the files for deletion, but only do the physical action of deleting the file(s) only when the script was being run locally, or to put the Recycle script (by itself) as a Task on the server that would delete & send the item to the Recycle Bin that would run at a set interval.
My questions-
In Powershell (using 2.0) how do you determine the source computer
vs the target computer? In this case, the script is being run from
MyPC, and it's target is Server1.
The script will run whether the target is a mapped drive (Drive Y:),
or if it's targeted by the servername (\Server1). How can you
distinguish the above question in both of these cases?
You can get the local computer name with $env:COMPUTERNAME. Use it to compare the value against the target server name.
For each file, you'd have to check first if the drive is a mapped drive, if it is, get the server name from the wmi instance and compare it to $env:COMPUTERNAME.
You can get a file's Drive qualifier with the Split-Path cmdlet:
PS> $drive = Split-Path Q:\test.txt -Qualifier
PS> $drive
Q:
And then get the server name with WMI:
PS> (gwmi win32_logicaldisk -filter "drivetype=4 and deviceid='$drive'").ProviderName.Split('\')[2]
Server1
The OP wrote:
#Shay - Thanks for your help. I've learned a great deal from many posts by you on various Powershell sites.
I was able to use almost everything you suggested, and only had to add an extra line of code to make it work. I checked the property ([System.Uri]$markedFile).IsUnc to determine if the filename I've read is a UNC name.
It returns False if the drive is mapped, and True if it is UNC. From that, I'm able to get the servername & make a comparison to the environment. Code follows.
$markedFile = "\\Server1\foldername1\Error.log"
#$markedFile = "Y:\foldername1\Error.log"
$TargetComputer = $null
$thisComputer = Get-Content env:computername
if (Test-Path $markedFile) { # if file exists
if (([System.Uri]$markedFile).IsUnc) { # if it's a UNC name & not a mapped drive name
$TargetComputer = ([System.Uri]$markedFile).Host
}
else { #file is not a UNC name, it must be a mapped drive
$drive = Split-Path $markedFile -Qualifier
$TargetComputer = (gwmi win32_logicaldisk -Filter "drivetype=4 and deviceid = '$drive'").Providername.split('\')[2]
}
}
The above code works either way. Thank you again for your help!