Changing log directory script IIS - powershell

I ve tried to create a code which changes the directory from default to E:\Logfiles . Additionally I need to manage there some permissions groups which should be stored in XML or just added via ACL.
Could someone please give me advice how to manage with that?
My short code is below which should check if folder exist, go trough all servers on the farm.
Import-Module WebAdministration
$LogPath = “E:\LogFiles\”
foreach ($srv in (Get-SPServer | ? {($_.role -like "WebFrontEnd*") -or ($_.role -like "Application")}))
{
If(!(test-path $Logpath))
{
New-Item -ItemType Directory -Force -Path $path
}
foreach($site in (dir iis:\sites\*))
{
New-Item $LogPath\$($site.Name) -type directory
Set-ItemProperty IIS:\Sites\$($site.Name) -name logFile.directory -value “$LogPath\$($site.Name)”
}
}

Related

How to copy files based on last modified date to network drive?

Our Git repo blew up and we ended up losing the repo so now all our our users code is only on local workstations. For temporary storage we are going to have all of them put their local repo's on a network share. I am currently trying to write a PowerShell script to allow users to select all their repos with GridView and then copy them to the network share. This will cause a lot of overlap, so I only want files that have the latest modified date (commit) to overwrite when their are duplicate files.
For example,
User 1 has repo\file.txt last modified 8/10 and uploads it to network share.
User 2 also has repo\file.txt last modifed 8/12. when User 2 copies to the share it should overwrite User 1 file because it is the newer file.
I am new to PowerShell so I am not sure which direction to take.
As of right now I figured out how to copy over all files, but can't figure out the last modified piece. Any help would be greatly appreciated.
$destination = '\\remote\IT\server'
$filesToMove = get-childitem -Recurse | Out-GridView -OutputMode Multiple
$filesToMove | % { copy-item $_.FullName $destination -Recurse }
If your users have permission to write/delete files in the remote destination path, this should do it:
$destination = '\\remote\IT\server\folder'
# create the destination folder if it does not already exist
if (!(Test-Path -Path $destination -PathType Container)) {
Write-Verbose "Creating folder '$destination'"
New-Item -Path $destination -ItemType Directory | Out-Null
}
Get-ChildItem -Path 'D:\test' -File -Recurse |
Out-GridView -OutputMode Multiple -Title 'Select one or more files to copy' | ForEach-Object {
# since we're piping the results of the Get-ChildItem into the GridView,
# every '$_' is a FileInfo object you can pipe through to the Copy-Item cmdlet.
$skipFile = $false
# create the filename for a possible duplicate in the destination
$dupeFile = Join-Path -Path $destination -ChildPath $_.Name
if (Test-Path -Path $dupeFile) {
# if a file already exists AND is newer than the selected file, do not copy
if ((Get-Item -Path $dupeFile).LastWriteTime -gt $_.LastWriteTime ) {
Write-Host "Destination file '$dupeFile' is newer. Skipping."
$skipFile = $true
}
}
if (!$skipFile) {
$_ | Copy-Item -Destination $destination -Force
}
}
this is my first post here so please be forgiving. I'm browsing reddit/stackoverflow looking for cases to practice my PowerShell skills. I tried creating a script like you asked for on my local home PC, let me know if that somehow helps you:
$selectedFiles = get-childitem -Path "C:\Users\steven\Desktop" -Recurse | Out-GridView -OutputMode Multiple
$destPath = "D:\"
foreach ($selectedFile in $selectedFiles) {
$destFileCheck = $destPath + $selectedFile
if (Test-Path -Path $destFileCheck) {
$destFileCheck = Get-ChildItem -Path $destFileCheck
if ((Get-Date $selectedFile.LastWriteTime) -gt (Get-Date $destFileCheck.LastWriteTime)) {
Copy-Item -Path $selectedFile.FullName -Destination $destFileCheck.FullName
}
else {
Write-Host "Source file is older than destination file, skipping copy."
}
}
}

powershell create folder in other userprofile with administrative privilege

i have a user with standard right, and i need to run a powershell script with admin right to do something, and finnaly create a folder and copy a single file in current logged userprofile.
How i can do this?
example:
C:\Users\USERNAME\FOO\FOO.TXT
i do this but, obviusly, create the folder in my admin profile
# DO SOMETHINGS BEFORE
$directory = $env:USERPROFILE + 'FOO'
if(!(Test-Path -Path $directory)){
New-Item -Path $env:USERPROFILE -Name "FOO" -ItemType "directory"
}
Copy-Item "testo.txt" -Destination $directory
# Copy-Item "arDigiCore.ini" -Destination $arDigiSign
Thanks in advance
EDIT:
1 - i run my powershell script, logged like standard user (e.g. user1), like a admin (e.g. admin1).
2 - the script install a program, and before end, check and in case create a folder in the path C:\Users\users1\foo
NB: I do not know before the name of the user logged in to execute the program
You can use query.exe to pull the current users. Then filter the active user that isn't you.
$user = (((& query user) | ? {$_ -like "*active*" -and $_ -notlike "AdminUserName"}).trim() -Replace '\s+',' ' -Split '\s')[0]
#Credit to Jaap Brasser https://gallery.technet.microsoft.com/scriptcenter/Get-LoggedOnUser-Gathers-7cbe93ea
then convert to a SID and match to the SID returned from Win32_UserProfile
$NTUser = New-Object System.Security.Principal.NTAccount($user)
$SID = ($NTUser.Translate([System.Security.Principal.SecurityIdentifier])).value
$directory = (gcim Win32_UserProfile | ? {$_.sid -eq $SID}).localpath
if(!(Test-Path -Path (Join-Path $directory FOO))){
New-Item -Path $Directory -Name "FOO" -ItemType "directory"
}
Copy-Item "testo.txt" -Destination (Join-Path $directory FOO)

Ceate folders for AD users using a CSV file produces wrong folder names

I have created a CSV file listing certain Active Directory users. Now I want to use this CSV to create a certain amount of folders for those users. I started with
$UserList = Import-Csv .\users.csv
and continued with
ForEach ($UserName in $UserList) {
$UserName
New-Item -Name $Username -ItemType directory -Path .\Download\$UserName
New-Item -Name $Username -ItemType directory -Path .\Home\$UserName
New-Item -Name $Username -ItemType directory -Path .\Publishing\$UserName
}
What I expect is that each folder Download, Home and Publishing contains a subfolder having the username, e.g. testuser.
When I run this script, the result for the folder names is #{name=testuser} which is not the expected result.
Any ideas how to solve this naming problem?
You need to dereference the property.
If you run this:
$UserList = Import-Csv .\users.csv
$UserList[0]
You should see something like this:
name
----
testuser
The name heading there tells you it's a property of the object $UserList.
For the solution, you could do this:
ForEach ($User in $UserList) {
$UserName = $User.Name
$UserName
New-Item -Name $Username -ItemType directory -Path .\Download\$UserName
New-Item -Name $Username -ItemType directory -Path .\Home\$UserName
New-Item -Name $Username -ItemType directory -Path .\Publishing\$UserName
}
Or this:
ForEach ($User in $UserList) {
$User.Name
New-Item -Name $Username -ItemType directory -Path .\Download\$($User.Name)
New-Item -Name $Username -ItemType directory -Path .\Home\$($User.Name)
New-Item -Name $Username -ItemType directory -Path .\Publishing\$($User.Name)
}
Or, alternately, you could get just the names on import like this:
$UserNameList = Import-Csv .\users.csv | Select-Object -ExpandProperty Name
ForEach ($UserName in $UserNameList) {
$UserName
New-Item -Name $Username -ItemType directory -Path .\Download\$UserName
New-Item -Name $Username -ItemType directory -Path .\Home\$UserName
New-Item -Name $Username -ItemType directory -Path .\Publishing\$UserName
}
However, if there are other values in your CSV file that you want to use, this last option isn't a good solution because you're only importing the name.

powershell - copy specific user folders based on last modified

I need to copy the Documents, Favorites, and Desktop folders for each user that has logged in to the PC in the last 30 days.
I'm just starting to learn powershell and I've got a decent start I think, but I'm wasting too much time. This is a project for work, and I've found myself just digging for solutions to X problem only to run into another at the next turn. Spent about a month trying to get this sorted out thus far, and thrown away a lot of codes.
What I have as of right now is this:
Get-ChildItem -path c:\users |Where-Object { $_.lastwritetime -gt (get-date).AddDays(-30)}
I know that this line will return the user folders that I need. At this point, I need code that will go in to each childitem from above and pull out the Documents, Favorites, and Desktop folder.
Now the tricky part. I need the code to create a folder on c: with the username it is pulling those folders from.
So the solution should:
for each user logged in in last 30 days;
copy Documents, Favorites, Desktop folder from their user drive
create a folder on c:\ for that user name
paste Documents, Favorites, Desktop to that folder
To better cover the scope:
I have to reimage PCs a lot in my department. The process of "inventorying" a PC is copying those folders and replacing them on the new PC I image for the user. That way their desktop etc looks the same and functions the same when they get their new PC. This code will be part of a larger code that ultimately "inventories" the entire PC for me... Ultimately, I want to be able to run my script for 2 seconds and then pull X folders and X documents off the c: drive on that PC as opposed to click, click, click, click a hundred times for 9 users that have used the PC in the last 30 days.
Any ideas?
2dubs
$usersFoldr = Get-ChildItem -path c:\users | Where-Object { $_.lastwritetime -gt (get-date).AddDays(-30)}
foreach ($f in $usersFoldr){
$toFld = "c:usrTest\" + $f.Name +"\Desktop\"
New-Item $toFld -type directory -force
Get-ChildItem ($f.FullName + "\Desktop") | Copy-Item -destination $toFld -Recurse -Force
}
Thanks to #bagger for his contribution. He was close.
After some experimentation, I found that this is the actual solution:
$usersFoldr = Get-ChildItem -path c:\users | Where-Object {
$_.lastwritetime -gt (get-date).AddDays(-30)}
foreach ($f in $usersFoldr)
{
$doc = "c:\users\$f\documents"
$toFldDoc = "c:\$f\documents"
New-Item $doc -type directory -force
Copy-Item $doc $toFldDoc -recurse -Force
}
foreach ($f in $usersFoldr){
$desk = "c:\users\$f\desktop"
$toFldDesk = "c:\$f\desktop"
New-Item $desk -type directory -force
Copy-Item $desk $toFldDesk -recurse -Force
}
foreach ($f in $usersFoldr){
$fav = "c:\users\$f\favorites"
$toFldFav = "c:\$f\favorites"
New-Item $fav -type directory -force
Copy-Item $fav $toFldFav -recurse -Force
}
Then save this file, send a shortcut of it to the desktop, then change the target of the shortcut to this:
C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -f "C:\YOURDIRECTORY\YOURSCRIPTNAME.ps1"
Then run that shortcut as an administrator. Works like gold.
Thanks for your help, guys! :)
For anyone interested in the whole script:
Inventory script to copy pertinent files for all users in last 30 days, gather printer hostname/driver/IP, gather serialnumber, gather make/model.
$usersFoldr = Get-ChildItem -path c:\users | Where-Object {
$_.lastwritetime -gt (get-date).AddDays(-30)}
foreach ($f in $usersFoldr){
$doc = "c:\users\$f\documents"
$toFldDoc = "c:\inventory\$f\documents"
New-Item $doc -type directory -force
Copy-Item $doc $toFldDoc -recurse -Force
}
foreach ($f in $usersFoldr){
$desk = "c:\users\$f\desktop"
$toFldDesk = "c:\inventory\$f\desktop"
New-Item $desk -type directory -force
Copy-Item $desk $toFldDesk -recurse -Force
}
foreach ($f in $usersFoldr){
$fav = "c:\users\$f\favorites"
$toFldFav = "c:\inventory\$f\favorites"
New-Item $fav -type directory -force
Copy-Item $fav $toFldFav -recurse -Force
}
Get-WMIObject -class Win32_Printer | Select Name,DriverName,PortName
|Export-CSV -path 'C:\Inventory\printers.csv'
Get-WmiObject win32_bios |foreach-object {$_.serialnumber} |out-file
'c:\Inventory\SerialNumber.txt'
Get-WmiObject Win32_ComputerSystem | Select Model,Manufacturer |out-file
'c:\Inventory\MakeModel.txt'
Again, save this file, send a shortcut of it to the desktop, then change the target of the shortcut to this:
C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -f "C:\YOURDIRECTORY\YOURSCRIPTNAME.ps1"
You can also retrieve a list of installed software by adding this line to the script:
get-wmiobject win32_product | select Name |export-csv -path 'c:\inventory
\software.csv'

Powershell filter contents to find file or directory

I'm reading in contents of a text file that includes files and folders that I'm applying audit settings to. The folders have a few different settings on the SACL so I wanted to filter to find all the files and do an action and then do a different action to only the directories. I'm having trouble finding a way to filter/differentiate between the two. I'm using PowerShell v2.0, upgrading may not be a possibility.
Here's the code I have, I know it doesn't work, but gives an idea of my thinking:
import-module NTFSSecurity
$Targetfiles = Get-Content c:\temp\test.txt
if($Targetfiles -match ".exe or .dll or .msc"){
$files = $Targetfiles -match ".exe or .dll or .msc"
foreach ($File in $files){
Get-NTFSAudit -Path $files | Remove-NTFSAudit -PassThru
Add-NTFSAudit -Path $files -Account 'NT Authority\Everyone' -AccessRights FullControl -Type Failure -AppliesTo ThisFolderOnly
Add-NTFSAudit -Path $files -Account 'NT Authority\Everyone' -AccessRights ExecuteFile, AppendData, Delete, ChangePermissions, TakeOwnership -Type Success -AppliesTo ThisFolderOnly
}
}
else{
$directories = $Targetfiles -notmatch ".exe or .dll or .msc"
foreach ($Directory in $directories){
Get-NTFSAudit -Path $directories | Remove-NTFSAudit -PassThru
Add-NTFSAudit -Path $directories -Account 'NT Authority\Everyone' -AccessRights FullControl - Type Failure -AppliesTo ThisFolderOnly
Add-NTFSAudit -Path $directories -Account 'NT Authority\Everyone' -AccessRights ExecuteFile, AppendData, ReadData, CreateFiles, Delete, ChangePermissions, TakeOwnership -Type Success -AppliesTo ThisFolderOnly
}
}
Obviously the -match/-notmatch isn't working. I want the script to check for all the items with an extension, put them into $files and then do the work and anything that doesn't have an extension, go into $directories and do that work. I'm still learning, so my logic may not work. I've tried "." with -match, but it doesn't seem to do anything.
The module can be found here: https://gallery.technet.microsoft.com/scriptcenter/1abd77a5-9c0b-4a2b-acef-90dbb2b84e85#content
Thank you for any help!
I've not worked with that particular module so I can't debug anything there, but for your logic this is what I'd use:
Import-Module NTFSSecurity;
Get-Content C:\Temp\test.txt | ForEach-Object {
If (Test-Path $_ -PathType Leaf) {
If ((Get-Item $_).Extension -in ('.exe','.dll','.msc')) {
#Apply executable file rule
}
} ElseIf (Test-Path $_ -PathType Container) {
#Apply folder rule
} Else {
Write-Output "Path not found: $_";
}
}
Import the module.
Get the content of the file
For each line of the content...
If it's a leaf object (aka, a file), check if it has the .exe, .dll, or .msc extension. If so, apply the NTFSSecurity commands for files.
If it's a container (aka, a directory), apply the NTFSSecurity commands for directory.
If it wasn't a leaf or a container, it's because it wasn't a valid path at all.
My syntax is a bit pedantic and could be improved, but this should get you started. I'm not sure what you're trying to accomplish and this is the most straightforward version.