Issue with permissions when creating and copy directories - powershell

I have created a PowerShell script for copying files to a directory, the script, first creates a folder , or forces a new folder event if it exists. Then copies a directory from another location. After copying, the files I then need to copy the correct web config depending on a value given by the user execturing the script. The issue I am having is I can copy the files, but all the files are set to read-only meaning when I try and copy the correct web.config, the script fails as access is denied.
This is a cut down version of script for simplicity.
$WebApp_Root = 'C:\Documents and Settings\user\Desktop\Dummy.Website'
$Preview_WebApp_Root = 'c:\applications\Preview\'
$Choice = read-host("enter 'preview' to deploy to preview, enter Dummy to deploy to Dummy, or enter test to deploy to the test environment")
if (($Choice -eq 'Preview') -or ($Choice -eq 'preview'))
{
$Choice = 'Preview'
$Final_WebApp_Root = $Preview_WebApp_Root
}
write-host("Releasing Build to " + $Choice +'...')
write-host("Emptying web folders or creating them if they don't exist... ")
New-Item $Final_WebApp_Root -type directory -force
write-host("Copying Files... ")
Copy-Item $WebApp_Root $Final_WebApp_Root -recurse
write-host("Copy the correct config file over the top of the dev web config...")
Copy-Item $Final_WebApp_Root\Config\$Choice\Web.configX $Final_WebApp_Root\web.config
write-host("Copying correct nhibernate config over")
Copy-Item $Final_WebApp_Root\Config\$Choice\NHibernate.config $Final_WebApp_Root\NHibernate.config
write-host("Deployed full application to environment")

Try to use -Force parameter to replace read-only files. From documentation:
PS> help Copy-Item -Par force
-Force [<SwitchParameter>]
Allows the cmdlet to copy items that cannot otherwise be changed,
such as copying over a read-only file or alias.

Related

Copy-item cmdlets only working correctly when the destination folder exists

I want to copy folders with their contents to a remote computer using a PSSession and Copy-item. When running the script for the first time it has to create the destination folder, it does so correctly and then is supposed to dump the folders with their contents inside into the destination folder. What it is instead doing is dumping two of the folders correctly and then dumping the contents of the third folder, not the folder itself. When I run it a second time without deleting the destination folder, everything runs fine.
I have tried using various different parameters, including -container but it doesn't seem to help at all. Here is where I use the function in my code, I use a lot of environment variables and variables in general because this needs to be a script that can be put anywhere and work.
if (Test-Path -path "$env:TEMP\VMlogs") {
Write-Host "I'M GONNA SEND IT!"; Pause
Copy-Item -path "$env:TMP\VMLogs\*" -tosession $Targetsession -destination $Destination`_$Source -force -recurse
Write-Host Logs copied sucessfully!
Remove-Item "$env:TEMP\VMlogs" -recurse
} else {
Write-Host "There was an issue copying logs!"
Pause
Exit
What I expect is that the folders are put into the destination folder with their intact structure but instead this only happens on the second running of the script, after the destination folder has already been created.

How do I copy multiple files from multiple hosts in powershell?

I am trying to make a powershell script (5.1) that will copy several files and folders from several hosts, these hosts change frequently therefore it would be ideal if I can use a list that I can append when required.
I have this all working using xcopy so I know the locations exist. I want to ensure that if a change is made when I am not In work someone can just add or remove a host in the text file and the back up will continue to work.
The code I have is supposed to go through each host in my list of hosts and copy all the files from the list of file paths before moving onto the next host.
But there are 2 errors showing up:
The term '\REMOTEHOST\c$\Users\Public\desktop\back-up\$Computers' is not recognized as the name of a cmdlet, function, script
file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
At line:8 char:17
and:
copy-item : Cannot find path '\HOST\C$\LK\Appdata\Cmmcfg C$\LKAppData\Errc C$\LK\Appdata\TCOMP C$\LK\Probes C$\LK\Appdata\CAMIO C$\LK\Appdata\LaunchPad C$\LK\Appdata\Wincmes
C$\barlen.dta C$\Caliprogs C$\Cali' because it does not exist.
This does not seem to reading through the list as I intended, I have also noticed that the HOST it is reading from is 6th in the list and not first.
REM*This file contains the list of hosts you want to copy files from*
$computers = Get-Content 'Y:\***FILEPATH***\HOSTFILE.txt'
REM*This is the file/folder(s) you want to copy from the hosts in the $computer variable*
$source = Get-Content 'Y:\***FILEPATH***\FilePaths.txt'
REM*The destination location you want the file/folder(s) to be copied to*
$destination = \\**REMOTEHOST**\c$\Users\Public\desktop\back-up\$Computers
foreach ($item in $computers) {
}
foreach ($item in $source) {
}
copy-item \\$computer\$source -Destination $destination -Verbose
Your destination variable needs to be enclosed in quotes. To have it evaluate other variables inside of it, enclose it in double quotes. Otherwise PowerShell thinks it's a command you are trying to run.
$destination = "\\**REMOTEHOST**\c$\Users\Public\desktop\back-up\$Computers"
cracked it, thank you for your help. I was messing up the foreach command!I had both variables set to Item, so I was confusing things!
foreach ($itemhost in $computers) {
$destination = "\Remotehost\c$\Users\xoliver.jeffries\desktop\back-up\$itemhost"
foreach ($item in $source)
{copy-item "\$itemhost\$item*" -Destination $destination -Verbose -recurse}
}
Its not the neatest output but that's just a snag! the code now enables me to use a list of hosts and a list files and copy them to a remote server!

PowerShell Active Directory Login Script Auto-Build

I've created an active directory account creation script using powershell 4.
My Boss has stated there's a new policy where we have to build a login script per user, is there a way to do this where it'll build the .bat file and map the drives that we specify within the script?
I know there's a way to build .txt files, but not sure about .bat.
What I need
Select Drives That The user Needs Access To
I need it to build a .bat file, mapping the drives previously specified.
Then move it to the login script folder on the DC, mapped to S
For Future reference to anybody who wants to do this.
I've managed to resolve it myself after some playing around.
$NewName = $SAMAccountName
$extension = ".bat"
$FileName = "$SAMAccountName$extension"
$ScriptDrive = "\\IPREMOVED\scripts"
Write-Output "
BAT CONTENTS" `n`n|FT -AutoSize >>LoginScript.txt
Get-ChildItem LoginScript.txt | Rename-Item -NewName $FileName
Move-Item -Path ".\$FileName" -Destination $ScriptDrive

How can I use Powershell to keep two directories updated with the latest files

I have two directories:
C:\G\admin\less
C:\G\user\less
Inside of those directives I have multiple less and css files. I know all the names of the files so I would like to hardcode a list of these into the script. Perhaps this could be in an array or something but my knowledge of PowerShell is not even enough to know if there are arrays in the scripting language.
C:\G\admin\less
html-light.less
html-light.css
html-dark.less
html-dark.css
do-not-track-me.less
C:\G\user\less
html-light.less
html-light.css
html-dark.less
html-dairk.css
do-not-track-me.less
Is there a way I can use PowerShell to check each of these files (that I want to hardcode in my program) one by one and copy the last modified file from its directory to the other directory so that both directories will contain the same latest versions of these files?
Note that I would need to evaluate the predefined list of files one by one. Comparing modified date in one directory with the other and copying over as needed.
again assume that this isn't the best solution or approach
This solution assumes following
- when the LastWriteTime on one folder is bigger than the other it copy it to another folder.
- I'm not doing the path validation because of laziness but if you want with path validation just ask.
- I'm assuming that all the files on those folder must be tracked otherwise read the comment on code.
- i suggest you backup your folder before you run the script.
#if there is a file you don't want to track on those folder (for example you don't want to track txt files)
#just write $userFiles=dir C:\G\user\less\ -Exclude "*.txt"
#if you don't want track txt but only one of them should be track among with other file format
#$userFiles=dir C:\G\user\less\ -Exclude "*.txt" -Include "C:\G\user\less\Txtadditionaltotrack.txt"
$userFiles=dir C:\G\user\less\
$adminfiles=dir C:\G\admin\less\
foreach($userfile in $userFiles)
{
$exactadminfile= $adminfiles | ? {$_.Name -eq $userfile.Name} |Select -First 1
#my suggestion is to validate if it got the file.
#By now because of my lazy i will not call the test-path to validate if it got the file
#I'm assuming all directory are exact copy of each other so it will find the file.
if($exactadminfile.LastWriteTime -gt $userfile.LastWriteTime)
{
Write-Verbose "Copying $exactadminfile.FullName to $userfile.FullName "
Copy-Item -Path $exactadminfile.FullName -Destination $userfile.FullName -Force
}
else
{
Write-Verbose "Copying $userfile.FullName to $exactadminfile.FullName "
Copy-Item -Path $userfile.FullName -Destination $exactadminfile.FullName -Force
}
}
you can improve it because the way this code is it always copy file from one directory to another inside the else you can validate so that when the lastwriteTime is equal on both it doesn't copy.
You can improve it in many ways. i hope you got the ideia
Find the modification made to code so that it can archieve this requirement.
PLEASE READ THE COMMENT IN CODE.
NOTE THAT I'M NOT FOLLOWING THE BEST PRATICE (avoid unexpected error, name correctly all variable, ...)
#to make it more dynamical you can save on one file
#all the file names including extension in different lines.
#For example on path C:\FilesToWatch\watcher.txt
#$filestowatch=get-content C:\FilesToWatch\watcher.txt
$filestowatch="felicio.txt","marcos.txt"
$userFiles=dir C:\G\user\less\
$adminfiles=dir C:\G\admin\less\
#Optionally instead of use this if approach you can
#$userFiles=dir C:\G\user\less\ |? {$filestowatch -contains $_.Name}
#$adminfiles=dir C:\G\admin\less\|? {$filestowatch -contains $_.Name}
#loading in the above manner the first if statement on code bellow can be removed because
#We make sure that $userFiles and $adminfiles only have correct file to monitor
foreach($userfile in $userFiles)
{
if($filestowatch -contains $userfile.Name)
{
$exactadminfile= $adminfiles | ? {$_.Name -eq $userfile.Name} |Select -First 1
#my suggestion is to validate if it got the file.
#By now because of my lazy i will not call the test-path to validate if it got the file
#I'm assuming all directory are exact copy of each other so it will find the file.
if($exactadminfile.LastWriteTime -gt $userfile.LastWriteTime)
{
Write-Verbose "Copying $exactadminfile.FullName to $userfile.FullName "
Copy-Item -Path $exactadminfile.FullName -Destination $userfile.FullName -Force
}
else
{
Write-Verbose "Copying $userfile.FullName to $exactadminfile.FullName "
Copy-Item -Path $userfile.FullName -Destination $exactadminfile.FullName -Force
}
}
}
I think what you need is symbolic links, aka symlinks.
With symlinks, you can define files and folders that will be always in-sync, where the target file is updated automatically when the original is modified.
To create a symbolic link, enter the following in the console/command prompt:
mklink /[command] [link path] [file or folder path]
Mklink can create several types of links, according to these commands:
/D – creates a soft symbolic link, which is similar to a standard folder or file shortcut in Windows. This is the default option, and mklink will use it if you do not enter a command.
/H – creates a hard link to a file.
/J – creates a hard link to a folder.
The syntax is simple. Choose your option, define the path you want for the symlink, and finally the path of the original file/folder.
For example, imagine I'm developing a new project, and I want to share it to my client via Dropbox shared folder. I don't want to move all my workspace to dropbox, I just want to share that specific folder to them:
mklink /J C:\Dropbox\clients_shared_folders\project_x C:\my_dev_rootfolder\project_x
Note that the first path is the symbolic folder I want to create, while the second path is the existing directory.
In you case, I'll be assuming your working on the admin folder, and want to generate a syncd copy on the user folder:
mklink /J C:\G\user\less C:\G\admin\less
Here's a nice article for more info:
http://www.howtogeek.com/howto/16226/complete-guide-to-symbolic-links-symlinks-on-windows-or-linux/

Need a PowerShell script that copies files from the current dir that the script is run from to another location

I was asked to write a PowerShell script that they can package in with their build updates. They will complete a build that gets dropped to a folder (say \server\build\release1.1.2). We need a script that takes all the files/folders from that folder and copies them to the appropriately named locations.
I need the script to also read the # of the current build from the folder title and create that same # build folder when it copies. Easy enough to that however I need the references to be all dynamic, so when that Release1.1.3 comes out wecan drop the same script into there and it will copy all the files to the appropriate directories (and create them if they don't exist).
This script should get you started. Run it to see an example of the values it produces.
# variable name chosen based on the automatic variable available to PowerShell Modules
$PSScriptRoot = ($MyInvocation.MyCommand.Path | Split-Path | Resolve-Path).ProviderPath
$BuildName = $PSScriptRoot | Split-Path -Leaf
#"
This script file is located at:
$($MyInvocation.MyCommand.Path)
The folder this script file is in is:
$PSScriptRoot
The name of the folder this script file is in is:
$BuildName
To copy files you might do this:
Copy-Item -Path `$PSScriptRoot\* -Destination C:\Install\`$BuildName -Recurse
"#