Copy file from a Network Drive to a local Drive with a Jenkins Agent - powershell

So here is the situation, I am trying to automate the copy of some files that are in a network drive into a local folder on one of my servers. The task seems to be simple and when I try the code with PowerShell or with x copy in the command line both are working pretty great.
I've installed a Jenkins agent on this Windows server 2016 server and run the agent as a service. When I try to run the same code from the Jenkins agent, it is never working.
I tried starting the agent service as local system and as the windows network administrator who has all the right
I tried with PowerShell those lines :
Copy-Item -Path "\\server IP\directory\*" -Destination "D:\Directory\" -Verbose
and
Copy-Item -Path "z:\*" -Destination "D:\Directory\" -Verbose
Both return no error but did not copy the files, and when I tried the same code with x copy I just receive no file found and the file was not copied
xcopy "\\server IP\directory\*" "D:\Directory\" /f /s /h /y
xcopy "z:\*" "D:\Directory\" /f /s /h /y
With PowerShell, I also tried inserting the copy-file command into a script and only calling the script with the Jenkins agent, and it also didn't work
I am now running in a circle and wonder how are we supposed to work with the network drive with the Jenkins agent? Or what I am doing wrong ?
Note that other PowerShell code are working great locally.

I tried starting the agent service as local system and as the windows network administrator who has all the right
Local system doesn't have any network permissions by default. This is the machine account, so you would have to give the machine access to "\\server\share". It is not advisable though, because permissions should be granted on finer granularity. Also, local system has too many local rights, which Jenkins doesn't need.
I don't know what you mean by "Windows Network Administrator". It sounds like this one would also have too many rights.
Usually, one creates a dedicated Jenkins (domain) user. This user will be granted access to any network paths it needs. Then you have two options:
Always run Jenkins service as this user (easiest way).
Run Jenkins under another local user and connect to network drives as the domain user only on demand: net use \\server\share /user:YourDomain\Jenkins <Password>. This adds some security as you don't need to give any local permissions to the domain user.
Both return no error but did not copy the files
To improve error reporting, I suggest you set $ErrorActionPreference = 'Stop' at the beginning of your PowerShell scripts. This way the script will stop execution and show an error as soon as the first error happens. I usually wrap my PS scripts in a try/catch block to also show a script stack trace, which makes it much easier to pinpoint error locations:
$ErrorActionPreference = 'Stop' # Make the script stop at the 1st error
try {
Copy-Item -Path "\\server IP\directory\*" -Destination "D:\Directory\" -Verbose
# Possibly more commands...
# Indicate success to Jenkins
exit 0
}
catch {
Write-Host "SCRIPT ERROR: $($_.Exception)"
if( $_.ScriptStackTrace ) {
Write-Host ( "`t$($_.ScriptStackTrace)" -replace '\r?\n', "`n`t" )
}
# Indicate failure to Jenkins
exit 1
}
In this case the stack trace is not much helpful, but it will come in handy, when you call functions of other scripts, which in turn may call other scripts too.

Related

Run powershell script within powershell script (Intune related)

This is Intune related but could probably apply outside the scope of Intune as well.
I wrote a PowerShell script that downloads a folder from an Azure blob storage and extracts the content. Within the extracted folder is another PowerShell script that I want to run.
The PowerShell script is deployed in Intune and runs successfully all the way up to the point where the second PowerShell script runs. From the log, it's running into a permission issue.
Scripts deployed through Intune are ran as administrator/system and don't require any local policy change to allow the execution of PowerShell scripts on the device. However on the device, the user account is only a standard user so they don't have permissions to execute PowerShell scripts. In the first script, I've included the "Set-ExclusionPolicy Bypass" command already.
I need to be able to deploy the script from Intune to the local device and essential run another script as the local user (non administrator account). I thought maybe I needed the local user to be included in the local administrators group to be able to run the second script but that did not work either.
Also read somewhere that PowerShell can run PowerShell scripts from other PowerShell scripts directly. The only time you need Start-Process for that is when you want to run the called script with elevated privileges (which isn't necessary here, since your parent script is already running elevated).
^^^Is this my issue? My script does include "Start-Process" to run the next powershell script.
Script below for reference.
New-Item -Path "C:\IT Drivers" -ItemType Directory ;
Invoke-WebRequest - Uri 'https://xyz.zip' -OutFile "C:\xyz.zip" ;
Expand-Archive -Path "C:\xyz.zip" -DestinationPath "C:\";
Set-ExclusionPolicy Bypass ;
Start-Sleep -Seconds 30 ;
Start-Process powershell "C:\xyz.ps1"
Any guidance would be appreciated, thank you!

Group Policy Issue

I'm hoping you can help. For a bit of background one of our Domain Controllers is Server 2008 R2 and the other is 2012 R2, the 2012 R2 contains the more recent ADMX files. The primary DC is the 2008 version and we plan on upgrading this soon.
I have a Default Domain Policy that contains a basic logon script (.bat) that maps drives and adds documents to users desktops. This policy is Enforced and Link Enabled and pushes to all OU's.
I have an additional Policy setup within an OU that is linked to our domain, this includes a Powershell script that runs at logon. This was created on the secondary DC (2012) as it uses the "Start Screen Layout" feature within Start Menu and Taskbar - the primary DC (2008) doesn't have this features available.
For some reason the Default Domain Policy isn't running, the second policy however is successfully running. Is there any reason this might be happening? I've had a mess with the enforcement options for the second policy but I can't fathom out why it's not running both.
It's definately something to do with the Powershell commands I have running in the second policy, if I remove these the default policy login scripts run fine.
Some More Info
To explain, there's 3 scripts in total (all of which sit in the Policy location within sysvol). The first ps1 file copies a shortcut for devices and printers from a network location that all users have access to and pastes it into %appdata% for all users. Part of this script uses the startlayout.xml to reference this location which then add the shortcut to the users start menu.
Copy-Item -Path "\\Server\Share\*.lnk" -Destination "$env:APPDATA\Microsoft\Windows\Start Menu\Programs"
The next command removes a load of the bloatware that Win 10 installs to a users profile, I won't post all of the apps in the script but a few so you get the gist:
$AppList = #(
"*Microsoft.3dbuilder*"
"*AdobeSystemsIncorporated.AdobePhotoshopExpress*"
"*Microsoft.WindowsAlarms*"
"*Microsoft.Asphalt8Airborne*"
)
foreach ($App in $AppList) {
Get-AppxPackage -Name $App | Remove-AppxPackage -ErrorAction SilentlyContinue
}
The script that is added as the actual logon script references the location the two scripts above sit in and run them:
Get-ChildItem \\domain\SysVol\domain\Policies\'{Policy Number}'\User\Scripts\Logon | ForEach-Object {
& $_.FullName
}
Hope this helps, whether or not there's isn't an exit command maybe I'm not sure as I'm not too savy with Powershell scripting.
Further Update
Hi All, another update. I've made all my drive mappings and document copies through Group Policy now which eliminates the first policy therefore there is only one logon script that should run now which is to remove the Bloatware. I've decided to try and call it from a .bat file using the following command:
PowerShell.exe -NoProfile -ExecutionPolicy Bypass -Command "& {Start-Process PowerShell -ArgumentList '-NoProfile -ExecutionPolicy Bypass -WindowStyle Hidden -File "\\domainm\SysVol\domain\Policies\{Policy Number}\User\Scripts\Logon\BloatwareRemoval.ps1"'-Wait}"
Again this will only run after a user logs on for the second time. Is there something glaringly obvious I'm missing here?Thanks in advance.
Thanks in advance.

Script location of a remotely executed script?

How can I get a remotely executed script to know it's own location? I'm using Invoke-Command to run a script on a remote server. The script needs to create files in the directory in which it lives. Running with relative addressing doesn't work (i.e. .\output.log), the scripts generally end up in my user profile on the remote server. I tried all the methods outlined in this question but none of them seem to work when the script is remote.
Update: Provided script invocation code per request
$server='ad1hfdahp802'
$remotepath='\\ad1hfdahp802\d$\AHP\pi_exceed_presentation\pi_exceed_presentation_deploy.ps1'
$SDFEnvironment='INT'
Invoke-Command -ComputerName $server -FilePath $remotepath -ArgumentList($SDFEnvironment,$remotepath)
The remote script takes the $remotepath and turns it into a file system path.
Using -FilePath with Invoke-Command means that you read the script locally and send the content as the scriptblock to the remote computer. $PSScriptRoot only works when the script is executed directly on the target. You could try using:
Invoke-Command - ComputerName "computer1" -Scriptblock { & '\\server\path\to\script.ps1' } -Authentication Credssp
Be aware that you need CredSSP to make this work since the remote computer can't use your credentials to access network-resources without it. As an alternative, you could use psexec (or start a process remotely). Ex.
psexec \\computer1 powershell -noprofile -file \\server\path\to\script.ps1
After trying some of the changes proposed I've come to understand that the Invoke-Command isn't actually running the remote script at its original location, but rather loading it from the original location and then running it under the context of PowerShell as the user running the local script. The "script directory" is actually a directory in the user's workspace regardless of where the script originally lived.
This clarifies things for me somewhat. While there may be ways to divine where the script originally came from or to actually start a session on the remote server then run the script as a "local" script there, the need for the remote script to further access other servers, creating multiple hops in authentication, means I have to add CredSSP to the mix.
It seems my original plan, to pass the path I'm using to locate the script to the script so it can place output files in the original directory, is probably the best approach given that I also have to add CredSSP to the mix.
I'm open to refutation, but I don't think any of the proposed solutions actually improve the functionality of the remote script so I'm going to stick with what I started with for now. Thanks to everyone for their contributions.
Enter a session on the remote server, and call the script from there.
local PS> Enter-PSSession -ComputerName $server ...
remote PS> powershell d:\AHP\...\script.ps1
remote PS> exit
local PS>
Then you can use $PSScriptRoot in the script in the remote server to get the local path of the directory of the script on the remote server.
EDIT:
To locate the script on the remote server, you can use your knowledge of the network path of the script file, and parse the output of net share to map network path to local path on the remote server.
remote PS> net share | where { $_.StartsWith('D$ ') } | foreach { [regex]::Split($_, " +")[1]}

How do I issue a remote copy command?

The script I'm working on right now performs a bunch of work on the email server for processing Litigation hold requests. One of the components of this script is to copy all the data from an identified folder on a file server to a secure backup location that can't be altered which happens to reside on the same remote server. I have the script working fine but it copies all the files across the network from the file server to the email server where the PowerShell script is run and then pushes it right back to the same file server it came from to the "hold" folder. This is creating severe delays as some of these folders are several thousands of files ranging from a few bytes to several meg each.
I'm looking for a way for my PowerShell script running on the email server to issue a remote command to the file server to copy a folder from one directory to another (all on the same remote server). Any suggestions?
Currently, I'm using the robocopy command in my script but I'm open to other ways of performing this.
robocopy "Source Folder" "Destination Folder" /e
I'm looking for something like:
PSEXEC \\\FileServer -s -d Copy "Source folder" to "Destination folder"
I just can't seem to find any material on the old google train that satisfies the situation.
psexec \\fileserver -s robocopy "source" "destination" /e
should work just fine. With PSRemoting enabled on the fileserver you could also do
Invoke-Command -Computer fileserver -Scriptblock {
& robocopy "source" "destination" /e
}
Note that this assumes local paths on the remote server. If you're working with network shares on the remote host you'll be running into the second hop problem.
For some strange reason the first attempt to use PSEXEC prior to posting resulted in an error stating it didn't know how to handle the command. But I plugged in the PSEXEC command again and it worked perfectly today.
Thank you.

Execute remote quiet MSI installs from Powershell

I am trying to use the Invoke-Command powershell cmdlet to install a MSI installer. From within powershell on the local machine and from the proper directory, the following works:
./setup /quiet
The following does not seem to work:
$script =
{
param($path)
cd "$path"
& ./setup /quiet
return pwd
}
return Invoke-Command -ComputerName $product.IPs -ScriptBlock $script -Args $sourcePath
For test purposes I am working on the local machine passing in "." for the -ComputerName argument. The paths have been verified correct before passing in to Invoke-Command, and errors generated on different versions of this code indicate the paths are correct. I have also tried with and without the "& " on the remote call to setup. Other Invoke-Command calls are working, so I doubt it is a permissions issue. I have verified that the return from the pwd call is the expected directory.
How do I get the install to work?
What error (if any) are you receiving? Unfortunately, you must run the shell as admin on your local machine to be able to connect to your local machine with invoke-command or any WINRM based command that requires administrative privilege (this is not a requirement when connecting remotely).
When connecting to loopback, I believe it is unable (for some security reason) to enumerate groups and determine if you are in an admin enabled AD or local group, which is how it auto elevates when invoking on a remote machine. The only solution may be to have a conditional which checks for localhost and if so, don't use the -ComputerName parameter.
This GitHub Issue covers it
You might try using Start-Process in your script block:
cd $path
start-process setup.exe -arg "/quiet"
Not sure if you will want or need to wait. Look at help for Start-Process.
I have had weird issues when trying to remotely execute a script on a local machine. In other words, remote powershell to the local machine. It comes back with an error that seems to say that PowerShell remoting is not enabled on the machine, but it was. I can run the script remotely from another machine to the target, but when using remoting to the same box, the issue crops up.
Verify that the WinRM service is running.
Verify powershell remoting has been enabled as in Enable-PSRemoting -force.
Verify your powershell execution policy is loose enough as in Set-ExecutionPolicy Unrestricted, for example. If the policy was set to RemoteSigned, this might be the problem.
You might also want to verify the user you are running the script as (locally, but using remoting) has privileges to "log on as a service" or as a batch job. Just guessing there, if the above list doesn't solve anything.