I have added a PowerShell script as a Group Policy computer startup script. The script runs fine and does all of the tasks fine. However, at the end of the script, it is supposed to copy a log file to a file share, which it is not doing. The file share shows that "SYSTEM" has full control, so I'm not sure what the issue is. I'm able to run the script as admin while on the same machine and it will copy the log to the server without a problem. It does not do this via computer startup script (under SYSTEM account) though. Any ideas?
You will need to give the computer account write permissions on the network share. When the SYSTEM account is used to access a network resource it will do so as the domain account of the computer (DOMAIN\COMPUTER$).
Related
As a final step in our AD account creation process that is being moved to a powershell script a few folders need to be created on the filer for users and I am coming unstuck with permissions.
I am just using the basic new-item command to create folder but the locations need unix permissions (775) set before anything can be created. I can't go there and right click in Windows explorer and click new.. and the powershell script is being bounced also due to permissions.
The reasoning from one of the tech guys here is that I am trying to create a sub folder via smb mount from Windows using ntfs permissions. There is no correlation to unix permissions and any of our Linux users won't be able to access / use the location created for them.
Sorry if that is a clumsy way of explaining it, I am not a systems engineer, just the guy trying to translate a whole heap if pearl scripts into a new powershell process.
Thank you
S.
I've got a powershell script to archive log files. The script is intended to be run daily from a scheduled task as a specified user 'LogArchiver'.
The script uses the aws-cli to copy the file to S3 and needs sufficient credentials to access the bucket which are stored in the user directory C:\Users\LogArchiver\.aws as recommended in the aws docs.
When I run the script from a powershell terminal running as the user it recognises the credentials and successfully copies files to S3. But when it is run from the scheduled task it doesn't recognise the aws credentials and writing the Transcript to file shows the message:
Unable to locate credentials. You can configure credentials by running "aws configure".
Anyone know why this is and any fixes to it? I've read in another post about scheduled tasks doing funny things to environment variables but not sure if that would cause the problems i'm having.
Turns out that it was a bug in server 2012 and is fixed by this patch
https://support.microsoft.com/en-gb/kb/3133689
The 'fix' for me was to change the USERPROFILE environment variable at the top of the script with
$env:USERPROFILE = "C:\Users\LogArchiver"
Not elegent, but works.
I am using Putty to transfer files from my windows machine to Linux machine.
I am able to transfer, when i run the script and also if i run the same script using Schedule task with my credentials.
if schedule the task to run using system account(SYSTEM) or other user account, file transfer not happening.
Do i need to save any session vales?
PuTTY saves session information in the registry for the current user only, this information will simply be not available for the other accounts you mentioned. So you either need to provide them by exporting yours and importing them in the other user's accounts or simply provide everything needed on the shell command invoked to copy your files. The latter sounds much easier to me in combination with a little script which gets invoked by the task scheduler.
I have a .exe file in some location C:/foo/myhost.exe which actually creates MSMQ queues in the system and I need to run that a specific user (Network account). I am looking for script where business user can login to the server click the exe(But it should run as network account). I can shift-right-click on exe but I am looking to automate it.
Any help is appreciated
runas with savecred option
http://anyadangtech.blogspot.ca/2009/06/runas-without-password.html
But you need to run the first time to input the passwd. Afterwards it will remember the passwd.
I'm creating and testing some powershell scripts to do some basic file copying. I've set my executionpolicy to RemoteSigned. According to the help, this should allow me to run scripts that were not downloaded from the internet. However, my observations seem to indicate that this will run only scripts created on the local machine.
For instance, if I create a script on my development machine and try to copy to my server (on my same domain), the script will not run. However, if I open up the Powershell ISE on the server and open my script, copy the code and paste it into a new file window and save it to the server, the script then runs. Further, if I want to create a self-signed certificate, it will not run on other computers (per the help).
So, this all seems a bit cumbersome that I have to develop my scripts on the machine they are to be run or go through the copy/paste routine mentioned above to get them to run on my server. I just want to know that I've understood all of this correctly and there is no other way to create a script within the same domain and run it under the remotesigned execution policy without paying the fee for a certificate.
this post here provide the method for executing script from shared folder. hope this could help you :-)