I'm trying to run a powershell backup script as a scheduled task, and as a part of this I need to map a network drive (before copying the backup to the remote host).
Now, this was working when I had it like this:
NET use $RemoteDir $RemoteUser $BackupPass
But due to security concerns, I can't keep the password in plain text, or anywhere on the server. So, my hope was that, since it's run as a scheduled task, 'NET' would be able to "inherit" the credentials from the user running the scheduled task, so that it could be run like this:
NET use $RemoteDir
But it doesn't seem to work, when I log the error I get:
The password is invalid for <RemoteDir>.
Enter the user name for <RemoteDir>':
So, my question is: Is there any way to get this to work? I.e. having NET use the credentials that the scheduled job is running with? Or do I need to find another approach?
Edit: Forgot to mention, the user running the scheduled task is a domain user, and it has access to the remote.
Thanks!
Create a domain account. Assign the account permissions to the share and NTFS directory. Set the task to execute under the account you just created.
I dont have the solution, but a work arround.
You can store a password in a file locked down to say "domain admin" like the suggestion above and follow the directions on this link:
http://geekswithblogs.net/Lance/archive/2007/02/16/106518.aspx
Seems weird, I manage station users from scheduled task without any problem, so it is possible to use the credentials. Maybe the net use is just misbehaving, have you tried using New-PSDrive instead?
Related
Greeting Everyone,
Hoping someone has a quick insight but I am getting access denied on a service account using the PowerShell command Remove-CMDevice.
This process is as per outlined here, https://technet.microsoft.com/en-us/library/jj821759(v=sc.20).aspx
The account has permission to remove devices from SCCM and this works fine through the GUI but not the command line. I have been unable to find documentation on what permissions the account need to do this via command line, it works fine manually in the GUI.
If anyone can shed light on this it will be wonderful, I do want to keep this service account as having as minimal permissions as possible.
Many thanks,
Edit to Add Image as follows,
After a lot of testing, I'm here with an answer on the Permission part when using PowerShell console to remove CM system object. Of course the symptom is the same: The account can delete from Admin console, however, when using PowerShell, it failed with Permission error message.
The account to perform the Remove-CMDevice cmdlet must have proper RBA Permission on the object. Assume the security scope is default one, the account connected to Configuration Manager console must have below RBA permission which I tested is almost minimal permission:
In the screenshot the Collection part, the permission is easy to understand, Read, Delete Resource, etc.
For the Computer Association part, you may get confused, why?
Steps I did the troubleshooting:
I opened a PowerShell Console connecting to Configuration Manager using my test account and run below command to see what will happen:
Remove-CMDevice 'Rsuraceccc' -Verbose
I got below error:
Yes, it's trying to querying from the SMS_StateMigration. Then I try to run a simple command:
Get-WMIObject -NameSpace root\sms\site_clt -Query 'Select * from SMS_StateMigration'
Once again, I got error. So I get the conclusion that the account needs Permission on SMS_StateMigration. So I add 'Recover User State' permission of Computer Association on the role and tried again, cheers, this time all command runs successfully.
I don't know why it's using SMS_StateMigration, but this is the case here.
I am working on a project using PowerShell, and the challenge that I have now is how to run PowerShell itself.
I have access to a domain credential that has login capability on the server I am running it from, and I am planning on using WQL queries as triggers to run the script at different times.
Is there a way to do this without leaving the credential information as plaintext? I have and use stored domain credentials within the script, but I cannot find a way to use those credentials to run the script itself.
Any idea how to do this, or creative ways to get around the issue? I cannot use Task Scheduler for this project.
I am using Putty to transfer files from my windows machine to Linux machine.
I am able to transfer, when i run the script and also if i run the same script using Schedule task with my credentials.
if schedule the task to run using system account(SYSTEM) or other user account, file transfer not happening.
Do i need to save any session vales?
PuTTY saves session information in the registry for the current user only, this information will simply be not available for the other accounts you mentioned. So you either need to provide them by exporting yours and importing them in the other user's accounts or simply provide everything needed on the shell command invoked to copy your files. The latter sounds much easier to me in combination with a little script which gets invoked by the task scheduler.
I am required to utilize an old version of ClearQuest 7, and the only APIs that are enabled in our installation are for VBA (Excel) and RatlPERL. (The REST API isn't an option for us - although it suffers the same cleartext credential problem.)
I've written a ratlperl script that executes queries into the defect database, and produces csv output. Note that ratlperl requires cleartext user credentials for authentication.
ratlperl query.cqpl -u %userid% -p %password% -q "%query%" -c %outfile%
That script is called from a Windows Batch file. When run from the Windows command line with no parameters, the batch file requests user credentials, but they can also be provided as parameters.
query.bat %userid% %password%
I trigger daily queries, with the user credentials passed as parameters for the batch file.
This all works well, but I'd rather not store the cleartext password in this way. The registry would be one possibility, but anyone with access to the machine would have access to those credentials.
How can I store these credentials in a somewhat secure way?
There's two things to watch out for. One is having your process list "show up" the auth credentials.
Particularly on Unix - if you run ps it'll show you the arguments, which might include a username and password. The way of handling this is mostly 'read from a file, not the arg list'. On Unix, you can also amend $0 to change how you show in ps (but that doesn't help command history, and it's also not perfect as there'll be a short period before it's applied).
The other is - storing the data at rest.
This is a bit more difficult. Pretty fundamentally, there aren't many solution that let your script access the credentials that wouldn't allow a malicious user to do so.
After all, by the simple expedient of inserting a print $password into your script... they bypass pretty much any control you could put on it. Especially if they have admin access on your box, at which point... there's really nothing you can do.
Solutions I'd offer though:
Create a file with (plaintext) username and password. Set minimum permissions on it. Run the script as a user that has privileges, but don't let anyone else access that user account.
That way other people can 'see' your script (and may need to to run it) but can't copy it/hack it/run it themselves.
I would suggest sudo for this on Unix. For Windows, I'm not sure how much granularity you have over RunAs - that's worth a look, or alternatively have a scheduled task that runs as your service account, and picks up 'request files' for processing that can be generated by anyone.
As the level of security doesn't need to be so high, perhaps consider to create a simple exe? The password could possibly be read out of the memory somehow, but I guess this way creates a big enough barrier.
Or something like this could be helpful?
http://www.battoexeconverter.com/
HTH
I have a nAnt script that works perfectly to build and copy a website to another domain machine. However, when I try to copy the website to a machine not on the domain I get security errors.
I know it's because the user that I have set to run nAnt doesn't have permissions on the remote computer.
Is it possible to specify a remote user to authenticate against when trying to copy files to a non-domain computer? There doesn't seem to be any options for this in the official nAnt documentation.
What other options are available?
We've got round this by having a account with the same username and password on all servers that are involved in the copy. However, we do it the other way round. We copy from a machine in a workgroup to a maching in a domain and it works fine.
e.g. useraccount on workgroup computer:
.\CruiseControl password1
useraccount on domain:
domain\CruiseControl password1
I've stumbled upon the same problem. Ended up using PsExec to call XCOPY. Works fine.
I answered a question that was similar to this.
One might be able to use the exec task to launch the runas command to copy (or xcopy) the files over to a computer with a different username/password. If this is a non-domain account, you might have to use the local administrator account to authenticate. I'm not 100% on that one.
This should allow you to stay within NAnt. Let me know if this is not sufficient and we can try and figure something else out.