Logon failure: account currently disabled using Copy-Item - powershell

I am getting the following error when attempting to use Copy-Item in powershell;
Logon failure: account currently disabled
The powershell code;
Copy-Item -Path $_ -Destination $remotePath -Recurse -Container -Force
UP until yesterday, this was working fine. Today, nope. I can access the remote machine using the UNC path that it is copying the file to. I can login to the machie directly using RDP and I am in the local Admins group.
I can see nothing in the standard logs. A reboot of the target machine does nothing.

Related

Using PowerShell to discover all valid IP's over a VPN

I am attempting to learn PowerShell. I am currently working on how to find the IP address that are remote. I am having trouble getting all active remote IP's.
Set-Location “C:\Users”
Remove-Item “.\*\Appdata\Local\Temp\*” -recurse -force
Set-loaction C:\Users\%UserName%\Appdata\Local\Temp
Remove-Item “.\*\Appdata\Local\Temp\*” -recurse -force
As you can see I am also attempting to delete all the temp files on said computers.
Please remeber I am new at this and I am asking for help.

Copy Folder Remote to Remote (No sharing path)

We are writing powershell script that will make backup on remote server. I am looking for some code reference that can copy my folder from remote to remote machine .
Trying with below sample code but "ToSession" is working with powershell 4.0.
We are writing powershell script (windows) that will run in separate server and create backup in remote machine(s).
Copy-Item –Path $sourcePath -Filter *.* -Recurse -Destination $targetPath -ToSession $session

Why does Set-AzStorageBlobContent task status always stay Active and never completes?

I am retrieving files from a directory and uploading them to an Azure Blob Storage account with the following Powershell command:
Get-ChildItem $ArtifactStagingDirectory -Recurse -File | Set-AzStorageBlobContent -Container $StorageContainerName -Context $StorageAccount.Context -Force
I am running this in the Windows PowerShell ISE. There are two files to upload and the ISE shows both being uploaded, but the overall task status stays as:
Total: 2. Successful: 0. Failed: 0. Active 2.
Both of the files are actually uploaded to the Azure storage. However, this powershell command never completes and never moves to the next command.
I am looking for reasons why this might be or for ways to help troubleshoot the issue. I am new to using the Windows Powershell ISE.
Get-ChildItem $ArtifactStagingDirectory -Recurse -File | Set-AzStorageBlobContent -Container $StorageContainerName -Context $StorageAccount.Context -Force
I have tried including the -ClientTimeoutPerRequest property, which did not change the results.
I found that if I wrap the statement in a runspace, it completes as it should. As this is what I ultimately wanted to do anyway, I'm considering this question answered.

Powershell: Setting SASS_BINARY_PATH

I need to set SASS_BINARY_PATH environment variable with the local file I've downloaded to be able to install node-sass behind a corporate firewall. So on windows cmd, I just do:
SET SASS_BINARY_PATH=C:\Source\Repos\SRT\Srt.Web\sass-binary\v4.7.2\win32-x64-48_binding.node
And the installation works fine since it successfully sets the variable. But when I try doing it via Powershell, it doesn't work:
$env:SASS_BINARY_PATH="C:\Source\Repos\SRT\Srt.Web\sass-binary\v4.7.2\win32-x64-48_binding.node"
I've also tried another way on Powershell:
[Environment]::SetEnvironmentVariable("SASS_BINARY_PATH", "C:\Source\Repos\SRT\Srt.Web\sass-binary\v4.7.2\win32-x64-48_binding.node", "Machine")
Upon checking it on the control panel, it successfully added a "SASS_BINARY_PATH" system variable. But upon trying to reinstall node-sass, it fails again.
One of my observations is when I'm doing it the windows cmd way then check it by using the command line set, the variable shows up along with others. But when I use both the Powershell methods, it does not show up. Any ideas on this?
The error encountered when trying to npm-install node-sass over a corporate firewall is:
Downloading binary from
https://github.com/sass/node-sass/releases/download/v4.7
.2/win32-x64-48_binding.node Cannot download
"https://github.com/sass/node-sass/releases/download/v4.7.2/win3
2-x64-48_binding.node":
HTTP error 401 Unauthorized
Download win32-x64-48_binding.node manually
Put it in C:\Users\<user>\AppData\Roaming\npm-cache\node-sass\4.7.2 folder.
Then try to run npm install node-sass
here is the PowerShell command #jengfad used based on above solution which is commented in the discussion
$cacheSassPath = $env:APPDATA + '\npm-cache\node-sass'
if( -Not (Test-Path -Path $cacheSassPath ) )
{
Write-Host "cacheSassPath not exists"
New-Item -ItemType directory -Path $cacheSassPath
Write-Host "cacheSassPath CREATED"
}
<# Ensure has no content #>
Get-ChildItem -Path $cacheSassPath -Recurse| Foreach-object {Remove-item -Recurse -path $_.FullName }
<# Copy local sass binary (~Srt.Web\sass-binary\4.7.2) file to cache folder #>
$sassBinaryPath = split-path -parent $MyInvocation.MyCommand.Definition
$sassBinaryPath = $sassBinaryPath + "\sass-binary\4.7.2"
Copy-Item -Path $sassBinaryPath -Recurse -Destination $npmcachedir -Container
Write-Host "node-sass binary file successfully copied!"

Copying from remote servers to local server

I am trying to do an unattended backup our websites from 2 webservers to our backup server.
$FolderName = $(Get-Date -Format D)
New-Item -ItemType directory -Path D:\backups\webservers\$FolderName
New-Item -ItemType directory -Path D:\backups\webservers\$FolderName\ColoWebP1
New-Item -ItemType directory -Path D:\backups\webservers\$FolderName\ColoWebD1
Copy-Item \\colowebp1.wa.local\e$\websites D:\backups\webservers\$FolderName\ColoWebP1 -recurse
Copy-Item \\colowebp1.wa.local\e$\backup D:\backups\webservers\$FolderName\ColoWebP1 -recurse
Copy-Item \\colowebd1.wa.local\e$\websites D:\backups\webservers\$FolderName\ColoWebD1 -recurse
Copy-Item \\colowebd1.wa.local\e$\backup D:\backups\webservers\$FolderName\ColoWebD1 -recurse
Now I still have not got this to run unattended. It creates the folders but does not copy the files. And now a new wrinkle has occured. When I run it manually I recieve this error:
Copy-Item : Access to the path 'D:\backups\webservers\Tuesday, February 25, 2014\ColoWebD1\websites\Agent_eVantage_Beta
\Master_wSlider.master' is denied.
At C:\scripts\Webserverbackup.ps1:12 char:10
+ Copy-Item <<<< \\colowebd1.wa.local\e$\websites D:\backups\webservers\$FolderName\ColoWebD1 -recurse
+ CategoryInfo : PermissionDenied: (Master_wSlider.master:FileInfo) [Copy-Item], UnauthorizedAccessExcept
ion
+ FullyQualifiedErrorId : CopyFileInfoItemUnauthorizedAccessError,Microsoft.PowerShell.Commands.CopyItemCommand
But all the files appear to be there. (I haven't attempted a restore of this yet).
So my questions are:
Am I reading this error right? Is it having trouble authenticating to the server this is running from?
And how do I get this to run unattended?
The problem is with the dollar sign in your Copy-Item (i.e. \$e\)
PowerShell is interpreting the $ sign as a variable. I would instead use a shared folder instead of the drive letter.
Copy-Item '\\colowebp1.wa.local\Share\websites' "D:\backups\webservers\$FolderName\ColoWebP1" -recurse
You have to set proper permissions to access Admin share. What happens when you access the target path above with Explorer? If everything is set up correctly, you should be able to get into the share without authentication. (ex. with default network credentials). Your solution itself is fine however and it will work once the authentication is not required. There are workarounds to this with PS but you would have to provide some details on network and UAC setup. I will happily attempt to resolve this once you provide the details.
At work I use such paths to admin share and these are working perfectly, Powershell doesnt treat the share as a variable.
Thanks,
Alex