Powershell copy all folders and files with certain extension - powershell

I have one package on my Windows machine and another package on a remote server.
The first one is -> C:\Users\One. It contains the following files:
adapter.jsx
result.js
system.jsx
moment.js
readme.txt
package called info that contains two files -> logger.jsx and date.js.
Another one is a remote target directory -> /mnt/media/Two. It is currently empty. The user and host for it are: $userAndHost = "user#foo.bar"
I want to copy all the packages and files of extensions .jsx and .js from package One to package Two. It's required to use scp here since this is a copy between two different platforms.
What I tried:
get all the items within the package:
Get-ChildItem -Path "C:\Users\One" -Recurse
filter items by certain extension, in my case they are .jsx and .js:
Get-ChildItem -Path "C:\Users\One" -Recurse | Where-Object {$_.extension -in ".js",".jsx"}
do the secure copy (scp) - I didn't come up with the script here.
Please, help me finish the script.

Hi i think you need something like this.
I write a code for you, tested working.
#Set execution policy to bypass
Set-ExecutionPolicy -ExecutionPolicy Bypass -Scope Process -Force
#install Posh-SSH from powershell gallery
#https://www.powershellgallery.com/packages/Posh-SSH/2.0.2
Install-Module -Name Posh-SSH -RequiredVersion 2.0.2
#import module
Import-Module Posh-SSH
#get all the items within the package in the path:
$path = 'C:\Users\One'
$items = (Get-ChildItem -Path $path -Name -File -Include ( '*.jsx', '*.js') -Recurse)
#Need destination credential
$credential = Get-Credential
#copy selected items to destination via scp
$items | ForEach-Object {
Set-SCPFile -ComputerName 'SCP-SERVER-HOST-HERE' -Credential $credential -RemotePath '/mnt/media/Two' -LocalFile "$path\$_"
}
Hope this helps you

Related

PowerShell Only get files within folders > move to root > delete folders only > upload to ShareFile

I have the below powershell script which has been developed to copy files from a local unc path to the application Citrix ShareFile. Whilst all is good on that aspect, one issue we are facing is that we strictly can not support folders due to Citrix ShareFile not accepting the copy as a new upload, however it is creating the file as a New Folder and not triggering the workflow correctly.
One thing I think that will make our life easier is simply not supporting folders which will work for our environment.
What i am thinking is a script that pulls all files and moves them to the root directory, deletes all folders, then uploads them to ShareFile.
The below script will copy the folder and all its contents.
I have had a look around and am struggling to get it to do as i wish.
## Add ShareFile PowerShell Snap-in
Add-PSSnapin ShareFile
## Create new authentication file
#New-SfClient -Name "C:\Sharefile\SVCACC.sfps" -Account aws
## Variables ##
$OutputAppReqFID = "fo4a3b58-bdd6-44c8-ba11-763e211c183f"
$Project = 'A001'
$LocalPath = "\\file.server.au\$project\DATA\DATA CUSTODIAN\OUTPUT\"
$sfClient = Get-SfClient -Name C:\sharefile\SVCACC.sfps
$OutputAppReqFID_URL = (Send-SfRequest $sfClient -Entity Items -id $OutputAppReqFID).Url
## Create PS Drive ##
New-PSDrive -Name "sfDrive-$($project)" -PSProvider ShareFile -Client $sfClient -Root "\" -RootUri $OutputAppReqFID_URL
## Copy all files from specified path to ShareFile, followed by moving files to another folder ##
foreach ($object in Get-ChildItem -Path $LocalPath) {
Copy-SfItem -Path $object.FullName -Destination "sfDrive-$($project):"
remove-item $object.FullName -Recurse
}
## Remove PS Drive ##
Remove-PSDrive "sfdrive-$($project)"
Answered!
I managed to apply a simple Where-Object that excludes the mode directory d----- from the upload
Get-childitem -Path $LocalPath -Recurse | Where-Object {$_.Mode -ne "d-----"} | Select-Object -ExpandProperty FullName)
Seems to have worked a treat!

How to run multiple scripts using text files full of references

I have a script that I feel that I am close to being ready to run but need some help fine tuning things.
My primary objective is this:
From each text file (named after the computer it was generated from), run each script using the data that exists within the .txt file. Each file is output from the C:\Users folder on the computer, listing each user profile that exists on that machine. I need to be able to run the script so that it deletes the specified folders/files for each user profile on that machine.
# Name: CacheCleanup
# Description: Deletes cache files per user on each computer
# Syntax: .\CacheCleanup.ps1
# Author: Nicholas Nedrow
# Created: 06/15/2021
#Text file contains list of all machines that have recently pinged and are online
$Computers = Get-Content "C:\Temp\CacheCleanUp\ComputerUp.txt"
#Users are listed in individual text files assigned with the name of their PC.
$Users = Get-Content "C:\Temp\CacheCleanUp\Computer Users\*.txt"
#Base path for deletion paths
$Path = "\\$PC\c$\users\$user\appdata\local"
#Delete User\Temp files
Remote-Item -Path "$Path\temp\*" -Recurse -Force -EA SilentlyContinue -Verbose
#Delete Teams files
Remove-Item -Path "$Path\Microsoft\Teams" -Recurse -Force -EA SilentlyContinue -Verbose
Remove-ITem -Path "$Path\Microosft\TeamsMeetingAddin" -Recurse -Force -EA SilentlyContinue -Verbose
Remove-Item -Path "$Path\Microsoft\TeamsPresenceAddin" -Recurse -Force -EA SilentlyContinue -Verbose
#Delete Chrome Cache
Remove-Item -Path "$Path\Google\Chrome\User Data\Default\Cache\*" -Recurse -Force -EA SilentlyContinue -Verbose
#Delete IE Cache
Remove-Item -Path "$Path\Microsoft\Windows\INetCache\*" -Recurse -Force -EA SilentlyContinue -Verbose
#Delete Firefox cache
Remove-Item -Path "$Path\Mozilla\Firefox\Profiles\*.default\cache\*" -Recurse -Force -EA SilentlyContinue -Verbose
Remove-Item -Path "$Path\Mozilla\Firefox\Profiles\*.default\cache\*.*" -Recurse -Force -EA SilentlyContinue -Verbose
Remove-Item -Path "$Path\Mozilla\Firefox\Profiles\*.default\cache\cache2\entries\*.*" -Recurse -Force -EA SilentlyContinue -Verbose
Remove-Item -Path "$Path\Mozilla\Firefox\Profiles\*.default\cache\thumbnails\*" -Recurse -Force -EA SilentlyContinue -Verbose
Remove-Item -Path "$Path\Mozilla\Firefox\Profiles\*.default\cache\cookies.sqlite" -Recurse -Force -EA SilentlyContinue -Verbose
Remove-Item -Path "$Path\Mozilla\Firefox\Profiles\*.default\cache\webappstore.sqlite" -Recurse -Force -EA SilentlyContinue -Verbose
Remove-Item -Path "$Path\Mozilla\Firefox\Profiles\*.default\cache\chromeapstore.sqlite" -Recurse -Force -EA SilentlyContinue -Verbose
#How to Run each script for each user on each machine
#How to generate detailed log with results of deletion for each section
I will state right away that I am still learning scripting and am unfamiliar with functions, even though I am pretty sure that is what I need to develop here. This is a domain network so the appropriate path for the computer name has been taken into consideration. Each script does run independently, with the computer name specified but I run into issues when it comes to trying to call out each user profile on that computer.
If possible, it would be nice to have some sort of generated report with the outcome of each user profile and what was ran successfully. I don't need to necessarily know every file that was deleted but maybe a list of those files that were unable to be deleted due to conflicts with running programs or permission issues.
You need to use loops. Consider the following code:
$configFiles = "C:\Temp\CacheCleanUp";
Get-Content "$configFiles\TESTComputers.txt" | % {
$PC = $_;
Write-Host "Attempting to clean cache on computer: $PC";
Get-Content "$configFiles\TESTusers.txt" | % {
$user = $_;
$Path = "\\$PC\c$\users\$user\appdata\local"
Write-Host "`tCleaning $Path"
<# Your code goes here #>
}
}
TESTusers.txt contains:
dave
bob
amy
TESTComputers.txt contains:
10.0.0.1
10.0.0.2
10.0.0.3
10.0.0.4
10.0.0.5
This is the output of the above code and computer/user files:
Attempting to clean cache on computer: 10.0.0.1
Cleaning \\10.0.0.1\c$\users\dave\appdata\local
Cleaning \\10.0.0.1\c$\users\bob\appdata\local
Cleaning \\10.0.0.1\c$\users\amy\appdata\local
Attempting to clean cache on computer: 10.0.0.2
Cleaning \\10.0.0.2\c$\users\dave\appdata\local
Cleaning \\10.0.0.2\c$\users\bob\appdata\local
Cleaning \\10.0.0.2\c$\users\amy\appdata\local
Attempting to clean cache on computer: 10.0.0.3
Cleaning \\10.0.0.3\c$\users\dave\appdata\local
Cleaning \\10.0.0.3\c$\users\bob\appdata\local
Cleaning \\10.0.0.3\c$\users\amy\appdata\local
Attempting to clean cache on computer: 10.0.0.4
Cleaning \\10.0.0.4\c$\users\dave\appdata\local
Cleaning \\10.0.0.4\c$\users\bob\appdata\local
Cleaning \\10.0.0.4\c$\users\amy\appdata\local
Attempting to clean cache on computer: 10.0.0.5
Cleaning \\10.0.0.5\c$\users\dave\appdata\local
Cleaning \\10.0.0.5\c$\users\bob\appdata\local
Cleaning \\10.0.0.5\c$\users\amy\appdata\local
Few things to note about the code:
Get-Content "filename" | % - this is going to loop through the contents of the file one line at a time. % is a shortcut for ForEach-Object.
$_ when inside a foreach loop is an automatic variable created by PowerShell that contains the current item in the loop.
If you have a loop inside a loop and you need to access both $_ values from the inner and outer loop, you can create a new variable (eg $PC = $_;) in the outer loop that can be used within the inner loop (eg $Path = "\\$PC\c$\users\$user\appdata\local").
You should definitely learn to use functions, and then in the future you can combine functions into modules. This is a big help in organising your code, and you can avoid duplication by sharing functions between different scripts - but your current script doesnt need functions (but theyre a good idea).
Depending on your network, you might be able to use PowerShell remoting instead of the Administrative shares to achieve the same effect. This is a more advanced topic, there is some configuration required on the machines you want to connect but the advantage is your computer sends the script to each target, and the target computer runs the script and reports its results.
Another possible change i would suggest is only using a list of computers - then on each computer use get-childitem -path c:\users to actually get the list of each profile currently on that target computer.

Windows PowerShell is in NonInteractive mode. Read and Prompt functionality is not available

I am new to the power shell scripting. I am trying to delete all files except one folder and one file. I run this script by using jenkins it showing error called " Windows PowerShell is in NonInteractive mode. Read and Prompt functionality is not available." And i am trying to run this script in powershell window but it asking Confirmation [Y/N]. I need to run this script by using jenkins please help me.
$Path = "C:\TeamCity\buildAgent2\work"
$exclude = #("*.old", "*directory.map")
Get-ChildItem $Path -Exclude $exclude | Remove-Item -Force -Confirm:$false -ErrorAction Stop| echo Y
You need to add the Recurse parameter to the remove command.
Like that:
Remove-Item -Recurse -Force
guiwhatsthat is correct your code should like this;
$Path = "C:\TeamCity\buildAgent2\work"
$exclude = #("*.old", "*directory.map")
Get-ChildItem $Path -Exclude $exclude | Remove-Item -Recurse -Force -Confirm:$false -ErrorAction Stop

How to get the current script to read off computer names off a txt file located on c:\

I currently have this script working but only able to get it to run locally, I would like to have it read a text file that would be stored on c:\List_of_PCs.txt that would have computer names that it would also run the same script on. That way I can update the text file instead of modify the code.
Set-ExecutionPolicy RemoteSigned
# Get all users
$users = Get-ChildItem -Path "C:\Users"
# Loop through users and delete the Teams file
$users | ForEach-Object {
Remove-Item -Path "C:\Users\$($_.Name)\AppData\Roaming\Microsoft\Teams\Cache\f*" -Force
Remove-Item -Path "C:\Users\$($_.Name)\AppData\Roaming\Microsoft\Teams\Application Cache\Cache\f*" -Force
}
Any help on this I've tried multiple things every which way, I'm sure this is something simple but I'm still very new to PowerShell.
Try something like this...
Requires PowerShell remoting to be enabled and using an account that is an admin on the remote computer
$ComputerList = Import-Csv -Path 'c:\List_of_PCs.txt'
$ComputerList | % {
Invoke-Command -ComputerName $_ -ScriptBlock {
# Set-ExecutionPolicy RemoteSigned # this is something that should be set via GPO for all systems, not your script, so that it is centrally controlled and monitored.
# Get all users
$users = Get-ChildItem -Path "C:\Users"
# Loop through users and delete the Teams file
$users | ForEach-Object {
Remove-Item -Path "C:\Users\$($_.Name)\AppData\Roaming\Microsoft\Teams\Cache\f*" -Force
Remove-Item -Path "C:\Users\$($_.Name)\AppData\Roaming\Microsoft\Teams\Application Cache\Cache\f*" -Force
}
}
}

Powershell: Setting SASS_BINARY_PATH

I need to set SASS_BINARY_PATH environment variable with the local file I've downloaded to be able to install node-sass behind a corporate firewall. So on windows cmd, I just do:
SET SASS_BINARY_PATH=C:\Source\Repos\SRT\Srt.Web\sass-binary\v4.7.2\win32-x64-48_binding.node
And the installation works fine since it successfully sets the variable. But when I try doing it via Powershell, it doesn't work:
$env:SASS_BINARY_PATH="C:\Source\Repos\SRT\Srt.Web\sass-binary\v4.7.2\win32-x64-48_binding.node"
I've also tried another way on Powershell:
[Environment]::SetEnvironmentVariable("SASS_BINARY_PATH", "C:\Source\Repos\SRT\Srt.Web\sass-binary\v4.7.2\win32-x64-48_binding.node", "Machine")
Upon checking it on the control panel, it successfully added a "SASS_BINARY_PATH" system variable. But upon trying to reinstall node-sass, it fails again.
One of my observations is when I'm doing it the windows cmd way then check it by using the command line set, the variable shows up along with others. But when I use both the Powershell methods, it does not show up. Any ideas on this?
The error encountered when trying to npm-install node-sass over a corporate firewall is:
Downloading binary from
https://github.com/sass/node-sass/releases/download/v4.7
.2/win32-x64-48_binding.node Cannot download
"https://github.com/sass/node-sass/releases/download/v4.7.2/win3
2-x64-48_binding.node":
HTTP error 401 Unauthorized
Download win32-x64-48_binding.node manually
Put it in C:\Users\<user>\AppData\Roaming\npm-cache\node-sass\4.7.2 folder.
Then try to run npm install node-sass
here is the PowerShell command #jengfad used based on above solution which is commented in the discussion
$cacheSassPath = $env:APPDATA + '\npm-cache\node-sass'
if( -Not (Test-Path -Path $cacheSassPath ) )
{
Write-Host "cacheSassPath not exists"
New-Item -ItemType directory -Path $cacheSassPath
Write-Host "cacheSassPath CREATED"
}
<# Ensure has no content #>
Get-ChildItem -Path $cacheSassPath -Recurse| Foreach-object {Remove-item -Recurse -path $_.FullName }
<# Copy local sass binary (~Srt.Web\sass-binary\4.7.2) file to cache folder #>
$sassBinaryPath = split-path -parent $MyInvocation.MyCommand.Definition
$sassBinaryPath = $sassBinaryPath + "\sass-binary\4.7.2"
Copy-Item -Path $sassBinaryPath -Recurse -Destination $npmcachedir -Container
Write-Host "node-sass binary file successfully copied!"