Powershell To Read From Network Location - powershell

I am attempting to access a DLNA server that is set-up by using the dir structure created by clicking through it from My Computer, like the below
Now I want to read the file names in the folder and write them. This is the syntax I have tried
Get-ChildItem "This PC\Serviio (AMDDesktop)\Video\Folders\TV Shows\Battlebots\Season 01"
Foreach-Object {
$content = Get-Content $_.FullName
Write-Host $content
}
But it produces an error that says the path does not exist.
What would be the proper way to iterate these files? Or better yet, maybe the proper way to word the ? is how do I get the address to these files to iterate?

Related

null-valued expression error in simple powershell script

I have a small powershell script that looks in a patch for all files that contain a string and then replace that string.
Im sure this worked last week so im very confused why its now not working.
$filePath = "C:\my\file\path*"
# Get the files from the folder and iterate using Foreach
Get-ChildItem $filePath -Recurse | ForEach-Object {
# Read the file and use replace()
(Get-Content $_).Replace('oldString','NewString') | Set-Content $_
}
Im getting two errors i think, the first is:
Get-Content : Access to the path 'C:\my\file\path\YYY' is denied.
YYY is a folder in my path and im running the script as administrator, i was running as my own user who i confirmed has full access to this path.
The second is:
You cannot call a method on a null-valued expression.
Im guessing its the $_ but im really not sure. Ive tried replacing it with different name but no luck.
The You cannot call a method on a null-valued expression error is coming from calling the replace method when the result of Get-Content $_ is null.
This can happen for a few reasons:
First, inside of your ForEach-Object script block, $_ will essentially return the same value as $_.name which does not contain the full path to the folder or file being processed. Depending on the value of the working directory,$PWD, when you execute this code this could result in a bad file or folder path.
To address issues with the working directory try using $_.FullName. For debugging you can also write that value to the terminal to confirm where it is looking.
The issue could also be that the file has no contents. If the file is empty then the result of Get-Content will be $null.
Also attempting to get the content of a folder will result in an "access denied" error, after which the result of (get-content $_) will also be $null, resulting in the null error you received.
To summarize:
Double check your working directory.
Consider using $_.FullName.
Consider adding code to avoid calling Get-Content on folders.
Here is an example which I tested with several nested folders, some of which contained empty files and others which contained files with contents:
Get-ChildItem C:\testFolder -File -Recurse | ForEach-Object{
#this write host is just for debugging, to see what the file path is
Write-Host $_.FullName
$fileContents = Get-Content $_.FullName
if($fileContents -ne $null){
$fileContents.replace('oldString','newString') | Set-Content $_.FullName
}
}

Powershell: copy file without locking

I created simple nagios plugin check_log.ps1 to check log file on windows machine. It works in way that make copy content of log and in next time look for specified string in difference between copy of log and original log file.
The problem is that sometimes in random moments check_log.ps1 locks log file so it cause stop of the application which create log file.
Generally plugin use original log file in two places
# compare content of $Logfile and $Oldlog, save diff to $tempdiff
Compare-Object -ReferenceObject (Get-Content -Path $Logfile) -DifferenceObject (Get-Content -Path $Oldlog) | Select-Object -Property InputObject > $tempdiff
# override file $Oldlog using conetent of $Logfile
Copy-Item $Logfile $Oldlog
I make test. In one PS session I run while($true) { [string]"test" >> C:\test\test.log }, in second session I run plugin C:\test\check_log.ps1 C:\test\test.log C:\test\Old_log.log test
I'm not fully sure if my test is correct but I think that Copy-Item command cause problem. When I comment this line in script I don't see any errors in terminals. I tested some custom functions to copy file which I found in internet but I din't find solution for my problem.
Do you have an idea how to make it work fully?
if you think the copy-item is locking the file, try reading the content and then saving it to another location. Something like this:
Get-Content $Logfile | Set-Content $Oldlog

Is it possible to make a search and replace in file-content on multiple network locations?

I need to search for a string and then replace it with another in multiple files. Sound easy, but the hard part is that is that it's multiple files on multiple network locations. I've tried connecting to all of the locations at once with vscode and then using the built-in search and replace function. This allmost works, except when I get to big searches is seems to hang.
I'm now looking for another, more stable, way to do this. Anybody got any ideas? I thought powershell could be a good competitor, but unfortunately I'm not that used to working with powershell.
I found this guide and it's a bit like what I want, except I need to do it on multiple files at multiple locations at once.
https://mcpmag.com/articles/2018/08/08/replace-text-with-powershell.aspx
I would settle with running one skript for each location since it's only < 20 locations to scan. But it needs to include subfolders.
Any tips are appreciated, thanks! :)
Edit 1:
The folder structure differs from location to location so its hard to say how it looks. But I can say that no location has a folder structure deeper than 15 steps. The text that I'm replacing are thumbprints of certificates stored in .config files. The files are between 100 and 1000 characters long and the thumbprints I'm replacing looks something like this d2e8c58e5b34021671f2121483572f03f54ab9ae
This is assuming that the different network locations are in trusted domains or at least part of the wmi trustedhosts. PowerShell remoting will also need to be enabled on all computers involved. Run (In elevated PowerShell) Enable-PSRemoting -Force to enable PowerShell Remoting
$command = { Get-ChildItem -Path C:\Test\ -Include *.config -Name -Recurse | ForEach-Object {$configContent = Get-Content -Path $_ -Raw; $configContent.Replace("Old Value", "New Value") | Out-File -FilePath ($_.FullName) -Force } }
Invoke-Command -ComputerName "TestServer1", "TestServer2", "etc..." -ScriptBlock $command
If you are not part of the domain but have a domain/server login, you will need to use the -Credentials switch on the Invoke-Command function. This will basically find all files that have the .config extension in any subfolders in the path, get the current content of the .config file, replace your value, and finally overwrite the existing config file. WATCH OUT THOUGH this will get EVERY .config file that is in that path. If you have more than one it will also grab it, but if it doesn't have the string it will just rewrite the same file.
Without seeing an example of the folder structures and files this is quite hard to give a thorough answer on. However I would probably build a series of ForEach segments. For example:
ForEach ($Server in $Servers)
{
ForEach ($File in $Files)
{
Select-String -Path $File -Pattern "$ExampleString"
}
}

How do I copy multiple files from multiple hosts in powershell?

I am trying to make a powershell script (5.1) that will copy several files and folders from several hosts, these hosts change frequently therefore it would be ideal if I can use a list that I can append when required.
I have this all working using xcopy so I know the locations exist. I want to ensure that if a change is made when I am not In work someone can just add or remove a host in the text file and the back up will continue to work.
The code I have is supposed to go through each host in my list of hosts and copy all the files from the list of file paths before moving onto the next host.
But there are 2 errors showing up:
The term '\REMOTEHOST\c$\Users\Public\desktop\back-up\$Computers' is not recognized as the name of a cmdlet, function, script
file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
At line:8 char:17
and:
copy-item : Cannot find path '\HOST\C$\LK\Appdata\Cmmcfg C$\LKAppData\Errc C$\LK\Appdata\TCOMP C$\LK\Probes C$\LK\Appdata\CAMIO C$\LK\Appdata\LaunchPad C$\LK\Appdata\Wincmes
C$\barlen.dta C$\Caliprogs C$\Cali' because it does not exist.
This does not seem to reading through the list as I intended, I have also noticed that the HOST it is reading from is 6th in the list and not first.
REM*This file contains the list of hosts you want to copy files from*
$computers = Get-Content 'Y:\***FILEPATH***\HOSTFILE.txt'
REM*This is the file/folder(s) you want to copy from the hosts in the $computer variable*
$source = Get-Content 'Y:\***FILEPATH***\FilePaths.txt'
REM*The destination location you want the file/folder(s) to be copied to*
$destination = \\**REMOTEHOST**\c$\Users\Public\desktop\back-up\$Computers
foreach ($item in $computers) {
}
foreach ($item in $source) {
}
copy-item \\$computer\$source -Destination $destination -Verbose
Your destination variable needs to be enclosed in quotes. To have it evaluate other variables inside of it, enclose it in double quotes. Otherwise PowerShell thinks it's a command you are trying to run.
$destination = "\\**REMOTEHOST**\c$\Users\Public\desktop\back-up\$Computers"
cracked it, thank you for your help. I was messing up the foreach command!I had both variables set to Item, so I was confusing things!
foreach ($itemhost in $computers) {
$destination = "\Remotehost\c$\Users\xoliver.jeffries\desktop\back-up\$itemhost"
foreach ($item in $source)
{copy-item "\$itemhost\$item*" -Destination $destination -Verbose -recurse}
}
Its not the neatest output but that's just a snag! the code now enables me to use a list of hosts and a list files and copy them to a remote server!

powershell ftp script for web site

I have the following script (adapted from here) for uploading files via ftp for a website.
$files = #(dir -Path $path)
foreach ($file in $files) {
if ($file.GetType().FullName -eq 'System.IO.FileInfo') {
"uploading $file"
$uri = New-Object System.Uri($ftp+$file.Name)
$webclient.UploadFile($uri, $file.FullName)
}elseif ($file.GetType().FullName -eq 'System.IO.DirectoryInfo') {
Recurse $file.FullName
}
This works fine if all files go to the root of the directory. The problem I am having is that there are subdirectories for the site under the root. This places (as expected) all files at the root regardless of where they exist in the actual directory structure.
Is there a simple way to transfer all of the files while maintaining the directory structure of the source. I'm sure I could put something together using split-path, but I just wanted to make sure that I wasn't overlooking something before I went any further.
Thanks.
Per request converted from the comments:
geekswithblogs.net has a solution for recursive FTP copy.
Perhaps Microsoft Documentation can help here :
The URI may be relative or absolute. If the URI is of the form "ftp://contoso.com/%2fpath" (%2f is an escaped '/'), then the URI is absolute, and the current directory is /path. If, however, the URI is of the form "ftp://contoso.com/path", first the .NET Framework logs into the FTP server (using the user name and password set by the Credentials property), then the current directory is set to /path.