I have the following script (adapted from here) for uploading files via ftp for a website.
$files = #(dir -Path $path)
foreach ($file in $files) {
if ($file.GetType().FullName -eq 'System.IO.FileInfo') {
"uploading $file"
$uri = New-Object System.Uri($ftp+$file.Name)
$webclient.UploadFile($uri, $file.FullName)
}elseif ($file.GetType().FullName -eq 'System.IO.DirectoryInfo') {
Recurse $file.FullName
}
This works fine if all files go to the root of the directory. The problem I am having is that there are subdirectories for the site under the root. This places (as expected) all files at the root regardless of where they exist in the actual directory structure.
Is there a simple way to transfer all of the files while maintaining the directory structure of the source. I'm sure I could put something together using split-path, but I just wanted to make sure that I wasn't overlooking something before I went any further.
Thanks.
Per request converted from the comments:
geekswithblogs.net has a solution for recursive FTP copy.
Perhaps Microsoft Documentation can help here :
The URI may be relative or absolute. If the URI is of the form "ftp://contoso.com/%2fpath" (%2f is an escaped '/'), then the URI is absolute, and the current directory is /path. If, however, the URI is of the form "ftp://contoso.com/path", first the .NET Framework logs into the FTP server (using the user name and password set by the Credentials property), then the current directory is set to /path.
Related
I need to search for a string and then replace it with another in multiple files. Sound easy, but the hard part is that is that it's multiple files on multiple network locations. I've tried connecting to all of the locations at once with vscode and then using the built-in search and replace function. This allmost works, except when I get to big searches is seems to hang.
I'm now looking for another, more stable, way to do this. Anybody got any ideas? I thought powershell could be a good competitor, but unfortunately I'm not that used to working with powershell.
I found this guide and it's a bit like what I want, except I need to do it on multiple files at multiple locations at once.
https://mcpmag.com/articles/2018/08/08/replace-text-with-powershell.aspx
I would settle with running one skript for each location since it's only < 20 locations to scan. But it needs to include subfolders.
Any tips are appreciated, thanks! :)
Edit 1:
The folder structure differs from location to location so its hard to say how it looks. But I can say that no location has a folder structure deeper than 15 steps. The text that I'm replacing are thumbprints of certificates stored in .config files. The files are between 100 and 1000 characters long and the thumbprints I'm replacing looks something like this d2e8c58e5b34021671f2121483572f03f54ab9ae
This is assuming that the different network locations are in trusted domains or at least part of the wmi trustedhosts. PowerShell remoting will also need to be enabled on all computers involved. Run (In elevated PowerShell) Enable-PSRemoting -Force to enable PowerShell Remoting
$command = { Get-ChildItem -Path C:\Test\ -Include *.config -Name -Recurse | ForEach-Object {$configContent = Get-Content -Path $_ -Raw; $configContent.Replace("Old Value", "New Value") | Out-File -FilePath ($_.FullName) -Force } }
Invoke-Command -ComputerName "TestServer1", "TestServer2", "etc..." -ScriptBlock $command
If you are not part of the domain but have a domain/server login, you will need to use the -Credentials switch on the Invoke-Command function. This will basically find all files that have the .config extension in any subfolders in the path, get the current content of the .config file, replace your value, and finally overwrite the existing config file. WATCH OUT THOUGH this will get EVERY .config file that is in that path. If you have more than one it will also grab it, but if it doesn't have the string it will just rewrite the same file.
Without seeing an example of the folder structures and files this is quite hard to give a thorough answer on. However I would probably build a series of ForEach segments. For example:
ForEach ($Server in $Servers)
{
ForEach ($File in $Files)
{
Select-String -Path $File -Pattern "$ExampleString"
}
}
I am attempting to access a DLNA server that is set-up by using the dir structure created by clicking through it from My Computer, like the below
Now I want to read the file names in the folder and write them. This is the syntax I have tried
Get-ChildItem "This PC\Serviio (AMDDesktop)\Video\Folders\TV Shows\Battlebots\Season 01"
Foreach-Object {
$content = Get-Content $_.FullName
Write-Host $content
}
But it produces an error that says the path does not exist.
What would be the proper way to iterate these files? Or better yet, maybe the proper way to word the ? is how do I get the address to these files to iterate?
I have two directories:
C:\G\admin\less
C:\G\user\less
Inside of those directives I have multiple less and css files. I know all the names of the files so I would like to hardcode a list of these into the script. Perhaps this could be in an array or something but my knowledge of PowerShell is not even enough to know if there are arrays in the scripting language.
C:\G\admin\less
html-light.less
html-light.css
html-dark.less
html-dark.css
do-not-track-me.less
C:\G\user\less
html-light.less
html-light.css
html-dark.less
html-dairk.css
do-not-track-me.less
Is there a way I can use PowerShell to check each of these files (that I want to hardcode in my program) one by one and copy the last modified file from its directory to the other directory so that both directories will contain the same latest versions of these files?
Note that I would need to evaluate the predefined list of files one by one. Comparing modified date in one directory with the other and copying over as needed.
again assume that this isn't the best solution or approach
This solution assumes following
- when the LastWriteTime on one folder is bigger than the other it copy it to another folder.
- I'm not doing the path validation because of laziness but if you want with path validation just ask.
- I'm assuming that all the files on those folder must be tracked otherwise read the comment on code.
- i suggest you backup your folder before you run the script.
#if there is a file you don't want to track on those folder (for example you don't want to track txt files)
#just write $userFiles=dir C:\G\user\less\ -Exclude "*.txt"
#if you don't want track txt but only one of them should be track among with other file format
#$userFiles=dir C:\G\user\less\ -Exclude "*.txt" -Include "C:\G\user\less\Txtadditionaltotrack.txt"
$userFiles=dir C:\G\user\less\
$adminfiles=dir C:\G\admin\less\
foreach($userfile in $userFiles)
{
$exactadminfile= $adminfiles | ? {$_.Name -eq $userfile.Name} |Select -First 1
#my suggestion is to validate if it got the file.
#By now because of my lazy i will not call the test-path to validate if it got the file
#I'm assuming all directory are exact copy of each other so it will find the file.
if($exactadminfile.LastWriteTime -gt $userfile.LastWriteTime)
{
Write-Verbose "Copying $exactadminfile.FullName to $userfile.FullName "
Copy-Item -Path $exactadminfile.FullName -Destination $userfile.FullName -Force
}
else
{
Write-Verbose "Copying $userfile.FullName to $exactadminfile.FullName "
Copy-Item -Path $userfile.FullName -Destination $exactadminfile.FullName -Force
}
}
you can improve it because the way this code is it always copy file from one directory to another inside the else you can validate so that when the lastwriteTime is equal on both it doesn't copy.
You can improve it in many ways. i hope you got the ideia
Find the modification made to code so that it can archieve this requirement.
PLEASE READ THE COMMENT IN CODE.
NOTE THAT I'M NOT FOLLOWING THE BEST PRATICE (avoid unexpected error, name correctly all variable, ...)
#to make it more dynamical you can save on one file
#all the file names including extension in different lines.
#For example on path C:\FilesToWatch\watcher.txt
#$filestowatch=get-content C:\FilesToWatch\watcher.txt
$filestowatch="felicio.txt","marcos.txt"
$userFiles=dir C:\G\user\less\
$adminfiles=dir C:\G\admin\less\
#Optionally instead of use this if approach you can
#$userFiles=dir C:\G\user\less\ |? {$filestowatch -contains $_.Name}
#$adminfiles=dir C:\G\admin\less\|? {$filestowatch -contains $_.Name}
#loading in the above manner the first if statement on code bellow can be removed because
#We make sure that $userFiles and $adminfiles only have correct file to monitor
foreach($userfile in $userFiles)
{
if($filestowatch -contains $userfile.Name)
{
$exactadminfile= $adminfiles | ? {$_.Name -eq $userfile.Name} |Select -First 1
#my suggestion is to validate if it got the file.
#By now because of my lazy i will not call the test-path to validate if it got the file
#I'm assuming all directory are exact copy of each other so it will find the file.
if($exactadminfile.LastWriteTime -gt $userfile.LastWriteTime)
{
Write-Verbose "Copying $exactadminfile.FullName to $userfile.FullName "
Copy-Item -Path $exactadminfile.FullName -Destination $userfile.FullName -Force
}
else
{
Write-Verbose "Copying $userfile.FullName to $exactadminfile.FullName "
Copy-Item -Path $userfile.FullName -Destination $exactadminfile.FullName -Force
}
}
}
I think what you need is symbolic links, aka symlinks.
With symlinks, you can define files and folders that will be always in-sync, where the target file is updated automatically when the original is modified.
To create a symbolic link, enter the following in the console/command prompt:
mklink /[command] [link path] [file or folder path]
Mklink can create several types of links, according to these commands:
/D – creates a soft symbolic link, which is similar to a standard folder or file shortcut in Windows. This is the default option, and mklink will use it if you do not enter a command.
/H – creates a hard link to a file.
/J – creates a hard link to a folder.
The syntax is simple. Choose your option, define the path you want for the symlink, and finally the path of the original file/folder.
For example, imagine I'm developing a new project, and I want to share it to my client via Dropbox shared folder. I don't want to move all my workspace to dropbox, I just want to share that specific folder to them:
mklink /J C:\Dropbox\clients_shared_folders\project_x C:\my_dev_rootfolder\project_x
Note that the first path is the symbolic folder I want to create, while the second path is the existing directory.
In you case, I'll be assuming your working on the admin folder, and want to generate a syncd copy on the user folder:
mklink /J C:\G\user\less C:\G\admin\less
Here's a nice article for more info:
http://www.howtogeek.com/howto/16226/complete-guide-to-symbolic-links-symlinks-on-windows-or-linux/
I've searched numerous MSDN/Technet and StackOverflow articles regarding this but I can't find a solution to my problem.
SO references below.
I am trying to run a script on my server that simply counts the files in a folder on a network location.
I can get it working if it's a local folder, and I can get it working when I map the network drive. However I can't use a network drive because I'll be running this script from a web interface that doesn't have a user account (local drives work fine).
My script is:
$Files = Get-ChildItem \\storage\folder -File
$Files.count
I get the error:
Get-ChildItem : Cannot find path '\\storage\folder' because it does not exist.
[0]open folder from Network with Powershell
[1]File counting with Powershell commands
[2]Count items in a folder with PowerShell
[3]Powershell - remote folder availability while counting files
Two things that I can think of,
One would be to add -path to your get-childitem call. I tested this on my Powershell and it works fine.
$files = get-childitem -path C:\temp
$files.count
This returns the number of files in that path.
However I am testing this on a local file. If you are sure it is the remote access part giving you trouble I would suggest trying to set credentials. Besides the get-credentials option, you could also try setting them yourself.
$Credentials = New-Object System.Management.Automation.PsCredential("Username", "password")
Then perhaps you can set the drive and still be able to access your files. Hope that helps.
Try this:
set-location \\\storage\folder\
dir -recurse | where-object{ $_.PSIsContainer } | ForEach{ Write-Host $_.FullName (dir $_.FullName | Measure-Object).Count }
This will count the number of files in each sub-folder (recurse) and display the full path and count in the output.
I have created a PowerShell script for copying files to a directory, the script, first creates a folder , or forces a new folder event if it exists. Then copies a directory from another location. After copying, the files I then need to copy the correct web config depending on a value given by the user execturing the script. The issue I am having is I can copy the files, but all the files are set to read-only meaning when I try and copy the correct web.config, the script fails as access is denied.
This is a cut down version of script for simplicity.
$WebApp_Root = 'C:\Documents and Settings\user\Desktop\Dummy.Website'
$Preview_WebApp_Root = 'c:\applications\Preview\'
$Choice = read-host("enter 'preview' to deploy to preview, enter Dummy to deploy to Dummy, or enter test to deploy to the test environment")
if (($Choice -eq 'Preview') -or ($Choice -eq 'preview'))
{
$Choice = 'Preview'
$Final_WebApp_Root = $Preview_WebApp_Root
}
write-host("Releasing Build to " + $Choice +'...')
write-host("Emptying web folders or creating them if they don't exist... ")
New-Item $Final_WebApp_Root -type directory -force
write-host("Copying Files... ")
Copy-Item $WebApp_Root $Final_WebApp_Root -recurse
write-host("Copy the correct config file over the top of the dev web config...")
Copy-Item $Final_WebApp_Root\Config\$Choice\Web.configX $Final_WebApp_Root\web.config
write-host("Copying correct nhibernate config over")
Copy-Item $Final_WebApp_Root\Config\$Choice\NHibernate.config $Final_WebApp_Root\NHibernate.config
write-host("Deployed full application to environment")
Try to use -Force parameter to replace read-only files. From documentation:
PS> help Copy-Item -Par force
-Force [<SwitchParameter>]
Allows the cmdlet to copy items that cannot otherwise be changed,
such as copying over a read-only file or alias.