I have a text file with a list of user names separated by semi colon, users names in the text file are: user1; user2; user3, etc.. The user names all have a network folder located at \testserver\users\user1, \testserver\users\user2, and so on.
I am trying to have PowerShell script read the text file and copy the folder and all data in each folder for each user from one location to another location called \testserver\newusers\users. However when I launch the script I have written so far, it just creates a folder with a user name from the text file I have. Below is what I have so far:
$File = Get-Content .\MyFile.txt
$File | ForEach-Object {
$_.Split(';') | ForEach-Object {
Copy-Item -Path "$_" -Destination '\\testserver\newusers\users'
}
}
I am launching my PowerShell .ps1 file from a location that has the myfile.txt file in it.
How do I get this to work properly?
Call Copy-Item with the parameter -Recurse if you want to copy the folders' content as well. Otherwise just the folder itself would be copied (without content). You also need to provide the full path to the source folders unless you run the script from \\testserver\users.
Something like this should work:
$server = 'testserver'
$src = "\\$server\users"
$dst = "\\$server\newusers"
(Get-Content .\MyFile.txt) -split ';' | % {
Copy-Item -Path "$src\$_" -Destination "$dst\" -Recurse
}
Related
I'm trying to get the names of the files within the directory and the sub directories within the "parent" directory.
I managed to solve my problem in parts, but in the last directory it is multiplying the files (as if I was checking this folder twice).
If I have 2 items, he is exporting me 4 to that last directory.
I managed to get the files from the directories, but if I have any files in the root directory (parent directory), it won't list.
Parent dir "\XML"
File1.xlsx
File2.txt
Folder2
Dir "Folder2"
File1.xml
File2.xml
So, in the CSV file exported will be 4 rows, which is duplicate files from "Folder2" folder. And it won't bring me the files in the "XML" folder, which is the parent folder.
Powershell script:
# To execute the script without agreeing with the execution policy
Set-ExecutionPolicy Bypass -Scope Process
# Defines the parent directory, where all files and folders inside it will be
$DirPai = 'D:\Users\F02579\Desktop\XML'
# Variable to store all directories within the parent directory
$DirPastas = (Get-ChildItem -Directory -Recurse $DirPai).FullName
# Variable that will keep the final result
$results = #()
foreach ($Dir in $DirPastas)
{
# Write the directory name
Write-Host $Dir
# Get the file details
$Arquivos = Get-ChildItem -Path $Dir -Recurse | Select-Object Directory, Name, Lenght, LastWriteTime, #{Name="Extension";Expression={$_.Extension}} #| Where-Object "Extension"-ne ''
# Store the result for each path
$results += $Arquivos
}
# Defines the directory to which the final file will be exported
$DiretorioExportacao = 'D:\Users\F02579\Desktop\XML\I_PI_LISTFILES.csv'
# Export the result to CSV in the previously informed directory
$results | Export-Csv -Path $DiretorioExportacao -NoTypeInformation -Encoding UTF8
You could reduce your code to the following.
Keep only the first Get_ChildItem. Remove the -Dir switch (ie. get files and folders)
Pipe directly to Select-Object. Don't bother with an explicit loop.
This will give a result with five items: four files and one folder.
You also had a typo with 'length'.
Cheers.
# To execute the script without agreeing with the execution policy
Set-ExecutionPolicy Bypass -Scope Process
# Defines the parent directory, where all files and folders inside it will be
$DirPai = 'D:\Users\F02579\Desktop\XML'
$results = (Get-ChildItem -Recurse $DirPai) | Select-Object Directory, Name, Length, LastWriteTime, #{Name="Extension";Expression={$_.Extension}} #| Where-Object "Extension"-ne ''
# Defines the directory to which the final file will be exported
$DiretorioExportacao = 'D:\Users\F02579\Desktop\XML\I_PI_LISTFILES.csv'
# Export the result to CSV in the previously informed directory
$results | Export-Csv -Path $DiretorioExportacao -NoTypeInformation -Encoding UTF8
You already have all the directories captured so in your loop you need to specify the -File parameter and there is no need for the -Recurse parmaeter.
$Arquivos = Get-ChildItem -Path $Dir -File |
Select-Object #{Name="Directory",Expression($Dir),
Name, Lenght, LastWriteTime,
#{Name="Extension";Expression={$_.Extension}}
HTH
I am new to the powershell scripting and hence need one help in scripting, the script which I am looking for should search for the folders as per the entries in .csv file, please note that it should search for the folders in the drive without knowing the path and move it to the destination.
I did some research and created below script which is taking data from .txt file and moving the data to the destination however it does not work if I just write C:\ at the place of source folder.
Request you to please help me :)
Get-Content C:\abc.txt |
Foreach-Object {
move-item -path "C:\0123\$_" -destination "C:\To Archive\$_"
}
With what you've given, I'd do something like the following:
$File = Import-Csv C:\share\test\files.txt
Foreach($fileName in $File.FileName)
{
Move-Item -path "C:\share\test\OldLocation\$fileName" -Destination "C:\share\test\NewLocation\$fileName"
}
I did this with a .csv file that had one column whose title was FileName. Notable differences from your code include using the Import-Csv cmdlet and specifying the .csv header title in the foreach loop.
If you wanted to do this with a single command:
Import-Csv C:\share\test\files.txt | ForEach-Object {
Move-Item -path "C:\share\test\OldLocation\$($_.[csvHeader])" -Destination "C:\share\test\NewLocation\$($_.[csvHeader])"
}
Where csvHeader is the title of the column in your .csv file.
My .csv file looked like:
FileName
file1.txt
file2.txt
file3.txt
I have list of share path in a text file. I try to read the files and folders in each path and exporting to csv file using powershell script. I got some csv files with 0 KB.
so i try to test the existance of such network path using Test-Path command. few path shows it is exist but when itry to list out the directories of existing path using Dir \sharepath name i got error like "The specified network name is no longer available" Why??
Sharing code below..
foreach ($dir in (Get-Content $infile)) {
$outfilecsv='jerin-Download'+'.csv'
Get-ChildItem -Path $dir -Filter *.* -Recurse | Select-Object
Name,#{Name="Owner";Expression={(Get-ACL $_.fullname).Owner}},CreationTime,#{Name="FileModifiedDate";Expression={$_.LastWriteTime}},
#{Name="FileAccessedDate";Expression={$_.LastAccessTime}},#{Name="Attributes";Expression=
{$_.Attributes}},#{l='ParentPath';e={Split-Path $_.FullName}},
#{Name="DormantFor(days)";Expression={[int]((Get-Date)-$_.LastWriteTime).TotalDays}},
#{N="FileCategory";E={Get-FileSizeCategory($_)}},
#{Name="Size";Expression={if($_.PSIsContainer -eq $True){(New-Object -com
Scripting.FileSystemObject).GetFolder( $_.FullName).Size} else {$_.Length}}}|
Export-Csv -Path $outfilecsv -Encoding ascii -NoTypeInformation
}
Can anyone suggest
Thanks
Jerin
First I will give a brief overview of what im trying to achieve. I want to go through a series of HTML files, replace code and then re-save these HTML files. This all works however the PS command will only execute this on HTML files which are on the default Powershell path (for me this is the H drive).
I want to be able to have a seperate folder which contains my powershell script and HTML files and convert them in that folder NOT from the H drive.
The code I have is follows:
Powershell script
$HTMLfiles=get-childitem . *.html -rec
foreach ($files in $HTMLfiles)
{
(Get-Content $files.PSPath) | ForEach-Object { $_ -replace "this text", "TEST" } | Set-Content $files.PSPath
}
This successfully changes all HTML files on the H drive that contain the words 'this text' with 'TEST'. I want to be able to change these HTML files from where the Powershell script is located, NOT from the H drive?
I appreciate any help.
Thanks
Use the built-in variable called $PSScriptRoot to retrieve the files from the same folder where the PowerShell script resides.
Get-ChildItem -Path $PSScriptRoot -Include *.HTML;
In your script, you ask to the Get-ChildItem cmdlet to look for items in the current directory, to make the script look for files in another directory, you just have to specify it to Get-ChildItem :
$HTMLpath="C:\path\to\your\html\files"
$HTMLfiles=get-childitem $HTMLpath *.html -rec
foreach ($files in $HTMLfiles)
{
(Get-Content $files.PSPath) | ForEach-Object { $_ -replace "this text", "TEST" } | Set-Content $files.PSPath
}
Edit :
if you want the path to be passed as an argument to your script, just do the following :
param($HTMLpath)
$HTMLfiles=get-childitem $HTMLpath *.html -rec
foreach ($files in $HTMLfiles)
{
(Get-Content $files.PSPath) | ForEach-Object { $_ -replace "this text", "TEST" } | Set-Content $files.PSPath
}
then you can call your script in the console (assuming you are in the directory where your script is) : ./myscript "C:\path\to\your\files"
Calling Get-ChildItem . *.html -Rec will get all files under the current working directory. If you happen to be in the same folder as your script when you call it, I'd expect it to work as you want. If you call the script from another path, e.g. by setting up a scheduled task to run powershell.exe <path_to_script> then it may not pick up the files you want. Maybe H: is the root of your Windows user profile?
As per other answers, using $PSScriptRoot or passing the path under which the .html files reside in a parameter would be good. To combine both, you can add a parameter to your script AND set the default value for that parameter to be $PSScriptRoot:
param($HTMLpath = $PSScriptRoot)
This will (1) allow you to specify a remote path if necessary and (2) otherwise default to the path where the script is saved.
I am trying to write a Powershell script that will read a text file on my desktop that is filled with user names, then go out to a specified folder on our network share, lets say u:\data and copy the contents from that folder to another network share lets says y:\information, for each user in the text file.
How would this be written?
I have tried several things with reading the text file then trying several commands to copy and paste but they each failed.
UPDATE:
Below is what I have done so far:
$user = Get-Content "test.txt"
$path = "\\abnas2\abusers\users"
$path2 = "\\abnas2\abdept\dept\testcopy"
$Copy = Copy-Item -path $path\$user\ * -Destination $path2\$user
I had one username in the test.txt file called user1 and it pulled the name, and copied perfectly.
Now if I add more than one name to the test.txt file and run the above, it errors out. The error it returned made it look like the 3 user names in the list were one user name.
What I need this to do is run the command for each name on the list. I was thinking I could use the foreach command But not sure how to do it.
UPDATE - 04\09\2014:
I have tried the following and am getting an error back:
$user = Get-Content "test.txt"
$path = "\abnas2\abusers\users"
$path2 = "\abnas2\abdept\dept\testcopy"
$Copy = Copy-Item -path $path\$user* -Destination $path2\$user
foreach($username in $user) {
Copy-Item -path $path\$username* -Destination $path2\$username\
}
When I run it I am getting the following error:
Copy-Item : An object at the specified path \\abnas2\abusers\users\user1 user2 user3 does not exist.
These are the names in my test.txt file, is there a way to get it to read one line at a time and execute the copy and when done go to the next name on the list and do the same? I'm not sure how to get it to do that.
You can use foreach
In this case:
foreach($username in $user) {
Copy-Item -path $path\$username\* -Destination $path2\$username\
}
would copy the contents of each named folder in $user under $path to its corresponding folder in $path2.