How do I output the files being copied in console? - powershell

I'm writing a PowerShell code to copy files in a folder to another folder. I want the console to display the files that are being copied, as well as the file size if possible.
Currently, I have tried using -Verbose But the output is not very readable.
I would like the console to display the files being copied, and the file size.

You can use the parameter -PassThru for Copy-Item. But it will not show you the file size.
I would recommend you to use robocopy.exe for any copy jobs in powershell. It is more reilable and in your case it will show you the filesize.

Related

My script can read a text file when run manually through ISE, but it looks in a different directory when run through Task Scheduler

Powershell noob here.
I have a script for copying PDF documents and CSV files. The script gets the CSV data from a URL defined in a .txt file in the same directory as the script. In the script, the file is determined like this:
$publishedCSV = Get-Content .\DriveURL.txt -Raw
When I run this script in ISE, it works fine and retrieves all the CSV data. However, when I run it in Scheduler, it tries to find the DriveURL file in System32, rather than in the path that is specified (I used transcript to find out what was happening)
I figured that out, and defined the FULL path of DriveURL, rather than just using the .\ notation. It works, but I don't know why it works
What I did:
Specified proper path of DriveURL and now my script works. I don't understand why it worked previously with using ./DriveURL.txt rather than the full path when I'd run it in ISE, but it didn't when run in Scheduler. It's the same script
If you use relative paths then you must also either set your working directory, or in the script change to the appropriate directory before referencing said relative paths. Alternatively you can use full paths, as you have already discovered.
A simple use of cd or pushd and the automatic $PSScriptRoot variable will change your working directory to wherever the script is saved to:
pushd $PSScriptRoot

Powershell file compression to multiple zip files

In a Powershell script, I've created zip file archives using functions like
[io.compression.zipfile]::CreateFromDirectory.
Now these archives are getting large, and I need to break them down to files that are under 5GB. So, I've been looking through some of MS API documents on file compression looking for some type of disk spanning feature, or making archives spread out over multiple files.
Does anyone know of a .Net or Powershell cmdlet that can do this?
Thanks!
I guess you've already read about the file size limitation:
powershell compress-archive File size issue
zip file size in powershell
Compress-Archive
Because Compress-Archive relies upon the Microsoft .NET Framework API System.IO.Compression.ZipArchive to compress files, the maximum file size that you can compress by using Compress-Archive is currently 2 GB. This is a limitation of the underlying API.
May be you could use 7zip's volume option /v4GB

Powershell - Verifying contents of zip files

I wrote a simple powershell script that takes a set of files and compresses them into a zip file using the ZipFile .NET Framework class. What I'd like to do is to verify that the file compressed without issues.
I can create a hash value for the zip file itself, but I'm unsure as to how to do this with each individual file in the the archive or compare each uncompressed file to the compressed version of the file. Here's the compression piece of my script.
$FileList | ForEach-Object -Begin {
$Zip = [System.IO.Compression.ZipFile]::Open("Destination.Zip","Create")
} -Process {
[System.IO.Compression.ZipFileExtensions]::CreateEntryFromFile($Zip,$_.FullName,$_.Name,"optimal")
} -End {
$Zip.Dispose()
}
I know the compression piece works, however the eventual goal is to verify each file and redo that file if the verification fails. Afterwards delete the uncompressed files.
The system this is to run on only has powershell v3 and no third party compression tools are installed. I'd like to stick with that if possible.
I guess your most direct shot would be to use 7-Zip.
You can use native commands in PowerShell.
You can find an example use of the -t switch here.
Get the CLI, add 7-Zip folder to your path and run for example:
7z t archive.zip

Powershell: How to copy a file from TFS to a different destination

I'm writing a powershell script for deployment. I need to copy changed files from TFS to our Test Server. I have been able to retrieve the change sets, and I have been able to drill down to the Item. I have access to the path of the source file.
Does anyone know an efficiect way of doing this? Do I need to use the DownloadFile cmdlet or can I just use the Copy-Item cmdlet.
path of sourcefile is $file.ServerItem which resolves to, for example, $/Project/PromonetBaseline/Main/Source/ItemHierarchy.vb
Destination is a path like \\104Server\WebApps\PromonetBaseline\Main\Source\ItemHierarchy.vb
Is there a neat way to do this programatically?
Any input is appreciated.
Thanks,
Akin
For something like this, I would set up a local workfold mapping for the source files, get those files and then use Copy-Item to copy the source files to the destination folder. You can use the -Force parameter on Copy-Item to overwrite an existing file.
Another option is to use tf view itemspec /i > tempfilename to get the files from the server without creating a local workfold mapping.

Copy-Item cmdlet should only copy difference

Im writeing a backupscript useing powershell. I know that i could just use robocopy or rsync but i would like to do this in powershell. The problem i have has to do with the copy-item cmdlet. What my script does:
Read var's and fills them from a csv
Pings destination host
Checks if Outlook is open on source host and asks if it should close it
Then it should copy some folders onto the destination
My problem is that it always does a full copy of all files. I would like it to only copy the files that were changed or do not exist on the destination.
The second problem i have is that in Win7 there are hidden systemfolders in "Users/Documents" that link to "My Pictures" and "My Videos". I dont need to copy these but i didnt manage to exclude them useing the exclude agrument.
Just some quick and general suggestion...
I would like it to only copy the files that were changed
I would copy the files comparing the source and destination modified date
The second problem i have is that in Win7 there are hidden systemfolders in "Users/Documents" that link to "My Pictures" and "My Videos". I dont need to copy these but i didnt manage to exclude them useing the exclude agrument. :
Do not use copy-item directly, but use the output of get-childitem without force parameter. This will prevent to copy hidden or system files.