I'm trying to convert a few thousand home videos to a smaller format. However, encoding the video changed the created and modified timestamp to today's date. I wrote a powershell script that successfully (somehow) worked by writing the original file's modified timestamp to the new file.
However, I couldn't find a way in powershell to modify the "Media created" timestamp in the file's details properties. Is there a way to add a routine that would either copy all of the metadata from the original file, or at least set the "media created" field to the modified date?
When I searched for file attributes, it looks like the only options are archive, hidden, etc. Attached is the powershell script that I made (please don't laugh too hard, haha). Thank you
$filepath1 = 'E:\ConvertedMedia\Ingest\' # directory with incorrect modified & create date
$filepath2 = "F:\Backup Photos 2020 and DATA\Data\Photos\Photos 2021\2021 Part1\Panasonic 3-2-21\A016\PRIVATE\PANA_GRP\001RAQAM\" # directory with correct date and same file name (except extension)
$destinationCodec = "*.mp4" # Keep * in front of extension
$sourceCodec = ".mov"
Get-ChildItem $filepath1 -File $destinationCodec | Foreach-Object { # change *.mp4 to the extension of the newly encoded files with the wrong date
$fileName = $_.Name # sets fileName variable (with extension)
$fileName # Optional used during testing- sends the file name to the console
$fileNameB = $_.BaseName # sets fileNameB variable to the filename without extension
$filename2 = "$filepath2" + "$fileNameB" + "$sourceCodec" # assembles filepath for source
$correctTime = (Get-Item $filename2).lastwritetime # used for testing - just shows the correct time in the output, can comment out
$correctTime # prints the correct time
$_.lastwritetime = (Get-Item $filename2).lastwritetime # modifies lastwritetime of filepath1 to match filepath2
$_.creationTime = (Get-Item $filename2).lastwritetime # modifies creation times to match lastwritetime (comment out if you need creation time to be the same)
}
Update:
I think I need to use Shell.Application, but I'm getting an error message "duplicate keys ' ' are not allowed in hash literals" and am not sure how to incorporate it into the original script.
I only need the "date modified" attribute to be the same as "lastwritetime." The other fields were added just for testing. I appreciate your help!
$tags = "people; snow; weather"
$cameraModel = "AG-CX10"
$cameraMaker = "Panasonic"
$mediaCreated = "2/16/1999 5:01 PM"
$com = (New-Object -ComObject Shell.Application).NameSpace('C:\Users\philip\Videos') #Not sure how to specify file type
$com.Items() | ForEach-Object {
New-Object -TypeName PSCustomObject -Property #{
Name = $com.GetDetailsOf($_,0) # lists current extended properties
Tags = $com.GetDetailsOf($_,18)
CameraModel = $com.GetDetailsOf($_,30)
CameraMaker = $com.GetDetailsOf($_,32)
MediaCreated = $com.GetDetailsOf($_,208)
$com.GetDetailsOf($_,18) = $tags # sets extended properties
$com.GetDetailsOf($_,30) = $cameraModel
$com.GetDetailsOf($_,32) = $cameraMaker
$com.GetDetailsOf($_,32) = $mediaCreated
}
}
Script Example
File Properties Window
I think your best option is to drive an external tool/library from Powershell rather than using the shell (not sure you can actually set values this way tbh).
Its definitely possible to use FFMpeg to set the Media Created metadata of a file like this:
ffmpeg -i input.MOV -metadata creation_time=2000-01-01T00:00:00.0000000+00:00 -codec copy output.MOV
This would copy input.MOV file to new file output.MOV and set the Media Created metadata on the new output.MOV. This is very inefficient - but it does work.
You can script ffmpeg something like the below. The script will currently output the FFMpeg commands to the screen, the commented out Start-Process line can be used to execute ffmpeg.
gci | where Extension -eq ".mov" | foreach {
$InputFilename = $_.FullName;
$OutputFilename = "$($InputFilename)-fixed.mov";
Write-Host "Reading $($_.Name). Created: $($_.CreationTime). Modifed: $($_.LastWriteTime)";
$timestamp = Get-Date -Date $_.CreationTime -Format O
Write-Host "ffmpeg -i $InputFilename -metadata creation_time=$timestamp -codec copy $OutputFilename"
# Start-Process -Wait -FilePath C:\ffmpeg\bin\ffmpeg.exe -ArgumentList #("-i $InputFilename -metadata creation_time=$timestamp -codec copy $($OutputFilename)")
}
Related
I am looking for a way to extract the GPS Latitude and Longitude values from a lot of .jpg files.
I know that Exiftool can sort of do it, but as the values I'm after are visible in Windows 10 Explorer for that file (Properties > Detail > GPS heading > Latitude)... Can I use PS to directly grab them? I'm assuming it would be quicker that way.
I know how to extract $img.FullName etc., but can't get to Latitude this way.
To access the metadata (different from filesystem metadata) you see in Windows Explorer without external tools, you have to use the Windows Image Acquisition (WIA) Automation Layer. You can do it like that:
# Create an ImageFile object and load an image file
$image = New-Object -ComObject Wia.ImageFile
$image.LoadFile("C:\Absolute\path\to\my.jpg")
# Read your desirered metadata
$image.Properties.Item('GpsLatitude').Value
$image.Properties.Item('GpsLongitude').Value
Be aware that WIA has only limited parsing capabiltites in comparison to external tools like ExifTool or exiv2. But it will be enough to get the data you need in your case.
You can read more about ImageFile objects and what they are capapble of here.
Thanks to everyone, particularly #stackprotector. I've learned a lot and managed to work out how to extract my exact values, including the important N,S,E,W.
To help others, here's what I discovered:
# Based on answer from #stackprotector (above)
# Create an ImageFile object and load an image file
$image = New-Object -ComObject Wia.ImageFile
$image.LoadFile("D:\temp\toNAS\temp\PXL_20210830_004210948.jpg")
$image.Properties.Item('GpsLatitudeRef').Value
# S
$image.Properties.Item('GpsLatitude').Value
<#
Value Numerator Denominator
----- --------- -----------
37 37 1
51 51 1
43.92 4392 100
>#
# Can rinse and repeat for GpsLongitude
Now to extract what I want. This is a subset; easy to copy in GpsLongitude for the other one :-)
$image.Properties.Item('GpsLatitude').Value[1].Value
# 37
$image.Properties.Item('GpsLatitude').Value[2].Value
# 51
$image.Properties.Item('GpsLatitude').Value[3].Value
# 43.92
$image.Properties.Item('GpsLatitudeRef').Value
# S
$image.Properties.Item('DateTime').Value
# 2021:08:30 10:42:10
# This is local not UTC
# So final latitude is 37 deg 51 min 43.92 sec S
# Taken on 2021:08:30 10:42:10 local time
In-camera value says -37.8622 and taken 30 Aug 2021 at 10:42:10 local, so happy with that, including the '-' for South
I will try Exiftool #daniel to see if it's easier. I have used it before and it's excellent.
Update (2022-02-13).
I DID end up going with Exiftool. As I became more confident with JSON that toolset became more obvious for me. Here's the key snippets of the code (below). One little thing: this exiftool option set gives no "feedback" when running. Only at the end does it give a summary:
# $photoYear is the root folder with the images
$data = (exiftool -if '$gpsdatetime' -s -s -s -json -ext jpg -filename -FileTypeExtension -Directory -CreateDate -GPSDateTime -GPSLatitude -GPSLongitude -n -r $photoYear ) | ConvertFrom-Json
# ...
[int]$n = 0
foreach ($photo in $data){
Write-host $n, " " $photo.CreateDate, $photo.GPSLatitude, $photo.GPSLongitude
$n++
}
Using #dwids and #stackprotector answers, I came up with the following to run through all files in a folder. It's not the tidiest, but it's self explanatory:
cls
$FolderPath = "c:\MyFolder"
Get-ChildItem $FolderPath -Filter *.* | where { ! $_.PSIsContainer } |
Foreach-Object {
# Get the file name and path, write it out to screen
$FileName = $_.FullName
Write-Host "$FileName"
# Create an ImageFile object and load an image file
$image = New-Object -ComObject Wia.ImageFile
$image.LoadFile($FileName)
# Read your desirered metadata, if it doesn't contain any, say NONE
try
{
#Clear variables for Lat and Lon
Clear-Variable Lat*
Clear-Variable Lon*
$LatDEG = $image.Properties.Item('GpsLatitude').Value[1].Value
$LatMIN = $image.Properties.Item('GpsLatitude').Value[2].Value
$LatSEC = $image.Properties.Item('GpsLatitude').Value[3].Value
$LatREF = $image.Properties.Item('GpsLatitudeRef').Value
$LonDEG = $image.Properties.Item('GpsLongitude').Value[1].Value
$LonMIN = $image.Properties.Item('GpsLongitude').Value[2].Value
$LonSEC = $image.Properties.Item('GpsLongitude').Value[3].Value
$LonREF = $image.Properties.Item('GpsLongitudeRef').Value
# Convert them to Degrees Minutes Seconds Ref
$LatSTR = "$LatDEG$([char]176) $LatMIN$([char]39) $LatSEC$([char]34) $LatREF"
$LonSTR = "$LonDEG$([char]176) $LonMIN$([char]39) $LonSEC$([char]34) $LonREF"
# Write the full coordinates out
Write-Host "$LatSTR $LonSTR"
}
catch
{
Write-Host "NONE"
}
}
Obviously you can remove the Write-Host "$FileName" (file path and name) if you just want a list of coordinates. Or you can change it to Write-Host "$_" if you only want the file name.
You can add -Recurse to Get-ChildItem $FolderPath -Filter *.* | so it would be Get-ChildItem $FolderPath -Filter *.* -Recurse | to look at all files in all sub-folders.
#dwids - to show progress, you can use the 'progress' feature of exiftool as documented in the exiftool 'pod' https://exiftool.org/exiftool_pod.html#Advanced-formatting-feature
-progress[:[TITLE]]
Show the progress when processing files. Without a colon, the -progress option adds a progress count in brackets after the name of each processed file, giving the current file number and the total number of files to be processed. Implies the -v0 option, causing the names of processed files to also be printed when writing. When combined with the -if option, the total count includes all files before the condition is applied, but files that fail the condition will not have their names printed.
If followed by a colon (ie. -progress:), the console window title is set according to the specified TITLE string. If no TITLE is given, a default TITLE string of "ExifTool %p%%" is assumed. In the string, %f represents the file name, %p is the progress as a percent, %r is the progress as a ratio, %##b is a progress bar of width "##" (20 characters if "##" is omitted), and %% is a % character. May be combined with the normal -progress option to also show the progress count in console messages. (Note: For this feature to function correctly on Mac/Linux, stderr must go to the console.)
Rather than rework your example, the code below is an extract from working code:
$sourcedir = 'fewphotos'
$exifargs = 'exiftool -json -d %Y%m%dT%H%M%S%z -Model -DateTimeOriginal -ext jpg -progress:%50b -r ' + $sourcedir
$exifdata = invoke-expression $exifargs | ConvertFrom-Json
The progress bar will appear in the frame of the window in which exiftool is running. %50b will create a new marker for every 2% of files processed by exiftool.
The example also shows how the exiftool command can include variables, e.g. for the source directory. The example works under both Linux and Windows.
Hope this helps.
Should be a comment but I don't have enough rep.
I have 400+ .vcf files that I would like to replace the "FN:" line (line 4) with the file name. I've looked at multiple solutions and I can't seem to find something that will achieve what I'm looking for even though I know there's a way to do this.
This is what I have currently
File Name: LastNamefirstName
BEGIN:VCARD
VERSION:3.0
N:lastName;firstName;;;
FN:firstName lastName
ADR:;;111 Main Rd;Columbia;MO;65202;
TEL;TYPE=mobile:(111) 222-3333
EMAIL;TYPE=work:email#gmail.com
BDAY:20000101
END:VCARD
This is what I would like to achieve
Keep "FN:" and replace the text after it with the file name text.
BEGIN:VCARD
VERSION:3.0
N:lastName;firstName;;;
FN:LastNamefirstName
ADR:;;111 Main Rd;Columbia;MO;65202;
TEL;TYPE=mobile:(111) 222-3333
EMAIL;TYPE=work:email#gmail.com
BDAY:20000101
END:VCARD
This Powershell script does do half what I want but I would really like to take the file name and input it in the replacementLineText.
# Set by user to their needs.
$filesToCheck = "C:\path\*.vcf"
$lineToChange = 4
$replacementLineText = "New Text"
# Gather list of files based on the path (and mask) provided by user.
$files = gci $filesToCheck
# Iterate over each file.
foreach ($file in $files) {
# Load the contents of the current file.
$contents = Get-Content $file
# Iterate over each line in the current file.
for ($i = 0; $i -le ($contents.Length - 1); $i++) {
# Are we on the line that the user wants to replace?
if ($i -eq ($lineToChange - 1)) {
# Replace the line with the Replacement Line Text.
$contents[$i] = $replacementLineText
# Save changed content back to file.
Set-Content $file $contents
}
}
}
Any input or guidance would be greatly appreciated!
I would really like to take the file name and input it in the replacementLineText.
To accept the paths of all target files, all you need to do is declare a parameter:
param(
[Parameter(Mandatory = $true)]
[string[]]$Path
)
$lineToChange = 4
# Gather list of files based on the path (and mask) provided by user.
$files = gci -Path $Path
# ... rest of original script
I made a slight modification to the variable names - Path is the idiomatic parameter name for strings describing expandable paths, and parameter names are generally expected to be upper case.
The Mandatory flag in the [Parameter()] attribute associated with $Path means that the caller MUST supply a value - otherwise PowerShell will prompt for it:
PS C:\> .\script.ps1
cmdlet script.ps1 at command pipeline position 1
Supply values for the following parameters:
Path:
PS C:\> .\script.ps1 -Path "C:\path\*.vcf" # now it won't prompt
For more information on parameters, see the about_Functions and about_Functions_Advanced_Parameters help topics - although the documentation is about functions, the rules for parameters and their declaration is the same for script files (you can think of a script file as a function that happens to sit on the filesystem instead of in memory)
The gci (or Get-ChildItem) cmdlet returns [FileInfo] objects, with all the files metadata, so to use the file name as the replacement value inside the loop, you simply do $file.Name:
$contents[$i] = "FN:$($file.Name)"
# or using the -f format operator:
$contents[$i] = "FN:{0}" -f $file.Name
Since you already know which index (line number minus 1) you want to modify, you can skip the inner loop and instead do:
param(
[Parameter(Mandatory = $true)]
[string[]]$Path
)
$lineToChange = 4
# Gather list of files based on the path (and mask) provided by user.
$files = Get-ChildItem -Path $Path
# Iterate over each file.
foreach ($file in $files) {
# Load the contents of the current file.
$contents = Get-Content $file
if($contents.Count -ge $lineToChange){
# Replace the line with the Replacement Line Text.
$contents[$lineToChange - 1] = "FN:$($file.Name)"
# Save changed content back to file.
Set-Content $file $contents
}
}
I am new to scripting, and I would like to ask you help in the following:
This script should be scheduled task, which is working with Veritas NetBackup, and it creates a backup register in CSV format.
I am generating two source files (.csv comma delimited):
One file contains: JobID, FinishDate, Policy, etc...
The second file contains: JobID, TapeID
It is possible that in the second file there are multiple same JobIDs with different TapeID-s.
I would like to reach that, the script for each line in source file 1 should check all of the source file 2 and if there is a JobID match, if yes, it should have the following output:
JobID,FinishDate,Policy,etc...,TapeID,TapeID....
I have tried it with the following logic, but sometimes I have no TapeID, or I have two same TapeID-s:
Contents of sourcefile 1 is in $BackupStatus
Contents of sourcefile 2 is in $TapesUsed
$FinalReport =
foreach ($FinalPart1 in $BackupStatus) {
write-output $FinalPart1
$MediaID =
foreach ($line in $TapesUsed){
write-output $line.split(",")[1] | where-object{$line.split(",")[0] -like $FinalPart1.split(",")[0]}
}
write-output $MediaID
}
If the CSV files are not huge, it is easier to use Import-Csv instead of splitting the files by hand:
$BackupStatus = Import-Csv "Sourcefile1.csv"
$TapesUsed = Import-Csv "Sourcefile2.csv"
This will generate a list of objects for each file. You can then compare these lists quite easily:
Foreach ($Entry in $BackupStatus) {
$Match = $TapesUsed | Where {$_.JobID -eq $Entry.JobID}
if ($Match) {
$Output = New-Object -TypeName PSCustomObject -Property #{"JobID" = $Entry.JobID ; [...] ; "TapeID" = $Match.TapeID # replace [...] with the properties you want to use
Export-Csv -InputObject $Output -Path <OUTPUTFILE.CSV> -Append -NoTypeInformation }
}
This is a relatively verbose variant, but I prefer it like this.
I am checking for each entry in the first file whether there is a matching entry in the second. If there is one I combine the required fields from the entry of the first list with the ones from the entry in the second list into one object that I can then export very comfortably using Export-Csv.
I'm trying to modify the script created by Boe Prox that combines multiple CSV files to one Excel workbook to run on a network share.
When I run it locally, the script executes great and combines multiple .csv files into one Excel workbook.
Clear-Host
$OutputFile = "ePortalMonthlyReport.xlsx"
$ChildDir = "C:\MonthlyReport\*.csv"
cd "C:\MonthlyReport\"
echo "Combining .csv files into Excel workbook"
. C:\PowerShell\ConvertCSVtoExcel.ps1
Get-ChildItem $ChildDir | ConvertCSVtoExcel -output $OutputFile
echo " "
But when I modify it to run from a network share with the following changes:
Clear-Host
# Variables
$OutputFile = "ePortalMonthlyReport.xlsx"
$NetworkDir = "\\sqltest2\dev_ePortal\Monthly_Report"
$ChildDir = "\\sqltest2\dev_ePortal\Monthly_Report\*.csv"
cd "\\sqltest2\dev_ePortal\Monthly_Report"
echo "Combining .csv files into Excel workbook"
. $NetworkDir\ConvertCSVtoExcel.ps1
Get-ChildItem $ChildDir | ConvertCSVtoExcel -output $OutputFile
echo " "
I am getting an error where it looks like it using the network path twice and I am not sure why:
Combining .csv files into Excel workbook
Converting \sqltest2\dev_ePortal\Monthly_Report\001_StatsByCounty.csv
naming worksheet 001_StatsByCounty
--done
opening csv Microsoft.PowerShell.Core\FileSystem::\sqltest2\dev_ePortal\Monthly_Report\\sqltest2\dev_ePortal\Monthly_Report\001_StatsByCounty.csv) in excel in temp workbook
Sorry, we couldn't find Microsoft.PowerShell.Core\FileSystem::\sqltest2\dev_ePortal\Monthly_Report\\sqltest2\dev_ePortal\Monthly_Report\001_StatsByCounty.csv. Is it possible it was moved, renamed or deleted?
Anyone have any thoughts on resolving this issue?
Thanks,
Because in the script it uses the following regex:
[regex]$regex = "^\w\:\\"
which matches a path beginning with a driveletter, e.g. c:\data\file.csv will match and data\file.csv will not. It uses this because (apparently) Excel needs a complete path, so if the file path does not match, it will add the current directory to the front of it:
#Open the CSV file in Excel, must be converted into complete path if no already done
If ($regex.ismatch($input)) {
$tempcsv = $excel.Workbooks.Open($input)
}
ElseIf ($regex.ismatch("$($input.fullname)")) {
$tempcsv = $excel.Workbooks.Open("$($input.fullname)")
}
Else {
$tempcsv = $excel.Workbooks.Open("$($pwd)\$input")
}
Your file paths will be \\server\share\data\file.csv and it doesn't see a drive letter, so it hits the last option and jams $pwd - an automatic variable of the current working directory - onto the beginning of the file path.
You might get away if you edit his script and change the regex to:
[regex]$regex = "^\w\:\\|^\\\\"
which will match a path beginning with \\ as OK to use without changing it, as well.
Or maybe edit the last option (~ line 111) to say ...Open("$($input.fullname)") as well, like the second option does.
Much of the issues are caused in almost every instance where the script calls $pwd rather than $PSScriptRoot. Replace all instances with a quick find and replace.
$pwd looks like:
PS Microsoft.PowerShell.Core\FileSystem::\\foo\bar
$PSScriptRoot looks like:
\\foo\bar
The second part i fixed for myself is what #TessellatingHeckler pointed out. I took a longer approach.
It's not the most efficient way...but to me it is clear.
[regex]$regex = "^\w\:\\"
[regex]$regex2 = "^\\\\"
$test = 0
If ($regex.ismatch($input) -and $test -eq 0 ) {
$tempcsv = $excel.Workbooks.Open($input)
$test = 1 }
If ($regex.ismatch("$($input.fullname)") -and $test -eq 0) {
$tempcsv = $excel.Workbooks.Open("$($input.fullname)")
$test = 1}
If ($regex2.ismatch($input) -and $test -eq 0) {
$tempcsv = $excel.Workbooks.Open($input)
$test = 1 }
If ($regex2.ismatch("$($input.fullname)") -and $test -eq 0) {
$tempcsv = $excel.Workbooks.Open("$($input.fullname)")
$test = 1}
If ($test -eq 0) {
$tempcsv = $excel.Workbooks.Open("$($PSScriptRoot)\$input")
$test = 0 }
I've been working on this Powershell script for a good week now, and it almost works as expected.
Essentially, the script reaches into the specified directory which we have another script dropping .CSV files into, grabs the .CSV file(s) and pushes the information found into a Sharepoint list, well, that's the intention anyway. I've gotten the script to work perfectly if I manually specify the file, the issue I am having is actually getting all the .CSV files into a group, and then looping through each .CSV to pull the information out and push it into a Sharepoint list. Once done, it renames the file from .CSV to .ARCHIVED for another script to come in and re-locate after we're done with it.
I think I have, through selective (creative) troubleshooting, figured out what I am doing wrong, I just don't know how to proceed after identifying the issue.
I declare the string $Filecsv like so:
$Filecsv = get-childitem "Z:\" -recurse | where {$_.extension -eq ".csv"}
So, this reaches into my 'Z:\' directory, and pulls all the files with .CSV extension and combines them into a table...
ForEach ($items in $Filecsv) {
And this says for each item, perform logic...
foreach($row in $Filecsv)
The only problem is, when I call $Filecsv, it is returning the list of each .CSV file in the directory like such:
And as such, when I execute the bit of code that says 'put the information into my list', only the file name is added to my Sharepoint list....
Now, I can see what's going on here, it's pulling the 'Name' from the $Filecsv table, and pushing that up to Sharepoint, however, I am not sure how to re-construct my logic so that it operates as expected because as it exists now, it should (to me anyway) work as I think it does, but I am still new to Sharepoint and am certainly missing something here.
Below, is the full code, if it helps:
# Add SharePoint PowerShell Snapin which adds SharePoint specific cmdlets
Add-PSSnapin Microsoft.SharePoint.PowerShell -EA SilentlyContinue
#start the counter at 1 to track times script has looped
$iterations = 1
# set the location where the .CSV files will be pulled from and define the
# file extension we are concerned with
$filecsv = get-childitem "Z:\" -recurse | where {$_.extension -eq ".csv"}
# for each file found in the directory
ForEach ($items in $Filecsv) {
# check to see if files exist, if not exit cleanly
if ($Filecsv) {"File exists" + $Filecsv} else {exit}
# count the times we've looped through
"Iterations : $iterations"
# specify variables needed. The webURL should be the site URL, not including the list
# the listName should be the list name
$WebURL = "http://SHAREPOINTURL/"
$listName = "test"
# Get the SPWeb object and save it to a variable
$web = Get-SPWeb -identity $WebURL
# Get the SPList object to retrieve the list
$list = $web.Lists[$listName]
# START deletes all items. code shows the number of items in a list, then deletes all items
# If you don't want your script to delete items, then remove this
$site = new-object Microsoft.SharePoint.SPSite ( $WebURL )
$web = $site.OpenWeb()
"Web is : " + $web.Title
# Enter name of the List below instead of
$oList = $web.Lists["test"];
"List is :" + $oList.Title
"List Item Count: " + $oList.ItemCount
#delete existing contents and replace with new stuff
$collListItems = $oList.Items;
$count = $collListItems.Count - 1
for($intIndex = $count; $intIndex -gt -1; $intIndex--) {
"Deleting record: " + $intIndex
$collListItems.Delete($intIndex);
}
# END Deletes all items
# goes through the CSV file and performs action for each row
foreach($row in $Filecsv)
{
$newItem = $list.items.Add()
$item = $list.items.add()
# Check if cell value is not null in excel
if ($row."Name" -ne $null)
# Add item to sharepoint list. for this one, I had to use the internal column name.
#You don't always have to, but I had trouble with one SharePoint column, so I did
{$newItem["Name"] = $row."Name"}
else{$newItem["Name"] = $row."Not Provided"}
if ($row."Description" -ne $null)
{$newItem["Description"] = $row."Description"}
else{$newItem["Description"] = $row."No Description"}
if ($row."NetworkID" -ne $null)
{$newItem["Network ID"] = $row."NetworkID"}
else{$newItem["Network ID"] = $row."No NetworkID"}
if ($row."Nested" -ne $null)
{$newItem["Nested"] = $row."Nested"}
else{$newItem["Nested"] = $row."Not Nested"}
# Commit the update, then loop again until end of file
$newItem.Update()
}
# get the date and time from the system
$datetime = get-date -f MMddyy-hhmmtt
# rename the file
$NewName = $items.fullname -replace ".csv$","$datetime.csv.archived"
$Items.MoveTo($NewName)
# +1 the counter to count the number of files we've looped through
$iterations ++
}
a very cursory look would suggest that you need to use $items not $filecsv in your main loop.
essentially you are looping over the contents of the $filecsv collection, so you need to look at $items.
Your ForEach loops look redundant since they are both looping through a list of FileInfo objects. I think you want to find all the files, and for each file load it into memory and process it's contents. We'll go that route.
I have moved your SharePoint object creation out of the loop since I don't see any point to creating the object over and over for each file processed since it never references anything based on the file or it's contents. It simply makes the same object over, and over, and over.
# Add SharePoint PowerShell Snapin which adds SharePoint specific cmdlets
Add-PSSnapin Microsoft.SharePoint.PowerShell -EA SilentlyContinue
#start the counter at 1 to track times script has looped
$iterations = 1
# specify variables needed. The webURL should be the site URL, not including the list
# the listName should be the list name
#Setup SP object
$WebURL = "http://SHAREPOINTURL/"
$listName = "test"
# Get the SPWeb object and save it to a variable
$web = Get-SPWeb -identity $WebURL
# Get the SPList object to retrieve the list
$list = $web.Lists[$listName]
# START deletes all items. code shows the number of items in a list, then deletes all items
# If you don't want your script to delete items, then remove this
$site = new-object Microsoft.SharePoint.SPSite ( $WebURL )
$web = $site.OpenWeb()
"Web is : " + $web.Title
# Enter name of the List below instead of
$oList = $web.Lists["test"];
"List is : " + $oList.Title
"List Item Count: " + $oList.ItemCount
#delete existing contents and replace with new stuff
$collListItems = $oList.Items;
$count = $collListItems.Count - 1
for($intIndex = $count; $intIndex -gt -1; $intIndex--) {
"Deleting record: " + $intIndex
$collListItems.Delete($intIndex);
}
# END Deletes all items
Find all the CSV files, and start looping through the list of them. I removed the check to see if the file exists. You just pulled a directory listing to find these files, they really should exist.
# set the location where the .CSV files will be pulled from and define the
# file extension we are concerned with
$CSVList = get-childitem "Z:\" -recurse | where {$_.extension -eq ".csv"}
ForEach ($CSVFile in $CSVList) {
# count the times we've looped through
"Iterations : $iterations"
Now, this is different. It loads the CSV file, and processes each row in it as $row. I'm pretty sure this is what you intended to do from the start. I also changed it from If(Something -ne $null) to check for either null, or empty since either can actually exist and the later can cause you some issues. It's just a safer method in general.
foreach($row in (Import-CSV $CSVFile.FullName))
{
$newItem = $list.items.Add()
$item = $list.items.add()
# Check if cell value is not null in excel
if (![string]::IsNullOrEmpty($row."Name"))
# Add item to sharepoint list. for this one, I had to use the internal column name.
#You don't always have to, but I had trouble with one SharePoint column, so I did
{$newItem["Name"] = $row."Name"}
else{$newItem["Name"] = $row."Not Provided"}
if (![string]::IsNullOrEmpty($row."Description"))
{$newItem["Description"] = $row."Description"}
else{$newItem["Description"] = $row."No Description"}
if (![string]::IsNullOrEmpty($row."NetworkID"))
{$newItem["Network ID"] = $row."NetworkID"}
else{$newItem["Network ID"] = $row."No NetworkID"}
if (![string]::IsNullOrEmpty($row."Nested"))
{$newItem["Nested"] = $row."Nested"}
else{$newItem["Nested"] = $row."Not Nested"}
# Commit the update, then loop again until end of file
$newItem.Update()
}
I don't really understand why you are adding a new item twice, but if it works then more power to you. Then your bit to rename files when you're done with them (hey, this looks familiar):
# get the date and time from the system
$datetime = get-date -f MMddyy-hhmmtt
# rename the file
$NewName = $CSVFile.fullname -replace ".csv$","$datetime.csv.archived"
$CSVFile.MoveTo($NewName)
# +1 the counter to count the number of files we've looped through
$iterations ++
}
I did rename a few things to make them more indicative of what they represent ($Items to $CSVFile and what not). See if this works for you. If you have questions or concerns let me know.
Edit: Ok, to fix the loop trying to pull each item from the current folder we reference the FullName property of it. One line changed:
foreach($row in (Import-CSV $CSVFile.FullName))