Writing new tag to XMP group with configured exiftool doesn't work - exiftool

I need to add new tags to many images, the tags are:
AboveGroundAltitude
BandName
CentralWaveLength
ColorTransform
PerspectiveDistortion
PerspectiveFocalLength
PrincipalPoint
WavelengthFWHM
I created this configuration file:
%Image::ExifTool::UserDefined = (
'Image::ExifTool::XMP::xmp' => {
NewXMPxmpTag => { Groups => { 1 => 'AboveGroundAltitude' } },
NewXMPxmpTag => { Groups => { 1 => 'BandName' } },
NewXMPxmpTag => { Groups => { 1 => 'CentralWaveLength' } },
NewXMPxmpTag => { Groups => { 1 => 'ColorTransform' } },
NewXMPxmpTag => { Groups => { 1 => 'PerspectiveDistortion' } },
NewXMPxmpTag => { Groups => { 1 => 'PerspectiveFocalLength' } },
NewXMPxmpTag => { Groups => { 1 => 'PrincipalPoint' } },
NewXMPxmpTag => { Groups => { 1 => 'WavelengthFWHM' } },
},
);
Variations: I've tried group 0 the first time, then read somewhere that XMP tags belong to group 1 and edited accodingly.
And I'm running the command like this:
exiftool -config config.txt -ext jpg \
-AboveGroundAltitude='55.8224668413325'\
-BandName='Red, Garbage, NIR'\
-CentralWaveLength='625, 0, 850'\
-ColorTransform='1.000, 0.000, -0.996, 0.000, 0.000, 0.000, -0.286, 0.000, 4.350'\
-PerspectiveDistortion='-0.093, 0.122, 0.000, 0.000, 0.000'\
-PerspectiveFocalLength='5.4'
-PrincipalPoint='3.100, 2.325'\
-WavelengthFWHM='100, 0, 40' test.jpg
Variations tried:
- -xmp:AboveGroundAltitude='55.8224668413325'
- -XMP-AboveGroundAltitude='55.8224668413325'
- -XMP-xmp:AboveGroundAltitude='55.8224668413325'
- all the three above with `+=` between the tag and the value
Also note the backslashes were added here for clarity, my original command is a one liner with no newlines nor backslashes.
The error I'm getting is (I'm using a mix of options tried here to illustrate different error messages, but when I tried them the style for options were normalized at each try):
Also, used -v4 for more verbose logging
exiftool -config config.txt -v4 -ext jpg -XMP-AboveGroundAltitude='55.8224668413325' -xmp:BandName='Red, Garbage, NIR' -XMP-xmp:CentralWaveLength='625, 0, 850' -xmp:ColorTransform='1.000, 0.000, -0.996, 0.000, 0.000, 0.000, -0.286, 0.000, 4.350' -PerspectiveDistortion='-0.093, 0.122, 0.000, 0.000, 0.000' -xmp:PerspectiveFocalLength='5.4' -xmp:PrincipalPoint='3.100, 2.325' -xmp:WavelengthFWHM='100, 0, 40' test.jpg
Tag 'XMP-AboveGroundAltitude' is not defined or has a bad language code
Warning: Tag 'XMP-AboveGroundAltitude' is not defined or has a bad language code
Tag 'xmp:BandName' is not defined
Warning: Tag 'xmp:BandName' is not defined
Tag 'XMP-xmp:CentralWaveLength' is not defined
Warning: Tag 'XMP-xmp:CentralWaveLength' is not defined
Sorry, xmp:ColorTransform doesn't exist or isn't writable
Warning: Sorry, xmp:ColorTransform doesn't exist or isn't writable
Tag 'PerspectiveDistortion' is not defined
Warning: Tag 'PerspectiveDistortion' is not defined
Tag 'xmp:PerspectiveFocalLength' is not defined
Warning: Tag 'xmp:PerspectiveFocalLength' is not defined
Tag 'xmp:PrincipalPoint' is not defined
Warning: Tag 'xmp:PrincipalPoint' is not defined
Tag 'xmp:WavelengthFWHM' is not defined
Warning: Tag 'xmp:WavelengthFWHM' is not defined
Nothing to do.
Notice how the message for ColorTransform is different
Note: Already seen enter link description here and other related posts here and in the exiftool forum.

I found a configuration file made by the camera maker, which I'll paste below:
#------------------------------------------------------------------------------
# File: xmp_camera_tags.config
#
# Description: Adds capability to modify all XMP camera tags
#
#------------------------------------------------------------------------------
%Image::ExifTool::UserDefined = (
'Image::ExifTool::XMP::Main' => {
Camera=> {
SubDirectory => {
TagTable => 'Image::ExifTool::UserDefined::Camera',
},
},
},
);
%Image::ExifTool::UserDefined::Camera = (
GROUPS => { 0 => 'XMP', 1 => 'XMP-Camera', 2 => 'Other' },
NAMESPACE => { 'Camera' => 'http://pix4d.com/Camera/1.0/' },
WRITABLE => 'string',
GPSXYAccuracy=> {},
GPSZAccuracy => {},
Pitch => {},
Roll=>{},
Yaw => {},
BandName => { List => 'Seq' },
CentralWavelength => { List => 'Seq' },
WavelengthFWHM => { List => 'Seq' },
BandSensitivity => { List => 'Seq' },
ColorTransform => { List => 'Seq' },
);
%Image::ExifTool::UserDefined::camera = (
GROUPS => { 0 => 'XMP', 1 => 'XMP-Camera', 2 => 'Other' },
NAMESPACE => { 'Camera' => 'http://pix4d.com/Camera/1.0/' },
);
#------------------------------------------------------------------------------
and used that as a template to add the missing tags I needed.
The relevant model info for the Sentera camera is:
LensModel : 5.4mm-0001_0015
Model : 21021-03_12MP-ERS-0001
Original problem was my employee took a first batch (near 8000) of aerial photos with an old firmware. Later he updated the firmware and did the rest of the work. So when trying to process the images, the ones taken first would produce errors and the processing software refused to work.
Thanks anyway #StartGeek for your comments :)
Clearly my attempt was far, far away from what was needed, but the urgency didn't allow me time to investigate further the proper way to configure Exiftool.

Related

Elasticsearch searching with perl client

I'm attempting to do something that should be simple but I cannot get it to work. I've looked and search all over to find detailed doc for perl search::elsticsearch. I can only find CPAN doc and as far as search is concerned it is barely mentioned. I've search here and cannot find a duplicate question.
I have elasticsearch and filebeat. Filebeat is sending syslog to elasticsearch. I just want to search for messages with matching text and date range. I can find the messages but when I try to add date range the query fails. Here is the query from kibana dev tools.
GET _search
{
"query": {
"bool": {
"filter": [
{ "term": { "message": "metrics" }},
{ "range": { "timestamp": { "gte": "now-15m" }}}
]
}
}
}
I don't get exactly what I'm looking for but there isn't an error.
Here is my attempt with perl
my $results=$e->search(
body => {
query => {
bool => {
filter => {
term => { message => 'metrics' },
range => { timestamp => { 'gte' => 'now-15m' }}
}
}
}
}
);
This is the error.
[Request] ** [http://x.x.x.x:9200]-[400]
[parsing_exception]
[range] malformed query, expected [END_OBJECT] but found [FIELD_NAME],
with: {"col":69,"line":1}, called from sub Search::Elasticsearch::Role::Client::Direct::__ANON__
at ./elasticsearchTest.pl line 15.
With vars: {'body' => {'status' => 400,'error' => {
'root_cause' => [{'col' => 69,'reason' => '[range]
malformed query, expected [END_OBJECT] but found [FIELD_NAME]',
'type' => 'parsing_exception','line' => 1}],'col' => 69,
'reason' => '[range] malformed query, expected [END_OBJECT] but found [FIELD_NAME]',
'type' => 'parsing_exception','line' => 1}},'request' => {'serialize' => 'std',
'path' => '/_search','ignore' => [],'mime_type' => 'application/json',
'body' => {
'query' => {
'bool' =>
{'filter' => {'range' => {'timestamp' => {'gte' => 'now-15m'}},
'term' => {'message' => 'metrics'}}}}},
'qs' => {},'method' => 'GET'},'status_code' => 400}
Can someone help me figure out how to search with the search::elasticsearch perl module?
Multiple filter clauses must be passed as separate JSON objects within an array (like in your initial JSON query), not multiple filters in the same JSON object. This maps to how you must create the Perl data structure.
filter => [
{term => { message => 'metrics' }},
{range => { timestamp => { 'gte' => 'now-15m' }}}
]

Puppet : get mountpoint using facter filter

I am trying to get a root partition (mountpoint => "/") name using Puppet facter. When I run "facter mountpoints", it shows multiple partitions. I would like to get the variable "/dev/md3" from the result.
{
/ => {
available => "893.71 GiB",
available_bytes => 959608590336,
capacity => "1.86%",
device => "/dev/md3",
filesystem => "ext4",
options => [
"rw",
"errors=remount-ro"
],
size => "910.69 GiB",
size_bytes => 977843884032,
used => "16.98 GiB",
used_bytes => 18235293696
},
/run => {
available => "794.91 MiB",
available_bytes => 833527808,
capacity => "0.07%",
device => "tmpfs",
filesystem => "tmpfs",
options => [
"rw",
"noexec",
"nosuid",
"size=10%",
"mode=0755"
],
size => "795.48 MiB",
size_bytes => 834125824,
used => "584.00 KiB",
used_bytes => 598016
},
/tmp => {
available => "1.78 GiB",
available_bytes => 1909157888,
capacity => "1.21%",
device => "/dev/md1",
filesystem => "ext4",
options => [
"rw"
],
size => "1.80 GiB",
size_bytes => 1932533760,
used => "22.29 MiB",
used_bytes => 23375872
}
}
I tried to use filter, but I was not able to filter "/" device.
$root_mount = $facts['mountpoints'].filter |$mountpoint| { $mountpoint == '/' } Do you guys have any idea?
You can access this fact directly via hash notation. Since your question heavily implies you are using Facter 3/Puppet 4, I will work with that syntax.
You just directly traverse the keys in the Facter hash to arrive at the /dev/md3 value. If we minimize the hash to the relevant portion from facter mountpoints:
{
/ => {
device => "/dev/md3"
}
}
then we see that the keys are mountpoints (you accessed that key directly when you did facter mountpoints from the CLI), /, and device. Therefore, using standard hash notation in Puppet with the $facts hash, we can access that value with:
$facts['mountpoints']['/']['device'] # /dev/md3
Check here for more info: https://docs.puppet.com/puppet/4.9/lang_facts_and_builtin_vars.html#the-factsfactname-hash

Reading specific values in hash of hashes - Perl

I have an hash of hash map as below. Please note that the hash map is very huge which contains PluginsStatus as Success or Error. When PluginsStatus for a key is Success then I need not process anything (I have handled this scenario) but if its Error I need to to display in the order - PluginsStatus, PluginspatchLogName, PluginsLogFileName_0, PluginsLogFileLink_0, PluginsLogFileErrors_0 and so on.
Please note, I do not know exactly how many keys (in hash of a hash) i.e. PluginsLogFileName, PluginsLogFileLink, PluginsLogFileErrors exists i.e. it is dynamic.
$VAR1 = { 'Applying Template Changes' => {
'PluginsLogFileErrors_2' => 'No Errors',
'PluginsStatus' => 'Error',
'PluginsLogFileName_1' => 'Applying_Template_Changes_2015-05-12_02-57-40AM.log',
'PluginsLogFileName_2' => 'ApplyingTemplates.log',
'PluginsLogFileErrors_1' => 'ERROR: FAPSDKEX-00024 : Error in undeploying template.Cause : Unknown.Action : refer to log file for more details.',
'PluginspatchLogName' => '2015-05-11_08-14-28PM.log',
'PluginsLogFileLink_0' => '/tmp/xpath/2015-05-11_08-14-28PM.log',
'PluginsLogFileName_0' => '2015-05-11_08-14-28PM.log',
'PluginsLogFileErrors_0' => 'No Errors',
'PluginsLogFileLink_2' => 'configlogs/ApplyingTemplates.log',
'PluginsLogFileLink_1' => 'configlogs/Applying_Template_Changes_2015-05-12_02-57-40AM.log'
},
'Configuring Keystore Service' => {
'PluginsStatus' => 'Error',
'PluginsLogFileName_1' => 'Configuring_Keystore_Service_2015-05-11_11-11-37PM.log',
'PluginsLogFileErrors_1' => 'ERROR: Failed to query taxonomy attribute AllProductFamilyAndDomains.',
'PluginspatchLogName' => '2015-05-11_08-14-28PM.log',
'PluginsLogFileLink_0' => '/tmp/xpath/2015-05-11_08-14-28PM.log',
'PluginsLogFileName_0' => '2015-05-11_08-14-28PM.log',
'PluginsLogFileErrors_0' => 'No Errors',
'PluginsLogFileLink_1' => 'configlogs/Configuring_Keystore_Service_2015-05-11_11-11-37PM.log'
},
'Applying Main Configuration' => {
'PluginsStatus' => 'Error',
'PluginspatchLogName' => '2015-05-11_08-14-28PM.log',
'PluginsLogFileName_0' => 'Applying_Main_Configuration_2015-05-12_01-11-21AM.log',
'PluginsLogFileLink_0' => '/tmp/xpath/2015-05-11_08-14-28PM.log',
'PluginsLogFileErrors_0' => 'ERROR: Failed to query taxonomy attribute AllProductFamilyAndDomains.apps.ad.common.exception.ADException: Failed to query taxonomy attribute AllProductFamilyAndDomains.... 104 lines more'
}
};
Below is an output snippet I am looking for:
Plugin name is = Applying Template Changes
PluginsStatus = Error
PluginspatchLogName = 2015-05-11_08-14-28PM.log
PluginsLogFileName_0 = 2015-05-11_08-14-28PM.log
PluginsLogFileLink_0 = /tmp/xpath/2015-05-11_08-14-28PM.log
PluginsLogFileErrors_0 = No Errors
PluginsLogFileName_1 = Applying_Template_Changes_2015-05-12_02-57-40AM.log
PluginsLogFileLink_1 = configlogs/Applying_Template_Changes_2015-05-12_02- 57-40AM.log
PluginsLogFileErrors_1 = ERROR: FAPSDKEX-00024 : Error in undeploying template.Cause : Unknown.Action : refer to log file for more details.,
PluginsLogFileName_2 = ApplyingTemplates.log
PluginsLogFileLink_2 = configlogs/ApplyingTemplates.log
PluginsLogFileErrors_2 = No Errors`
Please let me know if someone could help me here ?
You have built a hash that is less than ideal for your purposes. You should create a LogFile hash element that has an array as its value. After that the process is trivial
{
"Applying Main Configuration" => {
LogFile => [
{
Errors => "ERROR: Failed to query taxonomy attribute AllProductFamilyAndDomains.apps.ad.common.exception.ADException: Failed to query taxonomy attribute AllProductFamilyAndDomains.... 104 lines more",
Link => "/tmp/xpath/2015-05-11_08-14-28PM.log",
Name => "Applying_Main_Configuration_2015-05-12_01-11-21AM.log",
},
],
patchLogName => "2015-05-11_08-14-28PM.log",
Status => "Error",
},
"Applying Template Changes" => {
LogFile => [
{
Errors => "No Errors",
Link => "/tmp/xpath/2015-05-11_08-14-28PM.log",
Name => "2015-05-11_08-14-28PM.log",
},
{
Errors => "ERROR: FAPSDKEX-00024 : Error in undeploying template.Cause : Unknown.Action : refer to log file for more details.",
Link => "configlogs/Applying_Template_Changes_2015-05-12_02-57-40AM.log",
Name => "Applying_Template_Changes_2015-05-12_02-57-40AM.log",
},
{
Errors => "No Errors",
Link => "configlogs/ApplyingTemplates.log",
Name => "ApplyingTemplates.log",
},
],
patchLogName => "2015-05-11_08-14-28PM.log",
Status => "Error",
},
"Configuring Keystore Service" => {
LogFile => [
{
Errors => "No Errors",
Link => "/tmp/xpath/2015-05-11_08-14-28PM.log",
Name => "2015-05-11_08-14-28PM.log",
},
{
Errors => "ERROR: Failed to query taxonomy attribute AllProductFamilyAndDomains.",
Link => "configlogs/Configuring_Keystore_Service_2015-05-11_11-11-37PM.log",
Name => "Configuring_Keystore_Service_2015-05-11_11-11-37PM.log",
},
],
patchLogName => "2015-05-11_08-14-28PM.log",
Status => "Error",
},
}
Just iterate over the keys of the hash. Use the $hash{key}{inner_key} syntax to get into the nested hash.
#!/usr/bin/perl
use warnings;
use strict;
use feature qw{ say };
my %error = ( 'Applying Template Changes' => {
'PluginsLogFileErrors_2' => 'No Errors',
'PluginsStatus' => 'Error',
'PluginsLogFileName_1' => 'Applying_Template_Changes_2015-05-12_02-57-40AM.log',
# ...
'PluginsLogFileLink_0' => '/tmp/xpath/2015-05-11_08-14-28PM.log',
'PluginsLogFileErrors_0' => 'ERROR: Failed to query taxonomy attribute AllProductFamilyAndDomains.apps.ad.common.exception.ADException: Failed to query taxonomy attribute AllProductFamilyAndDomains.... 104 lines more',
},
);
for my $step (keys %error) {
print "Plugin name is = $step\n";
for my $detail (sort keys %{ $error{$step} }) {
print "$detail = $error{$step}{$detail}\n";
}
}

How can I embed metadata into a custom XMP field with exiftool?

Can someone please explain how to embed metadata into a custom metadata field in an MP4 file with exiftool? I've searched all the docs and it seems to be related to the config file that needs to be created. Here is what I'm working with. (I know this isnt even close, as its not doing XMP fields, but I havent found a single working example with XMP fields yet.
%Image::ExifTool::UserDefined = (
'Image::ExifTool::Exif::Main' => {
0xd001 => {
Name => 'Show',
Writable => 'string',
WriteGroup => 'IFD0',
},
);
1; #end
The command I'm trying to run is:
exiftool -config exifToolConfig -show="Lightning" /reachengine/media/mezzanines/2015/02/13/13/CanyonFlight.mp4
Running this in a linux environment.
What is the properly way to set XMP metadata on custom metadata fields via ExifTool in linux on MP4 files?
The sample exiftool config file contains a number of working examples of custom XMP tags.
Basically, it is done like this:
%Image::ExifTool::UserDefined = (
'Image::ExifTool::XMP::Main' => {
xxx => {
SubDirectory => {
TagTable => 'Image::ExifTool::UserDefined::xxx',
},
},
},
);
%Image::ExifTool::UserDefined::xxx = (
GROUPS => { 0 => 'XMP', 1 => 'XMP-xxx', 2 => 'Other' },
NAMESPACE => { 'xxx' => 'http://ns.myname.com/xxx/1.0/' },
WRITABLE => 'string',
MyNewXMPTag => { },
);
Then the command is
exiftool -config myconfig -mynewxmptag="some value" myfile.mp4

Reading in a CSV File in Perl

I have read files in Perl before, but not when the CSV file has the values I require on different lines. I assume I have to create an array mixed with hash keys but I'm out of my league here.
Basically, my CSV file has the following columns: branch, job, timePeriod, periodType, day1Value, day2Value, day3Value, day4Value, day4Value, day6Value, and day7Value.
The day* values represent the value of a periodType for each day of the week respectively.
For Example -
East,Banker,9AM-12PM,Overtime,4.25,0,0,1.25,1.5,1.5,0,0
West,Electrician,12PM-5PM,Regular,4.25,0,0,-1.25,-1.5,-1.5,0,0
North,Janitor,5PM-12AM,Variance,-4.25,0,0,-1.25,-1.5,-1.5,0,0
South,Manager,12A-9AM,Overtime,77.75,14.75,10,10,10,10,10,
Etc.
I need to output a file that takes this data and keys off of branch, job, timePeriod, and day. My output would list each periodType value for one particular day rather than one periodType value for all seven.
For example -
South,Manager,12A-9AM,77.75,14.75,16
In the line above, the last 3 values represent the three periodTypes (Overtime, Regular, and Variance) day1Values.
As you can see, my problem is I don't know how to load into memory the data in a manner which allows me to pull the data from different lines and output it successfully. I've only parsed off of singular lines before.
Unless you like pain, use Text::CSV and its relatives Text::CSV_XS and Text::CSV_PP.
However, that may be the easier part of this problem. Once you've read and validated that the line is complete, you need to add the relevant information to the correctly keyed hashes. You're probably going to have to get rather intimately familiar with references, too.
You might create a hash %BranchData keyed by the branch. Each element of that hash would be a reference to a hash keyed by job; and each element in that would be a reference to a hash keyed by timePeriod, and each element in that would be reference to an array keyed by day number (using indexes 1..7; it over allocates space slightly, but the chances of getting it right are vastly greater; do not mess with $[ though!). And each element of the array would be a reference to a hash keyed by the three period types. Ouch!
If everything is working well, a prototypical assignment might be something like:
$BranchData{$row{branch}}->{$row{job}}->{$row{period}}->[1]->{$row{p_type}} +=
$row{day1};
You would be iterating of elements 1..7 and 'day1' .. 'day7'; there's a bit of clean-up on the design work to do there.
You have to worry about initializing stuff correctly (or maybe you don't - Perl will do it for you). I'm assuming that the row is returned as a direct hash (rather than a hash reference), with keys for branch, job, period, period type (p_type), and each day ('day1', .. 'day7').
If you know which day you need in advance, you can avoid accumulating all days, but it may make more generalized reporting simpler to read and accumulate all the data all the time, and then simply have the printing deal with whatever subset of the entire data needs to be processed.
It was intriguing enough a problem that I've hacked together this code. I doubt if it is optimal, but it does work.
#!/usr/bin/env perl
#
# SO 8570488
use strict;
use warnings;
use Text::CSV;
use Data::Dumper;
use constant debug => 0;
my $file = "input.csv";
my $csv = Text::CSV->new({ binary => 1, eol => $/ })
or die "Cannot use CSV: ".Text::CSV->error_diag();
my #headings = qw( branch job period p_type day1 day2 day3 day4 day5 day6 day7 );
my #days = qw( day0 day1 day2 day3 day4 day5 day6 day7 );
my %BranchData;
open my $in, '<', $file or die "Unable to open $file for reading ($!)";
$csv->column_names(#headings);
while (my $row = $csv->getline_hr($in))
{
print Dumper($row) if debug;
my %r = %$row; # Not for efficiency; for notational compactness
$BranchData{$r{branch}} = { } if !defined $BranchData{$r{branch}};
my $branch = $BranchData{$r{branch}};
$branch->{$r{job}} = { } if !defined $branch->{$r{job}};
my $job = $branch->{$r{job}};
$job->{$r{period}} = [ ] if !defined $job->{$r{period}};
my $period = $job->{$r{period}};
for my $day (1..7)
{
# Assume that Overtime, Regular and Variance are the only types
# Otherwise, you need yet another level of checking whether elements exist...
$period->[$day] = { Overtime => 0, Regular => 0, Variance => 0} if !defined $period->[$day];
$period->[$day]->{$r{p_type}} += $r{$days[$day]};
}
}
print Dumper(\%BranchData);
Given your sample data, the output from this is:
$VAR1 = {
'West' => {
'Electrician' => {
'12PM-5PM' => [
undef,
{
'Regular' => '4.25',
'Overtime' => 0,
'Variance' => 0
},
{
'Regular' => 0,
'Overtime' => 0,
'Variance' => 0
},
{
'Regular' => 0,
'Overtime' => 0,
'Variance' => 0
},
{
'Regular' => '-1.25',
'Overtime' => 0,
'Variance' => 0
},
{
'Regular' => '-1.5',
'Overtime' => 0,
'Variance' => 0
},
{
'Regular' => '-1.5',
'Overtime' => 0,
'Variance' => 0
},
{
'Regular' => 0,
'Overtime' => 0,
'Variance' => 0
}
]
}
},
'South' => {
'Manager' => {
'12A-9AM' => [
undef,
{
'Regular' => 0,
'Overtime' => '77.75',
'Variance' => 0
},
{
'Regular' => 0,
'Overtime' => '14.75',
'Variance' => 0
},
{
'Regular' => 0,
'Overtime' => 10,
'Variance' => 0
},
{
'Regular' => 0,
'Overtime' => 10,
'Variance' => 0
},
{
'Regular' => 0,
'Overtime' => 10,
'Variance' => 0
},
{
'Regular' => 0,
'Overtime' => 10,
'Variance' => 0
},
{
'Regular' => 0,
'Overtime' => 10,
'Variance' => 0
}
]
}
},
'North' => {
'Janitor' => {
'5PM-12AM' => [
undef,
{
'Regular' => 0,
'Overtime' => 0,
'Variance' => '-4.25'
},
{
'Regular' => 0,
'Overtime' => 0,
'Variance' => 0
},
{
'Regular' => 0,
'Overtime' => 0,
'Variance' => 0
},
{
'Regular' => 0,
'Overtime' => 0,
'Variance' => '-1.25'
},
{
'Regular' => 0,
'Overtime' => 0,
'Variance' => '-1.5'
},
{
'Regular' => 0,
'Overtime' => 0,
'Variance' => '-1.5'
},
{
'Regular' => 0,
'Overtime' => 0,
'Variance' => 0
}
]
}
},
'East' => {
'Banker' => {
'9AM-12PM' => [
undef,
{
'Regular' => 0,
'Overtime' => '4.25',
'Variance' => 0
},
{
'Regular' => 0,
'Overtime' => 0,
'Variance' => 0
},
{
'Regular' => 0,
'Overtime' => 0,
'Variance' => 0
},
{
'Regular' => 0,
'Overtime' => '1.25',
'Variance' => 0
},
{
'Regular' => 0,
'Overtime' => '1.5',
'Variance' => 0
},
{
'Regular' => 0,
'Overtime' => '1.5',
'Variance' => 0
},
{
'Regular' => 0,
'Overtime' => 0,
'Variance' => 0
}
]
}
}
};
Have fun taking it from here!
I don't have firsthand experience with it, but you can use DBD::CSV and then pass the relatively simple SQL query needed to compute the aggregation that you want.
If you insist on doing it the hard way, though, you can loop through and gather your data in the following hash of hash references:
(
"branch1,job1,timeperiod1"=>
{
"overtime"=>"overtimeday1value1",
"regular"=>"regulartimeday1value1",
"variance"=>"variancetimeday1value1"
},
"branch2,job2,timeperiod2"=>
{
"overtime"=>"overtimeday1value2",
"regular"=>"regulartimeday1value2",
"variance"=>"variancetimeday1value2"
},
#etc
);
and then just loop through the keys accordingly. This approach does, however, rely on a consistent formatting of the keys (eg "East,Banker,9AM-12PM" is not the same as "East, Banker, 9AM-12PM"), so you'd have to check for consistent formatting (and enforce it) while making the hash above.