IncludeWholeFilesOnly property with a value of 1 (one) in the Patch Creation Properties (PCP) file delivers the worng timestamp for dll's - deployment

I am creating a patch for a product. I don't want the patch to access the details of original files during patch installation. So in Patch creation property i have changed the value of IncludeWholeFilesOnly to 1.
But, 'IncludeWholeFilesOnly' property with a value of 1 (one) in the Patch Creation Properties (PCP) file delivers the wrong time stamp for the un-versioned dll's delivered in that particular patch.
problem is instead of dll modification time-stamp it shows the patch creation time stamp.
whereas if i change the value of IncludeWholeFilesOnly to 0 then everything will be proper.
How to fix this issue. Is there any other properties which i can modify so that time remains same.

Timestamps are irrelevant (even though they are to you) because they are not used anywhere to decide whether a file is the latest or not. So the system doesn't preserve them. When you install a non-versioned file it will set the create date and modify dates to be identical so that any modification will change the modify date, and Windows will assume that the file has been updated and so a patch won't overwrite it.
Versioned binaries are replaced or not depending on the version rules; data files are replaced or not depending on file hashes:
http://msdn.microsoft.com/en-us/library/aa368599(v=vs.85).aspx
and for example your non-versioned files:
http://msdn.microsoft.com/en-us/library/aa370531(v=vs.85).aspx
So this is the way it works, and dates aren't used to decide which is the latest. The best way to manage binary file versions is to use file versions.
Is there an actual problem, or do you just not like the way the dates change?

Related

In which file is the _AppInfo data stored in Beckhoff TwinCAT 3 PLC

I'm looking for the 'AppTimeStamp' information so this can be used to verify that the code is not updated/changed by service personel.
Detect code changes on Beckhoff PLC using C#
At this location I already find part of my information, but I was not able to add a comment due to the 'new user' limitations
You can find the AppTimestamp in the _AppInfo instance.
So just call _AppInfo.AppTimestamp in your program to know the time of the last application start.
Make sure you also check the number of online changes since last download with the OnlineChangeCnt counter which you will also find in the _AppInfo instance.
There are many possibilities where this value is saved. The TwinCAT saves data to the C:\TwinCAT\3.1\Boot folder, different files are explained here.
The ProjectName can be found for example from the configuration data (CurrentConfig.xml), from the end of the file (TcBootProject/ProjectInfo/ProjectName). The same file contains one date (<TcBootProject CreateTime="2019-06-10T13:14:17">), but it seems to be the build time of the boot project created.
I couldn't find the date of AppTimestamp in any files, but perhaps the TwinCAT uses the creation time of the files in those folders? Or perhaps it's hidden in the binary somewhere.
When you update the software without updating the boot project, the file Port_851_act.tizip is updated. So you can check its timestamp. When you update the boot project too, Port_851_boot.tizip and other files are also updated.
So basically, to check if the code is updated by someone, check that modified dates of the files under Boot directory. I suppose only .bootdata files should update as they contain saved persistent data. Of course, you can easily change the dates with 3rd party program. So one solution is to compare the Port_851.crc file contents since it contains the CRC check value of the code. It will always change when boot project is updated.

Using each plugin in Nutch separately

I'm using extractor plugin with Nutch-1.15. The plugin makes use of parsed data.
The plugin works fine when used as a whole. The problem arises when a few changes are made to the custom-extractos.xml file.
The entire crawling process needs to be restarted even if there is a small change in the custom-extractors.xml file.
Is there a way that single plugin can be used separately on parsed data?
Since this plugin is a Parser filter, it must be used as part of the Parse step, and is not stand-alone.
However, there are a number of things you can do.
If you are looking to change the configuration on the fly (only affecting newly parsed documents), you can use the extractor.file property to specify any location on the HDFS, and replace this file as needed, it will be read by each task.
If you are want to reapply the changes to previously parsed documents, the answer is dependent on the specifics of your crawl, but you may be able to run the parse step again using nutch parse on old segments (you will need to delete the existing parse folders in the segments).

Talend: How to copy the file with modified as of today

I have a job in Talend which will connect to a ftp folder and look for the files eg:ABCD. This file is created everyday and its placed in the ftp path and i need to move this files to some other folder. I'm new to talend and Java. Could you please help me how to move this file when and only the file last modified date as of the job run date.
You can use tFTPFileProperties to obtain the properties of the remote file, then in a javarow access those properties. You can then compare to current date either in the tJavaRow and stick the results in a global variable or put the date in a global variable. You then use an IF trigger to join to the tFTPGet component.
The IF trigger will either check the results of your compare, or do the compare. It will only execute the FTP Get if true.
This shows overall job structure, including the fields made available from the file properties:
This shows how to obtain the datetime of the remote file. This is where you will need to stick it in a global variable (code for that is not shown) so you can use it in your IF trigger code.
This shows the datetime of the remote file when the job is run.
This points you in the right direction but you will need to still do some work. You will need to do the compare in your IF trigger and know how to compare dates.

CALDAV sync algorithm

I'm trying to implement a syn using caldav and sync reports however I'm having conceptual problems about how to sync one calendar (one VEVENT) between multiple clients and the server.
Most rfc's refer to the use of the etag to determine if a resource has changed since it was last synced. (If the etag changes, the resource has changed since last sync). That I get. However how do I know which change is more recent?
For example client A has an ical 'X' that was last edited at 1AM and they sync at 8AM. Client B also has a version of ical X, that they edited at 2AM and sync at 7AM. So B is newer then A and B synced before A.
When A syncs it will see B's newer version of X. From the etag it knows that X has changed but not 'when'. I'm assuming that A should overwrite with B, since B is newer (or at least be able to prompt the user saying B is newer).... is this assumption correct / is there a standard way to handle this situation?
The problem in general is when trying to figure out what file is newer between the server and a client. The etag only can detect 'changed' but not 'newer'. The last modified date seems to reflect the icals upload date and not its last edit date on the client. This leads me to believe I'm missing something. Is there some generally accepted algorithm for syncing?
The last edit date is just one piece of the equation here. More meaningful is the actual modification. You might have turned off an alarm from device B (insignificant change) but changed the start date from device A (major change). So, a well behaved client should make its best effort at trying to merge the two.
Some clients will just notify you that the event had been edited and will ask you which copy to keep but without a side by side comparison UI, this is really confusing for end users.
Without a merge mechanism, I would just ignore the etag and always overwrite.
Finally, you should also worry about the schedule-tag of the event (see https://www.rfc-editor.org/rfc/rfc6638#section-3.2.10 ).
Also iCal file should contain SEQUENCE number (incremented on each edit) which is more important that date of edit. By comparing SEQUENCE at least you may decide which edit is newer if its value is not equal for both parties.

eclipse CVS usage: clean timestamps

during synchronisation with the CVS server, eclipse compares the content of the files (of course it uses internally CVS commands). But files without any content change are also shown as different, if they have another timestamp, because they are "touched". You always have to look manually per file comparison dialog if there was really a change in it or not.
Due to auto-generation I have some files that always get new timestamps and therefore I always have to check manually if they really contain any change.
At the eclipse docu I read :
Update and Commit Operations
There are several flavours of update and commit operations available
in the Synchronize view. You can perform the standard update and
commit operation on all visible applicable changes or a selected
subset. You can also choose to override and update, thus ignoring any
local changes, or override and commit, thus making the remote resource
match the contents of the local resource. You can also choose to clean
the timestamps for files that have been modified locally (perhaps by
an external build tool) but whose contents match that of the server.
That's exactly what I want to do. But I don't know how!? There is no further description/manual ...
Did anybody use this functionality and can help me (maybe even post a screenshot)?
Thanks in advance,
Mayoares
When you perform a CVS Update on a project (using context menu Team->Update), Eclipse implicitly updates the timestamp of local files whose contents match that of the server.