Creation Date of Compiled Executable (VC++ 2005) - date

The creation date of an executable linked in VS2005 is not set to the real creation-date of the .exe file. Only a complete re-build will set the current date, a re-link will not do it. Obviously the file is set to some date, which is taken from one of the project-files.
So: is there a way to force the linker to set the creation-date to the real link-date?
­­­­­­­­­­­­­­­­­­­­­­­­­­

Delete the executable as part of a pre-link event.
Edit:
Hah, I forgot about Explorer resetting the creation date if you name a file exactly the same as a file that was recently deleted.
Why are you keying off the creation date anyway?

A complete rebuild will delete that file forcing the linker to create it, hence the reason it gets a new creation date. You could try disabling incremental linking under project properties (Linker | General). If that doesn't do it you could add a build event to delete the exe file and force it to create a new file each time. Both of these things could increase your build time.

Deleting the executable doesn't do the job. That's the problem. Also I could not identify any projectfile, whose datetime was the same as the later linked executable. That lets me conclude, that the 'creation date' is an information taken from within some project-file.
The project has 400000 lines, so a full build is no option.

What about using somethign like DirDate (or write a little util yourself) to set the creation date and call it from the post-build step?

Related

Using PowerShell (5.1) to EDIT outlook PST path

I know how to add or remove a store with PowerShell using Microsoft.Office.Interop.Outlook, but I haven't found any information about changing values.
I read https://learn.microsoft.com/en-us/office/vba/api/outlook.namespace#methods but I don't see a method available for setting properties.
Context: User's PST files have been moved from one path to another. I'm trying to avoid disruption wherever possible, so I'm writing a PS script to move the PST files, and then update Outlook with the new path.
Since removing and re-adding the stores will break user-defined stuff like rules, I'm hoping for a way to change existing store filepaths that will require no user action.
Is this possible at all?
As a second option, can I pull the existing rules, and modify them (or recreate them)?
PST store entry id embeds the PST path in it (you can see it in OutlookSpy - I am its author - click IMessage / IMAPIFolder / IMsgStore button, select PR_STORE_ENTRYID, click "..." next to the Value edit box).
If a rule includes a store id (e.g. copy / move message action), you would need to reset / recreate the rule.
I you don't want to remove / add a store, can reset the store location using ProfMan library (I am also its author) directly in the profile section in the registry. See https://www.dimastr.com/redemption/profman_examples.htm#example2 for an example on how to read PST paths. You can modify the script to set the path instead.

how to Load data from last modified files within one day from subfolders Azure Data Flow

I have the following directory structure on an Azure container:
-dwh-prod
-Main_Folder
-2021-01
-file1.parquet
-2021-02
-file2.parquet
-file3.parquet
where the Data is partitioned by year and month to create subfolders. Within these sub-folders, I have my data files. I want to load into my data flow only the latest files that were added within one day from running my data flow pipeline.
I tried using currentUTC() in End Time and subtracting one day -> AddDays(currentUTC(), -1) in Start Time in the 'Filter by last modified' option provided in source options but it didn't work.
I also tried using currentTimestamp() instead but to no avail.
How do I go about solving this?
Your expression is correct. Please change the folder path from MainFolder to Main_folder in your dataset and set Main_Folder/*/*.parquet as your Wildcard paths in your Source option. Then it will work.
I think your solution is close, but I'm not sure the folder name is sufficient. I'm also not familiar with "currentUTC". The correct function should be utcNow.
Below is an outline of how I would approach this problem.
Source Dataset
Add a Parameter for the subfolder (year-month):
and then set the Folder path to an expression like:
Pipeline
You could either pass in the subfolder or calculate it at runtime. My preference would be to pass it in as a parameter:
I would then add variables to calculate the start and end times. Since you are running this daily, I would be sure to force the time to the START of the day(s). This should handle any vagaries based on run time. Also, I would use the built in getPastTime function:
Now use these objects in your Source configuration:

Unable to run experiment on Azure ML Studio after copying from different workspace

My simple experiment reads from an Azure Storage Table, Selects a few columns and writes to another Azure Storage Table. This experiment runs fine on the Workspace (Let's call it workspace1).
Now I need to move this experiment as is to another workspace(Call it WorkSpace2) using Powershell and need to be able to run the experiment.
I am currently using this Library - https://github.com/hning86/azuremlps
Problem :
When I Copy the experiment using 'Copy-AmlExperiment' from WorkSpace 1 to WorkSpace 2, the experiment and all it's properties get copied except the Azure Table Account Key.
Now, this experiment runs fine if I manually enter the account Key for the Import/Export Modules on studio.azureml.net
But I am unable to perform this via powershell. If I Export(Export-AmlExperimentGraph) the copied experiment from WorkSpace2 as a JSON and insert the AccountKey into the JSON file and Import(Import-AmlExperiment) it into WorkSpace 2. The experiment fails to run.
On PowerShell I get an "Internal Server Error : 500".
While running on studio.azureml.net, I get the notification as "Your experiment cannot be run because it has been updated in another session. Please re-open this experiment to see the latest version."
Is there anyway to move an experiment with external dependencies to another workspace and run it?
Edit : I think the problem is something to do with how the experiment handles the AccountKey. When I enter it manually, it's converted into a JSON array comprising of RecordKey and IndexInRecord. But when I upload the JSON experiment with the accountKey, it continues to remain the same and does not get resolved into RecordKey and IndexInRecord.
For me publishing the experiment as a private experiment for the cortana gallery is one of the most useful options. Only the people with the link can see and add the experiment for the gallery. On the below link I've explained the steps I followed.
https://naadispeaks.wordpress.com/2017/08/14/copying-migrating-azureml-experiments/
When the experiment is copied, the pwd is wiped for security reasons. If you want to programmatically inject it back, you have to set another metadata field to signal that this is a plain-text password, not an encrypted password that you are setting. If you export the experiment in JSON format, you can easily figure this out.
I think I found the issue why you are unable to export the credentials back.
Export the JSON graph into your local disk, then update whatever parameter has to be updated.
Also, you will notice that the credentials are stored as 'Placeholders' instead of 'Literals'. Hence it makes sense to change them to Literals instead of placeholders.
This you can do by traversing through the JSON to find the relevant parameters you need to update.
Here is a brief illustration.
Changing the Placeholder to a Literal:

Why would LayoutObjectNames return an empty string in FileMaker 14?

I'm seeing some very strange behavior with FileMaker 14. I'm using LayoutObjectNames for some required functionality. On the development system it's working fine. It returns the list of named objects on the layout.
I close the file, zip it up and send it to the client, and that required functionality isn't working. He sends the file back and I open it and get a data viewer up. The function returns nothing. I go into layout mode and confirm that there are named objects on the layout.
The first time this happened and I tried recovering the file. In the recovered file it worked, so I assumed some corruption had happened on his end. I told him to trash the file I had given him and work with a new version I supplied. The problem came up again.
This morning he sent me the oldest version that the problem manifested in. I confirmed the problem, tried recovering it again, but this time it didn't fix the problem.
I'm at a loss. It works in the version I send him, doesn't on his system. We're both using FileMaker 14, although I'm using Advanced. My next step will be to work from a served file instead of a local one, but I have never seen this type of behavior in FileMaker. Has anyone seen anything similar? Any ideas on a fix? I'm almost ready to just scrap the file and build it again from scratch since we're not too far into the project.
Thanks, Chuck
There is a known issue with the Get (FileName) function when the file name contains dots (other that the one before the extension). I will amend my answer later with more details and a possible solution (I have to look it up).
Here's a quote from 2008:
This is a known issue. It affects not only the ValueListItems()
function, but any function that requires the file name. The solution
is to include the file extension explicitly in the file name. This
works even if you use Get (FileName) to return the file name
dynamically:
ValueListItems ( Get ( FileName ) & ".fp7" ; "MyValueList" )
Of course, this is not required if you take care not to use period
when naming your files.
http://fmforums.com/forums/topic/60368-fm-bug-with-valuelistitems-function/?do=findComment&comment=285448
Apparently the issue is still with us - I wonder if the solution is still the same (I cannot test this at the moment).

How to check generated file has been modified in Eclipse plugin development?

Currently the plugin will generate a series of files in an IProject, I need to check whether the generated file has been modified by user before. If the generated artifact has been modified by user, I will need to handle the regeneration differently.
What I can think of is by checking Creation Date == Modified Date . The fact that I will delete the old file and create it again when user has not touched the file before to make sure the Creation Date always equals Modified Date. However I did not see how to retrieve these 2 properties from IFile. Anyone can help me regarding this?
I am quite new to Eclipse plugin development, can anyone suggest another way around this ?
*** Generated files cannot be locked as those are source codes
The modification stamp of an IFile or more generally an IResource can be obtained with getModificationStamp(). The return value is not strictly a time stamp but should serve your needs, see the JavaDoc for details.
If, however, you would like to track whether the content of a file was changed I would rather compute a hash of the content, for example with a MessageDigest. You can then compare the two hashes to decide whether the file was changed.
This latter approach would regard a file as unchanged if it was changed - saved - changes reverted - saved again. The modification stamp on the other hand would declare the file as changed even though its content is the same again.
Whichever approach you choose, you can store the modification stamp (or content hash) at generation time by using IResource#setPersistentProperty() and later compare it with the current modification stamp. Persistent properties are stored on disk with the platform metadata and maintained across platform shutdown and restart.
I found the answer:
private boolean isModified(IFile existingFile) throws CoreException {
IFileState[] history = existingFile.getHistory(NullProgessMonitor);
return history.length > 0;
}
This feature is maintained by eclipse IDE so it will survive the restarting of eclipse. If the file has been created without modification , the history state is zero.
You can clear local history by doing:
existingFile.clearHistory(NullProgessMonitor);