ClickOnce is modifying my file dates - deployment

When users get a new update via ClickOnce, the file modification date for the executables and dll's are getting updated to the date of installation. How can I keep them at the same date as when the project was built?

Related

php auto update script files

How can I create script version update function like in datalife engine cms ? For example customer has version 1.0 and I have created version 1.5. And I want the user to be able to update the script version directly from admin panel. How can I do it ?
Notice: This is my own solution.
#Guilherme Soster has provided the basics on how to do it. However if you do not want to do it yourself you can check my php-updater solution. You can define what you want to add (overwrite or add) and delete as well as scripts you want to run in a YAML file.
well, basically you'll have to store the user system's version in some config file or database and have a server where you keep the latest version of your scripts.
Then when the user request to check for updates (or each time that the software runs) your update script should hit the server and get the latest version number, compare it with the one your user has stored and if the user one is inferior the script should download the new scripts from the server (usually to a temp directory). Now all that the update script has to do is remove the old scripts and move the new ones to the directories where they belong. Finally the update script should update the version in the confi file/database.

Powershell Script for Copying a zip file everytime from one location to other with creating a new folder in the destination whenever a TFS build runs

Hi Can any one help on this. I have a situation where i need a powershell script to run from TFS build under post build section, it has to copy the zip file generated to some location locally with the build number or build name every time a build happens.
If you use TFS 2015, it's suggest using the new build system, as there is already a Copy Files task in the new build system, you can use it directly.
Detailed information of this task, you can refer to: https://github.com/Microsoft/vsts-tasks/tree/master/Tasks/CopyFiles

How to remove a folder created by old version of application with files during upgrade in advanced installer

Many thanks in advance, I am using advanced installer 12.1, the scenario is we have already installed application with some custom location feature and is extends the default application location, this installation creates some folders to store temporary files like "Temporary" folder, this folder is used by the application on every minute basis and hence there are some files in this folder, now issue is when i run a new build or try to install the new version of the application, i again select the another custom location, before finishing the installation the installer deleting all the folders installed with older version but not deleting the folders which contains files created by the application e.g the temporary directory above is deleted if it is empty but when it is non-empty it is not removed.
Solution i tried:
1) I used file Removal tool to remove the file, but i think it point the current location and not pointing the older application path.
2) A custom VB script, but again the same issue as #1
3) I tried uninstall cleanup wizard, result #1
Please guide me how i can delete that folder, any help would be appreciated.
Thanks

Installshield - Few files getting removed on major upgrade

I updated an installer file (.ism) for major upgrade in which I made the following changes:
updated product code,
updated package code,
updated versionmin and max in upgrade,
updated product version,
few strings in which old version was mentioned.
Now when I am upgrading my product using this setup, few files get removed automatically.
I did not make any changes in those files in target machine and the same files (no change in content) are in my new setup.
Also I did not add any entry in "RemoveFiles" table to remove them.
Also checked the installation log in which I am just seeing this:
Action 14:14:59: RemoveFiles. Removing files
RemoveFiles: File: CapibilityDemo.htm, Directory: C:\Program Files\Server\Printing\
RemoveFiles: File: HTTP.js, Directory: C:\Program Files\Server\Scripts\OpenLayers\lib\OpenLayers\Protocol\
RemoveFiles: File: Script.js, Directory: C:\Program
Files\Server\Scripts\OpenLayers\lib\OpenLayers\Protocol\
Can anyone please help me in resolving this issue ?
Thanks
Taran
Dynamic components are probably the problem.
This link is someone who was having a similar problem while patching (which is like a minor update)
Basically what is happening is that MSI has determined that the 'old' components have been removed (since they are dynamically generated, the GUIDs change every build). So in your upgrade it is removing the components you 'removed'. However it isn't laying down the new components, likely because it has determined there isn't a need for it to do so. You should examine your MSI file in Orca and look for the files/components that didn't get installed in your upgrade, and then search the install log for that GUID. That should give you a clue as to the next steps.
Also, here is the installshield best practice recommendations for Dynamic file linking.

How to use version control with JasperReports

We're about to start development of a number of reports using Jasper Server Reports version 3.7.0 CE.
Does anyone have any recommendations as to how best to manage version control with this development, given that the structure of the report units is managed in the database and through either iReport or the web front end?
In fact you can import/export to a directory structure using the js-import/js-export scripts, but then you can't edit these files directly with iReport.
Does anyone have any pointers?
This is problematic. I have established a subversion repository to allow standard reports delivery to be versioned but it is a real pain because jasper does not make this even a little bit easy.
I created a maven project with an assembly descriptor so that "src/main/xml/resources/Reports,adhoc,Domains, etc" can be packaged up in a zip that is pushed to our maven repository.
The biggest problem is that you can't just develop adhoc and input controls merely by modifying XML files. The developer has to import what is in source control into a working jasper server, modify the reports or add new ones (after making sure that his organization and datasources are configured) and once he's satisfied that the report(s) works, export the resources to a directory or zip file, manually modify all references in the exported files from datasources and organization specific resource locations back to "generic" before checking in his changes.
When importing into jasper, the same process has to be done in reverse. The generic paths and organization values have to be converted to the developer's organization so they can be easily imported/updated and he can prove out that the full "round trip" works properly before checking in.
To make the export/subversion checkin easier, I created an ant build file which lives in the maven project's root dir. The build prompts (or will read a properties file) to determine the exported zip location, the organization id of the exported tree. It then opens exported zip file from jasper, explodes it, performs text replacements on the files, resets the "createdDate" and "updatedDate" elements to something standard (so that the developer does not end up checking in files that haven't actually changed since jasper does not preserve the date values), and then copy the files into the subversion tree.
For the import process (from the subversion tree into jasper) we have a script that takes as input the organization id and then modifies the versioned xml files to the appropriate values so that the entire tree can be easily imported/updated into their organization.
The reason this level of complexity is required is to allow us to create the same standard reports in a multi-tenant environment, plus jasper's notion of deploying reports is absolutely bizarre. I'm not sure it would be possible to make this process more difficult if you were intending to do so.
If I was in your position I would have established this kind of process:
end of development session: export all reports to a directory structure in a project under version control
commit the project
before next development session: synchronize the project with svn repository
import directory structure to Jasper Server Reports
continue development
Not sure if someone found posted the solution.
This is what I have done for existing reports.
export reports from jasper server
modify file names from .data to .jrmxl
modify subreport calling to add extension (like in A.jrxml should have subreport name as B.jrxml
modify add .jrmxl to datafile,label and name in report unit xml files.
If you are creating new report on jasper server, it simple
give .jrxml to name and label while adding jrxml file. thats it.
Now you can work same files in local and import same to jasper server.