RemoveDiffs production flag option does not remove all occurences of the Runtime Version added to generated files - codefluent

Using the default production flag options both the file generation date and the Runtime Version are displayed as shown below:
Notice that the Build Version is listed twice
// CodeFluent Entities generated (http://www.softfluent.com). Date: Thursday, 28 January 2016 13:41.
// Build:1.0.61214.0805
[System.CodeDom.Compiler.GeneratedCodeAttribute("CodeFluent Entities", "1.0.61214.0805")]
When I set defaultProducerProductionFlags to "Default, Overwrite, RemoveDates" it still displays one of the build numbers
// CodeFluent Entities generated (http://www.softfluent.com). Date: .
[System.CodeDom.Compiler.GeneratedCodeAttribute("CodeFluent Entities", "1.0.01234.05678")]

The CodeFluent Entities build number cannot be removed totally. If you have a closer look, the build number is a "magic" sequence:
1.0.01234.05678
This is used to avoid merge conflicts in source control, and is guaranteed to never change.

Related

How can I control my build number with Azure DevOps?

I get so many frustrations with Azure DevOps. In my Build number format I would like to have both
A number that restart to 0 when I update my major an minor version.
But I also would like to have a real build number that is never reset whatever is my build number format. This build number can also be shared by all my build pipeline of my project. Is it possible?
I'm not using YAML format. I use the classic interfaces with the option page to set my build format. At this moment I have this:
It work except each month the r number restart at 0. I want it to continue.
EDIT
I still didn't decided my final format. I would like to understand all the possibilities. Now I discovered the $(BuildID) property I have another question. Is it possible to have something similar to $(Rev:r) variable but that only check the left part of my build number.
Example:
4.16.$(SequenceFor[4.16]).$(BuildID)
In fact I would like to manually set the Major and Minor version and let the system update one by one the Build and use the Revision for the global $(BuildID).
The $(rev:r) is restarted when the build number changes in any character, so this is the reason why it's restarted whenever the major/minor or the sate changed.
So if you want to use an incremental unique number you can't use the $(rev:r) because then it will be restarted each build.
If you want a number that depends on the major and the minor numbers you need to use the counter expression:
Create 2 variables:
major-minor = 4.16
And a variable that depends on his value and also is a counter:
revision = $[ counter(variables['major-minor'],0) ]
The build number will be:
$(major-minor).$(revision).$(Build.BuildId)
Now, if you will change the major-minor (to 4.17 or 5.16) the revision will be again 0.

How to check generated file has been modified in Eclipse plugin development?

Currently the plugin will generate a series of files in an IProject, I need to check whether the generated file has been modified by user before. If the generated artifact has been modified by user, I will need to handle the regeneration differently.
What I can think of is by checking Creation Date == Modified Date . The fact that I will delete the old file and create it again when user has not touched the file before to make sure the Creation Date always equals Modified Date. However I did not see how to retrieve these 2 properties from IFile. Anyone can help me regarding this?
I am quite new to Eclipse plugin development, can anyone suggest another way around this ?
*** Generated files cannot be locked as those are source codes
The modification stamp of an IFile or more generally an IResource can be obtained with getModificationStamp(). The return value is not strictly a time stamp but should serve your needs, see the JavaDoc for details.
If, however, you would like to track whether the content of a file was changed I would rather compute a hash of the content, for example with a MessageDigest. You can then compare the two hashes to decide whether the file was changed.
This latter approach would regard a file as unchanged if it was changed - saved - changes reverted - saved again. The modification stamp on the other hand would declare the file as changed even though its content is the same again.
Whichever approach you choose, you can store the modification stamp (or content hash) at generation time by using IResource#setPersistentProperty() and later compare it with the current modification stamp. Persistent properties are stored on disk with the platform metadata and maintained across platform shutdown and restart.
I found the answer:
private boolean isModified(IFile existingFile) throws CoreException {
IFileState[] history = existingFile.getHistory(NullProgessMonitor);
return history.length > 0;
}
This feature is maintained by eclipse IDE so it will survive the restarting of eclipse. If the file has been created without modification , the history state is zero.
You can clear local history by doing:
existingFile.clearHistory(NullProgessMonitor);

T4 and Edmx conflict - "input file appears to be using a schema version not supported by this template"

I'm getting a warning from the T4 when the input file is a EF5 edmx.
Running transformation: The input file appears to be using a schema
version not supported by this template. This may lead to compile
errors. Please use 'Add New Generated Item' to add an updated
template.
Any idea why this is happening?
I once got this issue once when I upgraded an old project to .NET Framework 4.7.
If that is the case here too, then the *.tt file is deprecated now. It is a T4 generator file, which will create C# source required to access your entity objects and needs to be replaced. Do the following to update it (assuming you're using database first approach):
Remove the current (deprecated) *.tt file (exclude it from project and delete it)
Open the *.edmx file in the solution explorer by double-clicking it. The data classes diagrams are opening up.
Right-Click on a free space in the data classes visualization (your EF data model) and select "Update model from database..." in the context menu
Specify and test data connection (to ensure it is successful)
Now what happens in the background is that a new *.tt file will be generated. Once that is completed, rebuild your solution and the error should disappear.
But be aware that you likely have to do more changes afterwards, because there have been a couple of breaking changes in the newer versions of EF, which I have described here.

How do you put a version number into an rdlc file?

I've a couple of rdlc files which I change everytime business has additional requirements. The problem is that we keep pdfs of the reports we create, and it's impossible right now to see what version of the rdlc file it was created with.
I've thought about putting a text field with "Version XX.Y" into the footer, but then I have to remember to update this when I make changes. It's not the worst solution in the world, but I'd like to hear how others handle report versioning in reporting services.
Note that I am renedering reports using local reporting, i.e. no server, so I've thought about somehow trying to display the assembly version of the application running the report. This I can control more easily with .* notation in assemblies, but I'm not sure how to have a text field which would show this.
You could add a property containing the assembly version to the report's data source (or pass the value in some other way).
public string AssemblyVersion {
get {
return Assembly.GetAssembly(typeof(WhateverTypeThisIs)).GetName().Version;
}
}
Another option might be to use an MSBuild task to replace the version number in the .rdlc for you. XMLPoke, for instance.
If you are using subversion, an option might be to use the SvnInfo task from MSBuild community tasks to get the last changed rev of the report and use that number to update the XML of the rdlc file.

JasperServer: Unable to locate the subreport exception

I searched for a couple of days to fix this bug with nothing new.
I had a report which include multi-level subreports everything works fine on iReport 3.7.5. I used subreport.jasper as subreport expression in the first level & also subreportA.jasper, subreportB.jasper in the second level & place all (the main report & subreports) in the same path.
The problem raised when I try to deploy it on my JasperServer.
When I try to upload the first main report the iReport wizard offerd me to attach the first subreport.jrxml in resource folder and access it with repo:subreport.jrxml or repo:subreport.jasper.
Then I manually upload the second level subreports and do the same thing change the subreport expression to repo:subreportA.jasper and repo:subreportB.jasper.
I got compilation error : Unable to locate the subreport with expression: ""repo:subreport.jasper"". java.lang.Exception: repo:subreport.jrxml not found.
I try dozen solution and nothing works.
using : SUBREPORT_DIR # the beginning,
using full path : repo:/Circuit_Reports/Connectivity/Connectivity_files/,
switch between .jasper & .jrxml.
using jasperserver_api_engine_impl_0_fix.jar in lib folder as a fixation to this bug,
I also searched the database record to be sure that they are in the same folder and have the same parent folder.
Smalltalk before Longtalk ;)
(Of course I don't want to encourage you to read everything of this long detailed post! The bold markers may already be enough to solve your problems but I found it worth documenting this tricky stuff in some more detail!)
Since I invested another couple of hours on this (after I resolved it some weeks ago, had a change now, but forgot to document it properly, forgot how I did it and could not retrieve this info again in any form - when uploading and configuring to/in JasperServer) ... here is some aggregated functionality mentioned on various sites regarding subreport referencing, how it works and what one can try ...
(I'll update mine or other findings in here if there hopefully will be some)
Short cut details / best practices?!4
Till maybe Jasper functionality provides a similar "wrapping" solution itself ...
To workaround all the problems related to running the *.jrxml, *.jasper files either locally in Preview mode or remotely on a JasperServer I am now using the following approach which allows to work with only a single *.jrxml file, that will work locally and remotely without modifications, in a multi-developer environment, supporting independent refactoring of dir structures (paths, names) per environment (= as it should ;-) ):
using some jasper-utils-*.jar
put it in your project (Java) class path (Project->Properties->Java Build Path->Libraries->Add)
put it in your ../jasperserver/WEB-INF/lib/ folder
referencing some custom Jasper Java Scriptlet jr.utl.EnvScriptlet that does the ugly subreport path/reference magic in your master reports
define the REPORT_SCRIPTLET by adding an attribute to your master report: report properties -> Report -> Data Set -> Scriptlet Class: jr.utl.EnvScriptlet
using some custom properties file jr.utl.properties or otherwise supplied system properties (any other way to set the Java system properties would be fine as well and work - where already set up properties will override loaded file properties) to configure the different environments including your
current environment information via jr.utl.env property (prod, myOsUsrName, test, demo, staging, local, ...)
which determines how the subreport references must be generated / look like
server subreport parent directory property references
take e.g. these property file contents and put one per environment here:
on your servers: ../jasperserver/WEB-INF/classes/jr.utl.properties
jr.utl.env=prod
mycompany.local.jr.gui.rep.subrep1.parentdir=repo:/x/y/z/
mycompany.local.jr.gui.rep.subrep2.parentdir=repo:/x/y/z/
mycompany.local.jr.gui.rep.subrep3.parentdir=repo:/x/y/foobar/
in your local JasperSoft Studio (Eclipse) Java src/build path: e.g. ../myrepproject/src/java/jr.utl.properties
jr.utl.env=dietrian
mycompany.local.jr.gui.rep.subrep1.parentdir=D:/reporting/src/reports/
mycompany.local.jr.gui.rep.subrep2.parentdir=D:/reporting/src/reports/
mycompany.local.jr.gui.rep.subrep3.parentdir=D:/reporting/src/reports.otherdir/
to achive source modification independency in our environments we parameterized those values and generate them once via some workspace-dependent/user-specific local.properties file, based on this idea:
|- build.xml (containing the ANT build magic)
|- build.properties (containing global properties)
|- local.properties (ignored in version control, e.g. .hgignore, user-specific generated from local.template.properties)
|- local.template.properties (source for ANT build task generating the local.properties above)
|- mycomp.local.proj.reporting.dir=D:/reporting
|- src/reports
|- jr.utl.properties (ignored in version control, user-specificly generated based on template below)
|- jr.utl.template.properties (source for ANT build task generating the jr.utl.properties above)
jr.utl.env=${user.name}
mycompany.local.jr.gui.rep.subrep1.parentdir=${mycomp.local.proj.reporting.dir}/src/reports/
mycompany.local.jr.gui.rep.subrep2.parentdir=${mycompany.local.jr.gui.rep.subrep1.parentdir}
mycompany.local.jr.gui.rep.subrep3.parentdir=${mycomp.local.proj.reporting.dir}/src/reports.otherdir/
defining your BASE_DIR master report parameters as e.g.
$P{REPORT_SCRIPTLET}.getProp("mycompany.allsubreports.parentdir") (matching some environment-dependent property in your jr.utl.properties file)
defining the master subreport expressions as e.g. jr.utl.EnvScriptlet.getSubrepPath( $P{BASE_DIR}, "subrep1.jrxml")
automatically resolving the values from properties you could also use e.g. these variants:
jr.utl.EnvScriptlet.getSubrepPathByPropKey( $P{BASE_DIR}, "mycompany.local.jr.gui.rep.subrep1.name")
jr.utl.EnvScriptlet.getSubrepPathByPropKeys( "mycompany.local.jr.gui.rep.subrep1.parentdir", "mycompany.local.jr.gui.rep.subrep1.name")
$P{REPORT_SCRIPTLET}.getSubrepPath(...) does not work here :-( (I don't know why)
do not forget to restart your server when you put all the files on the server!
(4: Of course I am still seeing some minor improvements here, but it seems much better than all the ugly solutions I found till now. Improvements I would see:
using the REPORT_SCRIPTLET or scriptlet functionality may not be the best way to go, but it will probably work in the vast majority of use cases
although both existing Jasper classes suggest this they do not seem to be able to handle the above properly:
FileResolver
RepositoryUtil
)
(5: the relevant special handling is encoded here: EnvScriptlet.java/getSubrepPath(String,String,boolean,String[]))
Intro (Background)
First thing to know is that the handling/setup in JasperStudio is quite different from the handling on Jasper Server (Repository)5 ...
suppose we have the following enviroments:
our Eclipse install dir: C:\eclipse\
our Eclipse (Report) workspace: C:\workspace\
our report project under: C:\workspace\report-project\
our reports under: C:\workspace\report-project\src/reports
a master report C:\workspace\report-project\src/reports/masterrep.jrxml
some subreport C:\workspace\report-project\src/reports/subrep1.jrxml
another subreport C:\workspace\report-project\src/reports/somesubdir/subrep2.jrxml
the BASE_DIR (explained in next section) in our workspace master report is set to C:\workspace\report-project\src/reports/
our Jasper Report Server GUI repo id-path of our master report will be: /x/y/z/
(which is not to-be-confused with the visual named-path, e.g. which could be Financial Reports/Expenses/Current Year)
In general: Jasper Studio, JasperServer
(and other "Jasper runtime environments" like custom Java Jasper package usage):
it seems a good practice to declare a report parameter "prefix" which can vary depending on your Jasper runtime environment e.g. named BASE_DIR
important here is that it seems best to assume the suffixed / may be included1 because there are cases where you may have/want to use it in a way where it should be an empty or "unslashed" path expression
e.g. $P{BASE_DIR} + "subrep1.jrxml" which should resolve to
repo:subrep1.jrxml
see e.g. here for more details (look for SUBREPORT_DIR)
(1: which I personally find a bad practice in general (not looking at Jasper Reports in this respect) when dealing with directory-like structures)
JasperStudio Designer (Eclipse Plugin)
(the official IReport successor with loads of more functionality)
(if you do not use the preview functionality this may be uninteresting to you)
unfortunately I found no practical way to fully support (normal) "team-development" with subreports (and likely other relative resources as well), meaning here the (currently to me unknown) inexistent possiblity to separate local paths and *.jrxml files :-(
e.g. if you have a version control system in place and work in different environments (different local paths to repos and/or different developers) the master report has to contain a local path to your subreport in some way)
I tried different approaches that failed:
relative path expressions in BASE_DIR do not work since the working directory is the eclipse dir, e.g. C:\eclipse
Eclipse->Window->Preferences->JasperStudio->Properties->Add e.g. my.base.dir
it is not available in the Preview mode, e.g. via new java.io.File(System.getProperty("my.base.dir")).getCanonicalPath() + "/" for our BASE_DIR expression (these props may be only used by the designer itself, but not set in preview runs)
just in case you may stumble upon (as I did): Eclipse->Window->Preferences->JasperStudio->Report Execution->Virtualizer Temporary Path is something unrelated (not useful here) dealing with the storage of the report result "caching"
of course I could write an ANT task to replace these local pattern based on a regexp filter copy on every usage/checkout, but that seems not a good way to handle this
if you solely want to work with *.jrxml files (as I do3) you have to reference some subrep1.jrxml like this: net.sf.jasperreports.engine.JasperCompileManager.compileReport($P{BASE_DIR} + "subrep1.jrxml")
(3: I don't need the *.jasper files explicitely and do not see why I want to deal with them. BTW the JasperServer WebGUI only seems to support the upload of *.jrxml files)
JasperServer Web GUI
(e.g. provided by some Tomcat application server and storing its data in some postgres database)
Scenario 1: reference attached subreport resource(s)
if you do not want to reuse your report in general, it seems fine to add your supreport to your master report (so it is not visible in the GUI repo tree - see below subitem how you could reference it outside of your master anyways)
if you attach your subreport it should in general have its file name as its resource id, e.g. our subrep1.jrxml from above is uploaded with a resource id of subrep1.jrxml (thus making the handling of local design references and server references less complicated)
taking the example reports from above we have to set our BASE_DIR to repo: in the to-be-uploaded master report
thus the subreport expressions $P{BASE_DIR} + "subrep1.jrxml" and $P{BASE_DIR} + "somesubdir/subrep2.jrxml" should work on the server as well
NOT recommended!: you could still reference these reports from other reports with absolute paths like this2: repo:/x/y/z/masterrep.jrxml_files/masterrep.jrxml_
(2: which I would not recommend in this case; it's undocumented and may change; better put your subreports then into the "GUI repo path" as described below)
Scenario 2: reference repo subreport resource(s)
suppose we upload our subreports to the master repo id-path /x/y/z/ (as shown on top)
again we have to differentiate two different use cases
we do NOT want to use the subreport as a standalone report (it will always only be included in other master reports)
in this case we should upload it using Add Resource->File->JRXML and reference it
../subrep1.jrxml or ./subrep1.jrxml do not work since it seems the underlying logic cannot handle the relative path expression .. (and likely . not as well) (which would actually be nice :-( )
so what we have to do here is to supply an absolute canonical path in the BASE_DIR of our masterrep.jrxml, e.g. repo:/x/y/z/
we want to use the subreport as a standalone report as well
in this case we should upload it using Add Resource->JasperReport
this obviously creates a hidden folder repo:/x/y/z/subrep1.jrxml_files containing the report itself and other resources
that's why we not only have to adjust the BASE_DIR (as above), but also the subreport expression to, e.g. $P{BASE_DIR} + "subrep1.jrxml_files/subrep1.jrxml_" (which points to the subreport itself)
and maybe remove the net.sf.jasperreports.engine.JasperCompileManager.compileReport(...) wrapper function, because the server does this automatically for *.jrxml files
I did not fully investigate some other likely incorrectly used approaches which did not work for me to solve the mentioned problems (maybe somebody else has some outcome/corrections here):
$P{REPORT_FILE_RESOLVER}.resolveFile("subrep1.jrxml") (NullPointerException)
resulting in empty subreport sections in master report:
$P{REPORT_CONTEXT}.getRealPath("subrep1.jrxml")
$P{REPORT_CONTEXT}.getProperty("REPORT_FILE_RESOLVER").resolveFile("subrep1.jrxml")
Additional hints
Since I like to automate the report design and deployment process as much as it makes sense I wrote some ANT tasks that handle the local *.jrxml file to deployable *.jrxml file transformations regarding the BASE_DIR and the other transformations.
SQL helpful to easily investigate the resource id path structures in a jasper server postgres meta database (following something like jdbc:postgresql://myjasperhost/jasperserver connecting e.g. with the postgres user):
select
f.id as folder_id,
r.id as res_id,
case when f.hidden = true then 1 else 0 end as hidden,
f.uri||case when f.uri = '/' then '' else '/' end||coalesce(r.name,'') as res_uri,
r.resourcetype,
r.creation_date,
r.update_date,
f.uri,
r.name,
-- less important
r.version,
r.parent_folder,
r.childrenfolder,
f.parent_folder,
f.version,
f.name
-- select *
from jiresourcefolder f
left outer join jiresource r on (r.parent_folder = f.id)
where not f.uri like '/themes%'
order by f.uri||coalesce(r.name,'')
Related Questions
Questions on the Jaspersoft forum related to this one include:
http://community.jaspersoft.com/questions/525466/proper-way-include-subreports
http://community.jaspersoft.com/questions/530526/subreport-could-not-load-object-location
http://community.jaspersoft.com/questions/517832/subreports-ireports
http://community.jaspersoft.com/questions/537611/sub-report-jrxml-jasper
http://community.jaspersoft.com/questions/534861/unable-compile-master-report-pls-advise
http://community.jaspersoft.com/questions/817852/databasetimezone
http://community.jaspersoft.com/questions/819343/comjaspersoftjasperserverapijsexception-error-filling-report-and
http://community.jaspersoft.com/questions/536251/solved-subreport-not-running-jasperserver
http://community.jaspersoft.com/questions/536218/resolved-problem-subreport-reference-after-exporting-ireport-jasperserver#81141
http://community.jaspersoft.com/questions/527109/subreport-problem
http://community.jaspersoft.com/questions/522331/atomatically-compile-subreports
Not sure if this mechanism works in all cases but it certainly works for JasperSoft Studio 5.6.0 and Jasper Reports Server 5.6.0.
Essentially we need a simple way to detect that we are running on the server - I use the presence (or absence) of the $P{REPORT_CONTEXT} parameter which experiments show is present on the server but not present during preview.
<parameter name="OnServer" class="java.lang.Boolean" isForPrompting="false">
<parameterDescription><![CDATA[Are we running on server]]></parameterDescription>
<defaultValueExpression><![CDATA[Boolean.valueOf($P{REPORT_CONTEXT}!=null)]]></defaultValueExpression>
</parameter>
Once you have that you can then define the location of your subreport from a choice of two:
<parameter name="SubReportProducts" class="java.lang.String" isForPrompting="false">
<parameterDescription><![CDATA[The products subreport]]></parameterDescription>
<defaultValueExpression><![CDATA[$P{OnServer}.booleanValue() ? "repo:OrderPicksheetProducts.jrxml" : "OrderPicksheetProducts.jasper"]]></defaultValueExpression>
</parameter>
And then include the sub report:
<subreportExpression><![CDATA[$P{SubReportProducts}]]></subreportExpression>
You can then use Preview in studio and all still works when you deploy to server.
I'm not 100% of this answer but : You have to upload your subreport as a jrxml ressource and put "repo://subreport.jrxml" to get it work.
If you read this one of those days tell me if it worked or what solutions you found.
Regards
Try removing the extension completely and use "repo:/subreportFolder/subreportName". The main report pulls the jasper file in iReport, but on the jasperserver you upload the jrxml.