Why Rational Team Concert changes the files' last modified attribute? - version-control

I'm having some issues with the installation of Rational Team Concert on my server.
The thing is that when I upload some changes to the server (any kind), it changes the last modified attribute of the file, but it shouldn't do it.
Is there a way to avoid this behavior?
Thank you in advance!

This is something that we have tried to add to RTC SCM (and we still plan to). However, we found that it needs to be an option on load/update.
There are numerous details and discussions available # this work item on jazz.net

Regarding timestamp, getting over the fact that relying on it in a version control tool isn't always considered a best-practice (see "What's the equivalent of use-commit-times for git?"), it is actually a complex issue:
an SCM loader wouldn't use just timestamp to determined what file has changed (Task 179263)
you can have various requirements for that timestamp (like in Defect 159043, where the file timestamp of the modified file on disk that of when it was delivered, not when I accepted.). The variable JAZZ_CCM_SKIP_MOD_TIME=true is mentioned so check if that could improve your specific case.
it is all based on the assumption the timestamp is correctly set by the local workstation, which isn't always true, as illustrated in Task 77201

Related

In which file is the _AppInfo data stored in Beckhoff TwinCAT 3 PLC

I'm looking for the 'AppTimeStamp' information so this can be used to verify that the code is not updated/changed by service personel.
Detect code changes on Beckhoff PLC using C#
At this location I already find part of my information, but I was not able to add a comment due to the 'new user' limitations
You can find the AppTimestamp in the _AppInfo instance.
So just call _AppInfo.AppTimestamp in your program to know the time of the last application start.
Make sure you also check the number of online changes since last download with the OnlineChangeCnt counter which you will also find in the _AppInfo instance.
There are many possibilities where this value is saved. The TwinCAT saves data to the C:\TwinCAT\3.1\Boot folder, different files are explained here.
The ProjectName can be found for example from the configuration data (CurrentConfig.xml), from the end of the file (TcBootProject/ProjectInfo/ProjectName). The same file contains one date (<TcBootProject CreateTime="2019-06-10T13:14:17">), but it seems to be the build time of the boot project created.
I couldn't find the date of AppTimestamp in any files, but perhaps the TwinCAT uses the creation time of the files in those folders? Or perhaps it's hidden in the binary somewhere.
When you update the software without updating the boot project, the file Port_851_act.tizip is updated. So you can check its timestamp. When you update the boot project too, Port_851_boot.tizip and other files are also updated.
So basically, to check if the code is updated by someone, check that modified dates of the files under Boot directory. I suppose only .bootdata files should update as they contain saved persistent data. Of course, you can easily change the dates with 3rd party program. So one solution is to compare the Port_851.crc file contents since it contains the CRC check value of the code. It will always change when boot project is updated.

Exporting individual Congos Reports via command line

I'm trying to work out how I can export individual Cognos Reports via the command line, for the purposes of source versioning in Git at a report-by-report level. I presume XML would be the output format.
I read that the Cognos SDK can help but you need to build your own solution, which may be possible but this use case feels like something many others would already want and there'd be tooling already.
Of course, importing the individual report would also be needed.
Can anyone help here please?
Thanks.
If your end game is version control (Who changed what, when?), you should look into MotioCI. Last time I looked, there was no free version of MotioCI.
You can use tools like the ones provided by companies like http://www.motio.com. With the free version you can export the XML of the reports but only one by one.
You can also use a Cognos deployment of the reports that generates a zip file with the XML of the reports, but all the reports are in the same file and you will have to extract the XML of the individual reports by hand.
I found the SDK to be cumbersome and, when I got it working, slow.
Yes, report specs are XML.
I have created a process that produces output like what you are asking for. Here's what it involves:
A recursive common table expression (CTE) query to get the report
specs along with the folder structure as seen in Cognos.
A PowerShell script to run the query and write the results to the file system.
Another PowerShell script to pull the current content from the remote git repo, run the first PowerShell script, then add, commit, and push the results up to the remote git repo.
I also wrote a PowerShell script to perform the operations associated with git push. This involves using a program I found called HTML Tidy (http://tidy.sourceforge.net/) that can be used to make the XML human-readable. This helps with diffs in git. I use TFS, so I get a nice, side-by-side diff if I have tidied the XML. (Otherwise, it tells me the only line of XML has changed.)
I recently added output for dashboards (exploration) and data sets (dataSet2). Dashboards are stored as JSON, so my routine had to tidy that (simple in PowerShell).
I run my routine daily, getting new and modified content from the last 3 days (just in case), and weekly to do an entire dump (to capture the deletes). The weekly process takes about six minutes. The daily process is negligible.
Before you ask: I hesitate to provide actual code because I can't take any responsibility for your system.
Updates:
Hacking away at the Content Store database is not recommended and it is not supported by IBM.
For reference/comparison: I'm running IBM Cognos 11.0.7 on IIS on Windows 2012 R2 with the Content Store database on MS SQL Server 2016. Your system may be different.
Additional Resources
https://www.cognoise.com/index.php/topic,28289.msg113869.html#msg113869
https://www.cognoise.com/index.php/topic,17411.msg50409.html#msg50409
https://learn.microsoft.com/en-us/powershell/scripting/overview?view=powershell-6
https://learn.microsoft.com/en-us/sql/t-sql/language-reference?view=sql-server-2017
https://git-scm.com/docs
http://tidy.sourceforge.net/

CVS synchronize/update issue

I am using CVS as version control system and facing a strange issue. For some files, I am not able to synchronize or update (using eclipse) because of the following error:
"[Project Name]: cvs [update aborted]: cannot create .#lang_en.properties.1.1.2.3.2.7.2.2.2.3.2.1.2.1.2.3.8.1.2.4.6.12.2.3.4.1.4.3.2.6.2.13.4.4.4.1.2.9.2.2.2.1.8.1.8.1.14.1.8.3.26.1.8.1.4.4.6.17.4.2.6.6.6.3.2.2.2.2.10.2.2.2.2.2.2.9.2.7.2.1.4.10.4.2.2.3.4.4.2.2.2.1.2.1.10.2.8.1.6.1.4.1.4.2.6.1.2.1.2.2.4.5.4.1 for copying: File name too long"
According to my observation this happens with frequently committed files. What happens is someone in the team commit such a file (which works) but then when some other person in the team try to sync or update, it simply shows 'file name too long' error. I would like to clarify that in example above file name is only "lang_en.properties'.
I am not sure how to resolve this issue. I have even tried deleting file from cvs and then recreating with same name (that is required), but same revision history appear again. Any help would be appreciated.
A file named like .#<filename>.<revision> is created when you do a cvs update and there are changes made to your checked out file. This is effectively a backup of the version you had, in case the update did something that you didn't want (eg. introduced a conflict that you are having trouble resolving). This allows you to roll back an update.
The simplest way to address this is to remove the local file before doing an update. That way there is no need for CVS to create this backup file.
According to my observation this happens with frequently committed files.
This is not caused by frequent commits. The revision id will increase sequentially every time you do a commit. eg. 1.1 -> 1.2 -> 1.3 and so on. Extra digits are added when you do branching. For example, if you took a branch off the 1.3 version of the above file then the revision numbers would be 1.3.1.1 -> 1.3.1.2 -> 1.3.1.3 etc.
I don't know how you are working, but your project seems to have introduced an impressive level of branching. Until you address that workflow, you are going to continue coming across this problem almost every time you attempt an update. You have hit the 256 character filename limit which exists on a lot of file systems.

SSRS Error: "One or more parameters required to run the report have not been specified. (rsParametersNotSpecified)"

Okay there are similar questions to this but this is NOT a duplicate. This error seems to come up when you have parameters referencing a dataset which is shared. Deleting the report from the server and redeploying does not fix in my case.
So I am developing on VS 2010 Professional with Business Intelligence Development Studio, BIDS, which is under source control with Team Foundation Server. I am deploying to a 2008R2 server which I thought may be the issue. The workaround is to change the dataset references to be embedded instead which stops this error dead in it's tracks but that is pretty poor in my opinion and I would like to have this work with shared datasets ultimately.
Things I have tried:
Ensure the naming of the dataset matches the reference. EG: "Name is ClientQuery, shared dataset is ClientQuery"
Ensure the naming on the server matches the refernces in step 1.
Ensure that this is what is breaking it by removing the reference to the shared dataset, works right away then.
Ensures that the shared dataset is not enabling some type of caching on the server.
I had a filter on a second shared dataset limiting scope, I removed that and there was still an error.
Removed all parameters and only added a single shared dataset, it gives error right away.
Added an option to the parameters binding to say: "Allow Empty values". Did this with Nulls as well.
Recreated EVERYTHING, a whole brand new RDL file, and copy and pasted only elements on the body of the report but explicitly created the parameters and the datasets and this STILL HAPPENED.
9. UPDATED - I have done the old destroy the RDL and then hope to redeploy. I found that a lot online. That does not work in this case. It is almost like this reference in the RDL:
< DataSet Name="**ClientQuery**">
< SharedDataSet>
< SharedDataSetReference>**ClientQuery**</SharedDataSetReference>
< /SharedDataSet>
< Fields>
< Field Name="CUSTOMER_ID">
< DataField>CUSTOMER_ID</DataField>
< rd:TypeName>System.String</rd:TypeName>
< /Field>
< Field Name="CUSTOMER_NAME">
< DataField>CUSTOMER_NAME</DataField>
< rd:TypeName>System.String</rd:TypeName>
< /Field>
< /Fields>
< /DataSet>
It appears that somehow the mention of this refernce causes havoc. I would examine my bin(environment) directory under my project. (I deploy for multiple environments and set up QA, UAT, PROD, etc.. under solution config) Each time the RDL is getting updated as it should and posting the updates I am showing. I think 'rebuild' is a lot of the issue at times when people see their report files not updating on a server, in my case a rebuild usually gets updates to the RDL versus just hitting deploy first.
While all of this is happening the hard part is that it works throughout changes every time on BIDS seamlessly. So the error is dealing completely with what the source server believes the rdl data to represent.
Any help is much appreciated, I would rate myself advanced at SSRS but this one has me stumped of what the error is refernecing that it is not getting.
I know this is an old question, but I just ran across this and was able to resolve my issue. Thought an updated option is warranted for others struggling with it. My issue had to do with the parameter settings on the Shared Dataset properties. The menu looks like this:
Specifically, make sure that you check the "Allows null value" option where needed. This instantly resolved my issue where a dataset would not work when pointing to a shared but embedding the dataset did.
Okay so the answer Jeroen proposed and others is half right. My issue was that my source code was under an older SVN Source Control, that was deployed to an SSRS 2008 Server, then we migrated the code base to TFS Source Control. The issue appears to be that the Shared Datasets were BELIEVING to be different identifiers than they actually were. The simple work around IN ADDITION to deleting the files is to redeploy the shared datasets as well. In my case I went into my project settings and deployed them to a different location entirely under the report structure to keep them in the same area so: Reports/Datasets instead of just Datasets. This seems to clear up the issue in my case so I believe this was just a perfect storm. In doubt with SSRS just delete everything and start from the ground up I guess.

eclipse CVS usage: clean timestamps

during synchronisation with the CVS server, eclipse compares the content of the files (of course it uses internally CVS commands). But files without any content change are also shown as different, if they have another timestamp, because they are "touched". You always have to look manually per file comparison dialog if there was really a change in it or not.
Due to auto-generation I have some files that always get new timestamps and therefore I always have to check manually if they really contain any change.
At the eclipse docu I read :
Update and Commit Operations
There are several flavours of update and commit operations available
in the Synchronize view. You can perform the standard update and
commit operation on all visible applicable changes or a selected
subset. You can also choose to override and update, thus ignoring any
local changes, or override and commit, thus making the remote resource
match the contents of the local resource. You can also choose to clean
the timestamps for files that have been modified locally (perhaps by
an external build tool) but whose contents match that of the server.
That's exactly what I want to do. But I don't know how!? There is no further description/manual ...
Did anybody use this functionality and can help me (maybe even post a screenshot)?
Thanks in advance,
Mayoares
When you perform a CVS Update on a project (using context menu Team->Update), Eclipse implicitly updates the timestamp of local files whose contents match that of the server.