Exporting individual Congos Reports via command line - version-control

I'm trying to work out how I can export individual Cognos Reports via the command line, for the purposes of source versioning in Git at a report-by-report level. I presume XML would be the output format.
I read that the Cognos SDK can help but you need to build your own solution, which may be possible but this use case feels like something many others would already want and there'd be tooling already.
Of course, importing the individual report would also be needed.
Can anyone help here please?
Thanks.

If your end game is version control (Who changed what, when?), you should look into MotioCI. Last time I looked, there was no free version of MotioCI.

You can use tools like the ones provided by companies like http://www.motio.com. With the free version you can export the XML of the reports but only one by one.
You can also use a Cognos deployment of the reports that generates a zip file with the XML of the reports, but all the reports are in the same file and you will have to extract the XML of the individual reports by hand.

I found the SDK to be cumbersome and, when I got it working, slow.
Yes, report specs are XML.
I have created a process that produces output like what you are asking for. Here's what it involves:
A recursive common table expression (CTE) query to get the report
specs along with the folder structure as seen in Cognos.
A PowerShell script to run the query and write the results to the file system.
Another PowerShell script to pull the current content from the remote git repo, run the first PowerShell script, then add, commit, and push the results up to the remote git repo.
I also wrote a PowerShell script to perform the operations associated with git push. This involves using a program I found called HTML Tidy (http://tidy.sourceforge.net/) that can be used to make the XML human-readable. This helps with diffs in git. I use TFS, so I get a nice, side-by-side diff if I have tidied the XML. (Otherwise, it tells me the only line of XML has changed.)
I recently added output for dashboards (exploration) and data sets (dataSet2). Dashboards are stored as JSON, so my routine had to tidy that (simple in PowerShell).
I run my routine daily, getting new and modified content from the last 3 days (just in case), and weekly to do an entire dump (to capture the deletes). The weekly process takes about six minutes. The daily process is negligible.
Before you ask: I hesitate to provide actual code because I can't take any responsibility for your system.
Updates:
Hacking away at the Content Store database is not recommended and it is not supported by IBM.
For reference/comparison: I'm running IBM Cognos 11.0.7 on IIS on Windows 2012 R2 with the Content Store database on MS SQL Server 2016. Your system may be different.
Additional Resources
https://www.cognoise.com/index.php/topic,28289.msg113869.html#msg113869
https://www.cognoise.com/index.php/topic,17411.msg50409.html#msg50409
https://learn.microsoft.com/en-us/powershell/scripting/overview?view=powershell-6
https://learn.microsoft.com/en-us/sql/t-sql/language-reference?view=sql-server-2017
https://git-scm.com/docs
http://tidy.sourceforge.net/

Related

Change Properties of multiple diagrams in Enterprise Architect

I would like to change the properties of multiple diagrams together rather than clicking on them one by one. Does anyone know how this can be achieved?
You can use the scripting facility of Enterprise Architect to loop the diagrams you would like to change and update them.
See this section of the manual to get help.
There is a bunch of example scripts included with EA, either from the local scripts, or from the EAScriptLib MDG.
Another source of examples is my Github repository: https://github.com/GeertBellekens/Enterprise-Architect-VBScript-Library
You could write a SQL to manipulate your database. t_diagram.PDATA holds a long cryptic string where one part is ScalePI=0; (which is the default for no scaling). You can alter that to be ScalePI=1; (meaning scale to one page).
String manipulations vary from database to database. So you need to write your own which you can execute in a script using
Repository.Execute("UPDATE t_diagram ...")
Note that you should test this in a sandbox first since invalid SQLs can easily disrupt your whole repository.

Can Tableau return non-UI results programmatically?

Tableau is an excellent tool for visualizing data. However, it is designed to be the final stop in a data (ETL) pipeline.
My Tableau workbook uses a bunch of Table Calcs to generate a list of "recommended orders". Rather than view these, I want to automate and execute them. This would make Tableau the engine of a quasi-ML process.
In other words, I would like to make Tableau a part of my ETL pipeline and send data to another tier. How can I write a back-end program that executes my Tableau workbook and receives a results dataset?
See the end of this article for example data I want to automate:
http://robm26.blogspot.com/2015/10/keep-your-factory-humming-with-tableau.html
Any ideas?
You're not not going to like the answer I'm going to give you -- "Don't do this".
Tableau isn't meant to be a task in a larger ETL pipeline and the reason you're having problems making it behave the way you want is it's not meant to be done.
Above and beyond the fact that you've figured out how to get a result that you want in Tableau ("the work is done"), Tableau isn't offering you any real value in the scenario you're describing. Use a tool (like Alteryx) that is really purpose built for this sort of work.
The above answer is correct that tabcmd is the way to pull it out. We use a function in python to generate the tabcmd requests so that they can be batched.
import subprocess
def runTabCmd(cmd):
# run tableau command and display the output
print cmd
if run_tabcmd == 'yes':
p = subprocess.Popen(
cmd, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
for line in p.stdout.readlines():
print line
You probably already knew that, but for us it was a way to completely automate the pulling and loading into another python package like scikit-learn for a streamlined ML solution
I'm editing this answer to agree with Russell's answer. Tableau is not an ETL tool and should not be used as such. If you absolutely have to do something, you can use what I provided. Otherwise, the best practice is to use a tool designed for the job.
You can easily use tabcmd to get the results of a view in CSV, which can be used later in your ETL process. If you need to automate it, you can write a script and execute it with a cron job. I, myself, have a few views that are exported to CSV and used later in my ETL stream to feed our CRM.
Just remember to create the view exactly as you want it to be exported to CSV - usually including the order of the fields. Another tip is that I don't let it use the default "Measure Names" and "Measure Values" - to make sure everything is good on my CSV, I have the fields added manually in the row/columns section.

Merging Quickbooks (QBW) files with generated report

I have a customer who accidently wrote about 3 megs of data to the wrong quickbooks file. They had a backup in the same folder for reasons unknown... however their accountant still was writing to the old file. Now we have like a 3 meg difference between 2 250~ meg QB files and I need to figure out how to merge these files (which quickbooks does not support) and generate some sort of report so that they can get their accounting info semi straightened out in some sort of organised fasion. Any help would be appreciated. Thank you for taking the time to read this.
(EDIT) - explanation for last few sentences above... They have conflicting invoice numbers and possibly other things due to last level of use of each file.
Karl Irvin has a Data Transfer Utility that can be used to transfer transactions and list items between QBW files. www.q2.us - his tools are widely used and very reliable.
He also has a report combiner tool, if all you want to do is see reports that are taken from data in two files.
QQube (www.clearify.com) can also generate reports from multiple QBW files.

SSRS Dynamic Filenames for Email Subscriptions

We are in the process of migrating our reports from Crystal Reports to SSRS. In Crystal Reports we use variables to dynamically generate our filenames so when the report gets sent out via email, the file has the report name and execution date. (e.g. MonthlyReport09-07-2012.xls).
Is this possible in SSRS? I don't see any straightforward approach to using variables in the filename when subscribing to a report. This could prove troublesome when sending multiple reports with the same filename to the same person because it would be difficult to discern which report is which.
Any help is greatly appreciated. Thank you SO.
There is no feature in SSRS as such but there is a work around for this. You have two options
Option 1:
Instead of emailing it directly first dump the file in fileshare location which can be something like \machine-name\ExportReports\ReportName\ then create a windows job which renames the file to the format you want and emails it in the next step.
Option 2:
Refer to this blog what you want starts from section "Generate a PDF output file programmatically" now you can use this in an assembly then have some scheduling mechanism which picks up the schedule. This then calls the DLL which generates the report and emails it.
Use #timestamp in the name of the file and it will translate at run time.
You cannot specify the report filename in a standard subscription in Reporting Services.
If you have Enterprise edition (or SQL 2012 Business Intelligence edition) you can use the Data-Driven Subscriptions features that allows you to specify the report filename (and other properties) based on data retrieved from a table.
If you have Standard edition, then your options are either of the ones suggested by Bhupendra, or you could look at scripting the report generation using the "rs.exe" utility supplied with Reporting Services and use Database Mail and SQL Server Agent to handle the emailing and scheduling.
This post looks pretty old , but better late then never ...
There are some tools on the market , which can run SSRS reports : CRD, R-Tag and RemiWare
These are Desktop tools but I guess you are not looking to replace SSRS , just to extend it.
I am not sure about the CRD and RemiWare , but R-Tag supports data driven reports and dynamic names. It also can be used with Standard license.
I was able to automate the emailing and change the file name by looping thru a table of accounts, invoices and emails, then setting the parameters and renaming the reportname and pathname in catalog be execution at the end of the after the execution I did a wait for 2 seconds then when to the next loop. I the end I set the path and name back to orginalname. Performed well.
#timestamp works for Windows File Share as answered by Chris.
For Email deliveries, you could use:
#ReportName -specifies the name of the report.
#ExecutionTime - specifies when the report was executed.
For more details- MS Docs
You can do this by changing the filename directly from the table [ReportServer].[dbo].[Catalog]
Make sure to add a forward slash to the Path
For e.g. I created a SQL Job that ran every night before the report and added the date to the filename.
Sample Job (replace [ItemID] corresponding to your Report):
DECLARE #PREFIX CHAR(25) = 'REPORT_NAME_';
DECLARE #SUFFIX CHAR(8) = CONVERT(CHAR(8),GETDATE(),112);
DECLARE #VARNAME CHAR(33) = CONCAT(#PREFIX,#SUFFIX);
DECLARE #VARPATH CHAR(34) = CONCAT('/',#PREFIX,#SUFFIX);
UPDATE [ReportServer].[dbo].[Catalog]
SET
[Name] = #VARNAME,
[Path] = #VARPATH
WHERE [ItemID]='63D051EE-3139-4F50-ADBB-1C944F3D5D47';

Given code base hosted on TFS, which command can tell me which file has changed most?

I want to find out files under a given directory which have been updated most. Is there any command which can display this info? Or is there any way to get max version count for a given file, so I can write some script to get this info from all and then sort desc.
Do you mean changed the most number of times, or undergone the most code chrun?
Either way - looking at the report data might be the easiest option for you. Take a look at the following blog post I did explaining how to use Excel for looking at TFS data that uses churn as an example allowing you to drill down into folders and files - but you should be able to get the data that you are looking for.
Getting Started with the TFS Data Warehouse