We are in the process of migrating our reports from Crystal Reports to SSRS. In Crystal Reports we use variables to dynamically generate our filenames so when the report gets sent out via email, the file has the report name and execution date. (e.g. MonthlyReport09-07-2012.xls).
Is this possible in SSRS? I don't see any straightforward approach to using variables in the filename when subscribing to a report. This could prove troublesome when sending multiple reports with the same filename to the same person because it would be difficult to discern which report is which.
Any help is greatly appreciated. Thank you SO.
There is no feature in SSRS as such but there is a work around for this. You have two options
Option 1:
Instead of emailing it directly first dump the file in fileshare location which can be something like \machine-name\ExportReports\ReportName\ then create a windows job which renames the file to the format you want and emails it in the next step.
Option 2:
Refer to this blog what you want starts from section "Generate a PDF output file programmatically" now you can use this in an assembly then have some scheduling mechanism which picks up the schedule. This then calls the DLL which generates the report and emails it.
Use #timestamp in the name of the file and it will translate at run time.
You cannot specify the report filename in a standard subscription in Reporting Services.
If you have Enterprise edition (or SQL 2012 Business Intelligence edition) you can use the Data-Driven Subscriptions features that allows you to specify the report filename (and other properties) based on data retrieved from a table.
If you have Standard edition, then your options are either of the ones suggested by Bhupendra, or you could look at scripting the report generation using the "rs.exe" utility supplied with Reporting Services and use Database Mail and SQL Server Agent to handle the emailing and scheduling.
This post looks pretty old , but better late then never ...
There are some tools on the market , which can run SSRS reports : CRD, R-Tag and RemiWare
These are Desktop tools but I guess you are not looking to replace SSRS , just to extend it.
I am not sure about the CRD and RemiWare , but R-Tag supports data driven reports and dynamic names. It also can be used with Standard license.
I was able to automate the emailing and change the file name by looping thru a table of accounts, invoices and emails, then setting the parameters and renaming the reportname and pathname in catalog be execution at the end of the after the execution I did a wait for 2 seconds then when to the next loop. I the end I set the path and name back to orginalname. Performed well.
#timestamp works for Windows File Share as answered by Chris.
For Email deliveries, you could use:
#ReportName -specifies the name of the report.
#ExecutionTime - specifies when the report was executed.
For more details- MS Docs
You can do this by changing the filename directly from the table [ReportServer].[dbo].[Catalog]
Make sure to add a forward slash to the Path
For e.g. I created a SQL Job that ran every night before the report and added the date to the filename.
Sample Job (replace [ItemID] corresponding to your Report):
DECLARE #PREFIX CHAR(25) = 'REPORT_NAME_';
DECLARE #SUFFIX CHAR(8) = CONVERT(CHAR(8),GETDATE(),112);
DECLARE #VARNAME CHAR(33) = CONCAT(#PREFIX,#SUFFIX);
DECLARE #VARPATH CHAR(34) = CONCAT('/',#PREFIX,#SUFFIX);
UPDATE [ReportServer].[dbo].[Catalog]
SET
[Name] = #VARNAME,
[Path] = #VARPATH
WHERE [ItemID]='63D051EE-3139-4F50-ADBB-1C944F3D5D47';
Related
I'm trying to work out how I can export individual Cognos Reports via the command line, for the purposes of source versioning in Git at a report-by-report level. I presume XML would be the output format.
I read that the Cognos SDK can help but you need to build your own solution, which may be possible but this use case feels like something many others would already want and there'd be tooling already.
Of course, importing the individual report would also be needed.
Can anyone help here please?
Thanks.
If your end game is version control (Who changed what, when?), you should look into MotioCI. Last time I looked, there was no free version of MotioCI.
You can use tools like the ones provided by companies like http://www.motio.com. With the free version you can export the XML of the reports but only one by one.
You can also use a Cognos deployment of the reports that generates a zip file with the XML of the reports, but all the reports are in the same file and you will have to extract the XML of the individual reports by hand.
I found the SDK to be cumbersome and, when I got it working, slow.
Yes, report specs are XML.
I have created a process that produces output like what you are asking for. Here's what it involves:
A recursive common table expression (CTE) query to get the report
specs along with the folder structure as seen in Cognos.
A PowerShell script to run the query and write the results to the file system.
Another PowerShell script to pull the current content from the remote git repo, run the first PowerShell script, then add, commit, and push the results up to the remote git repo.
I also wrote a PowerShell script to perform the operations associated with git push. This involves using a program I found called HTML Tidy (http://tidy.sourceforge.net/) that can be used to make the XML human-readable. This helps with diffs in git. I use TFS, so I get a nice, side-by-side diff if I have tidied the XML. (Otherwise, it tells me the only line of XML has changed.)
I recently added output for dashboards (exploration) and data sets (dataSet2). Dashboards are stored as JSON, so my routine had to tidy that (simple in PowerShell).
I run my routine daily, getting new and modified content from the last 3 days (just in case), and weekly to do an entire dump (to capture the deletes). The weekly process takes about six minutes. The daily process is negligible.
Before you ask: I hesitate to provide actual code because I can't take any responsibility for your system.
Updates:
Hacking away at the Content Store database is not recommended and it is not supported by IBM.
For reference/comparison: I'm running IBM Cognos 11.0.7 on IIS on Windows 2012 R2 with the Content Store database on MS SQL Server 2016. Your system may be different.
Additional Resources
https://www.cognoise.com/index.php/topic,28289.msg113869.html#msg113869
https://www.cognoise.com/index.php/topic,17411.msg50409.html#msg50409
https://learn.microsoft.com/en-us/powershell/scripting/overview?view=powershell-6
https://learn.microsoft.com/en-us/sql/t-sql/language-reference?view=sql-server-2017
https://git-scm.com/docs
http://tidy.sourceforge.net/
Whenever I construct a report that uses an embedded dataset and try to use a parameter (such as #StartDate and #EndDate), I receive an error that states I must declare scalar values. However, this error only comes up if I set a data source that uses the "credentials stored securely in the report server" option. If I set the data source to use "Windows integrated security," I do not receive the error.
I am at a complete loss. These reports need to be accessed by a large amount of people. We have granted them "browser" privileges through an Active Directory Group through SSRS, including the data sources.
What is the best way to proceed? Is there an easy fix?
I generally deploy with the option already set by going into the Data Source and choosing 'Log on to SQL Server' section > 'Use SQL Server Authentication'> (Set your user and settings). When you use a windows user as your main user after you deploy there could be issues.
The other question would be does this work correctly at all times in Business Intelligence Development Studio, BIDS, and just not on the server? It is very interesting a permission issue alone would cause a scalar error to return. Generally when users have to get to the report they may still get the error but not storing the credentials merely asks them for credentials. It would help more to know the datasets and what they are returning or supposed to be returning. Generally a Start and End are typically defined as 'DataTime' in SSRS and are in a predicate like 'Where thing between #Start and #End' and there data is chosen from a calendar by a user. If you are binding them to other datasets and there is the possibility of a user selecting multiple values that could present an issue.
I took a look at the data source that had been set up by our DBA. It was set up as an ODBC connection. I changed it to Microsoft SQL. It works now. I do not understand why and would appreciate if a more seasoned individual might explain.
Hi I developed one report which is taking parameters dynamically with help of Data driven subscription.
but when I did subscription it is exporting all users reports into Excel and keeping it into windows file share folder.
here my issue is my client doesn't want the report if report is empty for a particular user.
but I have to fix this issue in SSRS itself instead of doing changes in Procedure (database) level.
i used the below expression in SP Level
i.e
if(##rowcoun>0)
raiserror("nodata",16,1)
Note: but the same procedure is using for multiple purpose and my db developer is not accepting to do changes in SP levle.
in reports level i am using the below expression to hide the column headers if report is empty
iif(countrows()>0,"true","false")
but the above expression showing blank report and it is exporting empty excel sheet for the user.
Thanks in advance
Since you are already using a Data Driven Subscription, what you need to do is edit the subscription query so that it creates a list of recipient emails based on those with data. The way I do this is to join the table of recipients with the dataset to be returned.
Is it at all possible using CRM 2011 and SSRS to generate a report on a single record, and only get results for that one record?
EDIT
Additional Info - Must Use:
Custom SSRS report
Custom entity in CRM
Here's a more specific link to your question: link. You're probably looking for pre-filtering (look for "3. Pre-filtering Element" in the link provided) if you want the report to be record specific (context sensitive).
Here's a link describing the 2 types of pre-filters (CRM 4.0 but the theory applies to CRM 2011): link. And here's an example of prefiltering in CRM 2011: link
I have done this successfully in CRM 2011 with a completely custom report made in BIDS, on a custom entity, with full context sensitivity.
Make sure to learn fetchXML as it's going to be the going forward technology for these reports. The existing reports are using SQL which make them bad examples to copy off of.
Here's an example on how to extract fetchXML from an advanced find: link It also has more information on pre-filtering.
Take a look a the report Account Overview.rdl. It could be executed for a single account record or multiple records.
See Reporting for Microsoft Dynamics CRM Using Microsoft SQL Server Reporting Services
Create an embedded connection to the CRM database engine for the environment you want to target.
Create an embedded dataset to query the current record. This going to be kind of weird since experience will tell you that you are going to get tons of records, but because of the clunkiness behind CRM it will actually only get the current record. For example, if you wanted to get the current quote you would use "SELECT quoteid FROM FilteredQuote AS CRMAF_Quote"
Add a parameter to store the reference to the entity you just queried. In keeping with this example I created #QuoteFilter which is type text, could store multiple values (even though that's not what we're using it for), and gets its default value from the dataset in step 2. Also, probably ought to make this hidden since GUIDs aren't end user friendly.
Finally, use the parameter discovered in the where clause of the other datasets. For example, a search on quote products for the current quote would look something like SELECT * FROM FilteredQuoteDetail WHERE (quoteid = #QuoteFilter)
As a final note, you should keep in mind that CRM loves to remember everything even when you don't want it, too. On one of my reports I messed up my datasource and CRM was forever convinced that the report should run against all records. I fixed my datasource, but uploading the report did not trigger a refresh and correct the problem. In the end, I deleted the report from CRM, created a new one, uploaded the same exact file with no changes, and everything worked. Go figure.
Is there a way to convince Crystal Reports to export a page / group / whatever to separate worksheets when exporting to Excel (Data Only)? I'm using the CR that came with VS2008 (version 10.5)
Thanks.
According to the documentation you cannot export a report directly to multiple worksheets in a single Excel workbook.
When the limit of 65536 rows in Excel is reached though, the exporter does create a new worksheet, but you are not in control :)
update
To create your own Excel merger:
PRE: Make sure you have the Office (Excel) SDK libraries installed.
PRE: Place the files that need to be merged in a single directory.
In a VS2008 solution:
Create a new empty Excel Workbook (variable: objNewWorkbook)
Loop through the files in the directory (where you placed the Excel files) and for each item:
Load the file as a Excel Workbook (variable: objWorkbookLoop)
Create a new Worksheet in objNewWorkbook (optionally: with the filename of objWorkbookLoop) (variable: objNewWorksheetLoop)
Copy the data from (probably sheet1 in) objWorkbookLoop to objNewWorksheetLoop
Finally save objNewWorkbook to a file.
One of the things everybody ignores is that excel automation is not an acceptable solution. Yes it works ( almost always) , but even Microsoft recommends to not use it for unattended execution : http://support.microsoft.com/kb/257757
The only safe way I know to export a crystal report to multiple worksheets is by creating a grouped report and burst it using R-Tag report manager. This tool is not using Excel automation so you can run your reports at any time and on the server but if you are currently using another software to run your reports you will need to switch to this one (it is not an extension).
I know this thread is an old one, but I can see links to it without a real answer. Hopefully this will help somebody.