SSRS 2008 R2 Change Shared Data Source for Production vs Test - ssrs-2008

I am trying to figure out the best way to mitigate this situation. My project team consists of 3 developers each with their own instances of SSRS installed. We have 2 external SSRS servers that we must push updates to in order for the customer to review and for us to test and there is a 3rd external server coming online that will not be administered by us.
I have been trying to find a way to set the Shared Data Source to the current environment regardless of the system it is on. I had thought that just a common naming convention for the ReportServer address would be fine, but we've already found them to be inconsistent on the production and test servers. My next attempt was to specify an ODBC connection and let each person create a system DSN with connection information, but after an entire day of messing with it and continually getting errors, I'm not convinced it's the way to go. ( The most recent error being "The specified DSN contains an architecture mismatch between the Driver and Application" ). I have tried going through Windows ODBC DSN msc to create the DSN and I have tried using Report Builder 3.0 to create one and neither seem to work.
So I guess at this point I just have to ask, is there a best practice for going about this? I'd like to do local development and testing via the "Run" button inside Report Builder and then I'd just like to upload the file to the Report Manager and have it work regardless of the URL for the Report Server.

If the properties (connectionstring, etc) for shared data sources don't change much on your servers, the following may work for you: in the properties for your project set OverwriteDataSources = False for the appropriate configurations. Set it to true only temporarily to change the data source, if needed.
That way any dev can safely deploy to the servers, without affecting the data source, even if (s)he locally changed something (e.g. the connection string) to match a personal environment.
Not an optimal solution, but relatively easy to set up.

Related

Datstage Connector Metadata import hangs when choosing any of the available connections

I have a sample job that successfully extracts data from an Oracle database via the ODBC connector and I will mention that I manually added a column description for one of the columns in the Oracle table (I am also successful in extracting from an SQL Server db). I need to add table definitions, and I am attempting to use Import -> Table Definitions -> Start Connector Import Wizard. I receive a list of connectors to choose from including "ODBC Connector". When I choose it and press Next, it just hangs and eventually times out, displaying not able to connect. This behavior occurs for other connectors as well, i.e. Oracle.
This has worked in the past for us, and it has just started recently with this problem. We have tried using a user with more permissions, to no avail. Please note that the odbc connector is working fine to extract data, just not to import metadata. One of the team members performed a DataStage server restart but the problem persists.
Please advise.
Thanks,
David
The obvious question you must ask, when something is not longer working that used to work, is "what changed"? (And "nothing" is not the correct answer.) Have any patches been made, either to InfoSphere Information Server or to the operating system, for example? Your description has helped to eliminate certain possibilities, such as the Connector Access Service, it remains to do what Sherlock Holmes advised; eliminate the impossible then anything that remains is possible. Please do examine the logs, especially /opt/IBM/WebSphere/AppServer/profiles/InfoSphere/logs/server1/SystemOut.log, to determine whether any problem has been logged at the same time as the wizard has been unable to return a list of Connectors. You might also try to create and use a Connection within the InfoSphere Metadata Asset Manager tool.

SQL 2008 R2 - Reporting Services, using Data Model with Report Builder 3 over internet causes crash, any ideas?

Background:
I have a Windows 2008 R2 box set up with SQL 2008 R2 both the Data Engine and Reporting Services.
I have configured Reporting Services to use custom authentication (FormsAuthentication) that I wrote.
The custom authentication gets passed the name of a user to treat as the admininistrator assuming they login correctly in the Reporting Services configuration files.
The custom authentication when queried by Reporting Services about the current users permissions will always return true when logged in as the user configured as the administrator.
I have uploaded a Data Model to Reporting Services (using the built in Report Manager app) which uses a Data Source I added (also using built in Report Manager app) which connects to a database on the same box.
I have a ASP.NET MVC3 web app (also on same box) that is configured to use the Reporting Services web service to do things like list existing reports, run existing reports and a link to start Report Builder 3.
The ASP.NET MVC3 web app shares it user logins with Reporting Services, i.e. the custom authentication used by Reporting Services verifies user details by looking at the same data as the web app.
The ASP.NET MVC3 web app is available externally.
Problem:
If I log into the web app remotely, start Report Builder 3 via link, login as UserA, use the Report Wizard with options >> Create dataset >> select Data Model (see above) as the source of data >> choose 1 table of data (e.g. Organisations) >> click button to preview data >> click next -- BANG - REPORT BUILDER 3 HANGS.
If I log into the web app locally on the server hosting everything, start Report Builder 3 via link, login as UserA, use the Report Wizard with options >> Create dataset >> select Data Model (see above) as the source of data >> choose 1 table of data (e.g. Organisations) >> click button to preview data >> click next >> choose row/column groups and values, report style and click finish. I can then run the report and save it. -- EVERYTHING WORKS!
Things I looked at:
If I do a SQL Profiler Trace against
the both the Reporting Services
database and the database that the
Data Model is using in the case where
it hangs it appears Reporting
Services is getting into some kind of
loop continuously asking the
Reporting Services databases if there
are any running jobs. When it works
it never asks the Reporting Services
about Running jobs at this point.
If I use the Data Source the Data Model uses in Report Builder 3 as the data source then it does work. HOWEVER this then prompts for a SQL Server login and requires the you open the SQL Server port on both the server and the remote machine! Not good.
This is driving me crazy. Any body with elite skills in the black magic of using Reporting Services 2008 R2 and Report Builder 3 that can help be figure this out will be deserving of everying computing award going.
EDIT: I found this while Googling again (Forum post, Google Cache) and got really excited but couldn't quite make sense of what the poster was saying and changes I made trying to follow it just broke all Reporting Services access so I rolled back the changes. Unfortunately the forum is archieved (and a bit rubbish) so I can't contact or leave a message for the poster.
Peter, my experience with Reporting Services is extremely limited; I've been building reports for SQL Azure Reporting Services using BIDS.
I had a problem whereby BIDS was crashing when I attempted to preview a report and it took me hours to work out that my reports weren't checked out of version control and BIDS couldn't save them. Rather than giving me any kind of meaningful error message, Visual Studio simply crashed on me.
Maybe, just maybe, your issue is that Report Builder can't save the report and is crashing for a similar reason. However, from your description, I'd suggest that if it is the case, it's likely to be a permissions issue rather than a read-only one. I'd check that the security context for the access of data is as you expect when logging in remotely.
Sorry that I can't give you a definitive answer, but maybe it will suggest somewhere new to look.
It turns out that this is caused by a school boy error by Microsoft of using one connection and not opening/closing it appropriately. MS were able to give us this workaround when we called them:
Basically after you've got the Report Builder 3 clickOnce app installed, locate it on your computer and then inside the system.net tag of the app.config file add the tags below.
<connectionManagement>
<add address="*" maxconnection="1024"/>
</connectionManagement>

Automatically triggering merge activity after remote on-site (custom) development?

In our office, the software we create is sent to our client's office along with an engineer and a laptop. They modify the code at the customer site, based on the customer requests, and deploy the exe.
When the engineer returns to the office, the changed/latest code is not updated to the server, thereby causing us all sorts of problems in the source code on the development boxes and laptops.
I tried to use a version control system like svn, but sometimes the engineer forgets to update the latest code to the svn server. Is there an automatic way that when the laptop connects to the domain, the version control system should automatically check for changes and prompt the user to update the code on the server, or automatically update the code to the server.
I think that the key to this is to require the on-site engineers to use a VCS at the customer site, and to make it a condition of their continued employment that the code at the customer site is in fact reloaded into the VCS on return to the office. You could say that the engineers sent on-site need to be trained in their duties, and they should be held accountable for not doing the complete job - the job isn't finished until the paperwork is done (where 'paperwork' in this context includes updating the source repositories with the customer's custom adaptations of the software).
It seems to me that it might be better to use a DVCS such as Git or Mercurial rather than SVN in this context. However, you should be able to work with SVN if the laptop dispatched to the server has a suitable working copy created for the customization work.
That said, the question is "can we make this easier and more nearly automatic". In part, that might depend on your infrastructure - it also might depend on Windows capabilities about which I'm clueless. There might be a way to get a particular program to run when the laptop connects to a new domain. An alternative (Unix-ish) approach would be to use some regularly scheduled job that runs, say, every hour and looks to see whether it is on the home domain and whether there are changes that should be submitted to the main repository.

How To Deploy Web Application

We have an internal web system that handles the majority of our companies business. Hundreds of users use it throughout the day, it's very high priority and must always be running. We're looking at moving to ASP.NET MVC 2; at the moment we use web forms. The beauty of using web forms is we can instantaneously release a single web page as opposed to deploying the entire application.
I'm interested to know how others are deploying their applications whilst still making them accessible to the user. Using the deployment tool in Visual Studio would supposedly cause a halt. I'm looking for a method that's super quick.
If you had high priority bug fixes for example, would it be wise to perhaps mix web forms with MVC and instead replace the view with a code-behind web form until you make the next proper release which isn't a web form?
I've also seen other solutions on the same server of having the same web application run side-by-side and either change the root directory in IIS or change the web.config to point to a different folder, but the problem with this is that you have to do an entire build and deploy even if it were for a simple bug fix.
EDIT: To elaborate, how do you deploy the application without causing any disruption to users.
How is everyone else doing it?
I guess you can run the MVC application uncompiled also? and just replace .cs/views and such on the run.
A websetup uninstall/install is very quick, but it kills the application pool.. which might cause problem. Depending on how your site is built.
The smoothest way is to run it on two servers and store the sessions in sql server or shared state. Then you can just bring S1 down and patch it => bring s1 back up again and bring S2 down => patch S2 and then bring it up again. Al thought this might not work if you make any major changes to the session parts of the code.
Have multiple instances of your website running on multiple servers. The best way to do it is to have a production environment, a test environment, and a developement environment. You can create test cases and run the load every time you have a new build, if can get through all the tests, move the version into production ;).
You could have two physical servers each running IIS and hosting a copy of the site. OR you could run two copies of the site under different IIS endpoints on the SAME server.
Either way you cut it you are going to need at least two copies of the site in production.
I call this an A<->B switch method.
Firstly, have each production site on a different IP address. In your company's DNS, add an entry set to one of the IPs and give it a really short TTL. Then you can update site B and also pre-test/warm-up the site by hitting the IP address. When it's ready to go, get your DNS switched to the new site B. Once your TTL has expired you can take down site A and update it.
Using a shared session state will help to minimise the transition of users between sites.

Crystal Reports 9 Database Connection Issue

Crystal Reports 9 seems to save the database connection information inside the report file itself. I am having an issue changing that connection. I work with a team of developers who all have their own copy of a database on the same server. We are using Trusted Connections to the db. When we need to make changes to a crystal report, and we click the lightning bolt to execute the report, Crystal does not ask for login information to the database. It actually ends up connecting to the last database that was used when the report was saved last.
We came up with 2 workarounds:
Take the database that crystal thinks it should connect to offline, then crystal will ask for login info.
Remove permissions for the username that is making the crystal change.
Neither of these are acceptable for us. Does anyone know how to remove the crystal connection from the report file?
We have tried Log Off Datasource Location and all of the settings in the Database Expert.
UPDATE
I still have not found a solution that fits my case. But our newest workaround is to load up a crystal report and just before you click the lightning bolt (to run report against the database), unplug your ethernet cable. Then when Crystal cannot find the database, plug the ethernet cable back in and it will allow you to choose a different database server and name.
You could use a .dsn datasource file in a user-specific location (i.e. the same path for every user, but a different physical location) and point Crystal Reports at that. For example, on everyone's C drive: C:\DSNs\db.dsn, or on a network drive that is mapped to a different location for each user.
You can get more info on .dsn files on MSDN:
http://msdn.microsoft.com/en-us/library/ms710900(VS.85).aspx
We are using such way (using sql authentication however):
open report
database - log on server
database - set datasource location
refresh/preview
You may disable your [domain user] access to dev database, should help too :)
I am probably answering too late to have any chance at the bounty, but I'll offer an answer anyway.
If you are running the Crystal Report directly or with Crystal Enterprise then the only way I can think of to do this is by using a dsn as paulmorriss mentions. The drawback to this is that you'd be using ODBC which I believe is generally slower and thought of as outdated.
If you are using this in an application then you can simply change the database connection settings in code. Then, everyone can develop the report against their own test database and you can point it to the production database at runtime (assuming the developers database is up to date and contain the same fields as the production database).
To do this you should be able to use a function like the following:
private void SetDBLogonForReport(CrystalDecisions.Shared.ConnectionInfo connectionInfo, CrystalDecisions.CrystalReports.Engine.ReportDocument reportDocument)
{
CrystalDecisions.CrystalReports.Engine.Tables tables = reportDocument.Database.Tables;
foreach (CrystalDecisions.CrystalReports.Engine.Table table in tables)
{
CrystalDecisions.Shared.TableLogOnInfo tableLogonInfo = table.LogOnInfo;
tableLogonInfo.ConnectionInfo = connectionInfo;
table.ApplyLogOnInfo(tableLogonInfo);
}
}
For this to work you need to pass in a ConnectionInfo object (which will contain all of your login information) and the report document to apply it to. Hope this helps.
EDIT - Another option, that I can't believe I haven't thought of until now, is that if you are using SQL Server you can make sure that all of the development databases names are the same, then use "." or "(local)" for the server and integrated security so that everyone effectively has the same connection info locally. I think this is probably the best way to go assuming that you can get all of the developers to use the same setup.
EDIT Again :)
After reading some of the comments on the other answers, I think I may have misunderstood the question. There is no reason that I can think of why you wouldn't be able to do the steps in Arvo's answer outside of not having rights to edit the report, but I'm assuming that you've been able to make other changes so I doubt that is it. I assumed that to get the report to work for each developer you had been doing these steps all along.
Yeah I agree Crystal Reports is a pain. I have ran into the same problem in the applications that I have built that I was forced to use it.
1- Log off the server(inside crystal right click the database and log-off)
2- Click on the database and change the database location
If you are logged on and change the database location it doesn't seem to stick
You can set the logon at runtime. See this question...
How do I change a Crystal Report's ODBC database connection at runtime?
If you used ODBC, each dev could point their DSN at the appropriate database. Essentially pushing the connection string into the DSN and out of the crystal report.