I have a task to find out a way to deploy Pentaho BI CE server, Kettle jobs/transformation from testing to production environment servers.
It is pretty clear how to deploy Pentaho BI server, but the problem is that I do not have clear understanding how can I transfer all my data (Schemas, CDE dashboards, Saiku queries etc.) from one environment to another. Should I download all data and just upload to production server? But that does not seems to be viable solution.
For Kettle jobs/transformation the best approach would be to use repositories, but I still can not figure it out how it could be moved from testing to prod environments.
So my question is - what is the best approach to deploy Pentaho to different environments?
P.S. I am still very new to Pentaho and all BI stuff, so if I made some technical mistakes, please correct me.
Download/Upload is the usual procedure. You can also take a look at the Repository Synchronizer in the marketplace. That might be helpful to ease part of the work.
Related
I've gone down a rabbit hole and finding it hard to know where to go now.
I have a report project I'm trying to script up for deployment to a SharePoint site on a highly controlled production server. On my Dev box I can just deploy my project from BIDS and the reports run. If I upload my rdls, datasets and datasource to the document library directly they don't. I've done some digging and found that the uploaded files aren't linked in any way and that BIDS does some extra steps to set the DataSource for the Shared Datasets and then sets the reference to these DataSets on the Rdls.
So I've been poking around and can see that I need to call the SetItemReferences on ReportingServices2010.asmx to define the links but I'm lost using Powershell. Some scripts I've found are focussed on setting DataSources so I'm trying to adapt that using bits from other scripts but getting lost. One example does $Reference = New-Object -TypeName SSRS.ReportingService2010.ItemReference but I don't know where they're getting the SSRS. namespace from.
Incidentally, the structure I have is:
- One Shared DataSource points to a SharePoint List
- One DataSet pointing to the shared DataSource
- Four reports with NO embedded DataSources and five embedded DataSet references each pointing to the shared DataSet applying various filters.
Is there already a built in way to do this so I can avoid hassles?
Requirements here are that I need something extremely simple that doesn't require extra PowerShell modules to be installed (if possible). The network is highly controlled and it's difficult enough to get scripts we run ourselves approed let alone some third party module installed on the farm of machines in Prod. Basically it will take at least six months to scan, test and formally approve any addons but if we write a very simple script it's much easier.
Yes - deploy with your browser. I have written 3 separate report projects with SSRS on a highly controlled SharePoint 2010 production environment. Each one of them, I have deployed using the browser.
Deploying using your browser is simpler than the PowerShell. Follow the general steps outlined in the last part of this thread. Doing it via Powershell is possible, but far more difficult task.
If the admins have this production environment so highly controlled, then there should exist a parallel staging environment that is kept in precise configuration as production and available to you in order to do DevOps of your SSRS reports. You should request to test your install on the staging environment in order to work out your deployment issues (either by browser or PowerShell). If you get denied this request, then you need to request again. Otherwise its impossible to get it perfect if you don't have access to develop on a similar system.
DevOps on these reports is the last mile of the race and can be difficult if you are the first to do it at your organization. You can do it, just keep going and your reports will be installed. Keep good notes so when you development future reports you can repeat this process and will be the go-to person for getting it done in the future. Don't lose faith.
I'm looking for the easiest way to import data from SQL Server to SQL Azure.
I'd like to work locally, would there be a way to synchronize my local database to SQL Azure all the time?
The thing is I wouldn't like to update each time I add a table in my local database to SQL Azure.
I HIGHLY recommend using the SQL Database Migration Wizard: http://sqlazuremw.codeplex.com/ it is the best free tool I've used so far. Simple and works much easier the the SSMS and VS built in tools. I think the Red-Gate tools now work well with SQL Azure too - but I haven't had a chance to use them.
Have you looked at SQL Data Sync? A new October update just came out today.
http://msdn.microsoft.com/en-us/library/hh456371
Microsoft SQL Server Data Tools (SSDT) was developed by Microsoft to make it easy to deploy your DB to Azure.
Here's a detailed explanation: http://msdn.microsoft.com/en-us/library/windowsazure/jj156163.aspx or http://blogs.msdn.com/b/ssdt/archive/2012/04/19/migrating-a-database-to-sql-azure-using-ssdt.aspx
and here's how to automate the process of publishing: http://www.anujchaudhary.com/2012/08/sqlpackageexe-automating-ssdt-deployment.html
To look: SQL Server Data Tools Team Blog
There are a few ways to migrate databases, I would recommend you to do it by using the generate scripts Wizard.
Here are the steps to follow
http://msdn.microsoft.com/en-us/library/windowsazure/ee621790.aspx
Also there are others tools like Microsoft Sync Framework.
Here you'll find more information about it
http://msdn.microsoft.com/en-us/library/windowsazure/ee730904.aspx
I have some questions around the best mechanism to deploy MVC web applications to different environments. Previously I used setup projects (.msi's) but as these have been discontinued in VS2012 I am looking to move to an alternative.
Let me explain my current setup. I currently have a CI setup using TFSBuild 2010 with Team Foundation Server for source control.
A number of developers work on their local machines and check in to the TFS Server. We regularly deploy to a single server dev environment and a load balanced qa environment with 2 servers. Our current process includes installing an msi which carries out some of the following custom actions:
brings current app offline with the app_offline.htm file
run in database scripts (from database project in the solution)
modifies web.config (different for each web server of qa)
labels the code
warmup each deployed file via http request
etc
This is the current process. Now I would like to make some changes. Firstly, I need alternative to msi's. From som research I believe that web deploy via IIS and using MsDeploy is the best alternative. I can use web config transforms for web config modifications. Is this correct and if so, could I get an outline of what I need to do?
Secondly I want to set up continuous delivery via TFSBuild, I have no idea how this may be achieved, would it be possible to get an outline of how it can be integrated in to my current setup? Rather than check in driven, I would like it to be user driven following check in. Also, would it be possible for this to also run in database scripts from a database project in the solution.
Finally, there is also a production environment, but I would like to manually deploy this - can my process also produce an artifact that I can manually install?
Vishal Joshi has some information on his blog that is reasonably good, http://vishaljoshi.blogspot.com/2010/11/team-build-web-deployment-web-deploy-vs.html. It does have the downside that your deployment password is include in the properties you pass to msbuild.
Syed Hashimi has also posted some information on this in another questions Team Build: Publish locally using MSDeploy.
What is the best practice for deploying changes to SAP business objects reports? Can they be deployed from a Development to a Production Environment via a command line operation? The Wizard approach seems prone to mistakes.
The wizard approach is not error prone. Its true that even if the report is deployed without any errors, it pops up an error. You just need to ignore that. Something SAP BO team should work on.
Command Line, as far as I know can be achievable using the JAVA SDK. Also read the documentation of the Version Manager once, which you are using for deployment.
I am looking at getting a continuous integration/continuous deployment environment set up for my windows azure project and I was wondering if anyone had managed to (can point me in the right direction to) build and deploy a windows azure cloud service using powershell and Hudson and perhaps has sample scripts.
I can get the project to build using MSBuild64 (I'm running x64 Windows 2008 R2 Standard).
I know 32 bit development works, but assume 64 bit development is better as i understand it as problems will be ironed out on my local box as opposed to once deployed in the Azure environment which i believe is 64 bit. (Please feel free to correct my thinking here)
I assume i have to get the cspack.exe file to package the deployment first as in a manual deployment via the development portal.
Ideally i would like to deploy it locally (with the development simulation) run unit tests against it (perhaps against cloud storage for integration tests), deploy it to staging (run the acceptance/bdd) tests and then switch from staging to production.
Any help with anyones experience in this which will speed this research up for me would be appreciated
Many Thanks
Mark
This may help to get you started: http://blog.smarx.com/posts/building-running-and-packaging-windows-azure-applications-from-the-command-line
Also see http://scottdensmore.typepad.com/blog/2010/04/windows-azure-deployment-for-your-build-server-part-2-deploy-certs.html.
Mark, I am looking at something similar. The best solution at this point appears to be a custom solution that uses the REST APIs.
Erick