Copy List & Label server content from DEV to QA to PROD - listlabel

Reproduce List & Label content in different environments.
I tried enabling List & Label API and use the methods, but input parameters need existing ParentFolderID, etc... which need to be explicitly created in server.
I am new with List & Label report server.
My requirement is to create all reports (data sources, etc...) in DEV server.
However to QA and PROD afterwards, there is a need to be able to 'export' all existing reports (data sources, etc...) from DEV.
My knowledge is more of Oracle BI Publisher.
In BIP server i have an option to download (zip) all the contents of a server folder (where i have all the reports) and i am able to log in another BIP server and upload the zipped file from original server.
This way i have exactly the same reports on all servers.
Is any sort of feature like it in List & Label report server?
Explicitly i need a kind of reports installer into List & Label report server.
Regards,
ccarvalho

Currently, there is no such feature. The suggested procedure would be to copy the entire database from Dev to Production if your're working with the Report Server on both ends. Would that work for you?
Other than that, if you're designing your reports from a desktop app and want to transfer them to the Server, you could use the REST API of the Report Server. Here is a blog post covering the basics of this API. The setup also contains a sample ("C# Client API Sample").

Related

Integration and data upload - MuleSoft CloudHub to Tableau Server

I am using MuleSoft CloudHub with Runtime v4.4 to upload CSV data to Tableau Server. The documentation page https://docs.mulesoft.com/tableau-specialist-connector/1.1/ confirms that its not possible to use Hyper Configuration and its operations in CloudHub because CloudHub does not allow executing external code due to security reasons. That is why I am trying to use REST configurations and it's available operations.
With all my attempts, I am able to connect to Tableau server and successfully perform simple operations like Initial file upload, Query project, Append to file upload, etc. But, these operations does not help to publish my CSV content to Tableau. Publish workbook operation also requires file content in *.twbx format, and I am not sure how to convert CSV/JSON/XML to TWBX content type.
I have referred some websites and MuleSoft technical videos where HTTPS connection can be used to upload data to Tableau but those examples are using *.hyper file from classpath.
So, basically I am stuck at 2 different paths:
How can I transform CSV content to hyper file content in Mule flow. If this transform is possible then I can use HTTPS connection to Tableau and upload my data.
Using MuleSoft Tableau connector v1.1 and with REST configurations, is it possible to upload data to Tableau?
If there is any other solution, then I am happy to change my implementation strategy. Please can somebody guide me to correct direction?

SSRS 2008 R2 to SP integrated deployment scripting

I've gone down a rabbit hole and finding it hard to know where to go now.
I have a report project I'm trying to script up for deployment to a SharePoint site on a highly controlled production server. On my Dev box I can just deploy my project from BIDS and the reports run. If I upload my rdls, datasets and datasource to the document library directly they don't. I've done some digging and found that the uploaded files aren't linked in any way and that BIDS does some extra steps to set the DataSource for the Shared Datasets and then sets the reference to these DataSets on the Rdls.
So I've been poking around and can see that I need to call the SetItemReferences on ReportingServices2010.asmx to define the links but I'm lost using Powershell. Some scripts I've found are focussed on setting DataSources so I'm trying to adapt that using bits from other scripts but getting lost. One example does $Reference = New-Object -TypeName SSRS.ReportingService2010.ItemReference but I don't know where they're getting the SSRS. namespace from.
Incidentally, the structure I have is:
- One Shared DataSource points to a SharePoint List
- One DataSet pointing to the shared DataSource
- Four reports with NO embedded DataSources and five embedded DataSet references each pointing to the shared DataSet applying various filters.
Is there already a built in way to do this so I can avoid hassles?
Requirements here are that I need something extremely simple that doesn't require extra PowerShell modules to be installed (if possible). The network is highly controlled and it's difficult enough to get scripts we run ourselves approed let alone some third party module installed on the farm of machines in Prod. Basically it will take at least six months to scan, test and formally approve any addons but if we write a very simple script it's much easier.
Yes - deploy with your browser. I have written 3 separate report projects with SSRS on a highly controlled SharePoint 2010 production environment. Each one of them, I have deployed using the browser.
Deploying using your browser is simpler than the PowerShell. Follow the general steps outlined in the last part of this thread. Doing it via Powershell is possible, but far more difficult task.
If the admins have this production environment so highly controlled, then there should exist a parallel staging environment that is kept in precise configuration as production and available to you in order to do DevOps of your SSRS reports. You should request to test your install on the staging environment in order to work out your deployment issues (either by browser or PowerShell). If you get denied this request, then you need to request again. Otherwise its impossible to get it perfect if you don't have access to develop on a similar system.
DevOps on these reports is the last mile of the race and can be difficult if you are the first to do it at your organization. You can do it, just keep going and your reports will be installed. Keep good notes so when you development future reports you can repeat this process and will be the go-to person for getting it done in the future. Don't lose faith.

Tableau Desktop Inside Tableau Server

Is there a Tableau Desktop executable inside the Tableau server installation.
I have a system where Tableau server in Cloud and would want to use Tableau Desktop in the same server? Is that feasible?
Tableau Server and Desktop are two different products and Server does not ship with a copy of Desktop.
They can both be installed on the same windows machine, but I would never do that except for trouble-shooting reasons (ideally you should install Tableau Server on a dedicated machine so that it does not have to fight anything else for resources).
Tableau Server lets you make limited edits to existing workbooks, but you can not create new workbooks directly.
However, if you want to install Tableau Desktop separately, on the same cloud server that hosts your Tableau Server, it may (or may not) be doable depending on the specifications of the cloud server.
The major difference between Tableau desktop and Tableau Server?
At my previous organization, we always had a desktop version installed on the VM running our tableau server. This was useful for making connections to data sources that required firewall rules since the VM's IP was static. Then extracts could scheduled for refreshes.
So yes, it is feasable, but like others, it is a separate product.
Please note: make sure you understand the implications of editing an existing view.
Workbook owner, project owner or site admin may grant you rights to do the editing. However, you will be overwriting the existing workbook (you can't "save as...")
Besides, the edit function on the server is limited to visualization (sheets) and doesn't work with dashboards (to be improved in the next release, as announced)
Tableau Desktop and tableau server are two different product.
Both have their different executable files.
Desktop is created for development purposes while server is created for more sharing and authentications purposes.
You can do some edits in server, but you cannot create a new dashboard on server.
As others mentioned, Tableau Desktop and Server are separate products and have separate executable. We used to have Tableau desktop installed on Server to publish extracts and manage our extracts which were developed using API
Another thought: Tableau Server provides permissioned users with the ability to leverage Web Authoring to create/edit server content. Web authoring has the same look/feel as desktop, and has most of the features.
Many go this route as it comes with your server license, so the additional desktop purchase is not necessary.
More Info Here
In my current project both Tableau Server and Tableau Desktop are hosted on sane server. You need to analyse the data volume, traffic to workbooks to come up with right RAM size. I would recommend minimum RAM of 25GB assuming close to 20 users accessing tableau server and there is not huge volume of data refresh or connectivity
The desktop version "inside server" is to create and explore licenses. If you have one of these, you can create a sheet/dashboard using the Tableau Server through your browser.

how can I set up a continuous deployment with TFSBuild for MVC app?

I have some questions around the best mechanism to deploy MVC web applications to different environments. Previously I used setup projects (.msi's) but as these have been discontinued in VS2012 I am looking to move to an alternative.
Let me explain my current setup. I currently have a CI setup using TFSBuild 2010 with Team Foundation Server for source control.
A number of developers work on their local machines and check in to the TFS Server. We regularly deploy to a single server dev environment and a load balanced qa environment with 2 servers. Our current process includes installing an msi which carries out some of the following custom actions:
brings current app offline with the app_offline.htm file
run in database scripts (from database project in the solution)
modifies web.config (different for each web server of qa)
labels the code
warmup each deployed file via http request
etc
This is the current process. Now I would like to make some changes. Firstly, I need alternative to msi's. From som research I believe that web deploy via IIS and using MsDeploy is the best alternative. I can use web config transforms for web config modifications. Is this correct and if so, could I get an outline of what I need to do?
Secondly I want to set up continuous delivery via TFSBuild, I have no idea how this may be achieved, would it be possible to get an outline of how it can be integrated in to my current setup? Rather than check in driven, I would like it to be user driven following check in. Also, would it be possible for this to also run in database scripts from a database project in the solution.
Finally, there is also a production environment, but I would like to manually deploy this - can my process also produce an artifact that I can manually install?
Vishal Joshi has some information on his blog that is reasonably good, http://vishaljoshi.blogspot.com/2010/11/team-build-web-deployment-web-deploy-vs.html. It does have the downside that your deployment password is include in the properties you pass to msbuild.
Syed Hashimi has also posted some information on this in another questions Team Build: Publish locally using MSDeploy.

How can i change datasource of a rpt report in Cystal Reports Server

I have 3 different machines with Crystal Reports Server XI R2 installed. They are at different subnets pointing to different oracle databases with same schema definition but different names (dev, test, production).
I got a rpt file created on the development environment, pointing to the "dev" schema.
All went fine. The report executed successfully.
When i got the same rpt and published it in the test server, i could not change the datasource location. It appears that the rpt file keep fixed the datasource that was used in its creation.
Anyone knows how can i change the datasource of a rpt file, making it independent of database location and independent of the database used by the designer?
Thanks
I assume from the question that the different Oracle databases have different database names, but the same schema name - something like OperationalDB on dev, test and live? (It's more complicated if they have different schema names.)
If so, then it depends on what sort of Driver you're using.
If using an ODBC driver, then simply set up different ODBC sources (pointing to the appropriate database) with the same data source name on each of the Crystal machines.
If using a native Oracle driver, then I suggest editing the TNSNAMES.ORA file on each of the machines so that they each have the same TNS name pointing to the appropriate database.
If using a native Oracle driver and you are unable to edit the TNSNAMES.ORA file (for example, if it is a shared network file instead of being located on each of the Crystal machines), then you will need to change the datasource location in the Report Designer (or via the API, if accessing Crystal through one) each time you transfer a report from one machine to another.