Failed to deploy analysis services tabular project in Visual Studio 2019 - ssas-tabular

I am unable to deploy analysis service tabular model from VS 2019. The error message returned is "1500 is not a valid value for this element.". Also I checked the deployment analysis server in SSMS to find the supported compatibility levels does not include 1500 (the maximum level it supports is 1480 and the default one is 1200). I'm not sure if it has to do with the issue. Could someone be able to help? Thanks.

It looks like you are trying to deploy a tabular model with a compatibility mode of 1500 (The latest version of Tabular) to an instance of that is will only support a compatibility mode of 1480, which I think is a SQL Server 2019 Analysis Services CTP2.X instance. Here are some details of the compatibility modes options and how to set them
You will have to do one of two things. But it will be dependent on the features used in the Data Model.
If you are using for example an SSAS 2019 feature like calculation groups in your data model you will have to update the SSAS instance from the CTP to the RTM release.
If you are not using any features in your data model that are compatibility mode 1500, you can set your compatibility mode to 1400 in your data model
Hope that helps

Related

SSRS 2008 R2 to SP integrated deployment scripting

I've gone down a rabbit hole and finding it hard to know where to go now.
I have a report project I'm trying to script up for deployment to a SharePoint site on a highly controlled production server. On my Dev box I can just deploy my project from BIDS and the reports run. If I upload my rdls, datasets and datasource to the document library directly they don't. I've done some digging and found that the uploaded files aren't linked in any way and that BIDS does some extra steps to set the DataSource for the Shared Datasets and then sets the reference to these DataSets on the Rdls.
So I've been poking around and can see that I need to call the SetItemReferences on ReportingServices2010.asmx to define the links but I'm lost using Powershell. Some scripts I've found are focussed on setting DataSources so I'm trying to adapt that using bits from other scripts but getting lost. One example does $Reference = New-Object -TypeName SSRS.ReportingService2010.ItemReference but I don't know where they're getting the SSRS. namespace from.
Incidentally, the structure I have is:
- One Shared DataSource points to a SharePoint List
- One DataSet pointing to the shared DataSource
- Four reports with NO embedded DataSources and five embedded DataSet references each pointing to the shared DataSet applying various filters.
Is there already a built in way to do this so I can avoid hassles?
Requirements here are that I need something extremely simple that doesn't require extra PowerShell modules to be installed (if possible). The network is highly controlled and it's difficult enough to get scripts we run ourselves approed let alone some third party module installed on the farm of machines in Prod. Basically it will take at least six months to scan, test and formally approve any addons but if we write a very simple script it's much easier.
Yes - deploy with your browser. I have written 3 separate report projects with SSRS on a highly controlled SharePoint 2010 production environment. Each one of them, I have deployed using the browser.
Deploying using your browser is simpler than the PowerShell. Follow the general steps outlined in the last part of this thread. Doing it via Powershell is possible, but far more difficult task.
If the admins have this production environment so highly controlled, then there should exist a parallel staging environment that is kept in precise configuration as production and available to you in order to do DevOps of your SSRS reports. You should request to test your install on the staging environment in order to work out your deployment issues (either by browser or PowerShell). If you get denied this request, then you need to request again. Otherwise its impossible to get it perfect if you don't have access to develop on a similar system.
DevOps on these reports is the last mile of the race and can be difficult if you are the first to do it at your organization. You can do it, just keep going and your reports will be installed. Keep good notes so when you development future reports you can repeat this process and will be the go-to person for getting it done in the future. Don't lose faith.

MS Project server 2016 update custom fields on tasks

For Project Server 2013 we’ve been using the SOAP API’s QueueUpdateProjectRequest to achieve this but in 2016 we can’t even checkout the project using SOAP.
We try to POST to /PWA/_vti_bin/psi/Project.asmx:
<?xml version='1.0' encoding='UTF-8' ?><ns2:Envelope xmlns:ns3="http://schemas.microsoft.com/office/project/server/webservices/Project/" xmlns:ns2="http://schemas.xmlsoap.org/soap/envelope/"><ns2:Header></ns2:Header><ns2:Body><ns3:CheckOutProject><ns3:projectUid>7475f3ef-226e-e611-80d3-0050568a983b</ns3:projectUid><ns3:sessionUid>c430ce2b-057e-4990-b5b6-9c6f28415739</ns3:sessionUid><ns3:sessionDescription></ns3:sessionDescription></ns3:CheckOutProject></ns2:Body></ns2:Envelope>
and get:
<s:Envelope xmlns:s="http://schemas.xmlsoap.org/soap/envelope/"><s:Body><s:Fault><faultcode xmlns:a="http://Microsoft.Office.Project.Server">a:ProjectServerFaultCode</faultcode><faultstring>Unhandled Communication Fault occurred</faultstring><detail><string xmlns="http://schemas.microsoft.com/2003/10/Serialization/">Incorrect inproc routing. No inproc host is available for Project.</string></detail></s:Fault></s:Body></s:Envelope>
We’ve also tried writing the custom field values using custom field internal names when Merge Posting to /ProjectServer/Projects('{#project}')/Draft/Tasks('{#Id}’.
The server seems to ignore the custom field values while correctly updating system field values.
There is documentation for updating custom fields on Project, but not on Task: https://github.com/OfficeDev/Project-REST-Basic-Operations/blob/master/updateprojectcustomfieldvalues.ps1
What is the proper way for updating custom fields on Tasks in Project Server 2016?
According to Microsoft, there is no Project class in the PSI anymore:
https://technet.microsoft.com/en-us/library/mt422816(v=office.16).aspx#Anchor_2
Project Server Interface (PSI) Project class removed
The Project class in the PSI is not supported in Project Server 2016. For all new development, use the Project Client Side Object Model (CSOM).
I'm getting the same error for calling PSI functions from the Project class.
I'm not 100% sure, but I guess on the server itself, the REST/SOAP operations are still using the PSI in the end, so you get the same error.
No idea whether you can still achieve what you need with the REST/SOAP.
The solution will be to use CSOM (as suggested by Microsoft), but I don't know if it fits your application.

Is there a way to configure sonar-maven-plugin to store reports via REST API

We are using the maven plugin org.sonarsource.scanner.maven.sonar-maven-plugin to store our reports to our sonar qube instance. For company reasons our CI server is in a different network zone then this instance and postgres default port is closed. I wonder if there is an option to store the reports in a different way then having them written directly to database via jdbc, as opening ports is a tedious task here ;)
Furthermore, we also have some older pieces of software that need to be analyzed with a local sonar runner instance and the same question applies here (so if there is another way to store the reports)
Since version 5.2 there is no connection to the database from the scanners. So the easiest/safest course of action here should be to upgrade to the latest LTS (at time of writing 5.6 )

Slow migration to Visual Studio Online from TFS 2013.4 (OpsHub)

I am trying to migrate from TFS 2013 Update 4 to Visual Studio Online. I have had real difficulties using OpsHub Migration Utility, so I have come now seeking advice on how I may go forward.
I have 65 Team Projects to migrate, some with source code, others fairly empty with just a few work items so they differ in size a fair bit. I want to maintain branches we currently have which are cross Team Projects so I initially tried selecting all 65 projects for migration. This took 11hours at the creating configuration stage of the utility, then on getting to the end the tool complained it could not communicate with the service OpsHub Visual Studio Online Migration Utility. So back to square one.
Next approach was to batch the projects into sets of 10 projects. The validation and creating configuration stage was done in 10mins for each set which was a big leap forward, and the migration actually started. This has currently been running for about 17hours. It has done 29,000 work item revisions out of 480,000 across all Team Projects, so I think this may well take a couple of weeks to clear. It hasn't started on the Version Control data yet, which I'm hopeful it does at a later stage but this says not running at the moment.
This is running on a fast i7 box, with 16GB RAM, SSDs the business. 100Mbps internet connection.
I have emailed OpsHub to find out if the commercial version will perform better. But any suggestions what I could do better welcome, including any alternatives to the migration tool.
For Configuration creation taking high time, the assumed reason seems
to be due to issue in communicating with the service OpsHub Visual
Studio Online and waiting till the TimeOut.
The Work Item migration sync time seems to be near to the expectation, migration of Version Control data will be started once the work item migration is completed.
Also as the data is in big amount, its taking time but the migration would keep on running in background, you can use your source TFS instance(if required) without stopping the migration.
For more details on Profession Services you can drop an email to support#opshub.com
There is another tool named TFS Integration Platform that developed by MS for TFS Migration. You can try with it to see if it can migrate the data faster.

AppFabric Dashboard Throwing Powershell Error

I have installed AppFabric onto Server2012. I checked the prerequisites were installed first and have installed accumulative update 3. I have an instance of SQL Server 2012 installed to SP1 and the OS indicates that all updates have been installed.
When viewing the dashboard I am receiving the error:
Unable to cast object to type
'System.Management.Automation.PSCustomObject' to type
'Microsoft.ApplicationServer.Management.Data.Group...'
I've found minimal information during my google searches, none of which are providing a solution to the problem.
Can anyone help with this?
I had the same issue and found that applying Windows Update KB2932678 (Cumulative update package 5 for Microsoft AppFabric 1.1 for Windows Server) stopped the error from coming up.
EDIT : KB3092423 (Cumulative Update 7) is the latest one I could find, so if you are going to apply an update you might as well apply the latest one.
I've had the same issue. I read somewhere (don't have the link) that it appears because there are no records in the AppFabric Monitoring DB's table that contains the instances (of the service).
Once you have at least one record there, the error will disappear and the Dashboard will show the WF instances correctly.
This worked in my case.
UPDATE:
To be precise, the table which is relevant for the AppFabric Dashboard to stop showing the aforementioned error is: AppFabric Persistence DB's table [System.Activities.DurableInstancing].[InstancesTable] . This table should contain at least one record that is related to the WF Service(s) that the current AppFabric Dashboard is shown for.