BizTalk Server - Can't configure Groups - sql-server-2008-r2

I am trying to configure BizTalk Server 2010 along with SQL Server 2008 R2 with SP3. I am able to configure Enterprise SSO and Business Rules Engine, but I cannot configure Groups. When configuring Enterprise SSO, it is able to successfully create the SSO Database along with their respective accounts (SSO Administrators and SSO Affiliate Administrators). But for Groups, it is only able to create the Administrative Roles (BizTalk Administrators Group, Operators Group, and B2B Operators Group). It fails when trying to create the Databases (BizTalkMgmtDb, BizTalkMsgBoxDb, BizTalkDTADb).
Here are a few errors I am getting from the log file:
2015-06-09 07:50:23:4123 [Info] CfgExtHelper Checking the connection
to the BizTalk Management Database: BizTalkMgmtDb on server *****
2015-06-09 07:50:23:4748 [Info] CfgExtHelper The BAM Primary Import
Database found from the BizTalk Management Database BizTalkMgmtDb on
server ***** is not compatible.
2015-06-09 07:50:23:4748 [Info] CfgExtHelper Connecting to the BAM
Primary Import Table Database BAMPrimaryImport on server *****
2015-06-09 07:50:23:4905 [Info] CfgExtHelper The BAM Primary Import
Database found from the BizTalk Management Database BAMPrimaryImport
on server ***** is not compatible.

BizTalk 2010 won't work with SQL 2013, which you have installed on your server. What seems to be happening is that BizTalk is connecting to the server, and the default instance is likely running on SQL 2013 (hence the message "Database x on server y is not compatible"). You have to configure BizTalk to connect to the correct instance, or set the default instance to be the SQL 2008R2 instance.
An ideal fix would be to remove SQL 2013 from the server. BizTalk makes very heavy usage of MessageBox and Management databases, and best practice is to have it on its own dedicated server (or servers).

If any way possible, the best way to setup for BizTalk Dev is VM with the entire stack, Windows Server, SQL Server, Visual Studio and BizTalk Server, all installed locally.

Related

Deploying ASP.NET Core Web API to Azure - with Azure Key Vault and SQL Database dependency

I was able to deploy my Web API to Azure. I had to configure for 'Azure Key Vault', 'Azure SQL DB' and 'Microsoft Identity platform' before deployment as they were dependencies.
I configured Azure Key Vault and while configuring for SQL database, I selected the option to save connection string value in 'Azure Key Vault' instead of 'Azure app settings' (please see screenshot below).
When i press configure for 'Microsoft identify platform', the page shows loading symbol but never finshes and doesnt show any tenant id or details for configuration. Please see screenshot below.
So I had to ignore configuring Microsoft Identity Platform for now.
I was able to deploy the app to Azure, but when I try accessing the Azure endpoint, I get an error
HTTP ERROR 500 (Trouble retrieving data from database).
Log file message below while accessing the Azure endpoint:
An error occurred using the connection to database 'databasename' on server 'servername'.
An exception occurred while iterating over the results of a query for context type 'ItemApp.Infrastructure.Repository.ItemDBContext'.
Microsoft.Data.SqlClient.SqlException (0x80131904):
A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections...
Can anyone point where I am going wrong? Thanks in advance.
Check to see if your database engine can handle remote connections:
Start > All Programs > SQL Server 2005 > Configuration Tools > SQL
Server Surface Area Configuration
Click on Surface Area Configuration for Services and Connections
Select the instance that is having a problem > Database Engine >
Remote Connections
Enable local and remote connections
Restart instance
Check if your SQL server services is up and running properly:
Go to All Programs > Microsoft SQL Server 2008 > Configuration Tools > SQL
Server Configuration Manager > SQL Server Services
Check to make sure SQL Server service status is Running.
In addition, ensure that your remote server is in the same network.
Run sqlcmd -L to ascertain if your server is included in your network list.
You may need to create an exception on the firewall for the SQL Server instance and port you are using:
Start > Run > Firewall.cpl
Click on exceptions tab
Add sqlservr.exe (typically located in C:\Program Files
(x86)\Microsoft SQL Server\MSSQL.x\MSSQL\Bin, check your installs for
the actual folder path) and port (default is 1433)
Check your connection string as well
Please check the MSFT Document to troubleshoot the error: Here.

How do I identify the Initial Catalog(name of the collection database) in a migrating step of Azure DevOps Server to Services?

I am an administrator of Azure DevOps Server 2019 Update 1.1 in an organization.
I will migrate our collection from the on-premises server to Azure DevOps Services.
Currently, I am on the step of using SqlPackage.exe to generate a DACPAC file.
https://learn.microsoft.com/en-us/azure/devops/migrate/migration-import?view=azure-devops
According to this reference, the command example to generate DACPAC is as below.
SqlPackage.exe /sourceconnectionstring:"Data Source=localhost;Initial Catalog=Foo;Integrated Security=True" /targetFile:C:\DACPAC\Foo.dacpac /action:extract /p:ExtractAllTableData=true /p:IgnoreUserLoginMappings=true /p:IgnorePermissions=true /p:Storage=Memory
However, I cannot understand what is Initial Catalog.
The reference said Initial Catalog - Name of the collection database.
But I could not find the name of the collection database in Azure DevOps Server management console.
I referred another article on dev.to
https://dev.to/timothymcgrath/the-great-azure-devops-migration-part-6-import-2obc
By this article, Initial Catalog=[COLLECTION_NAME],
and the collection name in my Azure DevOps Server is "DefaultCollection" (default name).
Then, I tried the following command then failed.
C:\Program Files (x86)\Microsoft Visual Studio\2017\SQL\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\130> ./SqlPackage.exe /sourceconnectionstring:”Data Source=localhost;Initial Catalog=DefaultCollection;Integrated Security=True” /targetFile:C:\DefaultCollection.dacpac /action:extract /p:ExtractAllTableData=true /p:IgnoreUserLoginMappings=true /p:IgnorePermissions=true /p:Storage=Memory
Connecting to database 'DefaultCollection' on server 'localhost'.
Extracting schema
Extracting schema from database
*** Error extracting database:Could not connect to database server.
(provider: Named Pipes Provider, error: 40
Is this error caused by wrong Initial Catalog?
How do I find the correct Initial Catalog - Name of the collection database?
Environment and pre-conditions
Windows 10 Pro
SqlPackage.exe installed from SSDT for Visual Studio 2017
The machine where commands are executed and where Azure DevOps Server running is the same
so, DataSource=localhost should be correct, I think
Detached my collection by Azure DevOps Server management console
SQL Server Express for my Azure DevOps server is running
Look at the admin console on your app tier. That shows you all of the databases.
For what it's worth, the standard name for the default collection database is Tfs_DefaultCollection. It may be different in your case, but that's a safe bet.
Resolved.
The database name in the case is "AzureDevOps_DefaultCollection".
It could be found by Azure DevOps Management Console in the feature of selecting attach. (Application Tier -> Team Project Collections)
Or, by using SQL Server Management Studio, we can also find "AzureDevOps_DefaultCollection".
And, in my case, DataSource=localhost is wrong, DataSource=<hostname>\SQLEXPRESS is correct.
I noticed this answer when I connect to my database by SQL Server Management Studio.
Finally, I successfully generated a DACPAC file.
Connecting to database 'AzureDevOps_DefaultCollection' on server '<my machine host name>\SQLEXPRESS'.
Extracting schema
Extracting schema from database
Resolving references in schema model
Validating schema model for data package
Validating schema
Exporting data from database
Exporting data
Processing Export.
Processing Table '[dbo].[tbl_PropertyValue]'.
Processing Table '[Task].[tbl_LibraryItemString]'.
...
...
...
Processing Table '[Search].[tbl_FileMetadataStore]'.
Processing Table '[dbo].[SequenceIds]'.
Successfully extracted database and saved it to file 'C:\DACPAC\DefaultCollection.dacpac'.
Thanks so much! Mario Dietner, Daniel Mann.

Does IIS Metadata Asset Manager v11.5 support SQL Server v2014 reporting services (SSRS 2014)

As per the technote of IIS v11.5 the metadata bridge (Microsoft SQL Server Analysis and Reporting Services (Repository) bridge reference) doesn't support SQL Server 2014.
However, while using the IIS v11.5.0.1 there is an additional option of selecting the SQL Server v2014.
Could someone please verify whether the IIS v11.5.0.1 supports the reporting services of SQL Server v2014 or not?
SQL Server  2014 is supported in 115 fix pack 2. Please install the latest fix pack to have it supported.

Error code 40 when running SSRS reports from Internet Explorer (run as administrator)

We deployed a VB.Net application on a customer's computer that contains SSRS reports.
The application connects to the SQL Server database in the app without any problems. We installed SQL Server Data Tools so we could deploy the reports (rdl) and data source (rdl) files up to the report server. These deploy without any problems.
In SQL Server Data Tools we can "Preview" the reports without any problems as well.
We do run into a problem when attempting to view the report from Internet Explorer (run as an administrator).
We get the following error:
Cannot create a connection to data source 'DataSourceReports'
(this is the name we used for the TargetDataSourceFolder)
error:40 - Could not open a connection to SQL Server
We also get the same error when the app we deployed runs the reports.
Please let us know what is not set up correctly on the SQL Server side.
A likely possibility is that you are experiencing a double hop authentication problem. It's not clear from your explanation, but is the SQL Server database on a separate server from the report server? If so, then your credentials allow you to connect to the report server but Windows integrated security does not pass those credentials on to the SQL Server database if you are using NTLM on the report server. The report server tries to use Kerberos on your network to authenticate by way of ticketing to the SQL Server database, but you must have this configured correctly on your network. See this article if you want to use Kerberos: http://technet.microsoft.com/en-us/library/ff679930(v=sql.100).aspx.
Another (easier) solution is to open the data source on the report server and change the authentication to use stored credentials. Make sure the credentials you use have read permission on the SQL Server database. The downside of this approach is that you cannot use row-level security in your report by user unless you design your report to capture user information and set up the query or a filter on the dataset to restrict data by user. If that's not a concern, the stored credentials are easy to set up and maintain - and you're going to have to do this anyway if you want to use caching, snapshots, or subscriptions. For more information on stored credentials, see http://msdn.microsoft.com/en-us/library/ms159736.aspx.

SQL Server Linked Server error

I am using SQL Server 2008 on Windows Server 2003. I want to use a linked server to open a Visual Foxpro DBF file, using driver Microsoft OLD DB Provider for Visual Foxpro from SQL Server 2008 linked server feature.
When I establish linked server connection by using Microsoft OLD DB Provider for Visual Foxpro, to open a Foxpro DBF file, I meet with the following error, any ideas what is wrong?
Can not retrieve required data from this request (Microsoft.SqlServer.Management.Sdk.Src)
Meeting exception when executing Transact-SQL or batch (Microsoft.SqlServer.ConnectionInfo)
Error from Microsoft OLD DB Provider interface "VFPOLEDB" of linked server "DBFServer", access is defined.
Can not retrieved required interface "IID_IDBSchemaRowset" from OLE DB interface "VFPOLEDB" of linked server "DBFServer"
(Microsoft SQL Server error 7399)
regards,
George
Check 'Allow Inprocess' on the VFPOLEDB provider:
See Cindy Winegarden's answer on http://social.msdn.microsoft.com/forums/en-US/sqlreportingservices/thread/e54d20dd-b65b-4cff-9349-6499e6e069e2 for how to do it.
Edit: this is the relevant part of the answer:
Here's what Stepahnie posted to her thread in the
microsoft.public.data.oledb NNTP newsgroup on April 10:
"Finally, I found an option ' Allow inprocess' in linked server ->
providers -> VFPOLEDB in MSSQL2005. With enable this option, I can
connect to vfp with oledb. While diabling this option, it works only
50% with successful connection....
And I have another MSSQL2005 with the same setup (except the option
'allow inprocess' disabled), all the connection make to vfp is 100%
successful... "
I looked and found that, as she said, in Server Objects > Linked
Servers > Providers > VFPOLEDB > General tab > Provider options >
Allow inprocess. Also, you can change the InProcess setting with the
following code:
USE [master]
GO
EXEC master.dbo.sp_MSset_oledb_prop N'VFPOLEDB', N'AllowInProcess', 1
GO