Unable to configure TFS backup using Backup wizard - sql-server-2008-r2

When trying to configure the TFS 2010 backup using the TFS Power Tools I kept running into teh following error message:
Account TFS\tfsadmin failed to create backups using path \\tfs-xxxxxxx.local\TFSBackups
The strange thin is that TFS\TFSAdmin has full permissions on both share and file system and that the share path doesn't contain any spaces (thanks for MSDN forums for pointing that out).
I tried backing up through the SQL Server Management Studio, and sure, there the backups fail too.

It turns out that while the backup job is started using the account specified in the Create Backup Wizard of the TFS Power Tools, SQL Server will try to write the files to the share using its own service account.
So in addition to whomever needs access to the share, you need to add the service account running SQL Server to that share as well. In this case it was running under NETWORK SERVICE, so adding MACHINENAME$ to the share's list of permitted users did wonders.

Related

How to take backup of Tableau Server Repository(PostgreSQL)

we are using 2018.3 version of Tableau Server. The server stats like user login, and other stats are getting logged into PostgreSQL DB. and the same being cleared regularly after 1 week.
Is there any API available in Tableau to connect the DB and take backup of data somewhere like HDFS or any place in Linux server.
Kindly let me know if there are any other way other than API as well.
Thanks.
You can enable access to the underlying PostgreSQL repository database with the tsm command. Here is a link to the documentation for your (older) version of Tableau
https://help.tableau.com/v2018.3/server/en-us/cli_data-access.htm#repository-access-enable
It would be good security practice to limit access to only the machines (whitelisted) that need it, create or use an existing read-only account to access the repository, and ideally to disable access when your admin programs are complete (i.e.. enable access, do your query, disable access)
This way you can have any SQL client code you wish query the repository, create a mirror, create reports, run auditing procedures - whatever you like.
Personally, before writing significant custom code, I’d first see if the info you want is already available another way, in one of the built in admin views, via the REST API, or using the public domain LogShark or TabMon systems or with the Addon (for more recent versions of Tableau) the Server Management Add-on, or possibly the new Data Catalog.
I know at least one server admin who somehow clones the whole Postgres repository database periodically so he can analyze stats offline. Not sure what approach he uses to clone. So you have several options.

Microsoft SSIS Package running successfully manually from Visual Studio and SSMS Integration Services Catalog, but not via SQL Server Agent

I have a complex SSIS package, which detects the file extension from a folder, and loads the file into a SQL Server table. I have a For Each Loop Container to load a number of files in this manner, from this folder location and load each file into a SQL Server table.
After the loading of each file into the SQL Server table, the SSIS has a File System Task in the Control Flow; this File System Task first creates an archive folder and then moves each file into this archive folder.
I am using Environment Variables in the SSMS Integration Services Catalog, to map to the parameters in the SSIS package/project.
The entire process is successful when I run the SSIS package in the SSMS Integration Services Catalog manually, but when I try to run via a SQL Server Agent, the data loading and the (File System Task's) folder creation are successful, but not the File System Task's file moving process. (The Agent is Run as SQL Server Agent Service Account.)
I get the following error when I see the execution report in the Integration Services Catalog in the SSMS:
File System Task - Move Files:Error: An error occurred with the following error message: "Access to the path is denied.
While the SQL Server Agent is able to create a folder using a File System Task successfully, it is not able to
move the file into this new folder location.
Inside the SQL Server Agent History, I see this in the job step:
Execute as user: NT Service\SQLSERVERAGENT. Microsoft(R) SQL Server Execute Package Utility Version 14.0.2002. 14 for 64-bit.
... Package execution on IS Server failed. Execution ID: 30449, Execution Status : 4.
I am not good with this permission issue in SQL Server Agent. I read about some proxy setting etc. but am not able to comprehend.
Is there a step-by-step solution you can provide me to fix this issue ?
The SQL Agent job executes the package using the SQL Agent service account. When you run the package manually, the package executes using the credentials you used to sign in. Most likely the SQL Agent service account does not have enough access to the directory, especially if it has just been created. Make sure the service account has "Full Control" of the directory the package is referencing. To test whether it is an access issue, log on to the server using the service account credentials and manually run the package from the SSIS catalog. If it fails for the same reason, you know you need to look at file system access for the service account.

SQL Server 2012 - SSAS Deployment Failed: File System Error, Access is Denied

The context is OLAP cube development. After configuring my project though SQL Server Data Tools (SSDT, the new BIDS) I am unable to deploy the project.
Every time the deployment process is started I get an error like the one below:
File system error: The following error occurred while opening the file '\\?\D:\[...]\database\mssql\tmpdb\MDTempStore_1864_9_no8wd.tmp': Access is denied.
(The [...] denotes some part of the path I ommited for brievty)
I always get the same error, indicating that some .tmp file could not be accessed.
My environment:
OS: Windows Server 2008 R2 Standard, SP1
SQL Server: SQL Server 2012 (v11.0.2100.60), running on localhost
What I tried:
I have the File System access rights for the folder in question (at some point I even tried with Admin privileges on the machine, didn't help)
I tried to deactivate the anti-virus in case it was performing on-access-scan (still didn't help)
Attempts to deploy/process individual dimensions causes the same problem
Deploying dimensions or cubes programmatically through SMO (instead of SSDT) runs into the same problem
Deploying DataSource objects as well as DataSourceView objects works fine
Maybe some of you faced similiar issues or have further suggestions/ideas?
Thanks for you help!
So, I finally figured it out.
As expected it was a permission issue, but despite the error message hinting at some missing file system permissions, the cause of the problem was the user I configured the Data Source with.
The SQL User I specified was given the roles
db_datareader
db_datawriter
db_ddladmin
on the source database but this doesn't seem to be enough. When I tried to give him the server role sysadmin it started working.
This is probably overkill, one could further fine-tune the role assignment but for now it also works that way.
Just a suggestion here - have you tried running SSDT as an administrator? That is, right-click on SSDT and click Run As Administrator. Then try to deploy your project. It definitely sounds like a permissions issue.
Exact reason is SSAS Service user does not have an access to the folders that are specified in SSAS configuration (i.e error states it is Temp Folder). I think it is not directly related with SQL Server because it is just a file access error. Error is thrown before it reaches SQL Server.
Give full permission to SSAS Service User for those folders.
Regards
Onur

Error code 40 when running SSRS reports from Internet Explorer (run as administrator)

We deployed a VB.Net application on a customer's computer that contains SSRS reports.
The application connects to the SQL Server database in the app without any problems. We installed SQL Server Data Tools so we could deploy the reports (rdl) and data source (rdl) files up to the report server. These deploy without any problems.
In SQL Server Data Tools we can "Preview" the reports without any problems as well.
We do run into a problem when attempting to view the report from Internet Explorer (run as an administrator).
We get the following error:
Cannot create a connection to data source 'DataSourceReports'
(this is the name we used for the TargetDataSourceFolder)
error:40 - Could not open a connection to SQL Server
We also get the same error when the app we deployed runs the reports.
Please let us know what is not set up correctly on the SQL Server side.
A likely possibility is that you are experiencing a double hop authentication problem. It's not clear from your explanation, but is the SQL Server database on a separate server from the report server? If so, then your credentials allow you to connect to the report server but Windows integrated security does not pass those credentials on to the SQL Server database if you are using NTLM on the report server. The report server tries to use Kerberos on your network to authenticate by way of ticketing to the SQL Server database, but you must have this configured correctly on your network. See this article if you want to use Kerberos: http://technet.microsoft.com/en-us/library/ff679930(v=sql.100).aspx.
Another (easier) solution is to open the data source on the report server and change the authentication to use stored credentials. Make sure the credentials you use have read permission on the SQL Server database. The downside of this approach is that you cannot use row-level security in your report by user unless you design your report to capture user information and set up the query or a filter on the dataset to restrict data by user. If that's not a concern, the stored credentials are easy to set up and maintain - and you're going to have to do this anyway if you want to use caching, snapshots, or subscriptions. For more information on stored credentials, see http://msdn.microsoft.com/en-us/library/ms159736.aspx.

Restore sharepoint 2010 web application on different domain

We made a backup of a web application through the central administration to move it to a different server on a different domain and it's a domain controller actually.
So we made a restore operation on the destination server from the central administration but never managed to succeed.
with errors like: Object failed in event OnRestore. For more information, see the spbackup.log or sprestore.log file located in the backup directory.SPException: The specified user or domain group was not found.
I tried every user account possible with no success. any clues?
Two things:
Did you try with "New Configuration" option while restoring? I believe the problem is related to the users/groups added to the site and those users do not exist in new environment!
Also can you try restore using PowerShell with -Force switch parameter and see if that is successfull?