Azure Analysis Service Roles conflict with roles in database of source? - postgresql

I'm migrating an on-premise application (basically a database + a dashboard) to Azure. In my database on-premise, I had set up some access rules and RSL (Row Level Security) for different user profiles. Those were done at the level of my database which is postgres.
Now on the plateforme of Azure, I have to add an intermediare layer, Azure Analysis service between my postgres and dashboard. I want to know how to ensure my database access management & RLS with the addition of Azure Analysis service.
Do I need to replicate it in Analysis Service? Or it will still work and I do nothing with it.
Thanks

As I understand in the intial design you did not have AAS and the RLS was at the database level . Once you moved to Azure you introduced the AAS ( may be to boost performance) , to me it should work fine . We will have to think how will you refresh the data from the database to AAS and for that you may have add an user .

Related

Postgres with Azure Active Directory Authentication

In our organization, we are having common credentials to access the postgres databases, which every developers know, as it is hardcoded in application's connection string. Due to which, whenever a DML/DDL changes happens on databases, it is hard for us trace, as the developers use to make changes on their own. We can't have individual logins for each developers which is tedious to manage.
Note: Also, we can't ensure that the credentials won't be shared with the peer developers.
To get rid of this, we thought of integrating Postgres with Azure Active Directory, for Authentication.
If we can map Azure AD group/users to Postgres, security will be tightened as well as maintenance overhead will also reduce.
But, I couldn't find a article to implement this, since most of the articles says the integration for Azure managed postgresql with Azure AD, and not for the postgres running on VMs.
Can anyone guide me or share a detailed article to implement the Azure AD integration for Postgres running on a VM(IaaS)
In Azure portal go to the postgresql database select Authentication and set active directory admin.
You can specify an Azure AD group instead of an individual user to have multiple administrators.
Connecting to postgresql :
1.Login to Azure subscription.
2.Get the access token of the postgresql serverusing below command:
az account get-access-token --resource https://ossrdbms-aad.database.windows.net
3.Use that token as password for login with postgresql server.
Creating user
CREATE USER "user1#yourtenant.onmicrosoft.com" IN ROLE azure_ad_user;
Token validation:
Token is signed by Azure AD and has not been tampered with
Token was issued by Azure AD for the tenant associated with the server
Token has not expired
Token is for the Azure Database for PostgreSQL resource (and not another Azure resource)
Reference Link: Use Azure Active Directory - Azure Database for PostgreSQL - Single Server | Microsoft Learn
Using Azure Active Directory is a great idea for the reasons you specified, but unfortunately there's no native support for connection to Azure Active Directory with a local Postgres database (which is essentially what you have with Postgres in a VM). It can be done through the LDAP protocol, however.
FULL DISCLOSURE: I haven't actually done this part myself (or used the steps in the tutorial link), but this is my understanding from working with system operators. Use LDAP to connect to Azure AD then Postgres to connect via LDAP. More information on LDAP authentication in Postgres can be found here.
Bhavani's answer is about Azure Database for PostgreSQL, which is a Azure-native database service. This part I have used and I highly recommend it; you get Azure AD integration and can manage the database performance and connectivity specifically without having to also manage VM performance. Note that their screenshot is for the Flexible Server while the reference link says 'Single Server'; I recommend Flexible Server.

Queries on Understanding the necessary roles required for Migration of Azure Devops Server to Services using Data Migration Tool

This relates to the documentation that is available at the link below.
https://learn.microsoft.com/en-us/azure/devops/migrate/migration-overview?view=azure-devops
What would be the minimum role for a user to complete the migration successfully and without any permissions issues? That is my question.
For example, the user must have the what kind of necessary roles and permissions on both the Azure Devops server and the Azure Devops Services.
According to the Data Migration Utility Guide, the user who uses this tool must possess the following:
SQL Server's TFSEXECROLE role, and
Access rights to the TFS collection and configuration databases.
My understanding for example :-
Azure DevOps Server:
If we add the user to the Team Foundation Administrators group on the Azure DevOps Server, does the role fulfill.
Azure Devops Services: If we assign the same user who performs migration as an Azure DevOps Administrator mention in the below image on the Azure devops services, does the below role fulfill.
"Azure Devops Administrator"
Also, it would be useful if you could specify the maximum size limit of the. Dacpac backup file that the Data Migration Tool supports (i.e. the maximum size of the project collection backup) in order for the migration to go properly.
What permissions does the same user that runs the data migration tool need in SQL server to perform the command SqlPackage.exe?
I would thank you in advance for the help. It would help us to understand the better usage of the Data Migration Tool.
Many Thanks..!
Best Regards

Create Service Principle Connection from Crystal Reports to Azure Synapse Analytics

I have data held in an Azure Data Lake Gen 2 storage container. I would like to provision this data for an existing report authored in Crystal Reports using SQL on demand.
During development I used my own Azure AD login via an ODBC connection on my local machine. I have access to the Synapse environment and also the data lake. This worked successfully and although slow, pulled all information required.
To deploy this solution correctly I need to remove my AAD creds and use a provisioned service principle. I have given the service principle to read from the data lake and also added the principle to the SQL database. Now I am stuck on how to use the principle to connect to Crystal Reports.
I have tried the same authentication type as with my AAD but now I am using a clientID not a email. So when the system prompts for connection details it wants you to sign in and does not accept the clientID.
Does anyone have any suggestions on how to connect to Crystal Reports using this way or any other way?
Also: My org does not want this user or app reg to have restricted permissions so therefore adding them to the RBAC "synapse admin" wont work.
Thanks
Tom
Found a way around this.
Create a service account user on Azure Portal. Head to Synapse Analytics and open blank SQL script to give the user minimal permissions.
*USE [master]*
CREATE LOGIN [serviceaccountsynapseuser#company.onmicrosoft.com] FROM EXTERNAL PROVIDER
GRANT CONNECT ANY DATABASE TO [serviceaccountsynapseuser#company.onmicrosoft.com]
GRANT SELECT ALL USER SECURABLES TO [serviceaccountsynapseuser#company.onmicrosoft.com]
*USE [Reporting] (Serverless SQL DB)*
CREATE USER [serviceaccountsynapseuser#company.onmicrosoft.com] FROM EXTERNAL PROVIDER
ALTER ROLE db_datareader ADD MEMBER [serviceaccountsynapseuser#company.onmicrosoft.com]
Finally head to the storage account and give the user storage blob reader role.

EF Code First Migration, performance issues when not using admin account in SQL Azure

We've built a web tool (C# WebAPI) to administrate migration of SQL Azure databases with Entity Framework Code First Migrations.
By default, when we create databases we also create logins and user accounts per database.
These accounts get db_datareader and db_datawriter permissions.
We use these accounts from the web app to connect to the database to get current migrations and if there are any pending migrations, apply them.
For some reason, this operation takes about 10 seconds every time (without applying any updates).
If we use the admin account (associated when setting up the sql server in Azure) instead the time drops to less than a second.
I've come to the conclusion that there must be some kind of permission thing that gives us the decrease of performance.
I've added db_ddladmin role explained here without any success.
We use the DbMigrator class in Entity Framework Migrations to get pending migrations.
After some more investigation I found out that the solution to the problem is to create sql users in the master database also.
Before we only created logins and a corresponding user in the database.
According to this post SQL Azure doesn't support default database and therefore it defaults to the master database were I didn't have any rights.
Adding user to the master database solved the "performance" issue.

Connect to backend of VSO

Is there a way to get the server info of my VSO account and access using SQL Server?
I've tried logging in using the URL
{account}.visualstudio.com
But I got a sever not found error
No, the back-end databases are SQL Azure instances, different from the TFS on-premise databases. I cannot see MS ever giving you access to the database - maybe the data, but not the database.
You can only use the API (old and new REST) and Power BI tools to perform queries.
If you have a specific problem you are trying to solve, post it as a new question because it may be possible without database access.