Terraform provisioned Private AKS cluster unable to deploy application from Azure pipeline - azure-devops

enter image description here
We are trying to deploy application to the provisioned private aks cluster using terraform in Azure devops, when we try to deploy helm or access the cluster we are getting error.
enter image description here

As you did not provided much information, i will do my best to help you:
It seems that the user or Service principal that is running the pipeline has permissions on subscription level to create the AKS but not enough permissions to create anything inside Kubernetes.
You can leverage RBAC, Azure AD & Azure RBAC with your Kubernetes. With Terraform you can specify admin_group_object_ids inside the azure_active_directory_role_based_access_control block. Just assign the group there and add the pipeline User / SP to that group.
Alternative you can use Azure build-in roles like Azure Kubernetes Service Cluster Admin Role and add your User / SP there.

Related

Azure DevOps environment with private EKS cluster

I am currently using EKS private cluster with a public API server endpoint in order to use Azure DevOps environments(with Kubernetes service connection).
I have a requirement to make everything private in EKS.
Once EKS becomes private, it breaks everything in Azure DevOps as it is not able to reach the API server.
Any suggestion on how to communicate private kubernetes API server with azure devops would be appreciated.
If you're trying to target the cluster for deployment, you need a self-hosted agent that has a network route to your cluster.
The other capabilities exposed by the environment feature of Azure DevOps (i.e. monitoring the state of the cluster via the environment view) will not work -- they require a public-facing Kubernetes API to work.
If you don't mind the additional cost, VPN can be used to establish connection to the private EKS cluster.

How to create Azure devops kubernetes service connection for to access private AKS cluster?

Creating a service connection to access non-private AKS cluster is straight forward, however if i want to create service connection for private AKS cluster is it possible from Azure Devops?
You can create New Kubernetes service connection using the KubeConfig option and click the dropdown arrow to choose Save without Verification
Also see Deploying to Private AKS Cluster
Please use below link
https://techcommunity.microsoft.com/t5/fasttrack-for-azure/using-azure-devops-to-deploy-an-application-on-aks-private/ba-p/2029630
I have impleted this solution in my place, we had private aks , we where unable to make service connection from azure devops to azure kubeneted,
we created a self hosted linux agent in the subnet where kubenetes is and add used my agent to run build and release pipeline

spinnaker ecs account wont select

I create and an ecs account name linked to aws account name, enabled ecs. Now when creating a new server group and selecting ecs. Under the account section the ecs account name appears but it wont let me select it.
If you are using Spinnaker < 1.19.X then the AWS ECS provider depends on the AWS EC2 provider and the AWS IAM structure.
Please read: AWS Providers Overview to understand the AWs IAM structure that is required (AWS managing Account and AWS Managed accounts through AssumeRole action)
Then you can set up an AWS EC2 Provider following this easy to get started guide by armory
Finally Set the AWS ECS provider with the legacy instructions found at spinnaker.io
If you are using Spinnaker > 1.19.X then you must use AWS ECS Service linked roles
One very important step is tagging the AWS VPC subnets so that spinnaker can access them.

Connecting to AKS API (via HTTP) from an Azure VM

I would like to connect to AKS API from a script on Azure VM inorder to scrape some metrics, check some stats, etc of the cluster.
Is there any approach (like an Azure policy or role and attaching it to VM) other than creating an user in azure AD or a service account in the AKS with clusterRole bound to it and referencing the certs/tokens from VM?
Thank you

How to automate Azure DevOps Kubernetes Service Connection to Cluster?

To deploy services via Azure Devops to my kubernetes cluster, I need to create a Kubernetes Service Connection manually. I want to automate this by creating the service connection dynamically in Azure DevOps so I can delete and recreate the cluster and deployment. Is this possible? How can I do this?
you can create the service endpoint using the azure devops api,
check this out for api detail
this might be related