Transfer tool for unmanaged users GCP - google-workspace

Is there any way to export the list of consumer accounts from the google cloud admin console ( Transfer tool for unmanaged users) . Or I have to do copy/paste .
I was looking for a way to get : user email // Request Status // Last sent // Request already sent.
Thanks a lot.

If you want to export all the accounts in the project you can do the following:
gcloud projects get-iam-policy PROJECT-ID
To get all the accounts:
gcloud projects get-iam-policy PROJECT_ID | awk -F ":" '{print $2}' | grep '#'
To get the ones that are not service accounts (not managed):
gcloud projects get-iam-policy PROJECT_ID | awk -F ":" '{print $2}' | grep '#' | grep --invert-match "serviceaccount.com"

There's no API for the transfer tool for unmanaged users, and there's no option to export the list from the UI.

Related

There is a way to batch archive GitHub repositories based off of a search?

From the answer to a related question I know it's possible to batch clone repositories based on a GitHub search result:
# cheating knowing we currently have 9 pages
for i in {1..9}
do
curl "https://api.github.com/search/repositories?q=blazor+language:C%23&per_page=100&page=$i" \
| jq -r '.items[].ssh_url' >> urls.txt
done
cat urls.txt | xargs -P8 -L1 git clone
I also know that the Hub client allows me to make API calls.
hub api [-it] [-X METHOD] [-H HEADER] [--cache TTL] ENDPOINT [-F FIELD|--input FILE]
I guess the last step is, how do I archive a repository with Hub?
You can update a repository using the Update a Repository API call.
I put all my repositories in a TMP variable in the following way, and ran the following:
echo $TMP | xargs -P8 -L1 hub api -X PATCH -F archived=true
Here is a sample of what the $TMP variable looked like:
echo $TMP
/repos/amingilani/9bot
/repos/amingilani/advent-of-code-2019
/repos/amingilani/alan
/repos/amingilani/annotate_models

SumoLogic dashboards - how do I automate?

I am getting some experience with SumoLogic dashboards and alerting. I would like to have all possible configuration in code. Does anyone have experience with automation of SumoLogic configuration? At the moment I am using Ansible for general server and infra provisioning.
Thanks for all info!
Best Regards,
Rafal.
(The dashboards, alerts, etc. are referred to as Content in Sumo Logic parlance)
You can use the Content Management API, especially the content-import-job. I am not an expert in Ansible, but I am not aware of any way to plug that API into Ansible.
Also there's a community Terraform provider for Sumo Logic and it supports content:
resource "sumologic_content" "test" {
parent_id = "%s"
config =
{
"type": "SavedSearchWithScheduleSyncDefinition",
"name": "test-333",
"search": {
"queryText": "\"warn\"",
"defaultTimeRange": "-15m",
[...]
Disclaimer: I am currently employed by Sumo Logic
Below is the shell script to import the dashboard. Here it is SumoLogic AU instance. eg: https://api.au.sumologic.com/api. This will be changed based on your country.
Note: You can export all of your dashboard as json files.
#!/usr/bin/env bash
set -e
# if you are using AWS parameter store
# accessKey=$(aws ssm get-parameter --name path_to_your_key --with-decryption --query 'Parameter.Value' --region=ap-southeast-2 | tr -d \")
# accessSecret=$(aws ssm get-parameter --name name path_to_your_secret --with-decryption --query 'Parameter.Value' --region=ap-southeast-2 | tr -d \")
# yourDashboardFolderName="xxxxx" # this is the folder id in the sumologic where you want to create dashboards
# if you are using just key and secreat
accessKey= "your_sumologic_key"
accessSecret= "your_sumologic_secret"
yourDashboardFolderName="xxxxx" # this is the folder id in the sumologic
# you can place all the json files of dashboard in ./Sumologic/Dashboards folder.
for f in $(find ./Sumologic/Dashboards -name '*.json'); \
do \
curl -X POST https://api.au.sumologic.com/api/v2/content/folders/$yourDashboardFolderName/import \
-H "Content-Type: application/json" \
-u "$accessKey:$accessSecret" \
-d #$f \
;done

Transfer all GitHub repositories from one user to another user

Is there a way to transfer all GitHub repositories owned by one user to another user? Is this functionality accessible by an admin (eg. on Enterprise, if the user can no longer access GitHub)?
GitHub has a convenient command-line tool: hub found at https://hub.github.com
I've written an example to move all repos from all your organisations to my_new_organisation_name:
#!/usr/bin/env bash
orgs="$(hub api '/user/orgs' | jq -r '.[] | .login')";
repos="$(for org in $orgs; do hub api '/orgs/'"$org"'/repos' | jq -r '.[] | .name';
done)"
for org in $orgs; do
for repo in $repos; do
( hub api '/repos/'"$org"'/'"$repo"'/transfer'
-F 'new_owner'='my_new_organisation_name' | jq . ) &
done
done
For users rather than organisation, set my_new_organisation_name to the replacement username, remove the outer loop, and replace the repos= line with:
repos="$(hub api /users/SamuelMarks/repos | jq -r '.[] | .name')"
EDIT: Found a GUI if you prefer https://stackoverflow.com/a/54549899

ECS Service - Automating deploy with new Docker image

I want to automate the deployment of my application by having my ECS service launch with the latest Docker image. From what I've read, the way to deploy a new image version is as follows:
Create a new task revision (after updating the image on your Docker repository).
Update the service and specify the new revision.
This seems to work, but I want to do this all through CLI so I can script it. #2 seems easy enough to do through the AWS CLI with update-service, but I don't see a way to do #1 without specifying the entire Task JSON all over again as with register-task-definition (my JSON will include credentials in environment variables, so I want to have that in as few places as possible).
Is this how I should be automating deployment of my ECS Service updates? And if so, is there a "good" way to have the Task Definition launch a new revision (i.e. without duplicating everything)?
Yes, that is the correct approach.
And no, with the current API, you can't register a new revision of an existing task definition without duplicating it.
If you didn't use the CLI to generate the original task definition (or don't want to reuse the original commands that generated it), you could try something like the following through the CLI:
OLD_TASK_DEF=$(aws ecs describe-task-definition --task-definition <task_family_name>)
NEW_CONTAINER_DEFS=$(echo $OLD_TASK_DEF | jq '.taskDefinition.containerDefinitions' | jq '.[0].image="<new_image_name>"')
aws ecs register-task-definition --family <task_family_name> --container-definitions "'$(echo $NEW_CONTAINER_DEFS)'"
Not 100% secure as the last command's --container-defintions argument (which includes "environment" entries) will still be visible through processes like ps. One of the AWS SDKs would give better peace of mind.
The answer provided by Matt Callanan did not work for me: I received an error on this part:
--container-definitions "'$(echo $NEW_CONTAINER_DEFS)'"
Resulted in: Error parsing parameter '--container-definitions': Expected: '=', received: ''' for input:
'{ environment: [ { etc etc....
What I did to resolve it was:
TASK_FAMILY=<task familiy name>
DOCKER_IMAGE=<new_image_name>
LATEST_TASK_DEFINITION=$(aws ecs describe-task-definition --task-definition ${TASK_FAMILY})
echo $LATEST_TASK_DEFINITION \
| jq '{containerDefinitions: .taskDefinition.containerDefinitions, volumes: .taskDefinition.volumes}' \
| jq '.containerDefinitions[0].image='\"${DOCKER_IMAGE}\" \
> /tmp/tmp.json
aws ecs register-task-definition --family ${TASK_FAMILY} --cli-input-json file:///tmp/tmp.json
I take both the containerDefinitions and volumes elements from the original json document, because my containerDefinition uses these volumes (so it's not needed if you don't use volumes).
#!/bin/bash
SERVICE_NAME="your service name"
IMAGE_VERSION="v_"${BUILD_NUMBER}
TASK_FAMILY="your task defination name"
CLUSTER="your cluster name"
REGION="your region"
echo "=====================Create a new task definition for this build==========================="
sed -e "s;%BUILD_NUMBER%;${BUILD_NUMBER};g" taskdef.json > ${TASK_FAMILY}-${IMAGE_VERSION}.json
echo "=================Resgistring the task defination==========================================="
aws ecs register-task-definition --family ${TASK_FAMILY} --cli-input-json file://${TASK_FAMILY}-${IMAGE_VERSION}.json --region ${REGION}
echo "================Update the service with the new task definition and desired count================"
TASK_REVISION=`aws ecs describe-task-definition --task-definition ${TASK_FAMILY} --region ${REGION} | egrep "revision" | tr "/" " " | awk '{print $2}' | sed 's/"$//'`
DESIRED_COUNT=`aws ecs describe-services --cluster ${CLUSTER} --services ${SERVICE_NAME} --region ${REGION} | jq .services[].desiredCount`
if [ ${DESIRED_COUNT} = "0" ]; then
DESIRED_COUNT="1"
fi
echo "===============Updating the service=============================================================="
aws ecs update-service --cluster ${CLUSTER} --service ${SERVICE_NAME} --task-definition ${TASK_FAMILY}:${TASK_REVISION} --desired-count ${DESIRED_COUNT} --region ${REGION}
enter code here

List all GitHub repos for an organization - INCLUDING those in teams

Here's my query to the GitHub API
curl -i -u {user} https://api.github.com/orgs/{org}/repos?type=all
But this does not list all repos for this organization that I have access to. Specifically, it does not list repos in the organization that are part of a team that I am a member of.
If I were to query
curl -i -u {user} https://api.github.com/teams/{teamid}/repos
I would see the missing repos. If I were to navigate to github.com, I would see both private organization repos and my team repos next to each other on the same page. Is there a way to get all of these repos in the same API query?
You can use the below command
gh repo list {organization-name}
before this login with below command
gh auth login
github.com/cli/cli
I apologize. It was listing all my repos...on subsequent pages. Learning about "page=" and "per_page=" was all I needed, and now I can see all of the repos I need.
To add to original answer, below command can be used if there are many repositories and you wanted to fetch required number of repos. Without -L flag it returns 30 items by default.
gh repo list <org> -L <#>
If you have gh, the github cli (https://cli.github.com/) and jq (https://stedolan.github.io/jq/) you can do this:
gh repo list $ORG -L $COUNT --json name | jq '.[].name' | tr -d '"'
where $ORG is your organization name and $COUNT is the max number of repos returned
Download the official gh cli, the github cli (https://cli.github.com/)
gh repo list $ORG -L $COUNT --json name --jq '.[].name'
Set $ORG equal to your organization name, and $COUNT to be the amount of Repos you want to list. (Set $COUNT equal to the amount of repos in the organization if you want to list them all)
curl -i -u "username":"password" https://your_git_url.com/organization_name | grep "codeRepository" | awk -F '"' '{print $6}'
Have you tried setting the "per_page"-attribute to "0"? I have seen some APIs using a default value of for example 20, but if you activately set it to 0, like ?per_page=0 you get all pages.