Google user account impersonates as Service Account vs using Google Groups
I have seen the IAM configuration of bastion hosts where the local root user was configured with gcloud auth activate-service-account by passing the — key-file. The project owner then deleted the key file on the host for security reasons; Google user accounts with service account user
role were instructed to log into the bastion hosts and execute sudo su -
to run gcloud
commands as the service account. There are several downsides to the approach which makes it unsuitable for production:
- There is no Cloud Audit logging even
Identity and Access Management (IAM) API
Audit logs are enabled. - The generated service account key is for 1 time use and is usually lost or leaked after deletion.
- Users have to run as root on the bastion host defeating the purpose of having multiple individual users on the host. Each user needs to persist their working directories with some arbitrary names e,g,
/root/john-smith/gke-deployment.yaml
What’s the alternative solution to this problem? I’d say Google Groups. However, in an enterprise, asking the security team to create a Google Group and maintain members may take weeks and may not even happen. The Google Group approach means assigning Google user accounts to the Group devops-aiml@company.com
and grant the group IAM roles to perform allowed tasks. Users would inherently have the IAM roles to execute the gcloud
commands and enabled audit logs would generate proper logging.
What if Google Groups approach isn’t feasible? The solution is to use service account impersonation which generates the access token for the impersonated service account; The requires IAM roles are Service Account Token Creator role, Service Usage Consumer role, try the following command to run the gcloud command as the compute engine default service account:
gcloud compute zones list --impersonate-service-account=[project number]-compute@developer.gserviceaccount.com --project=[project ID]
# The newer method
gcloud config set auth/impersonate_service_account [project number]-compute@developer.gserviceaccount.com
My recently testing shows that Service account user role is not required as it’s intended for users with that role to gcloud compute ssh into an instance (of compute engine or AI platform notebooks) or to attach service accounts to resources.
If the command fails with the error below, verify that your Google account has the following roles:
- Service Account Token Creator role in the service account’s project or on the service account. You can verify with gcloud iam service-accounts get-iam-policy $GSA
- Service Usage Consumer role in the resource project specified by
--project=[project ID]
- The project specified by
--project=[project ID]
which the service account is listing zones for hasIAM Service Account Credentials API
enabled
ERROR: (gcloud.compute.zones.list) Error 403 (Forbidden) - failed to impersonate [project number]-compute@developer.gserviceaccount.com. Make sure the account that's trying to impersonate it has access to the service account itself and the "roles/iam.serviceAccountTokenCreator" role
If you still encounter error, append --log-http
in the gcloud command and inspect the output. If the http response was redacted, you may need to execute gcloud config set log_http_redact_token false
first.
"error": {
"code": 403,
"message": "Caller does not have required permission to use project [project ID]. Grant the caller the Owner or Editor role, or a custom role with the serviceusage.services.use permission, by visiting https://console.developers.google.com/iam-admin/iam/project?project=[project ID] and then retry (propagation of new permission may take a few minutes).",
"status": "PERMISSION_DENIED",
The next step would be to enable Audit logs for [Identity and Access Management (IAM) API]: Admin Read, Data Read, Data Write:
In Stackdriver logging, select the Service Account
resource or use filter protoPayload.serviceName="iamcredentials.googleapis.com" protoPayload.methodName="GenerateAccessToken"
to inspect the logs:
The downside is you can’t use environment variable GOOGLE_APPLICATION_CREDENTIALS
to run python or Java client applications which require the service account key json file. However, you could use gsutil command with -i [service_account]
to access Cloud Storage: gsutil -i storagereader@[project].iam.gserviceaccount.com ls gs://path/
. Thanks to Google Cloud free tier technical support to give --log-http
hint. Many Thanks to Brent Elphick in https://googlecloud-community.slack.com/ for providing the logging filter support.
If the Python code runs in Cloud Function or in Cloud Shell as a user, Granting service account token creator IAM role under the impersonated service account for the user or cloud function’s service account would be enough. Here’s a sample code in cloud function where $impersonated_PROJECT_ID has the impersonated service account. Code also works in Cloud Shell for a user to impersonate as a service account.
import google.auth
import google.auth.impersonated_credentials
from google.cloud import storage
def hello_world(request):
target_scopes = ["https://www.googleapis.com/auth/cloud-platform"]
creds, pid = google.auth.default()
print(f"Obtained default credentials for the project {pid}")
tcreds = google.auth.impersonated_credentials.Credentials(source_credentials=creds, target_principal="hil-tmp-storage-admin@$impersonated_PROJECT_ID.iam.gserviceaccount.com", target_scopes=target_scopes)
client = storage.Client(credentials=tcreds)
buckets = client.list_buckets(project="$impersonated_PROJECT_ID")
bucket_names = []
for bucket in buckets:
bucket_names.append(bucket.name)
# print(bucket_names)
return str(bucket_names)