我正在两个不同的项目中运行云函数。云函数存放在项目A中,MySQL实例存放在项目B中
为了让项目 A 中的云函数能够访问云实例并在项目 B 中执行导入作业,我在项目 B 中添加了云函数的服务帐户作为主体,并为该服务帐户分配了角色“云”项目 B 中的“SQL 客户端”和“Cloud SQL 编辑器”。我在此处遵循了本指南:https://www.cloudquery.io/blog/creating-cross-project-service-accounts-in-gcp
但是,当我在项目 A 中运行该函数时,我仍然收到此错误:
googleapiclient.errors.HttpError:
https://sqladmin.googleapis.com/sql/v1beta4/projects/astute-coda-410816/instances/test_database/import?alt=json返回“客户端无权执行此操作要求。”。详细信息:“[{'message': '客户端无权发出此请求。', 'domain': 'global', 'reason': 'notAuthorized'}]">
这是我使用的代码:
import sqlalchemy
from googleapiclient import discovery
import google.auth
import functions_framework
from google.cloud import storage
storage_client = storage.Client()
credentials, project = google.auth.default()
service = discovery.build('sqladmin', 'v1beta4', credentials=credentials)
#mysql connection
def getconn():
connector = Connector()
conn = connector.connect("astute-coda-410816:europe-west2:test_database",
"pymysql",
user=os.environ["DB_USER"],
password=os.environ["DB_PASS"],
db="product")
return conn
pool = sqlalchemy.create_engine("mysql+pymysql://",creator=getconn,)
def move_data(table: str) -> None:
#list files that need to be imported from gcs
bucket = storage_client.get_bucket('bucket_data')
blobs = bucket.list_blobs(prefix=table)
files = []
for blob in blobs:
files.append(blob.name)
#import files into mysql
for i in files:
instances_import_request_body = {
"importContext": {
"uri": f"gs://bucket_data/{i}",
"kind": "sql#importContext",
"database": "product",
"fileType": "CSV",
"csvImportOptions": {"table": table}
}
}
request = service.instances().import_(project="astute-coda-410816", instance="test_database", body=instances_import_request_body)
response = request.execute()
process_status = False
while not process_status:
resp = service.operations().get(project="astute-coda-410816", operation=response['name']).execute()
if resp['status'] == "DONE":
print("finish")
process_status = True
@functions_framework.cloud_event
def transfer(cloudevent):
#get status of Event method google.cloud.bigquery.v2.JobService.InsertJob from Cloud Audit Log
payload = cloudevent.data.get("protoPayload")
status = payload.get("status")
if not status: #if status is empty, job was successful and files can be copied over to mysql
move_data("product_data")
有人知道为什么吗?
API 激活是我的第一个猜测,但事实并非如此。
通过更仔细地阅读错误,您正在尝试执行导入。此操作需要
cloudsql.instances.import
权限,即仅限 Cloud SQL 管理员角色(或项目编辑者角色 -> 不好的做法,不要使用此角色)
文档中有更多详细信息https://cloud.google.com/sql/docs/mysql/iam-roles
(阅读 Cloud SQL 编辑器角色,明确提到导入不适用于它)