Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
57 commits
Select commit Hold shift + click to select a range
7dca4f2
docs(samples): add samples for IN, NOT_IN, and != operators. (#312)
jlara310 Jun 16, 2022
4165569
fix: require python 3.7+ (#332)
gcf-owl-bot[bot] Jul 10, 2022
b8fc4a7
chore(deps): update all dependencies (#340)
renovate-bot Aug 2, 2022
9ac64db
docs: Move the schedule_export samples from python-docs-samples (#344)
Mariatta Aug 9, 2022
4c1a86e
docs(samples): Add an example of using read_time in queries and get()…
jlara310 Aug 12, 2022
5ba9865
chore(deps): update dependency google-cloud-datastore to v2.8.1 (#348)
renovate-bot Aug 17, 2022
9bc603d
chore(deps): update dependency pytest to v7.1.3 (#359)
renovate-bot Sep 6, 2022
926a83e
chore: detect samples tests in nested directories (#360)
gcf-owl-bot[bot] Sep 13, 2022
8acb74c
samples: Update the read_time snippet. (#363)
jlara310 Oct 3, 2022
420626c
chore(deps): update dependency google-cloud-datastore to v2.8.2 (#369)
renovate-bot Oct 4, 2022
5d87c1d
chore(deps): update dependency backoff to v2.2.1 (#371)
renovate-bot Oct 6, 2022
db4e1e5
chore(deps): update dependency google-cloud-datastore to v2.8.3 (#375)
renovate-bot Oct 18, 2022
d2a500c
chore(deps): update dependency google-cloud-datastore to v2.9.0 (#376)
renovate-bot Oct 19, 2022
5b20685
chore(deps): update dependency pytest to v7.2.0 (#377)
renovate-bot Oct 26, 2022
d548451
chore(deps): update dependency google-cloud-datastore to v2.10.0 (#381)
renovate-bot Nov 9, 2022
882a641
chore(python): drop flake8-import-order in samples noxfile (#387)
gcf-owl-bot[bot] Nov 25, 2022
bbf0e63
chore(deps): update dependency google-cloud-datastore to v2.11.0 (#389)
renovate-bot Dec 1, 2022
9bd061d
samples: Add snippets and samples for Count query (#383)
Mariatta Dec 9, 2022
332b10c
chore(deps): update dependency google-cloud-datastore to v2.11.1 (#394)
renovate-bot Jan 4, 2023
3a596ff
chore(python): add support for python 3.11 (#395)
gcf-owl-bot[bot] Jan 6, 2023
49582bd
chore(deps): update dependency google-cloud-datastore to v2.12.0 (#399)
renovate-bot Jan 10, 2023
c0eded0
chore(deps): update dependency pytest to v7.2.1 (#403)
renovate-bot Jan 14, 2023
5b29ad6
chore(deps): update dependency google-cloud-datastore to v2.13.0 (#405)
renovate-bot Jan 18, 2023
35a74b6
chore(deps): update dependency google-cloud-datastore to v2.13.1 (#409)
renovate-bot Jan 23, 2023
f1af17a
chore(deps): update dependency google-cloud-datastore to v2.13.2 (#411)
renovate-bot Jan 24, 2023
063ae13
chore(deps): update dependency google-cloud-datastore to v2.14.0 (#423)
renovate-bot Mar 1, 2023
4b2c1cf
chore(deps): update dependency pytest to v7.2.2 (#424)
renovate-bot Mar 3, 2023
34854c2
chore(deps): update dependency google-cloud-datastore to v2.15.0 (#426)
renovate-bot Mar 17, 2023
40f91e9
chore(deps): update dependency google-cloud-datastore to v2.15.1 (#431)
renovate-bot Apr 6, 2023
7afd3f9
chore(deps): update dependency pytest to v7.3.1 (#433)
renovate-bot Apr 18, 2023
7e8f382
chore(deps): update dependency google-cloud-datastore to v2.15.2 (#438)
renovate-bot Jun 1, 2023
2d9a735
chore(deps): update dependency pytest to v7.3.2 (#445)
renovate-bot Jun 12, 2023
f8664ee
chore(deps): update all dependencies (#449)
renovate-bot Jul 5, 2023
c932e56
chore(deps): update dependency google-cloud-datastore to v2.16.1 (#454)
renovate-bot Jul 5, 2023
8164e8d
chore(deps): update dependency google-cloud-datastore to v2.17.0 (#469)
renovate-bot Aug 9, 2023
4141fee
chore(deps): update all dependencies (#473)
renovate-bot Sep 5, 2023
d30a1a8
chore(deps): update all dependencies (#475)
renovate-bot Sep 15, 2023
9277a67
chore(deps): update all dependencies (#483)
renovate-bot Sep 18, 2023
62b7aec
samples: Add snippets for sum and avg (#480)
jlara310 Oct 3, 2023
9d881ec
chore(deps): update all dependencies (#493)
renovate-bot Oct 30, 2023
60ebad6
feat: Add support for Python 3.12 (#498)
gcf-owl-bot[bot] Dec 1, 2023
324bb24
chore(deps): update dependency google-cloud-datastore to v2.19.0 (#508)
renovate-bot Dec 12, 2023
702775e
chore(deps): update dependency pytest to v7.4.4 (#511)
renovate-bot Jan 22, 2024
dc7a4ad
feat: implement query profiling (#542)
daniel-sanche Aug 7, 2024
b705ba1
chore(deps): update all dependencies (#519)
renovate-bot Aug 14, 2024
9840d5d
chore(deps): update all dependencies (#563)
renovate-bot Sep 19, 2024
5ee3fdb
chore(python): update dependencies in .kokoro/docker/docs (#574)
gcf-owl-bot[bot] Nov 15, 2024
1ad71d8
chore(python): Add support for Python 3.14 (#644)
chalmerlowe Nov 11, 2025
9a4720d
chore(deps): update all dependencies (#660)
renovate-bot Feb 13, 2026
a0ce1b4
chore(deps): update all dependencies (#670)
renovate-bot Feb 13, 2026
38c1c94
Merge remote-tracking branch 'migration/main' into python-datastore-m…
chalmerlowe Feb 18, 2026
1ba5e19
fix: can't claim all rights reserved and be Apache 2.
iennae Feb 19, 2026
17a1442
fix: Update requirements-test.txt
iennae Feb 19, 2026
ee7a265
Delete datastore/samples/snippets/schedule-export/noxfile.py
chalmerlowe Feb 20, 2026
b6bd254
Delete datastore/samples/snippets/schedule-export/noxfile_config.py
chalmerlowe Feb 20, 2026
73555cf
Delete datastore/samples/snippets/noxfile.py
chalmerlowe Feb 20, 2026
854904c
Delete datastore/samples/snippets/noxfile_config.py
chalmerlowe Feb 20, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 7 additions & 0 deletions datastore/samples/snippets/requirements-test.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
backoff===1.11.1; python_version < "3.7"
backoff==2.2.1; python_version >= "3.7"
pytest===7.4.3; python_version == '3.7'
pytest===8.3.5; python_version == '3.8'
pytest===8.4.2; python_version == '3.9'
pytest==9.0.2; python_version >= '3.10'
flaky==3.8.1
1 change: 1 addition & 0 deletions datastore/samples/snippets/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
google-cloud-datastore==2.23.0
5 changes: 5 additions & 0 deletions datastore/samples/snippets/schedule-export/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
# Scheduling Datastore exports with Cloud Functions and Cloud Scheduler

This sample application demonstrates how to schedule exports of your Datastore entities. To deploy this sample, see:

[Scheduling exports](https://cloud.google.com/datastore/docs/schedule-export)
57 changes: 57 additions & 0 deletions datastore/samples/snippets/schedule-export/main.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
# Copyright 2021 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

import base64
import json
import os

from google.cloud import datastore_admin_v1

project_id = os.environ.get("GCP_PROJECT")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

Using os.environ.get("GCP_PROJECT") will result in project_id being None if the environment variable is not set, which will cause a failure later in the function. It's better to fail early with a clear error message if this required environment variable is missing. Accessing it directly with os.environ["GCP_PROJECT"] will raise a KeyError if it's not set, which is a more robust way to handle required environment variables.

Suggested change
project_id = os.environ.get("GCP_PROJECT")
project_id = os.environ["GCP_PROJECT"]

client = datastore_admin_v1.DatastoreAdminClient()


def datastore_export(event, context):
"""Triggers a Datastore export from a Cloud Scheduler job.

Args:
event (dict): event[data] must contain a json object encoded in
base-64. Cloud Scheduler encodes payloads in base-64 by default.
Object must include a 'bucket' value and can include 'kinds'
and 'namespaceIds' values.
context (google.cloud.functions.Context): The Cloud Functions event
metadata.
"""
if "data" in event:
# Triggered via Cloud Scheduler, decode the inner data field of the json payload.
json_data = json.loads(base64.b64decode(event["data"]).decode("utf-8"))
else:
# Otherwise, for instance if triggered via the Cloud Console on a Cloud Function, the event is the data.
json_data = event

bucket = json_data["bucket"]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

Directly accessing json_data['bucket'] will raise a KeyError if 'bucket' is not in the payload, causing the function to fail with an unhandled exception. It's safer to use .get() and validate that the bucket is provided.

entity_filter = datastore_admin_v1.EntityFilter()

if "kinds" in json_data:
entity_filter.kinds = json_data["kinds"]

if "namespaceIds" in json_data:
entity_filter.namespace_ids = json_data["namespaceIds"]

export_request = datastore_admin_v1.ExportEntitiesRequest(
project_id=project_id, output_url_prefix=bucket, entity_filter=entity_filter
)
operation = client.export_entities(request=export_request)
response = operation.result()
print(response)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

For Cloud Functions, it is a best practice to use the standard Python logging module instead of print(). Logs from the logging module are automatically sent to Cloud Logging with structured data, which makes them easier to search and analyze. You will need to add import logging at the top of the file.

Suggested change
print(response)
logging.info("Export operation response: %s", response)

Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
pytest===8.4.2; python_version == '3.9'
pytest==9.0.2; python_version >= '3.10'
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
google-cloud-datastore==2.23.0
Original file line number Diff line number Diff line change
@@ -0,0 +1,73 @@
# Copyright 2019 Google LLC All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

import base64
from unittest.mock import Mock

import main

mock_context = Mock()
mock_context.event_id = "617187464135194"
mock_context.timestamp = "2020-04-15T22:09:03.761Z"


def test_datastore_export(capsys):
# Test an export without an entity filter
bucket = "gs://my-bucket"
json_string = '{{ "bucket": "{bucket}" }}'.format(bucket=bucket)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

Constructing JSON strings with string formatting can be error-prone and hard to read. Using json.dumps() is a safer and more idiomatic way to create JSON strings in Python. You'll need to add import json at the top of the file.

Suggested change
json_string = '{{ "bucket": "{bucket}" }}'.format(bucket=bucket)
json_string = json.dumps({"bucket": bucket})


# Encode data like Cloud Scheduler
data = bytes(json_string, "utf-8")
data_encoded = base64.b64encode(data)
event = {"data": data_encoded}

# Mock the Datastore service
mockDatastore = Mock()
main.client = mockDatastore

# Call tested function
main.datastore_export(event, mock_context)
out, err = capsys.readouterr()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The out and err variables are captured from capsys but are never used. If you don't intend to make assertions on the captured output, this line can be removed.

export_args = mockDatastore.export_entities.call_args[1]
# Assert request includes test values
assert export_args["request"].output_url_prefix == bucket


def test_datastore_export_entity_filter(capsys):
# Test an export with an entity filter
bucket = "gs://my-bucket"
kinds = "Users,Tasks"
namespaceIds = "Customer831,Customer157"
json_string = '{{ "bucket": "{bucket}", "kinds": "{kinds}", "namespaceIds": "{namespaceIds}" }}'.format(
bucket=bucket, kinds=kinds, namespaceIds=namespaceIds
)
Comment on lines +52 to +54
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

Constructing JSON strings with string formatting can be error-prone and hard to read. Using json.dumps() is a safer and more idiomatic way to create JSON strings in Python. You'll need to add import json at the top of the file.

Suggested change
json_string = '{{ "bucket": "{bucket}", "kinds": "{kinds}", "namespaceIds": "{namespaceIds}" }}'.format(
bucket=bucket, kinds=kinds, namespaceIds=namespaceIds
)
json_string = json.dumps(
{"bucket": bucket, "kinds": kinds, "namespaceIds": namespaceIds}
)


# Encode data like Cloud Scheduler
data = bytes(json_string, "utf-8")
data_encoded = base64.b64encode(data)
event = {"data": data_encoded}

# Mock the Datastore service
mockDatastore = Mock()
main.client = mockDatastore

# Call tested function
main.datastore_export(event, mock_context)
out, err = capsys.readouterr()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The out and err variables are captured from capsys but are never used. If you don't intend to make assertions on the captured output, this line can be removed.

export_args = mockDatastore.export_entities.call_args[1]
# Assert request includes test values

assert export_args["request"].output_url_prefix == bucket
assert export_args["request"].entity_filter.kinds == kinds
assert export_args["request"].entity_filter.namespace_ids == namespaceIds
Loading