Skip to content

Commit

Permalink
Copybara import of the project:
Browse files Browse the repository at this point in the history
--
c76a319 by Anthonios Partheniou <[email protected]>:

chore: regenerate code with gapic-generator-python 1.4.4

--
9c55be3 by Yu-Han Liu <[email protected]>:

chore: regenerate with gapic-generator-python 1.4.4
COPYBARA_INTEGRATE_REVIEW=#1779 from googleapis:regenerate-code-with-gapic-1-4-4 9c55be3
PiperOrigin-RevId: 488641639
  • Loading branch information
parthea authored and copybara-github committed Nov 15, 2022
1 parent 2377606 commit 43e2805
Show file tree
Hide file tree
Showing 48 changed files with 681 additions and 312 deletions.
325 changes: 177 additions & 148 deletions .kokoro/requirements.txt

Large diffs are not rendered by default.

Original file line number Diff line number Diff line change
Expand Up @@ -1593,7 +1593,7 @@ async def sample_create_feature():
become the final component of the Feature's resource
name.
This value may be up to 60 characters, and valid
This value may be up to 128 characters, and valid
characters are ``[a-z0-9_]``. The first character cannot
be a number.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1859,7 +1859,7 @@ def sample_create_feature():
become the final component of the Feature's resource
name.
This value may be up to 60 characters, and valid
This value may be up to 128 characters, and valid
characters are ``[a-z0-9_]``. The first character cannot
be a number.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1280,10 +1280,9 @@ async def sample_list_tensorboard_experiments():
The request object. Request message for
[TensorboardService.ListTensorboardExperiments][google.cloud.aiplatform.v1.TensorboardService.ListTensorboardExperiments].
parent (:class:`str`):
Required. The resource name of the
Tensorboard to list
Required. The resource name of the Tensorboard to list
TensorboardExperiments. Format:
'projects/{project}/locations/{location}/tensorboards/{tensorboard}'
``projects/{project}/locations/{location}/tensorboards/{tensorboard}``
This corresponds to the ``parent`` field
on the ``request`` instance; if ``request`` is provided, this
Expand Down Expand Up @@ -2014,10 +2013,9 @@ async def sample_list_tensorboard_runs():
The request object. Request message for
[TensorboardService.ListTensorboardRuns][google.cloud.aiplatform.v1.TensorboardService.ListTensorboardRuns].
parent (:class:`str`):
Required. The resource name of the
TensorboardExperiment to list
TensorboardRuns. Format:
'projects/{project}/locations/{location}/tensorboards/{tensorboard}/experiments/{experiment}'
Required. The resource name of the TensorboardExperiment
to list TensorboardRuns. Format:
``projects/{project}/locations/{location}/tensorboards/{tensorboard}/experiments/{experiment}``
This corresponds to the ``parent`` field
on the ``request`` instance; if ``request`` is provided, this
Expand Down Expand Up @@ -2745,10 +2743,9 @@ async def sample_list_tensorboard_time_series():
The request object. Request message for
[TensorboardService.ListTensorboardTimeSeries][google.cloud.aiplatform.v1.TensorboardService.ListTensorboardTimeSeries].
parent (:class:`str`):
Required. The resource name of the
TensorboardRun to list
TensorboardTimeSeries. Format:
'projects/{project}/locations/{location}/tensorboards/{tensorboard}/experiments/{experiment}/runs/{run}'
Required. The resource name of the TensorboardRun to
list TensorboardTimeSeries. Format:
``projects/{project}/locations/{location}/tensorboards/{tensorboard}/experiments/{experiment}/runs/{run}``
This corresponds to the ``parent`` field
on the ``request`` instance; if ``request`` is provided, this
Expand Down Expand Up @@ -3222,7 +3219,7 @@ async def sample_read_tensorboard_blob_data():
time_series (:class:`str`):
Required. The resource name of the TensorboardTimeSeries
to list Blobs. Format:
'projects/{project}/locations/{location}/tensorboards/{tensorboard}/experiments/{experiment}/runs/{run}/timeSeries/{time_series}'
``projects/{project}/locations/{location}/tensorboards/{tensorboard}/experiments/{experiment}/runs/{run}/timeSeries/{time_series}``
This corresponds to the ``time_series`` field
on the ``request`` instance; if ``request`` is provided, this
Expand Down
21 changes: 9 additions & 12 deletions google/cloud/aiplatform_v1/services/tensorboard_service/client.py
Original file line number Diff line number Diff line change
Expand Up @@ -1566,10 +1566,9 @@ def sample_list_tensorboard_experiments():
The request object. Request message for
[TensorboardService.ListTensorboardExperiments][google.cloud.aiplatform.v1.TensorboardService.ListTensorboardExperiments].
parent (str):
Required. The resource name of the
Tensorboard to list
Required. The resource name of the Tensorboard to list
TensorboardExperiments. Format:
'projects/{project}/locations/{location}/tensorboards/{tensorboard}'
``projects/{project}/locations/{location}/tensorboards/{tensorboard}``
This corresponds to the ``parent`` field
on the ``request`` instance; if ``request`` is provided, this
Expand Down Expand Up @@ -2312,10 +2311,9 @@ def sample_list_tensorboard_runs():
The request object. Request message for
[TensorboardService.ListTensorboardRuns][google.cloud.aiplatform.v1.TensorboardService.ListTensorboardRuns].
parent (str):
Required. The resource name of the
TensorboardExperiment to list
TensorboardRuns. Format:
'projects/{project}/locations/{location}/tensorboards/{tensorboard}/experiments/{experiment}'
Required. The resource name of the TensorboardExperiment
to list TensorboardRuns. Format:
``projects/{project}/locations/{location}/tensorboards/{tensorboard}/experiments/{experiment}``
This corresponds to the ``parent`` field
on the ``request`` instance; if ``request`` is provided, this
Expand Down Expand Up @@ -3059,10 +3057,9 @@ def sample_list_tensorboard_time_series():
The request object. Request message for
[TensorboardService.ListTensorboardTimeSeries][google.cloud.aiplatform.v1.TensorboardService.ListTensorboardTimeSeries].
parent (str):
Required. The resource name of the
TensorboardRun to list
TensorboardTimeSeries. Format:
'projects/{project}/locations/{location}/tensorboards/{tensorboard}/experiments/{experiment}/runs/{run}'
Required. The resource name of the TensorboardRun to
list TensorboardTimeSeries. Format:
``projects/{project}/locations/{location}/tensorboards/{tensorboard}/experiments/{experiment}/runs/{run}``
This corresponds to the ``parent`` field
on the ``request`` instance; if ``request`` is provided, this
Expand Down Expand Up @@ -3554,7 +3551,7 @@ def sample_read_tensorboard_blob_data():
time_series (str):
Required. The resource name of the TensorboardTimeSeries
to list Blobs. Format:
'projects/{project}/locations/{location}/tensorboards/{tensorboard}/experiments/{experiment}/runs/{run}/timeSeries/{time_series}'
``projects/{project}/locations/{location}/tensorboards/{tensorboard}/experiments/{experiment}/runs/{run}/timeSeries/{time_series}``
This corresponds to the ``time_series`` field
on the ``request`` instance; if ``request`` is provided, this
Expand Down
4 changes: 3 additions & 1 deletion google/cloud/aiplatform_v1/types/artifact.py
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,9 @@ class Artifact(proto.Message):
metadata store.
metadata (google.protobuf.struct_pb2.Struct):
Properties of the Artifact.
The size of this field should not exceed 200KB.
Top level metadata keys' heading and trailing
spaces will be trimmed. The size of this field
should not exceed 200KB.
description (str):
Description of the Artifact
"""
Expand Down
14 changes: 14 additions & 0 deletions google/cloud/aiplatform_v1/types/batch_prediction_job.py
Original file line number Diff line number Diff line change
Expand Up @@ -103,6 +103,16 @@ class BatchPredictionJob(proto.Message):
DEDICATED_RESOURCES this config may be provided (and the job
will use these resources), if the Model doesn't support
AUTOMATIC_RESOURCES, this config must be provided.
service_account (str):
The service account that the DeployedModel's container runs
as. If not specified, a system generated one will be used,
which has minimal permissions and the custom container, if
used, may not have enough permission to access other GCP
resources.
Users deploying the Model must have the
``iam.serviceAccounts.actAs`` permission on this service
account.
manual_batch_tuning_parameters (google.cloud.aiplatform_v1.types.ManualBatchTuningParameters):
Immutable. Parameters configuring the batch behavior.
Currently only applicable when
Expand Down Expand Up @@ -437,6 +447,10 @@ class OutputInfo(proto.Message):
number=7,
message=machine_resources.BatchDedicatedResources,
)
service_account = proto.Field(
proto.STRING,
number=29,
)
manual_batch_tuning_parameters = proto.Field(
proto.MESSAGE,
number=8,
Expand Down
4 changes: 3 additions & 1 deletion google/cloud/aiplatform_v1/types/context.py
Original file line number Diff line number Diff line change
Expand Up @@ -78,7 +78,9 @@ class Context(proto.Message):
metadata store.
metadata (google.protobuf.struct_pb2.Struct):
Properties of the Context.
The size of this field should not exceed 200KB.
Top level metadata keys' heading and trailing
spaces will be trimmed. The size of this field
should not exceed 200KB.
description (str):
Description of the Context
"""
Expand Down
17 changes: 17 additions & 0 deletions google/cloud/aiplatform_v1/types/dataset.py
Original file line number Diff line number Diff line change
Expand Up @@ -166,6 +166,18 @@ class ImportDataConfig(proto.Message):
labels specified inside index file referenced by
[import_schema_uri][google.cloud.aiplatform.v1.ImportDataConfig.import_schema_uri],
e.g. jsonl file.
annotation_labels (Mapping[str, str]):
Labels that will be applied to newly imported Annotations.
If two Annotations are identical, one of them will be
deduped. Two Annotations are considered identical if their
[payload][google.cloud.aiplatform.v1.Annotation.payload],
[payload_schema_uri][google.cloud.aiplatform.v1.Annotation.payload_schema_uri]
and all of their
[labels][google.cloud.aiplatform.v1.Annotation.labels] are
the same. These labels will be overridden by Annotation
labels specified inside index file referenced by
[import_schema_uri][google.cloud.aiplatform.v1.ImportDataConfig.import_schema_uri],
e.g. jsonl file.
import_schema_uri (str):
Required. Points to a YAML file stored on Google Cloud
Storage describing the import format. Validation will be
Expand All @@ -185,6 +197,11 @@ class ImportDataConfig(proto.Message):
proto.STRING,
number=2,
)
annotation_labels = proto.MapField(
proto.STRING,
proto.STRING,
number=3,
)
import_schema_uri = proto.Field(
proto.STRING,
number=4,
Expand Down
4 changes: 3 additions & 1 deletion google/cloud/aiplatform_v1/types/execution.py
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,9 @@ class Execution(proto.Message):
metadata store.
metadata (google.protobuf.struct_pb2.Struct):
Properties of the Execution.
The size of this field should not exceed 200KB.
Top level metadata keys' heading and trailing
spaces will be trimmed. The size of this field
should not exceed 200KB.
description (str):
Description of the Execution
"""
Expand Down
10 changes: 6 additions & 4 deletions google/cloud/aiplatform_v1/types/featurestore.py
Original file line number Diff line number Diff line change
Expand Up @@ -61,10 +61,12 @@ class Featurestore(proto.Message):
System reserved label keys are prefixed with
"aiplatform.googleapis.com/" and are immutable.
online_serving_config (google.cloud.aiplatform_v1.types.Featurestore.OnlineServingConfig):
Optional. Config for online storage
resources. If unset, the featurestore will not
have an online store and cannot be used for
online serving.
Optional. Config for online storage resources. The field
should not co-exist with the field of
``OnlineStoreReplicationConfig``. If both of it and
OnlineStoreReplicationConfig are unset, the feature store
will not have an online store and cannot be used for online
serving.
state (google.cloud.aiplatform_v1.types.Featurestore.State):
Output only. State of the featurestore.
encryption_spec (google.cloud.aiplatform_v1.types.EncryptionSpec):
Expand Down
13 changes: 12 additions & 1 deletion google/cloud/aiplatform_v1/types/featurestore_service.py
Original file line number Diff line number Diff line change
Expand Up @@ -548,6 +548,12 @@ class BatchReadFeatureValuesRequest(proto.Message):
[BatchReadFeatureValuesRequest.entity_type_specs] must have
a column specifying entity IDs in the EntityType in
[BatchReadFeatureValuesRequest.request][] .
start_time (google.protobuf.timestamp_pb2.Timestamp):
Optional. Excludes Feature values with
feature generation timestamp before this
timestamp. If not set, retrieve oldest values
kept in Feature Store. Timestamp, if present,
must not have higher than millisecond precision.
"""

class PassThroughField(proto.Message):
Expand Down Expand Up @@ -629,6 +635,11 @@ class EntityTypeSpec(proto.Message):
number=7,
message=EntityTypeSpec,
)
start_time = proto.Field(
proto.MESSAGE,
number=11,
message=timestamp_pb2.Timestamp,
)


class ExportFeatureValuesRequest(proto.Message):
Expand Down Expand Up @@ -1109,7 +1120,7 @@ class CreateFeatureRequest(proto.Message):
Required. The ID to use for the Feature, which will become
the final component of the Feature's resource name.
This value may be up to 60 characters, and valid characters
This value may be up to 128 characters, and valid characters
are ``[a-z0-9_]``. The first character cannot be a number.
The value must be unique within an EntityType.
Expand Down
2 changes: 1 addition & 1 deletion google/cloud/aiplatform_v1/types/index_endpoint.py
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,7 @@ class IndexEndpoint(proto.Message):
are mutually exclusive.
`Format <https://cloud.google.com/compute/docs/reference/rest/v1/networks/insert>`__:
projects/{project}/global/networks/{network}. Where
``projects/{project}/global/networks/{network}``. Where
{project} is a project number, as in '12345', and {network}
is network name.
enable_private_service_connect (bool):
Expand Down
14 changes: 10 additions & 4 deletions google/cloud/aiplatform_v1/types/metadata_service.py
Original file line number Diff line number Diff line change
Expand Up @@ -359,7 +359,8 @@ class ListArtifactsRequest(proto.Message):
``in_context("projects/<project_number>/locations/<location>/metadataStores/<metadatastore_name>/contexts/<context-id>")``
Each of the above supported filter types can be combined
together using logical operators (``AND`` & ``OR``).
together using logical operators (``AND`` & ``OR``). Maximum
nested expression depth allowed is 5.
For example:
``display_name = "test" AND metadata.field1.bool_value = true``.
Expand Down Expand Up @@ -667,7 +668,8 @@ class ListContextsRequest(proto.Message):
"projects/<project_number>/locations/<location>/metadataStores/<metadatastore_name>/contexts/<context_id>"
Each of the above supported filters can be combined together
using logical operators (``AND`` & ``OR``).
using logical operators (``AND`` & ``OR``). Maximum nested
expression depth allowed is 5.
For example:
``display_name = "test" AND metadata.field1.bool_value = true``.
Expand Down Expand Up @@ -1103,7 +1105,10 @@ class ListExecutionsRequest(proto.Message):
``in_context("projects/<project_number>/locations/<location>/metadataStores/<metadatastore_name>/contexts/<context-id>")``
Each of the above supported filters can be combined together
using logical operators (``AND`` & ``OR``). For example:
using logical operators (``AND`` & ``OR``). Maximum nested
expression depth allowed is 5.
For example:
``display_name = "test" AND metadata.field1.bool_value = true``.
order_by (str):
How the list of messages is ordered. Specify the values to
Expand Down Expand Up @@ -1523,7 +1528,8 @@ class QueryArtifactLineageSubgraphRequest(proto.Message):
``metadata.field_1.number_value = 10.0``
Each of the above supported filter types can be combined
together using logical operators (``AND`` & ``OR``).
together using logical operators (``AND`` & ``OR``). Maximum
nested expression depth allowed is 5.
For example:
``display_name = "test" AND metadata.field1.bool_value = true``.
Expand Down
13 changes: 11 additions & 2 deletions google/cloud/aiplatform_v1/types/model.py
Original file line number Diff line number Diff line change
Expand Up @@ -50,9 +50,9 @@ class Model(proto.Message):
version_aliases (Sequence[str]):
User provided version aliases so that a model version can be
referenced via alias (i.e.
projects/{project}/locations/{location}/models/{model_id}@{version_alias}
``projects/{project}/locations/{location}/models/{model_id}@{version_alias}``
instead of auto-generated version id (i.e.
projects/{project}/locations/{location}/models/{model_id}@{version_id}).
``projects/{project}/locations/{location}/models/{model_id}@{version_id})``.
The format is [a-z][a-zA-Z0-9-]{0,126}[a-z0-9] to
distinguish from version_id. A default version alias will be
created for the first version of the model, and there must
Expand Down Expand Up @@ -280,6 +280,11 @@ class Model(proto.Message):
be automl training pipeline, custom training
pipeline, BigQuery ML, or existing Vertex AI
Model.
metadata_artifact (str):
Output only. The resource name of the Artifact that was
created in MetadataStore when creating the Model. The
Artifact resource name pattern is
``projects/{project}/locations/{location}/metadataStores/{metadata_store}/artifacts/{artifact}``.
"""

class DeploymentResourcesType(proto.Enum):
Expand Down Expand Up @@ -454,6 +459,10 @@ class ExportableContent(proto.Enum):
number=38,
message="ModelSourceInfo",
)
metadata_artifact = proto.Field(
proto.STRING,
number=44,
)


class PredictSchemata(proto.Message):
Expand Down
Loading

0 comments on commit 43e2805

Please sign in to comment.