Skip to content
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.

Commit cf80af5

Browse files
committedJan 10, 2024
fix: failing sphinx tests
1 parent a25643c commit cf80af5

File tree

3 files changed

+13
-16
lines changed

3 files changed

+13
-16
lines changed
 

‎doc/conf.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -94,7 +94,7 @@
9494
}
9595

9696
# Example configuration for intersphinx: refer to the Python standard library.
97-
intersphinx_mapping = {"http://docs.python.org/": None}
97+
intersphinx_mapping = {"python": ("http://docs.python.org/", None)}
9898

9999
# -- Options for autodoc ----------------------------------------------------
100100
# https://www.sphinx-doc.org/en/master/usage/extensions/autodoc.html#configuration

‎src/sagemaker/feature_store/feature_processor/feature_processor.py

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -45,8 +45,8 @@ def feature_processor(
4545
4646
If the decorated function is executed without arguments then the decorated function's arguments
4747
are automatically loaded from the input data sources. Outputs are ingested to the output Feature
48-
Group. If arguments are provided to this function, then arguments are not automatically loaded
49-
(for testing).
48+
Group. If arguments are provided to this function, then arguments are not automatically
49+
loaded (for testing).
5050
5151
Decorated functions must conform to the expected signature. Parameters: one parameter of type
5252
pyspark.sql.DataFrame for each DataSource in 'inputs'; followed by the optional parameters with
@@ -96,7 +96,6 @@ def transform(input_feature_group, input_csv):
9696
development phase to ensure that data is not used until the function is ready. It also
9797
useful for users that want to manage their own data ingestion. Defaults to True.
9898
spark_config (Dict[str, str]): A dict contains the key-value paris for Spark configurations.
99-
10099
Raises:
101100
IngestionError: If any rows are not ingested successfully then a sample of the records,
102101
with failure reasons, is logged.

‎src/sagemaker/session.py

Lines changed: 10 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -4565,20 +4565,18 @@ def update_inference_component(
45654565
Args:
45664566
inference_component_name (str): Name of the Amazon SageMaker ``InferenceComponent``.
45674567
specification ([dict[str,int]]): Resource configuration. Optional.
4568-
Example: {
4569-
"MinMemoryRequiredInMb": 1024,
4570-
"NumberOfCpuCoresRequired": 1,
4571-
"NumberOfAcceleratorDevicesRequired": 1,
4572-
"MaxMemoryRequiredInMb": 4096,
4573-
},
4574-
4568+
Example: {
4569+
"MinMemoryRequiredInMb": 1024,
4570+
"NumberOfCpuCoresRequired": 1,
4571+
"NumberOfAcceleratorDevicesRequired": 1,
4572+
"MaxMemoryRequiredInMb": 4096,
4573+
},
45754574
runtime_config ([dict[str,int]]): Number of copies. Optional.
4576-
Default: {
4577-
"copyCount": 1
4578-
}
4579-
4575+
Default: {
4576+
"copyCount": 1
4577+
}
45804578
wait: Wait for inference component to be created before return. Optional. Default is
4581-
True.
4579+
True.
45824580
45834581
Return:
45844582
str: inference component name

0 commit comments

Comments
 (0)
Please sign in to comment.