Skip to content

Migrate base inference toolkit scripts and unit tests #157

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Oct 19, 2023

Conversation

sachanub
Copy link
Contributor

@sachanub sachanub commented Oct 16, 2023

Issue #, if available:

Description of changes:

The objective of this PR is to migrate the base inference toolkit scripts and unit tests to the PyTorch inference toolkit to remove the dependency on the base inference toolkit repository.

  • Base inference toolkit package src/sagemaker_inference:
    • __init__.py (original script): No change.

    • utils.py (original script): No change.

    • logging.py (original script): No change.

    • errors.py (original script): No change.

    • content_types.py (original script): No change.

    • encoder.py (original script): No change.

    • decoder.py (original script): No change.

    • parameters.py (original script):

    • environment.py (original script): Any methods/attributes corresponding to parameters retained in parameters.py will be retained. The rest will be removed. Specifically, we remove the following:

      • DEFAULT_STARTUP_TIMEOUT, self._startup_timeout, self.startup_timeout()
      • self. _model_server_timeout_seconds, self.model_server_timeout_seconds()
      • DEFAULT_VMARGS, self._vmargs, self.vmargs()
      • DEFAULT_MAX_REQUEST_SIZE, self._max_request_size_in_mb, self. max_request_size()
    • model_server.py (original script):

      • Attributes:
        • Retained: We retained the attributes which are used to install requirements from requirements.txt in torchserve.py.
          • REQUIREMENTS_PATH
        • Removed: The attributes mentioned below were removed because either their equivalents are present in torchserve.py or they are used by legacy methods:
          • MMS_CONFIG_FILE
          • DEFAULT_HANDLER_SERVICE
          • DEFAULT_MMS_CONFIG_FILE
          • MME_MMS_CONFIG_FILE
          • DEFAULT_MMS_LOG_FILE
          • DEFAULT_MMS_MODEL_EXPORT_DIRECTORY
          • DEFAULT_MMS_MODEL_NAME
          • ENABLE_MULTI_MODEL, MODEL_STORE
          • PYTHON_PATH_ENV
          • MMS_NAMESPACE
      • Methods:
        • Retained: We retained the methods which are used to install requirements from requirements.txt in torchserve.py.
          • _install_requirements()
          • _get_codeartifact_index()
        • Removed: The methods mentioned below have been removed because either their equivalents are present in torchserve.py or they are legacy methods.
          • _start_model_server()
          • _adapt_to_mms_format()
          • _set_python_path()
          • _create_model_server_config_file()
          • _generate_mms_config_properties()
          • _add_sigterm_handler()
          • _retry_retrieve_mms_server_process()
          • _retrieve_mms_server_process()
          • _reap_children()
          • _add_sigchild_handler()
    • default_handler_service.py (original script): No change.

    • default_inference_handler.py (original script): No change.

    • transformer.py (original script): No change.

All the unit tests from the base inference toolkit have been moved to the PyTorch inference toolkit. Removed any unit tests which test functions and attributes which have been removed from the base inference toolkit scripts. Specifically, we will have the following:

  • Unit tests test/unit:
    • Renamed test_default_inference_handler.py to test_default_pytorch_inference_handler.py.
    • Renamed test_model_server.py to test_torchserve.py.
    • test_decoder.py (original script): No change.
    • test_default_handler_service.py (original script): No change.
    • test_default_inference_handler.py (original script): No change.
    • test_encoder.py (original script): No change.
    • test_environment.py (original script): In the test_env(), removed the assertions for the following:
      • env.max_request_size
      • env.vmargs
      • env.startup_timeout
      • env.model_server_timeout_seconds
    • test_model_server.py (original script):
      • Retained:
        • test_install_requirements()
        • test_install_requirements_installation_failed()
        • test_install_requirements_codeartifact_invalid_arn_installation_failed()
        • test_install_requirements_codeartifact()
      • Removed:
        • test_start_model_server_default_service_handler()
        • test_start_model_server_custom_handler_service()
        • test_adapt_to_mms_format()
        • test_adapt_to_mms_format_existing_path()
        • test_set_existing_python_path()
        • test_new_python_path()
        • test_create_model_server_config_file()
        • test_generate_mms_config_properties()
        • test_generate_mms_config_properties_default_workers()
        • test_add_sigterm_handler()
        • test_retrieve_mms_server_process()
        • test_retrieve_mms_server_process_no_server()
        • test_retrieve_mms_server_process_too_many_servers()
        • test_retry_retrieve_mms_server_process()
    • test_transformer.py (original script): No change.
    • test_utils.py (original script): No change.
  • tox.ini: Modify the coverage run command to include sagemaker_inference in the source .
  • setup.py: Remove sagemaker_inference as a dependency and include the base toolkit’s dependencies.
  • test/container: Uninstall sagemaker_inference in the following Dockerfiles.

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.

@sachanub sachanub changed the title Merge base inference toolkit scripts and unit tests Migrate base inference toolkit scripts and unit tests Oct 16, 2023
Copy link
Contributor

@namannandan namannandan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall LGTM, apart from a few minor comments/questions.

# distributed on an 'AS IS' BASIS, WITHOUT WARRANTIES OR CONDITIONS OF
# ANY KIND, either express or implied. See the License for the specific
# language governing permissions and limitations under the License.
from __future__ import absolute_import
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we need this package? Is it for backwards compatibility with older python versions?
Same applies to following files in which this line is added.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This link offers details to the relevance of from __future__ import absolute_import:
https://portingguide.readthedocs.io/en/latest/imports.html

Also, this line is required for the flake8 checks.

@@ -3,6 +3,6 @@ FROM 763104351884.dkr.ecr.$region.amazonaws.com/pytorch-inference:2.0.0-cpu-py31

COPY dist/sagemaker_pytorch_inference-*.tar.gz /sagemaker_pytorch_inference.tar.gz

RUN pip uninstall -y sagemaker_pytorch_inference && \
RUN pip uninstall -y sagemaker_inference sagemaker_pytorch_inference && \
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: package name

Suggested change
RUN pip uninstall -y sagemaker_inference sagemaker_pytorch_inference && \
RUN pip uninstall -y sagemaker-inference sagemaker_pytorch_inference && \

Same applies to the remaining references to sagemaker-inference where we uninstall it.

Although this does not seem to affect the container build, it does successfully uninstall sagemaker-inference from the base image.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe I am missing something here, but I think we can specify either sagemaker-inference or sagemaker_inference when attempting pip uninstall. I tried uninstalling the base toolkit with both names and it works successfully with both.

ubuntu@ip-172-31-31-136:~/code$ pip uninstall -y sagemaker-inference 
Found existing installation: sagemaker-inference 1.10.0
Uninstalling sagemaker-inference-1.10.0:
  Successfully uninstalled sagemaker-inference-1.10.0

ubuntu@ip-172-31-31-136:~/code$ pip uninstall -y sagemaker_inference 
Found existing installation: sagemaker-inference 1.10.0
Uninstalling sagemaker-inference-1.10.0:
  Successfully uninstalled sagemaker-inference-1.10.0

Also, according to the logs while building the container, the base toolkit is successfully uninstalled:

#8 [3/3] RUN pip uninstall -y sagemaker_inference sagemaker_pytorch_inference &&     pip install --upgrade --no-cache-dir /sagemaker_pytorch_inference.tar.gz &&     rm /sagemaker_pytorch_inference.tar.gz
--
945 | #8 8.150 Found existing installation: sagemaker-inference 1.9.2
946 | #8 8.152 Uninstalling sagemaker-inference-1.9.2:
947 | #8 8.153   Successfully uninstalled sagemaker-inference-1.9.2
948 | #8 8.155 Found existing installation: sagemaker-pytorch-inference 2.0.14
949 | #8 8.157 Uninstalling sagemaker-pytorch-inference-2.0.14:
950 | #8 8.159   Successfully uninstalled sagemaker-pytorch-inference-2.0.14

@@ -3,6 +3,6 @@ FROM 763104351884.dkr.ecr.$region.amazonaws.com/pytorch-inference:2.0.0-gpu-py31

COPY dist/sagemaker_pytorch_inference-*.tar.gz /sagemaker_pytorch_inference.tar.gz

RUN pip uninstall -y sagemaker_pytorch_inference && \
RUN pip uninstall -y sagemaker_inference sagemaker_pytorch_inference && \
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
RUN pip uninstall -y sagemaker_inference sagemaker_pytorch_inference && \
RUN pip uninstall -y sagemaker-inference sagemaker_pytorch_inference && \

@@ -3,6 +3,6 @@ FROM 763104351884.dkr.ecr.$region.amazonaws.com/pytorch-inference:2.0.1-cpu-py31

COPY dist/sagemaker_pytorch_inference-*.tar.gz /sagemaker_pytorch_inference.tar.gz

RUN pip uninstall -y sagemaker_pytorch_inference && \
RUN pip uninstall -y sagemaker_inference sagemaker_pytorch_inference && \
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
RUN pip uninstall -y sagemaker_inference sagemaker_pytorch_inference && \
RUN pip uninstall -y sagemaker-inference sagemaker_pytorch_inference && \

@@ -3,6 +3,6 @@ FROM 763104351884.dkr.ecr.$region.amazonaws.com/pytorch-inference:2.0.1-gpu-py31

COPY dist/sagemaker_pytorch_inference-*.tar.gz /sagemaker_pytorch_inference.tar.gz

RUN pip uninstall -y sagemaker_pytorch_inference && \
RUN pip uninstall -y sagemaker_inference sagemaker_pytorch_inference && \
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
RUN pip uninstall -y sagemaker_inference sagemaker_pytorch_inference && \
RUN pip uninstall -y sagemaker-inference sagemaker_pytorch_inference && \

@sachanub sachanub merged commit 1e8e043 into aws:master Oct 19, 2023
@sachanub sachanub deleted the merge_base_toolkit branch October 20, 2023 18:07
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants