-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Add wrapper for LDA. #56
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Changes from all commits
Commits
Show all changes
11 commits
Select commit
Hold shift + click to select a range
e2b8f0d
Add wrapper for LDA.
lukmis 33fc211
Fix values used in unit tests to be matching the type of the parameter.
lukmis 30c65e0
Revert previous values changes in tests and fix the validation order …
lukmis e32aac7
Fix docstring.
lukmis b8658b3
Fix file reading mode.
lukmis 4e1c134
Merge branch 'master' into add_lda
lukmis 324655d
Update after merge from master.
lukmis 5094e97
Comments.
lukmis 238db37
Fix flake.
lukmis 7cc945a
Comments.
lukmis 1dfc4c3
Move construction of the RecordSet to test part until a more complete…
lukmis File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,127 @@ | ||
# Copyright 2017 Amazon.com, Inc. or its affiliates. All Rights Reserved. | ||
# | ||
# Licensed under the Apache License, Version 2.0 (the "License"). You | ||
# may not use this file except in compliance with the License. A copy of | ||
# the License is located at | ||
# | ||
# http://aws.amazon.com/apache2.0/ | ||
# | ||
# or in the "license" file accompanying this file. This file is | ||
# distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF | ||
# ANY KIND, either express or implied. See the License for the specific | ||
# language governing permissions and limitations under the License. | ||
from sagemaker.amazon.amazon_estimator import AmazonAlgorithmEstimatorBase, registry | ||
from sagemaker.amazon.common import numpy_to_record_serializer, record_deserializer | ||
from sagemaker.amazon.hyperparameter import Hyperparameter as hp # noqa | ||
from sagemaker.amazon.validation import gt | ||
from sagemaker.predictor import RealTimePredictor | ||
from sagemaker.model import Model | ||
from sagemaker.session import Session | ||
|
||
|
||
class LDA(AmazonAlgorithmEstimatorBase): | ||
|
||
repo_name = 'lda' | ||
repo_version = 1 | ||
|
||
num_topics = hp('num_topics', gt(0), 'An integer greater than zero', int) | ||
alpha0 = hp('alpha0', gt(0), 'A positive float', float) | ||
max_restarts = hp('max_restarts', gt(0), 'An integer greater than zero', int) | ||
max_iterations = hp('max_iterations', gt(0), 'An integer greater than zero', int) | ||
tol = hp('tol', gt(0), 'A positive float', float) | ||
|
||
def __init__(self, role, train_instance_type, num_topics, | ||
alpha0=None, max_restarts=None, max_iterations=None, tol=None, **kwargs): | ||
"""Latent Dirichlet Allocation (LDA) is :class:`Estimator` used for unsupervised learning. | ||
|
||
Amazon SageMaker Latent Dirichlet Allocation is an unsupervised learning algorithm that attempts to describe | ||
a set of observations as a mixture of distinct categories. LDA is most commonly used to discover | ||
a user-specified number of topics shared by documents within a text corpus. | ||
Here each observation is a document, the features are the presence (or occurrence count) of each word, and | ||
the categories are the topics. | ||
|
||
This Estimator may be fit via calls to | ||
:meth:`~sagemaker.amazon.amazon_estimator.AmazonAlgorithmEstimatorBase.fit`. It requires Amazon | ||
:class:`~sagemaker.amazon.record_pb2.Record` protobuf serialized data to be stored in S3. | ||
There is an utility :meth:`~sagemaker.amazon.amazon_estimator.AmazonAlgorithmEstimatorBase.record_set` that | ||
can be used to upload data to S3 and creates :class:`~sagemaker.amazon.amazon_estimator.RecordSet` to be passed | ||
to the `fit` call. | ||
|
||
To learn more about the Amazon protobuf Record class and how to prepare bulk data in this format, please | ||
consult AWS technical documentation: https://docs.aws.amazon.com/sagemaker/latest/dg/cdf-training.html | ||
|
||
After this Estimator is fit, model data is stored in S3. The model may be deployed to an Amazon SageMaker | ||
Endpoint by invoking :meth:`~sagemaker.amazon.estimator.EstimatorBase.deploy`. As well as deploying an Endpoint, | ||
deploy returns a :class:`~sagemaker.amazon.lda.LDAPredictor` object that can be used | ||
for inference calls using the trained model hosted in the SageMaker Endpoint. | ||
|
||
LDA Estimators can be configured by setting hyperparameters. The available hyperparameters for | ||
LDA are documented below. | ||
|
||
For further information on the AWS LDA algorithm, | ||
please consult AWS technical documentation: https://docs.aws.amazon.com/sagemaker/latest/dg/lda.html | ||
|
||
Args: | ||
role (str): An AWS IAM role (either name or full ARN). The Amazon SageMaker training jobs and | ||
APIs that create Amazon SageMaker endpoints use this role to access | ||
training data and model artifacts. After the endpoint is created, | ||
the inference code might use the IAM role, if accessing AWS resource. | ||
train_instance_type (str): Type of EC2 instance to use for training, for example, 'ml.c4.xlarge'. | ||
num_topics (int): The number of topics for LDA to find within the data. | ||
alpha0 (float): Optional. Initial guess for the concentration parameter | ||
max_restarts (int): Optional. The number of restarts to perform during the Alternating Least Squares (ALS) | ||
spectral decomposition phase of the algorithm. | ||
max_iterations (int): Optional. The maximum number of iterations to perform during the | ||
ALS phase of the algorithm. | ||
tol (float): Optional. Target error tolerance for the ALS phase of the algorithm. | ||
**kwargs: base class keyword argument values. | ||
""" | ||
|
||
# this algorithm only supports single instance training | ||
super(LDA, self).__init__(role, 1, train_instance_type, **kwargs) | ||
self.num_topics = num_topics | ||
self.alpha0 = alpha0 | ||
self.max_restarts = max_restarts | ||
self.max_iterations = max_iterations | ||
self.tol = tol | ||
|
||
def create_model(self): | ||
"""Return a :class:`~sagemaker.amazon.LDAModel` referencing the latest | ||
s3 model data produced by this Estimator.""" | ||
|
||
return LDAModel(self.model_data, self.role, sagemaker_session=self.sagemaker_session) | ||
|
||
def fit(self, records, mini_batch_size, **kwargs): | ||
# mini_batch_size is required, prevent explicit calls with None | ||
if mini_batch_size is None: | ||
raise ValueError("mini_batch_size must be set") | ||
super(LDA, self).fit(records, mini_batch_size, **kwargs) | ||
|
||
|
||
class LDAPredictor(RealTimePredictor): | ||
"""Transforms input vectors to lower-dimesional representations. | ||
|
||
The implementation of :meth:`~sagemaker.predictor.RealTimePredictor.predict` in this | ||
`RealTimePredictor` requires a numpy ``ndarray`` as input. The array should contain the | ||
same number of columns as the feature-dimension of the data used to fit the model this | ||
Predictor performs inference on. | ||
|
||
:meth:`predict()` returns a list of :class:`~sagemaker.amazon.record_pb2.Record` objects, one | ||
for each row in the input ``ndarray``. The lower dimension vector result is stored in the ``projection`` | ||
key of the ``Record.label`` field.""" | ||
|
||
def __init__(self, endpoint, sagemaker_session=None): | ||
super(LDAPredictor, self).__init__(endpoint, sagemaker_session, serializer=numpy_to_record_serializer(), | ||
deserializer=record_deserializer()) | ||
|
||
|
||
class LDAModel(Model): | ||
"""Reference LDA s3 model data. Calling :meth:`~sagemaker.model.Model.deploy` creates an Endpoint and return | ||
a Predictor that transforms vectors to a lower-dimensional representation.""" | ||
|
||
def __init__(self, model_data, role, sagemaker_session=None): | ||
sagemaker_session = sagemaker_session or Session() | ||
repo = '{}:{}'.format(LDA.repo_name, LDA.repo_version) | ||
image = '{}/{}'.format(registry(sagemaker_session.boto_session.region_name, LDA.repo_name), repo) | ||
super(LDAModel, self).__init__(model_data, image, role, predictor_cls=LDAPredictor, | ||
sagemaker_session=sagemaker_session) |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Binary file not shown.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,77 @@ | ||
# Copyright 2017 Amazon.com, Inc. or its affiliates. All Rights Reserved. | ||
# | ||
# Licensed under the Apache License, Version 2.0 (the "License"). You | ||
# may not use this file except in compliance with the License. A copy of | ||
# the License is located at | ||
# | ||
# http://aws.amazon.com/apache2.0/ | ||
# | ||
# or in the "license" file accompanying this file. This file is | ||
# distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF | ||
# ANY KIND, either express or implied. See the License for the specific | ||
# language governing permissions and limitations under the License. | ||
import boto3 | ||
import numpy as np | ||
import os | ||
from six.moves.urllib.parse import urlparse | ||
|
||
import sagemaker | ||
from sagemaker import LDA, LDAModel | ||
from sagemaker.amazon.amazon_estimator import RecordSet | ||
from sagemaker.amazon.common import read_records | ||
from sagemaker.utils import name_from_base, sagemaker_timestamp | ||
|
||
from tests.integ import DATA_DIR, REGION | ||
from tests.integ.timeout import timeout, timeout_and_delete_endpoint_by_name | ||
|
||
|
||
def test_lda(): | ||
|
||
with timeout(minutes=15): | ||
sagemaker_session = sagemaker.Session(boto_session=boto3.Session(region_name=REGION)) | ||
data_path = os.path.join(DATA_DIR, 'lda') | ||
data_filename = 'nips-train_1.pbr' | ||
|
||
with open(os.path.join(data_path, data_filename), 'rb') as f: | ||
all_records = read_records(f) | ||
|
||
# all records must be same | ||
feature_num = int(all_records[0].features['values'].float32_tensor.shape[0]) | ||
|
||
lda = LDA(role='SageMakerRole', train_instance_type='ml.c4.xlarge', num_topics=10, | ||
sagemaker_session=sagemaker_session, base_job_name='test-lda') | ||
|
||
record_set = _prepare_record_set_from_local_files(data_path, lda.data_location, | ||
len(all_records), feature_num, sagemaker_session) | ||
lda.fit(record_set, 100) | ||
|
||
endpoint_name = name_from_base('lda') | ||
with timeout_and_delete_endpoint_by_name(endpoint_name, sagemaker_session, minutes=20): | ||
model = LDAModel(lda.model_data, role='SageMakerRole', sagemaker_session=sagemaker_session) | ||
predictor = model.deploy(1, 'ml.c4.xlarge', endpoint_name=endpoint_name) | ||
|
||
predict_input = np.random.rand(1, feature_num) | ||
result = predictor.predict(predict_input) | ||
|
||
assert len(result) == 1 | ||
for record in result: | ||
assert record.label["topic_mixture"] is not None | ||
|
||
|
||
def _prepare_record_set_from_local_files(dir_path, destination, num_records, feature_dim, sagemaker_session): | ||
"""Build a :class:`~RecordSet` by pointing to local files. | ||
|
||
Args: | ||
dir_path (string): Path to local directory from where the files shall be uploaded. | ||
destination (string): S3 path to upload the file to. | ||
num_records (int): Number of records in all the files | ||
feature_dim (int): Number of features in the data set | ||
sagemaker_session (sagemaker.session.Session): Session object to manage interactions with Amazon SageMaker APIs. | ||
Returns: | ||
RecordSet: A RecordSet specified by S3Prefix to to be used in training. | ||
""" | ||
key_prefix = urlparse(destination).path | ||
key_prefix = key_prefix + '{}-{}'.format("testfiles", sagemaker_timestamp()) | ||
key_prefix = key_prefix.lstrip('/') | ||
uploaded_location = sagemaker_session.upload_data(path=dir_path, key_prefix=key_prefix) | ||
return RecordSet(uploaded_location, num_records, feature_dim, s3_data_type='S3Prefix') |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add a link to the docs for this.
It also indicates that it only supports CPU instances for training. That seems like it would be good to validate.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The training job will not fail if not started on CPU instance so I think we shouldn't fail too fast here. Will add comment/link.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yea, that makes sense. I misunderstood.