Skip to content

Build: do not upload build.tool to production S3 #9098

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 10 commits into from
Apr 14, 2022
92 changes: 28 additions & 64 deletions scripts/compile_version_upload_s3.sh
Original file line number Diff line number Diff line change
Expand Up @@ -23,56 +23,20 @@
# PRODUCTION ENVIRONMENT
#
# To create a pre-compiled cached version and make it available on production,
# **the script must be ran from a builder (build-default or build-large)** and
# it's required to set several environment variables for an IAM user with
# permissions on ``readthedocs(inc)-build-tools-prod`` S3's bucket. Also, note
# that in production we need to install `aws` Python package to run the script.
# We can do this in a different virtualenv to avoid collision with the builder's
# code.
#
# The whole process would be something like:
#
# ssh util01
# ssh `scaling status -s build-default -q | head -n 1`
#
# sudo su - docs
# TOOL=python
# VERSION=3.10.0
#
# cd /home/docs/checkouts/readthedocs.org/scripts
# virtualenv venv
# source venv/bin/activate
# pip install awscli==1.20.34
#
# export AWS_REGION=...
# export AWS_ACCESS_KEY_ID=...
# export AWS_SECRET_ACCESS_KEY=...
# export AWS_BUILD_TOOLS_BUCKET=readthedocs(inc)-build-tools-prod
#
# ./compile_version_upload.sh $TOOL $VERSION
#
#
# ONE-LINE COMMAND FROM UTIL01 PRODUCTION
#
# TOOL=python
# VERSION=3.10.0
# AWS_BUILD_TOOLS_BUCKET=readthedocs(inc)-build-tools-prod
#
# ssh `scaling status -s build-default -q | head -n 1` \
# "cd /home/docs && \
# sudo -u docs virtualenv --python python3 /home/docs/buildtools && \
# sudo -u docs /home/docs/buildtools/bin/pip install awscli==1.20.34 && \
# sudo -u docs env AWS_REGION=$AWS_REGION \
# AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID \
# AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY \
# AWS_BUILD_TOOLS_BUCKET=$AWS_BUILD_TOOLS_BUCKET \
# PATH=/home/docs/buildtools/bin:${PATH} \
# /home/docs/checkouts/readthedocs.org/scripts/compile_version_upload_s3.sh $TOOL $VERSION"
# we use a CircleCI job
# (https://github.com/readthedocs/readthedocs-docker-images/blob/main/.circleci/config.yml)
# It requires to set several environment variables for an IAM user with
# permissions on ``readthedocs(inc)-build-tools-prod`` S3's bucket. These
# variables are defined via the CircleCI UI under the `readthedocs-docker-image`
# project.
#
# Note that if for some reason, you need to run this command *outside CircleCI*
# you can find more information in this comment:
# https://github.com/readthedocs/readthedocs-ops/issues/1155#issuecomment-1082615972
#
# USAGE
#
# ./scripts/compile_version_upload.sh $TOOL $VERSION
# ./scripts/compile_version_upload.sh $TOOL-$VERSION
#
# ARGUMENTS
#
Expand All @@ -81,17 +45,18 @@
#
# EXAMPLES
#
# ./scripts/compile_version_upload.sh python 3.9.7
# ./scripts/compile_version_upload.sh nodejs 14.17.6
# ./scripts/compile_version_upload.sh python-3.9.7
# ./scripts/compile_version_upload.sh nodejs-14.17.6

set -e # Stop on errors
set -x # Echo commands

# Define variables
SLEEP=350 # Container timeout
OS="${OS:-ubuntu-22.04}" # Docker image name
TOOL=$1
VERSION=$2

TOOL=$(echo $1 | cut -d- -f1)
VERSION=$(echo $1 | cut -d- -f2-)

# https://stackoverflow.com/questions/59895/how-can-i-get-the-source-directory-of-a-bash-script-from-within-the-script-itsel
SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )"
Expand Down Expand Up @@ -138,22 +103,21 @@ docker cp $CONTAINER_ID:/home/docs/$OS-$TOOL-$VERSION.tar.gz .
# Kill the container
docker container kill $CONTAINER_ID

# Upload the .tar.gz to S3
AWS_ENDPOINT_URL="${AWS_ENDPOINT_URL:-http://localhost:9000}"
AWS_BUILD_TOOLS_BUCKET="${AWS_BUILD_TOOLS_BUCKET:-build-tools}"
AWS_ACCESS_KEY_ID="${AWS_ACCESS_KEY_ID:-admin}"
AWS_SECRET_ACCESS_KEY="${AWS_SECRET_ACCESS_KEY:-password}"

if [[ -z $AWS_REGION ]]
if [[ -z $CIRCLECI ]]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I noticed this on the shell scripting on the ops side of this change as well, and a few spots above, but just to note: when using variables in sh/bash, it's best to quote them:

https://tldp.org/LDP/abs/html/quotingvar.html

We don't expect are any problematic characters in these variables, so this isn't an issue.

I've definitely been bit by unquoted variables in the past, usually around string comparisons.

Suggested change
if [[ -z $CIRCLECI ]]
if [[ -z "$CIRCLECI" ]]

then
# Development environment
# Upload the .tar.gz to S3 development environment
echo "Uploading to dev environment"

AWS_ENDPOINT_URL="${AWS_ENDPOINT_URL:-http://localhost:9000}"
AWS_BUILD_TOOLS_BUCKET="${AWS_BUILD_TOOLS_BUCKET:-build-tools}"
AWS_ACCESS_KEY_ID="${AWS_ACCESS_KEY_ID:-admin}"
AWS_SECRET_ACCESS_KEY="${AWS_SECRET_ACCESS_KEY:-password}"

aws --endpoint-url $AWS_ENDPOINT_URL s3 cp $OS-$TOOL-$VERSION.tar.gz s3://$AWS_BUILD_TOOLS_BUCKET

# Delete the .tar.gz file from the host
rm $OS-$TOOL-$VERSION.tar.gz
else
# Production environment does not requires `--endpoint-url`
echo "Uploading to prod environment"
aws s3 cp $OS-$TOOL-$VERSION.tar.gz s3://$AWS_BUILD_TOOLS_BUCKET
echo "Skip uploading .tar.gz file because it's being run from inside CircleCI."
echo "It should be uploaded by orbs/aws automatically."
fi

# Delete the .tar.gz file from the host
rm $OS-$TOOL-$VERSION.tar.gz