Skip to content

Commit 63abbe4

Browse files
Matthew Luriejreback
Matthew Lurie
authored andcommitted
ENH: add s3_host from env variables
closes pandas-dev#12198
1 parent c4efe3a commit 63abbe4

File tree

2 files changed

+5
-3
lines changed

2 files changed

+5
-3
lines changed

doc/source/whatsnew/v0.18.0.txt

+1
Original file line numberDiff line numberDiff line change
@@ -193,6 +193,7 @@ Other enhancements
193193
- Handle truncated floats in SAS xport files (:issue:`11713`)
194194
- Added option to hide index in ``Series.to_string`` (:issue:`11729`)
195195
- ``read_excel`` now supports s3 urls of the format ``s3://bucketname/filename`` (:issue:`11447`)
196+
- add support for ``AWS_S3_HOST`` env variable when reading from s3 (:issue:`12198`)
196197
- A simple version of ``Panel.round()`` is now implemented (:issue:`11763`)
197198
- For Python 3.x, ``round(DataFrame)``, ``round(Series)``, ``round(Panel)`` will work (:issue:`11763`)
198199
- ``DataFrame`` has gained a ``_repr_latex_`` method in order to allow for automatic conversion to latex in a ipython/jupyter notebook using nbconvert. Options ``display.latex.escape`` and ``display.latex.longtable`` have been added to the configuration and are used automatically by the ``to_latex`` method. (:issue:`11778`)

pandas/io/common.py

+4-3
Original file line numberDiff line numberDiff line change
@@ -274,14 +274,15 @@ def get_filepath_or_buffer(filepath_or_buffer, encoding=None,
274274
import boto
275275
except:
276276
raise ImportError("boto is required to handle s3 files")
277-
# Assuming AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY
277+
# Assuming AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY and AWS_S3_HOST
278278
# are environment variables
279279
parsed_url = parse_url(filepath_or_buffer)
280+
s3_host = os.environ.get('AWS_S3_HOST', 's3.amazonaws.com')
280281

281282
try:
282-
conn = boto.connect_s3()
283+
conn = boto.connect_s3(host=s3_host)
283284
except boto.exception.NoAuthHandlerFound:
284-
conn = boto.connect_s3(anon=True)
285+
conn = boto.connect_s3(host=s3_host, anon=True)
285286

286287
b = conn.get_bucket(parsed_url.netloc, validate=False)
287288
if compat.PY2 and (compression == 'gzip' or

0 commit comments

Comments
 (0)