Skip to content

ENH: SAS7BDAT reader #12015

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
19 changes: 19 additions & 0 deletions LICENSES/SAS7BDAT_LICENSE
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
Copyright (c) 2015 Jared Hobbs

Permission is hereby granted, free of charge, to any person obtaining a copy of
this software and associated documentation files (the "Software"), to deal in
the Software without restriction, including without limitation the rights to
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies
of the Software, and to permit persons to whom the Software is furnished to do
so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
20 changes: 19 additions & 1 deletion asv_bench/benchmarks/packers.py
Original file line number Diff line number Diff line change
Expand Up @@ -318,6 +318,24 @@ def remove(self, f):
pass


class packers_read_sas7bdat(object):

def setup(self):
self.f = 'data/test1.sas7bdat'

def time_packers_read_sas7bdat(self):
pd.read_sas(self.f, format='sas7bdat')


class packers_read_xport(object):

def setup(self):
self.f = 'data/paxraw_d_short.xpt'

def time_packers_read_xport(self):
pd.read_sas(self.f, format='xport')


class packers_write_csv(object):
goal_time = 0.2

Expand Down Expand Up @@ -854,4 +872,4 @@ def remove(self, f):
try:
os.remove(self.f)
except:
pass
pass
29 changes: 16 additions & 13 deletions doc/source/io.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4564,24 +4564,25 @@ easy conversion to and from pandas.

.. _io.sas_reader:

SAS Format
----------
SAS Formats
-----------

.. versionadded:: 0.17.0

The top-level function :func:`read_sas` currently can read (but
not write) SAS xport (.XPT) format files. Pandas cannot currently
handle SAS7BDAT files.
The top-level function :func:`read_sas` can read (but not write) SAS
`xport` (.XPT) and `SAS7BDAT` (.sas7bdat) format files.

XPORT files only contain two value types: ASCII text and double
precision numeric values. There is no automatic type conversion to
integers, dates, or categoricals. By default the whole file is read
and returned as a ``DataFrame``.
SAS files only contain two value types: ASCII text and floating point
values (usually 8 bytes but sometimes truncated). For xport files,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

double backtick around xport and SAS7BDAT so they stand out a bit

there is no automatic type conversion to integers, dates, or
categoricals. For SAS7BDAT files, the format codes may allow date
variables to be automatically converted to dates. By default the
whole file is read and returned as a ``DataFrame``.

Specify a ``chunksize`` or use ``iterator=True`` to obtain an
``XportReader`` object for incrementally reading the file. The
``XportReader`` object also has attributes that contain additional
information about the file and its variables.
Specify a ``chunksize`` or use ``iterator=True`` to obtain reader
objects (``XportReader`` or ``SAS7BDATReader``) for incrementally
reading the file. The reader objects also have attributes that
contain additional information about the file and its variables.

Read a SAS XPORT file:

Expand All @@ -4602,6 +4603,8 @@ web site.

.. _specification: https://support.sas.com/techsup/technote/ts140.pdf

No official documentation is available for the SAS7BDAT format.

.. _io.perf:

Performance Considerations
Expand Down
7 changes: 7 additions & 0 deletions doc/source/whatsnew/v0.18.0.txt
Original file line number Diff line number Diff line change
Expand Up @@ -403,6 +403,13 @@ For example, if you have a jupyter notebook you plan to convert to latex using n
Options ``display.latex.escape`` and ``display.latex.longtable`` have also been added to the configuration and are used automatically by the ``to_latex``
method. See the :ref:`options documentation<options>` for more info.

SAS7BDAT files
^^^^^^^^^^^^^^

Pandas can now read SAS7BDAT files, including compressed files. The
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add the issue number here

files can be read in entirety, or incrementally. For full details see
:ref:`here <io.sas>`. (issue:`4052`)

.. _whatsnew_0180.enhancements.other:

Other enhancements
Expand Down
2 changes: 1 addition & 1 deletion pandas/io/api.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
from pandas.io.json import read_json
from pandas.io.html import read_html
from pandas.io.sql import read_sql, read_sql_table, read_sql_query
from pandas.io.sas import read_sas
from pandas.io.sas.sasreader import read_sas
from pandas.io.stata import read_stata
from pandas.io.pickle import read_pickle, to_pickle
from pandas.io.packers import read_msgpack, to_msgpack
Expand Down
Empty file added pandas/io/sas/__init__.py
Empty file.
Loading