-
-
Notifications
You must be signed in to change notification settings - Fork 18.4k
read_csv return wrong dataframe when setting skiprows. #12775
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
very tricky as you have embedded newlines within quoted fields. I guess the skip_lines is not accounting for the quoted fields (and ignoring them). |
this is related to #10911 |
cc @mdmueller |
Thank you. That problem appear in my real project when other people give me a large csv file contain text column (text from news website). Because the file is big so I just want to read a part of file using skiprows parameter, but it don't work as I expect. |
Using the python engine instead of the faster c engine works for the data given above
|
Patches bug in C engine CSV parser in which quotation marks were not being respected in skipped rows. Closes pandas-devgh-10911. Closes pandas-devgh-12775.
Code Sample, a copy-pastable example if possible
Expected Output
It should skip '1,"line 11\nline 12",2' instead skip '1,"line 11'
output of
pd.show_versions()
INSTALLED VERSIONS
commit: None
python: 2.7.10.final.0
python-bits: 64
OS: Linux
OS-release: 4.2.3-300.fc23.x86_64
machine: x86_64
processor: x86_64
byteorder: little
LC_ALL: None
LANG: en_US.UTF-8
pandas: 0.18.0
nose: 1.3.7
pip: 8.1.1
setuptools: 18.0.1
Cython: None
numpy: 1.11.0
scipy: 0.14.1
statsmodels: 0.6.1
xarray: None
IPython: 3.2.1
sphinx: 1.2.3
patsy: 0.4.1
dateutil: 2.5.2
pytz: 2016.3
blosc: None
bottleneck: 0.6.0
tables: 3.2.2
numexpr: 2.4.6
matplotlib: 1.4.3
openpyxl: None
xlrd: None
xlwt: None
xlsxwriter: None
lxml: None
bs4: None
html5lib: None
httplib2: None
apiclient: None
sqlalchemy: None
pymysql: None
psycopg2: None
jinja2: 2.8
boto: None
The text was updated successfully, but these errors were encountered: