You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
pd.read_csv()'s skip_blank_lines parameter defaults to True
I'm not sure if that is a good idea?
If I write away a dataframe that has 1 column and includes N null values to a CSV, without writing away an index (index=False), then I do not expect that when I read the dataframe back from that CSV the result is N rows shorter than the original.
For example I was puzzled when I wrote away a column with 1000 rows to CSV, then read that column back from the CSV to find it only had 37 rows.
The text was updated successfully, but these errors were encountered:
joristork
changed the title
skip_blank_lines
pd.read_csv() skip_blank_lines defaults to True
Feb 25, 2016
joristork
changed the title
pd.read_csv() skip_blank_lines defaults to True
pd.read_csv(): silently skips null values from single column CSV
Feb 25, 2016
jorisvandenbossche
changed the title
pd.read_csv(): silently skips null values from single column CSV
read_csv: silently skips null values from single column CSV
Feb 25, 2016
@jreback : I don't quite understand this issue because "null value" is extremely vague. Also, pandas does generally respect null values and doesn't skip them. I would vote to close this unless an example can be provided (which it hasn't since this is almost 6 months old).
pd.read_csv()
'sskip_blank_lines
parameter defaults toTrue
I'm not sure if that is a good idea?
If I write away a dataframe that has 1 column and includes N null values to a CSV, without writing away an index (
index=False
), then I do not expect that when I read the dataframe back from that CSV the result is N rows shorter than the original.For example I was puzzled when I wrote away a column with 1000 rows to CSV, then read that column back from the CSV to find it only had 37 rows.
The text was updated successfully, but these errors were encountered: