You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* Update changelog with all changes since 0.4.1.
* Address warnings when building the docs due to mis-formed links.
* Removes some old references to streaming, which are no longer
relevant.
If the ``if_exists`` argument is set to ``'append'``, the destination
43
+
dataframe will be written to the table using the defined table schema and
44
+
column types. The dataframe must contain fields (matching name and type)
45
+
currently in the destination table.
56
46
57
47
.. note::
58
48
@@ -66,8 +56,10 @@ For example, the following writes ``df`` to a BigQuery table in batches of 10000
66
56
67
57
.. note::
68
58
69
-
While BigQuery uses SQL-like syntax, it has some important differences from traditional
70
-
databases both in functionality, API limitations (size and quantity of queries or uploads),
71
-
and how Google charges for use of the service. You should refer to `Google BigQuery documentation <https://cloud.google.com/bigquery/what-is-bigquery>`__
72
-
often as the service seems to be changing and evolving. BiqQuery is best for analyzing large
73
-
sets of data quickly, but it is not a direct replacement for a transactional database.
59
+
While BigQuery uses SQL-like syntax, it has some important differences
60
+
from traditional databases both in functionality, API limitations (size
61
+
and quantity of queries or uploads), and how Google charges for use of the
62
+
service. You should refer to `Google BigQuery documentation
63
+
<https://cloud.google.com/bigquery/docs>`__ often as the service is always
64
+
evolving. BiqQuery is best for analyzing large sets of data quickly, but
65
+
it is not a direct replacement for a transactional database.
0 commit comments