You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* Use more descriptive link texts
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* Fix link target name
* Fix typos
---------
Co-authored-by: remigathoni <[email protected]>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
examples to the existing documentation. This can be done by following the :ref:`guidelines for contributing to the documentation <contributing.documentation>`.
Copy file name to clipboardExpand all lines: doc/user-guide/dask.rst
+6-6
Original file line number
Diff line number
Diff line change
@@ -39,7 +39,7 @@ The actual computation is controlled by a multi-processing or thread pool,
39
39
which allows Dask to take full advantage of multiple processors available on
40
40
most modern computers.
41
41
42
-
For more details on Dask, read `its documentation <https://docs.dask.org/>`__.
42
+
For more details, read the `Dask documentation <https://docs.dask.org/>`__.
43
43
Note that xarray only makes use of ``dask.array`` and ``dask.delayed``.
44
44
45
45
.. _dask.io:
@@ -234,7 +234,7 @@ disk.
234
234
.. note::
235
235
236
236
For more on the differences between :py:meth:`~xarray.Dataset.persist` and
237
-
:py:meth:`~xarray.Dataset.compute` see this `Stack Overflow answer <https://stackoverflow.com/questions/41806850/dask-difference-between-client-persist-and-client-compute>`_ and the `Dask documentation <https://distributed.dask.org/en/latest/manage-computation.html#dask-collections-to-futures>`_.
237
+
:py:meth:`~xarray.Dataset.compute` see this `Stack Overflow answer on the differences between client persist and client compute <https://stackoverflow.com/questions/41806850/dask-difference-between-client-persist-and-client-compute>`_ and the `Dask documentation <https://distributed.dask.org/en/latest/manage-computation.html#dask-collections-to-futures>`_.
238
238
239
239
For performance you may wish to consider chunk sizes. The correct choice of
240
240
chunk size depends both on your data and on the operations you want to perform.
@@ -549,7 +549,7 @@ larger chunksizes.
549
549
550
550
.. tip::
551
551
552
-
Check out the dask documentation on `chunks <https://docs.dask.org/en/latest/array-chunks.html>`_.
552
+
Check out the `dask documentation on chunks <https://docs.dask.org/en/latest/array-chunks.html>`_.
553
553
554
554
555
555
Optimization Tips
@@ -562,7 +562,7 @@ through experience:
562
562
1. Do your spatial and temporal indexing (e.g. ``.sel()`` or ``.isel()``) early in the pipeline, especially before calling ``resample()`` or ``groupby()``. Grouping and resampling triggers some computation on all the blocks, which in theory should commute with indexing, but this optimization hasn't been implemented in Dask yet. (See `Dask issue #746 <https://github.com/dask/dask/issues/746>`_).
563
563
564
564
2. More generally, ``groupby()`` is a costly operation and will perform a lot better if the ``flox`` package is installed.
565
-
See the `flox documentation <flox.readthedocs.io/>`_ for more. By default Xarray will use ``flox`` if installed.
565
+
See the `flox documentation <https://flox.readthedocs.io>`_ for more. By default Xarray will use ``flox`` if installed.
566
566
567
567
3. Save intermediate results to disk as a netCDF files (using ``to_netcdf()``) and then load them again with ``open_dataset()`` for further computations. For example, if subtracting temporal mean from a dataset, save the temporal mean to disk before subtracting. Again, in theory, Dask should be able to do the computation in a streaming fashion, but in practice this is a fail case for the Dask scheduler, because it tries to keep every chunk of an array that it computes in memory. (See `Dask issue #874 <https://github.com/dask/dask/issues/874>`_)
568
568
@@ -572,6 +572,6 @@ through experience:
572
572
573
573
6. Using the h5netcdf package by passing ``engine='h5netcdf'`` to :py:meth:`~xarray.open_mfdataset` can be quicker than the default ``engine='netcdf4'`` that uses the netCDF4 package.
574
574
575
-
7. Some dask-specific tips may be found `here<https://docs.dask.org/en/latest/array-best-practices.html>`_.
575
+
7. Find `best practices specific to Dask arrays in the documentation<https://docs.dask.org/en/latest/array-best-practices.html>`_.
576
576
577
-
8. The dask `diagnostics <https://docs.dask.org/en/latest/understanding-performance.html>`_ can be useful in identifying performance bottlenecks.
577
+
8. The `dask diagnostics <https://docs.dask.org/en/latest/understanding-performance.html>`_ can be useful in identifying performance bottlenecks.
Copy file name to clipboardExpand all lines: doc/user-guide/weather-climate.rst
+3-3
Original file line number
Diff line number
Diff line change
@@ -10,7 +10,7 @@ Weather and climate data
10
10
11
11
import xarray as xr
12
12
13
-
Xarray can leverage metadata that follows the `Climate and Forecast (CF) conventions`_ if present. Examples include automatic labelling of plots with descriptive names and units if proper metadata is present (see :ref:`plotting`) and support for non-standard calendars used in climate science through the ``cftime`` module (see :ref:`CFTimeIndex`). There are also a number of geosciences-focused projects that build on xarray (see :ref:`ecosystem`).
13
+
Xarray can leverage metadata that follows the `Climate and Forecast (CF) conventions`_ if present. Examples include :ref:`automatic labelling of plots<plotting>` with descriptive names and units if proper metadata is present and support for non-standard calendars used in climate science through the ``cftime`` module(Explained in the :ref:`CFTimeIndex` section). There are also a number of :ref:`geosciences-focused projects that build on xarray<ecosystem>`.
14
14
15
15
.. _Climate and Forecast (CF) conventions: https://cfconventions.org
16
16
@@ -49,10 +49,10 @@ variable with the attribute, rather than with the dimensions.
49
49
CF-compliant coordinate variables
50
50
---------------------------------
51
51
52
-
`MetPy`_ adds a ``metpy`` accessor that allows accessing coordinates with appropriate CF metadata using generic names ``x``, ``y``, ``vertical`` and ``time``. There is also a `cartopy_crs` attribute that provides projection information, parsed from the appropriate CF metadata, as a `Cartopy`_ projection object. See `their documentation`_ for more information.
52
+
`MetPy`_ adds a ``metpy`` accessor that allows accessing coordinates with appropriate CF metadata using generic names ``x``, ``y``, ``vertical`` and ``time``. There is also a `cartopy_crs` attribute that provides projection information, parsed from the appropriate CF metadata, as a `Cartopy`_ projection object. See the `metpy documentation`_ for more information.
0 commit comments