37 Commits

Author SHA1 Message Date
Jake VanderPlas
ba83ab5d3b Add expanded contributing documentation 2021-04-27 15:00:21 -07:00
Peter Hawkins
aafe8870ae Document that JAX follows the NEP-29 deprecation policy.
Remove the "experimental" disclaimer from the concurrency documentation.
2021-04-21 11:12:41 -04:00
Adam Paszke
40253f471d Make xmap tutorial public
Note that xmap is still experimental, so treat this more as a preview
than as documentation.
2021-04-08 15:56:58 +00:00
Jake VanderPlas
749ad95514 DOC: add transformations doc to HTML & reorganize contents 2021-03-08 16:25:04 -08:00
Roy Frostig
9c420653c3 move changelog to top level 2021-03-08 10:44:52 -08:00
Jake VanderPlas
12c84e7a50 Add jax.errors submodule & error troubleshooting docs 2021-03-03 12:39:12 -08:00
Jake VanderPlas
8f5038d4b9 DOC: add initial JAX glossary 2021-03-01 17:56:43 -08:00
Jake VanderPlas
5396f04c65 DOC: add first JAX-101 notebook 2021-02-26 09:22:34 -08:00
Tom Hennigan
090dd2117d Add autodidax to dev docs toctree and ignore md file to silence warning.
Fixes CI failure from google/jax#5827.
2021-02-25 10:50:10 +00:00
Jake VanderPlas
4c83723d11 DOC: link to RTD version of neural network notebook 2021-02-22 09:19:41 -08:00
Jake VanderPlas
236ba14585 move convolutions from common gotchas to its own file 2021-02-16 17:21:56 -08:00
Jake VanderPlas
d85d204897 DOC: change build from nbsphinx to myst 2021-02-16 10:28:39 -08:00
Jake VanderPlas
94484d85aa Migrate CHANGELOG.rst -> CHANGELOG.md 2021-02-12 17:03:53 -08:00
Jake VanderPlas
5d16ab03b5 Minor doc formatting fixes 2021-01-29 16:43:27 -08:00
Jake VanderPlas
967f3ac7a3 Add thinking_in_jax.ipynb 2021-01-26 12:08:37 -08:00
Jake VanderPlas
cfe934c053 Fix some doc build warnings 2021-01-25 14:08:57 -08:00
Peter Hawkins
5116fd47aa
Add a heap profiler API and document it. (#3576) 2020-06-26 17:09:09 -04:00
Skye Wanderman-Milne
66ba734882
Add note to docs describing how pytree arguments work. (#3284)
Addresses #3095. I'm not sure if we wanna link to this from API
docstrings.

This also subsumes the original pytrees notebook.
2020-06-03 09:46:00 -07:00
George Necula
6f2f779a3d Started a FAQ for JAX 2020-03-25 11:55:24 +02:00
Matthew Johnson
7e480fa923 add custom_jvp / vjp, delete custom_transforms 2020-03-21 22:08:03 -07:00
George Necula
89514f9278
Moved CHANGELOG to docs (#2252)
* Moved CHANGELOG to docs

This puts the documentation also on RTD, with TOC.
Also changed its format to .rst, for consistency.
Added GitHub links to the change log.

* Actually add the CHANGELOG.rst

* Added reminder comments to the CHANGELOG.rst
2020-02-23 19:18:06 +01:00
George Necula
a5c3468c93 Added the first draft of the Jaxpr documentation.
This replaces the previous Google Doc version, and is now
updated with the latest changes in Jaxpr.
2020-02-12 13:01:43 +01:00
Peter Hawkins
d958f3007d
Change JAX type promotion to prefer inexact types. (#1815)
Change the JAX type promotion table to prefer inexact types during type promotion.

NumPy's type promotion rules tend to promote aggressively to float64, which isn't a very accelerator-friendly behavior when not all accelerators (e.g., TPUs) support 64-bit floating point types. Even on accelerators that support 64-bit floating point types (e.g., GPUs), promotion to a 64-bit type comes with a significant performance cost.

This change makes JAX type promotion between inexact and exact types closer to PyTorch's promotion semantics, which are a better fit for modern accelerators:
e.g.,

```
import numpy as onp
from jax import numpy as np

In [1]: onp.promote_types(onp.float32, onp.int32)   
Out[1]: dtype('float64')

In [2]: onp.promote_types(onp.float16, onp.int64)   
Out[2]: dtype('float64')

In [3]: np.promote_types(onp.float32, onp.int32)    
Out[3]: dtype('float32')

In [4]: np.promote_types(onp.float16, onp.int64)    
Out[4]: dtype('float16')
```

This change is in preparation for enabling x64 mode by default on all platforms.
2019-12-05 10:57:23 -05:00
George Necula
4e89d43a75 Added JAX pytrees notebook
Also added docstrings to the tree_util module.
2019-11-24 20:29:07 +01:00
Sharad Vikram
5d56999913 Add custom interpreter notebook 2019-10-28 13:58:55 -07:00
George Necula
0ffcd769ef
Add sklearn to Travis, for documentation building. (#1547)
* Add sklearn to Travis, for documentation building.
* Add score_matching to auto-built notebooks
2019-10-21 23:24:16 +02:00
George Necula
eae59d0b2c
Moved all notebooks to docs/notebooks. (#1493)
* Moved all notebooks to docs/notebooks.

Now all notebooks are in the same place, thus all are subject
to auto-doc generation at readthedocs.io and to automated testing
with travis.

Some notebooks are too slow, exclude them at docs/conf.py:exclude_patterns.

Cleanup a bit the section headings in notebooks so that they show
up well in readtehdocs.io.

* Increase the cell timeout for executing notebooks
* Exclude also the neural network notebook from auto-generation (timing out)
* Disable the score_matching notebook from auto-doc (travis does not have sklearn)
2019-10-17 08:58:25 +02:00
George Necula
e42c010605 Create developer documentation.
* Moved out of README.md some developer-only stuff to docs/developer.rst.
    * Added documentation about building the documentation
2019-10-09 17:24:01 +02:00
George Necula
75c2236063 Addressed comments for the Colab.
* Cleaned up use of section levels
* Renamed ma to multiply_add and sq_add to square_add
* Other minor clarifications
* Separated the Colabs into Tutorials and Advanced Tutorials
2019-10-03 11:20:04 +02:00
George Necula
454320e9c9 Added How_JAX_primitives_work colab 2019-10-02 14:42:01 +02:00
Stephan Hoyer
9bd7330e1f
Notebooks on RTD (#1121) 2019-09-30 11:00:02 -07:00
Matthew Johnson
3c2a73592c improve rank promotion warning, add doc page 2019-08-25 14:28:53 -07:00
Peter Hawkins
6dc730a5f4 Make JAX tracer state thread-local. Allows performing traces in separate threads.
Using threading within a traced context still won't work, but that is perhaps less important than the ability to call JIT-ted computations from separate threads.

(Revives https://github.com/google/jax/pull/734.)
2019-08-09 13:55:20 -04:00
Peter Hawkins
bcacdfe315 Add some brief documentation about how to profile/trace JAX programs. 2019-08-08 21:02:41 -04:00
Skye Wanderman-Milne
5b9849f177 Add docs on GPU memory allocation. 2019-07-30 15:49:35 -07:00
Peter Hawkins
91e6ba322c Add documentation on asynchronous dispatch. 2019-06-04 10:09:43 -04:00
Peter Hawkins
86d8915c3d Add Sphinx-generated reference documentation for JAX. 2019-01-16 09:13:31 -05:00