mirror of
https://github.com/ROCm/jax.git
synced 2025-04-15 19:36:06 +00:00
DOC: update JAX quick description
This commit is contained in:
parent
57e34e1a2c
commit
72b2321750
@ -17,9 +17,8 @@
|
||||
|
||||
## What is JAX?
|
||||
|
||||
JAX is [Autograd](https://github.com/hips/autograd) and [XLA](https://www.tensorflow.org/xla),
|
||||
brought together for high-performance numerical computing, including
|
||||
large-scale machine learning research.
|
||||
JAX is a Python library for accelerator-oriented array computation and program transformation,
|
||||
designed for high-performance numerical computing and large-scale machine learning.
|
||||
|
||||
With its updated version of [Autograd](https://github.com/hips/autograd),
|
||||
JAX can automatically differentiate native
|
||||
|
@ -1,7 +1,8 @@
|
||||
JAX: High-Performance Array Computing
|
||||
=====================================
|
||||
|
||||
JAX is Autograd_ and XLA_, brought together for high-performance numerical computing.
|
||||
JAX is a Python library for accelerator-oriented array computation and program transformation,
|
||||
designed for high-performance numerical computing and large-scale machine learning.
|
||||
|
||||
If you're looking to train neural networks, use Flax_ and start with its documentation.
|
||||
Some associated tools are Optax_ and Orbax_.
|
||||
@ -93,8 +94,6 @@ For an end-to-end transformer library built on JAX, see MaxText_.
|
||||
glossary
|
||||
|
||||
|
||||
.. _Autograd: https://github.com/hips/autograd
|
||||
.. _XLA: https://openxla.org/xla
|
||||
.. _Flax: https://flax.readthedocs.io/
|
||||
.. _Orbax: https://orbax.readthedocs.io/
|
||||
.. _Optax: https://optax.readthedocs.io/
|
||||
|
Loading…
x
Reference in New Issue
Block a user