mirror of
https://github.com/ROCm/jax.git
synced 2025-04-18 21:06:06 +00:00
Update README.md and CHANGELOG.md.
This commit is contained in:
parent
21551c2378
commit
0fe601227f
30
CHANGELOG.md
30
CHANGELOG.md
@ -2,12 +2,38 @@
|
||||
|
||||
These are the release notes for JAX.
|
||||
|
||||
## jax 0.1.58 (unreleased)
|
||||
## jax 0.1.59 (unreleased)
|
||||
|
||||
## jax 0.1.58
|
||||
|
||||
### Breaking changes
|
||||
|
||||
* JAX has dropped Python 2 support, because Python 2 reached its end of life on
|
||||
January 1, 2020. Please update to Python 3.5 or newer.
|
||||
|
||||
### New features
|
||||
|
||||
- Forward AD of while loop (https://github.com/google/jax/pull/1980)
|
||||
* Forward-mode automatic differentiation (`jvp`) of while loop
|
||||
(https://github.com/google/jax/pull/1980)
|
||||
* New NumPy and SciPy functions:
|
||||
* `jax.numpy.fft.fft2`
|
||||
* `jax.numpy.fft.ifft2`
|
||||
* `jax.numpy.fft.rfft`
|
||||
* `jax.numpy.fft.irfft`
|
||||
* `jax.numpy.fft.rfft2`
|
||||
* `jax.numpy.fft.irfft2`
|
||||
* `jax.numpy.fft.rfftn`
|
||||
* `jax.numpy.fft.irfftn`
|
||||
* `jax.numpy.fft.fftfreq`
|
||||
* `jax.numpy.fft.rfftfreq`
|
||||
* `jax.numpy.linalg.matrix_rank`
|
||||
* `jax.numpy.linalg.matrix_power`
|
||||
* `jax.scipy.special.betainc`
|
||||
* Batched Cholesky decomposition on GPU now uses a more efficient batched
|
||||
kernel.
|
||||
|
||||
|
||||
### Notable bug fixes
|
||||
|
||||
* With the Python 3 upgrade, JAX no longer depends on `fastcache`, which should
|
||||
help with installation.
|
||||
|
@ -9,6 +9,13 @@
|
||||
| [**Install guide**](#installation)
|
||||
| [**Reference docs**](https://jax.readthedocs.io/en/latest/)
|
||||
|
||||
## Announcements
|
||||
|
||||
* `jax` 0.1.58 has been released. As of `jax` 0.1.58, JAX has dropped Python 2
|
||||
support. Please update to Python 3.5 or newer.
|
||||
|
||||
## What is JAX?
|
||||
|
||||
JAX is [Autograd](https://github.com/hips/autograd) and
|
||||
[XLA](https://www.tensorflow.org/xla),
|
||||
brought together for high-performance machine learning research.
|
||||
|
Loading…
x
Reference in New Issue
Block a user