We don't have a formal support policy for scipy versions, but 1.5 dates from around the same date as the oldest supported NumPy release NEP-29 would have us support (1.20).
This is being used in the following ways in this CL:
* To dump IR, you can now pass paths with `gs://` or `cns` and the HLO can be dumped to those paths.
* Removing the TF dep from gda serialization.
PiperOrigin-RevId: 452117007
--
391dea76bc8fe264cf26ec93d42147f87847894d by Peter Hawkins <phawkins@google.com>:
Update version numbers after jax/jaxlib 0.3.7 release.
COPYBARA_INTEGRATE_REVIEW=https://github.com/google/jax/pull/10324 from hawkinsp:jaxlib 391dea76bc8fe264cf26ec93d42147f87847894d
PiperOrigin-RevId: 442311051
Over time JAX has sprouted many variants of XLA translation rules, each with slightly different but overlapping arguments. This change consolidates them into a new xla.TranslationRule signature:
rule(ctx, avals_in, avals_out, *args, **params)
where ctx contains the parts of the other signatures that were typically not specific to a particular equation.
Since there are many JAX rules to migrate, and even a number of translation rules belonging to projects downstream of JAX, we leave backwards compatibility shims around `xla.translations`, `xla.backend_specific_translations`, and `xla.call_translations` which seem to be the only ones used outside JAX itself.
In passing, this change alters the semantics of `backend` arguments to nested `jit` blocks. We now always canonicalize the backend to a specific backend at the outermost `jit`, and do not complain if an inner `jit` has an explicit `backend` that matches the current default choice.
PiperOrigin-RevId: 403607667
1. In cloud_tpu_init.py, check whether we're on a Cloud TPU VM by
looking for the libtpu Python package, instead of /lib/libtpu.so
(which isn't necessarily present in a docker container). JAX now
relies on the libtpu package instead of the system libtpu.so, so
this makes more sense either way. This means we'll try/catch an
ImportError in all non-TPU environments when importing jax, which
hopefully isn't noticeably slow.
2. Add requests as a jax[tpu] dependency, since it's needed by
cloud_tpu_init.py. This comes pre-installed on Cloud TPU VMs, but
may not be installed in docker containers, virtualenvs, etc.
I manually tested by creating the following Dockerfile on a Cloud TPU VM:
```
FROM ubuntu:18.04
RUN apt update && apt install git python3-pip -y
RUN git clone https://github.com/skye/jax && cd jax && git checkout tpu_docker
WORKDIR jax
RUN python3 -m pip install --upgrade pip
RUN python3 -m pip install .[tpu] -f https://storage.googleapis.com/jax-releases/libtpu_releases.html
CMD ["python3", "-c", "import jax; print(jax.device_count())"]
```
And then running the following commands:
```
$ sudo docker build -t jax-test .
$ sudo docker run --privileged jax-test
8
```
Note the `--privileged` flags is necessary to let the container access
the TPU devices in /dev.
* Updates jax_releases.html index to include libtpu wheels
* Change [tpu] extras to specify `libtpu-nightly` instead of wheel URL
The full install command will now be:
`pip install pip install jax[tpu] -f https://storage.googleapis.com/jax-releases/jax_releases.html`
(similar to the cuda install commands)
I've already pushed an updated jax_releases.html to the jax-releases GCS bucket.
This can be used on Cloud TPU VMs to automatically install compatible
versions of jax, jaxlib, and libtpu (the low-level library JAX uses to
access the TPU on Cloud TPU VMs).
The new install command requires a new jax release (`>=0.2.15`) and
jaxlib release (`>=0.1.68`) to work, since it requires both
cdfbd9dde1
and
ce2bc24996
to pick up the pip-installed libtpu. I'll update the README and Cloud
TPU VM documentation once these releases are out.