diff --git a/README.md b/README.md index c99d3db10..26c0797db 100644 --- a/README.md +++ b/README.md @@ -390,6 +390,7 @@ Some standouts: | Google TPU | yes | n/a | n/a | n/a | n/a | n/a | | AMD GPU | yes | no | experimental | n/a | no | no | | Apple GPU | n/a | no | n/a | experimental | n/a | n/a | +| Intel GPU | experimental | n/a | n/a | n/a | no | no | ### Instructions @@ -401,6 +402,7 @@ Some standouts: | Google TPU | `pip install -U "jax[tpu]" -f https://storage.googleapis.com/jax-releases/libtpu_releases.html` | | AMD GPU (Linux) | Use [Docker](https://hub.docker.com/r/rocm/jax-community/tags), [pre-built wheels](https://github.com/ROCm/jax/releases), or [build from source](https://jax.readthedocs.io/en/latest/developer.html#additional-notes-for-building-a-rocm-jaxlib-for-amd-gpus). | | Mac GPU | Follow [Apple's instructions](https://developer.apple.com/metal/jax/). | +| Intel GPU | Follow [Intel's instructions](https://github.com/intel/intel-extension-for-openxla/blob/main/docs/acc_jax.md). | See [the documentation](https://jax.readthedocs.io/en/latest/installation.html) for information on alternative installation strategies. These include compiling diff --git a/docs/installation.md b/docs/installation.md index 5b8893628..7cf649557 100644 --- a/docs/installation.md +++ b/docs/installation.md @@ -35,6 +35,7 @@ The table below shows all supported platforms and installation options. Check if | Google Cloud TPU | {ref}`yes ` | n/a | n/a | n/a | n/a | n/a | | AMD GPU | {ref}`experimental ` | no | {ref}`experimental ` | n/a | no | no | | Apple GPU | n/a | no | n/a | {ref}`experimental ` | n/a | n/a | +| Intel GPU | {ref}`experimental `| n/a | n/a | n/a | no | no | (install-cpu)= @@ -230,6 +231,17 @@ JAX has experimental ROCm support. There are two ways to install JAX: * Use [AMD's Docker container](https://hub.docker.com/r/rocm/jax); or * Build from source (refer to {ref}`building-from-source` — a section called _Additional notes for building a ROCM `jaxlib` for AMD GPUs_). +(install-intel-gpu)= +## Intel GPU + +Intel provides an experimental OneAPI plugin: intel-extension-for-openxla for Intel GPU hardware. For more details and installation instructions, refer to one of the following two methods: +1. Pip installation: [JAX acceleration on Intel GPU](https://github.com/intel/intel-extension-for-openxla/blob/main/docs/acc_jax.md). +2. Using [Intel's XLA Docker container](https://hub.docker.com/r/intel/intel-optimized-xla). + +Please report any issues related to: +* JAX: [JAX issue tracker](https://github.com/jax-ml/jax/issues). +* Intel's OpenXLA plugin: [Intel-extension-for-openxla issue tracker](https://github.com/intel/intel-extension-for-openxla/issues). + ## Conda (community-supported) ### Conda installation