mirror of
https://github.com/ROCm/jax.git
synced 2025-04-16 03:46:06 +00:00
Update outdated comment Pmap_Cookbook
Colab now has TPU VMs, so this notebook works on Colab!
This commit is contained in:
parent
32922f61e9
commit
c980dc4e2a
@ -19,7 +19,7 @@
|
||||
"\n",
|
||||
"This notebook is an introduction to writing single-program multiple-data (SPMD) programs in JAX, and executing them synchronously in parallel on multiple devices, such as multiple GPUs or multiple TPU cores. The SPMD model is useful for computations like training neural networks with synchronous gradient descent algorithms, and can be used for data-parallel as well as model-parallel computations.\n",
|
||||
"\n",
|
||||
"**Note:** To run this notebook with any parallelism, you'll need multiple XLA devices available, e.g. with a multi-GPU machine, a Google Cloud TPU or a Kaggle TPU VM. The required features are not supported by the Google Colab TPU runtime at this time.\n",
|
||||
"**Note:** To run this notebook with any parallelism, you'll need multiple XLA devices available, e.g. with a multi-GPU machine, a Colab TPU, a Google Cloud TPU or a Kaggle TPU VM.\n",
|
||||
"\n",
|
||||
"The code in this notebook is simple. For an example of how to use these tools to do data-parallel neural network training, check out [the SPMD MNIST example](https://github.com/google/jax/blob/main/examples/spmd_mnist_classifier_fromscratch.py) or the much more capable [Trax library](https://github.com/google/trax/)."
|
||||
]
|
||||
|
Loading…
x
Reference in New Issue
Block a user