From 0f8710a6e341f5e7b130867ce4e8d48091e12a76 Mon Sep 17 00:00:00 2001 From: Matthew Johnson Date: Mon, 10 Dec 2018 07:13:51 -0800 Subject: [PATCH] add another link to Autograd in the readme --- README.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index 28e90f3e8..fd3f1f51c 100644 --- a/README.md +++ b/README.md @@ -8,7 +8,8 @@ JAX is [Autograd](https://github.com/hips/autograd) and [XLA](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/compiler/xla/g3doc/overview.md), brought together for high-performance machine learning research. -With its updated version of Autograd, JAX can automatically differentiate native +With its updated version of [Autograd](https://github.com/hips/autograd), +JAX can automatically differentiate native Python and NumPy functions. It can differentiate through loops, branches, recursion, and closures, and it can take derivatives of derivatives of derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation)