2394 Commits

Author SHA1 Message Date
Stephan Hoyer
515e07bdf2 White background 2019-08-05 13:08:17 -07:00
Stephan Hoyer
8c8222a319 Add the JAX logo to sphinx docs 2019-08-05 12:34:29 -07:00
James Bradbury
a26963fe87
Merge pull request #1106 from google/jb/bool-reduction
fix jax.numpy reduction init_val for bools
2019-08-05 10:45:17 -07:00
Matthew Johnson
498ff8e54f
Merge pull request #1111 from cclauss/master
Mark instances of 'long' with # noqa
2019-08-05 10:29:06 -07:00
Matthew Johnson
7b9ccea210 bump version for pypi 2019-08-04 18:37:02 -07:00
cclauss
f17f2b976f Mark instances of 'long' with # noqa 2019-08-05 00:22:41 +02:00
Peter Hawkins
cfc4df38ec
Merge pull request #1109 from hawkinsp/xlabridge
Cleanups to xla_bridge.py
2019-08-04 14:20:52 -04:00
Peter Hawkins
0ef05d7586 Cleanups to xla_bridge.py
Remove stringification of dtypes. The NumPy dtype handling bug has to do with types with different hashes comparing as equal. This only does not happen between two np.dtype objects; it is sufficient to ismply ensure we actually have an np.dtype rather than something dtype-like (e.g., a string or NumPy type object).
Remove xla_bridge.infeed_put, which is unused.
Remove xla_bridge.Shape (use xla_client.Shape instead).
Remove xla_bridge.dtype_to_etype_exact (use xla_client.dtype_to_etype instead).
Remove xla_bridge.device_put (inlined the definition into its callers)
Remove xla_bridge.make_tuple (inlined the definition into its callers).
2019-08-04 12:52:39 -04:00
Peter Hawkins
5a348107bb
Merge pull request #1108 from hawkinsp/fixes
Suppress flake8 warning from __version__ logic.
2019-08-04 12:14:18 -04:00
Peter Hawkins
8e66d29c45 Suppress flake8 warning from __version__ logic. 2019-08-04 12:12:53 -04:00
Peter Hawkins
d344635042
Merge pull request #1107 from hawkinsp/fixes
Fix flake8 warnings, including a few real bugs.
2019-08-04 11:32:47 -04:00
Peter Hawkins
e71474d84c Fix flake8 warnings, including a few real bugs.
Exceptions: long (Python 2 compat) and __version__ (flake8 doesn't undertand eval).

Fixes #317.
2019-08-04 10:25:48 -04:00
James Bradbury
72c0dcf1cf
Merge pull request #1102 from georgedahl/fix
Fix pack_optimizer_state to correctly use tuples everywhere in the pa…
2019-08-03 21:40:33 -07:00
James Bradbury
d0c9f45349 fix jax.numpy reduction init_val for bools 2019-08-03 21:27:06 -07:00
George Dahl
444aced791 Fix pack_optimizer_state to correctly use tuples everywhere in the packed state and add a unit test to check round trip unpack/pack. 2019-08-02 16:42:17 -07:00
Matthew Johnson
fd4b84bd95 Merge branch 'master' of github.com:google/jax 2019-08-02 11:26:49 -07:00
Matthew Johnson
3168006f4a fix np.var dtype bug 2019-08-02 11:26:17 -07:00
Peter Hawkins
f8fa1a60f4 Disable gradient test for entr. 2019-08-02 14:17:31 -04:00
Peter Hawkins
6bc476261b More build formatting fixes. 2019-08-02 13:32:14 -04:00
Peter Hawkins
e0b31ac310 Build formatting fixes. 2019-08-02 13:29:52 -04:00
Peter Hawkins
e2d6d3acc9
Merge pull request #1099 from hawkinsp/cusolver
Add support for linear algebra ops on GPU using Cusolver:
2019-08-02 11:53:37 -04:00
Peter Hawkins
99735958e2 Remove commented lines from test. 2019-08-02 11:48:12 -04:00
Peter Hawkins
ed3e2308c1 Add support for linear algebra ops on GPU using Cusolver:
* LU decomposition
* Symmetric (Hermitian) eigendecomposition
* Singular value decomposition.

Make LU decomposition tests less sensitive to the exact decomposition; check that we have a decomposition, not precisely the same one scipy returns.
2019-08-02 11:16:15 -04:00
Peter Hawkins
7c060435bb
Merge pull request #1098 from hawkinsp/fixes
Fix test breakage at head.
2019-08-02 08:56:07 -04:00
Peter Hawkins
b45d1ec1dd Fix test breakage at head.
Add new numpy/scipy functions to documentation.
2019-08-02 08:55:22 -04:00
Matthew Johnson
1dfdd8dafe
Merge pull request #1094 from fehiepsi/mvngamma
Add sp.multigammaln, sp.entr
2019-08-01 20:59:19 -07:00
Matthew Johnson
1f3b4ae97e
Merge pull request #1091 from fehiepsi/tril
expose tril_indices, triu_indices similar to diag_indices
2019-08-01 20:58:22 -07:00
Matthew Johnson
7673285202
Merge pull request #1089 from fehiepsi/nonlinear
Add some nonlinearities stax.Elu, stax.LeakyRelu
2019-08-01 20:58:06 -07:00
Matthew Johnson
fd98f957a9
Merge pull request #1088 from fehiepsi/median
Add numpy.median and support ddof for numpy.var
2019-08-01 20:57:28 -07:00
Matthew Johnson
e597b22240
Merge pull request #1086 from fehiepsi/sort
Add batching rule for lax.sort
2019-08-01 20:56:19 -07:00
Matthew Johnson
db8116d05d
Merge pull request #1083 from j-towns/patch-1
Correct jax.numpy.pad signature
2019-08-01 20:51:15 -07:00
Matthew Johnson
0134e2e2f2
Merge pull request #1087 from ibab/master
Add rmsprop_momentum optimizer (same as TF RMSProp)
2019-08-01 20:50:52 -07:00
Peter Hawkins
1f267cf6d4
Merge pull request #1096 from hawkinsp/fixes
Add --spawn_strategy from TensorFlow configuration.
2019-08-01 22:06:07 -04:00
Peter Hawkins
40c517e9ef Add --spawn_strategy from TensorFlow configuration. 2019-08-01 21:43:03 -04:00
Peter Hawkins
24d4aaf3e2
Merge pull request #1095 from hawkinsp/fixes
Fix test failures due to Numpy 1.17.
2019-08-01 21:24:35 -04:00
Peter Hawkins
021d93a3cd Fix test failures due to Numpy 1.17. 2019-08-01 21:16:05 -04:00
fehiepsi
7f4dc87a4c add multigammaln, entr 2019-08-01 19:12:03 -04:00
fehiepsi
7a5aecea31 expose tril_indices triu_indices 2019-08-01 17:35:36 -04:00
Peter Hawkins
dce27bfca6
Merge pull request #1090 from hawkinsp/pytree
Fix test failures.
2019-08-01 17:21:20 -04:00
Peter Hawkins
c302e38880 Fix test failures. 2019-08-01 17:20:27 -04:00
fehiepsi
5ffe2ae5dd expose sigmoid too 2019-08-01 17:11:31 -04:00
Peter Hawkins
129af6a5f8
Merge pull request #1085 from hawkinsp/pytree
Make the C++ version of tree_multimap accept tree suffixes of the primary tree.
2019-08-01 16:50:20 -04:00
Peter Hawkins
cb53ca876f Address review comments. 2019-08-01 16:48:18 -04:00
fehiepsi
688d77f432 better test str message 2019-08-01 16:41:06 -04:00
fehiepsi
2836d03b5e add some nonlinearity 2019-08-01 16:39:08 -04:00
fehiepsi
45c5bd4fba support ddof for var 2019-08-01 16:20:08 -04:00
Igor Babuschkin
628f87d365 Add rmsprop_momentum optimizer (same as TF RMSProp)
The TensorFlow RMSProp optimizer supports an additional momentum
parameter, which allows adding momentum to the RMSProp update.
Having momentum requires keeping around additional state, which might
not be desirable when using the standard RMSProp optimizer, so I've
created an additional optimizer for this case.
RMSProp with momentum can be necessary to reproduce some research papers.
2019-08-01 20:15:59 +01:00
fehiepsi
98152d9d07 add numpy.median 2019-08-01 14:19:41 -04:00
Peter Hawkins
c41677fac7
Merge pull request #1073 from hawkinsp/deviceget
Avoid building an identity computation in jax.device_get().
2019-08-01 12:47:04 -04:00
fehiepsi
1b490fb5e0 Merge remote-tracking branch 'upstream/master' into sort 2019-08-01 12:39:53 -04:00