rocm_jax/build/BUILD.bazel

61 lines
1.8 KiB
Python
Raw Normal View History

# Copyright 2018 The JAX Authors.
[JAX] Rewrite OSS build script. Significant changes: * Mac OS X support. * build script is in Python, not shell. * build configuration is passed via flags, not environment variables. * build script configures TF itself, and does not require explicitly checking out the TF git repository and running its configure script. Changes the TF dependency in the Bazel workspace to be an http_archive(), rather than a local checkout of TF. * rather than trying to guess the path for Bazel-generated XLA artifacts, use a sh_binary() to perform installation of the built artifacts in to the JAX source tree. Bazel's runfiles mechanism is the supported route to find build artifacts. * downloads Bazel in Python and checks its SHA256 before running it, rather than running an untrusted binary from the internet. * intentionally does not delete the Bazel cache or Bazel after building. Example of new build interaction: Building without CUDA on Mac or Linux: $ cd jax $ python3 build.py (or python2 build.py if you want a Python 2 build) _ _ __ __ | | / \ \ \/ / _ | |/ _ \ \ / | |_| / ___ \ / \ \___/_/ \_\/_/\_\ Starting local Bazel server and connecting to it... Bazel binary path: /Users/xyz/bin/bazel Python binary path: /Library/Frameworks/Python.framework/Versions/3.7/bin/python3 CUDA enabled: no Building XLA and installing it in the JAX source tree... ... Example of building with CUDA enabled on Linux: $ python3 build.py --enable_cuda --cudnn_path=/usr/lib/x86_64-linux-gnu/ ... as before, except ... CUDA enabled: yes CUDA toolkit path: /usr/local/cuda CUDNN library path: /usr/lib/x86_64-linux-gnu/ ... PiperOrigin-RevId: 222868835
2018-11-26 12:37:24 -08:00
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# JAX is Autograd and XLA
load("@bazel_skylib//rules:common_settings.bzl", "bool_flag")
load("@local_config_cuda//cuda:build_defs.bzl", "if_cuda")
load("@local_config_rocm//rocm:build_defs.bzl", "if_rocm")
load("//jaxlib:jax.bzl", "if_windows")
[JAX] Rewrite OSS build script. Significant changes: * Mac OS X support. * build script is in Python, not shell. * build configuration is passed via flags, not environment variables. * build script configures TF itself, and does not require explicitly checking out the TF git repository and running its configure script. Changes the TF dependency in the Bazel workspace to be an http_archive(), rather than a local checkout of TF. * rather than trying to guess the path for Bazel-generated XLA artifacts, use a sh_binary() to perform installation of the built artifacts in to the JAX source tree. Bazel's runfiles mechanism is the supported route to find build artifacts. * downloads Bazel in Python and checks its SHA256 before running it, rather than running an untrusted binary from the internet. * intentionally does not delete the Bazel cache or Bazel after building. Example of new build interaction: Building without CUDA on Mac or Linux: $ cd jax $ python3 build.py (or python2 build.py if you want a Python 2 build) _ _ __ __ | | / \ \ \/ / _ | |/ _ \ \ / | |_| / ___ \ / \ \___/_/ \_\/_/\_\ Starting local Bazel server and connecting to it... Bazel binary path: /Users/xyz/bin/bazel Python binary path: /Library/Frameworks/Python.framework/Versions/3.7/bin/python3 CUDA enabled: no Building XLA and installing it in the JAX source tree... ... Example of building with CUDA enabled on Linux: $ python3 build.py --enable_cuda --cudnn_path=/usr/lib/x86_64-linux-gnu/ ... as before, except ... CUDA enabled: yes CUDA toolkit path: /usr/local/cuda CUDNN library path: /usr/lib/x86_64-linux-gnu/ ... PiperOrigin-RevId: 222868835
2018-11-26 12:37:24 -08:00
licenses(["notice"]) # Apache 2
package(default_visibility = ["//visibility:public"])
bool_flag(
name = "enable_remote_tpu",
build_setting_default = False,
)
config_setting(
name = "remote_tpu_enabled",
flag_values = {
":enable_remote_tpu": "True",
},
)
py_binary(
name = "build_wheel",
srcs = ["build_wheel.py"],
[JAX] Rewrite OSS build script. Significant changes: * Mac OS X support. * build script is in Python, not shell. * build configuration is passed via flags, not environment variables. * build script configures TF itself, and does not require explicitly checking out the TF git repository and running its configure script. Changes the TF dependency in the Bazel workspace to be an http_archive(), rather than a local checkout of TF. * rather than trying to guess the path for Bazel-generated XLA artifacts, use a sh_binary() to perform installation of the built artifacts in to the JAX source tree. Bazel's runfiles mechanism is the supported route to find build artifacts. * downloads Bazel in Python and checks its SHA256 before running it, rather than running an untrusted binary from the internet. * intentionally does not delete the Bazel cache or Bazel after building. Example of new build interaction: Building without CUDA on Mac or Linux: $ cd jax $ python3 build.py (or python2 build.py if you want a Python 2 build) _ _ __ __ | | / \ \ \/ / _ | |/ _ \ \ / | |_| / ___ \ / \ \___/_/ \_\/_/\_\ Starting local Bazel server and connecting to it... Bazel binary path: /Users/xyz/bin/bazel Python binary path: /Library/Frameworks/Python.framework/Versions/3.7/bin/python3 CUDA enabled: no Building XLA and installing it in the JAX source tree... ... Example of building with CUDA enabled on Linux: $ python3 build.py --enable_cuda --cudnn_path=/usr/lib/x86_64-linux-gnu/ ... as before, except ... CUDA enabled: yes CUDA toolkit path: /usr/local/cuda CUDNN library path: /usr/lib/x86_64-linux-gnu/ ... PiperOrigin-RevId: 222868835
2018-11-26 12:37:24 -08:00
data = [
"LICENSE.txt",
"//jaxlib",
"//jaxlib:README.md",
"//jaxlib:setup.py",
"//jaxlib:setup.cfg",
"@org_tensorflow//tensorflow/compiler/xla/python:xla_client",
2021-11-25 00:07:25 +08:00
] + if_windows([
"//jaxlib/mlir/_mlir_libs:jaxlib_mlir_capi.dll",
]) + select({
":remote_tpu_enabled": ["@org_tensorflow//tensorflow/compiler/xla/python/tpu_driver/client:py_tpu_client"],
"//conditions:default": [],
}) + if_cuda([
"//jaxlib/cuda:cuda_gpu_support",
"@local_config_cuda//cuda:cuda-nvvm",
]) + if_rocm([
"//jaxlib/rocm:rocm_gpu_support",
]),
deps = ["@bazel_tools//tools/python/runfiles"],
[JAX] Rewrite OSS build script. Significant changes: * Mac OS X support. * build script is in Python, not shell. * build configuration is passed via flags, not environment variables. * build script configures TF itself, and does not require explicitly checking out the TF git repository and running its configure script. Changes the TF dependency in the Bazel workspace to be an http_archive(), rather than a local checkout of TF. * rather than trying to guess the path for Bazel-generated XLA artifacts, use a sh_binary() to perform installation of the built artifacts in to the JAX source tree. Bazel's runfiles mechanism is the supported route to find build artifacts. * downloads Bazel in Python and checks its SHA256 before running it, rather than running an untrusted binary from the internet. * intentionally does not delete the Bazel cache or Bazel after building. Example of new build interaction: Building without CUDA on Mac or Linux: $ cd jax $ python3 build.py (or python2 build.py if you want a Python 2 build) _ _ __ __ | | / \ \ \/ / _ | |/ _ \ \ / | |_| / ___ \ / \ \___/_/ \_\/_/\_\ Starting local Bazel server and connecting to it... Bazel binary path: /Users/xyz/bin/bazel Python binary path: /Library/Frameworks/Python.framework/Versions/3.7/bin/python3 CUDA enabled: no Building XLA and installing it in the JAX source tree... ... Example of building with CUDA enabled on Linux: $ python3 build.py --enable_cuda --cudnn_path=/usr/lib/x86_64-linux-gnu/ ... as before, except ... CUDA enabled: yes CUDA toolkit path: /usr/local/cuda CUDNN library path: /usr/lib/x86_64-linux-gnu/ ... PiperOrigin-RevId: 222868835
2018-11-26 12:37:24 -08:00
)