Alex Flint
Alex Flint

Reputation: 6706

Local cache for bazel remote repos

We are using codeship to run CI for a C++ project. Our CI build consists of a Docker image into which we install system dependencies, then a bazel build step that builds our tests.

Our bazel WORKSPACE file pulls in various external dependencies, such as gtest:

new_http_archive(
  name = "gtest",
  url = "https://github.com/google/googletest/archive/release-1.7.0.zip",
  build_file = "thirdparty/gtest.BUILD",
  strip_prefix = "googletest-release-1.7.0",
  sha256 = "b58cb7547a28b2c718d1e38aee18a3659c9e3ff52440297e965f5edffe34b6d0",
)

During CI builds, a lot of time is spent downloading these files. Is it possible to set up Bazel to use a local cache for these archives?

Upvotes: 2

Views: 2222

Answers (1)

hlopko
hlopko

Reputation: 3270

I think Bazel already caches external repositories in the output_base (It should, if not it's a bug worth reporting). Is it an option for you to keep the cache hot in the docker container? E.g. by fetching the code and running bazel fetch //... or some more specific target? Note you can also specify where is bazel`s output_base by using bazel --output_base=/foo build //.... You might find this doc section relevant.

[EDIT: Our awesome Kristina comes to save the day]:

You can use --experimental_repository_cache=/path/to/some/dir

Does this help?

Upvotes: 3

Related Questions